10 GbE ethernet, super slow speed


Go to solution Solved by JorgeB,

Recommended Posts

Hi everybody,

I got 2x 10 GbE ethernet cards + a DAC cable to connect my computer to Unraid server (2x Mellanox Technologies MT26448).

I tried to run iperf3 and performances are good enough for me, but when I try to copy any file from my PC to my Unraid server, it goes really slow...

I'm trying to copy some .vdi drive images from VirtualBox folder (~10-30 gb each one) and they are saved into a SATA3 SSD drive that performs pretty well...

The destination drive in Unraid is the RAID-1 SSD cache pool, both SSDs are connected to the SATA 3 port on the motherboard (SuperMicro X10SLL-F) and both of them are almost brand new (Bought both of them 3 weeks ago)...

I think the problem can't be in the Windows PC as it has a Ryzen 3950X, 64 gb of ram and all drives are NVME SSD + 1x SATA3 SSD.

 

As you can see here, it seems speed is fine from iperf3 tests (with antivirus -Kaspersky- enabled)...

image.thumb.png.34fe358b221c2b70f37f787d2cc540ae.png

 

Already tried disabling antivirus, but I cannot understand why it goes at around 50 mb/s...

It seems initially speed is qiet good (~900 mb/s) but immediately it drops down to 35-50 mb/s)...

image.thumb.png.f940442cb71338713d020a50eae04802.png

 

image.png.7dab168f8c3aa779cc33f04f20d57f80.png

 

image.thumb.png.6f87a61fdcae5058f8544416203b1de5.png

 

image.png.afe5720dfc734a8bd0637525065f5b3f.png

 

I attach also diagnostic zip file, maybe it could help!

endys-diagnostics-20221125-0149.zip

 

Thanks everybody! :)

Edited by endystrike
Link to comment

make sure that "flow control" is enabled on all cards (and switches if there are any). Windows has a bad habit to do an initial burst, and if not paced out properly it assumes that the line is crippled on the way and slows down.

 

Also watch the error counters for the cards, Even "direct connection" cables are not safe from faults and maybe you have got one that is not really properly supported by your cards?

This slow speed is usually a result of lost packets and therefor needed resends.

 

Iperf is a good tool, but you should retry it with much larger packet sizes to see if a lot of data can be transmitted too.

 

  • Like 1
Link to comment
1 hour ago, JorgeB said:

That might just be a device limitation, enable turbo write and transfer directly to the array, you should get >100MB/s

Thank you for your help @JorgeB :)
Unfortunately, after enabling turbo write, the transfer speed is still the same...

Btw, I'm writing to the SSD cache pool, so why turbo write should improve speed in this case?

Thank you!

image.png.1e6271902e805a7b2aae4b7cefd1738f.png

Link to comment
1 minute ago, endystrike said:

Btw, I'm writing to the SSD cache pool, so why turbo write should improve speed in this case?

Because you are are using Kingston SA400, which are some of the slowest SSDs available, not all SSDs can sustain 500MB/s writes, if fact most can't, unfortunately I missed that your array also won't be very fast since there's a 1TB disk, but pretty sure you are seeing low speeds due to device limits, that's why it starts reasonably fast, while it's being cached to RAM then it slows down when it's limited by the device write speed.

Link to comment
15 minutes ago, JorgeB said:

Because you are are using Kingston SA400, which are some of the slowest SSDs available, not all SSDs can sustain 500MB/s writes, if fact most can't, unfortunately I missed that your array also won't be very fast since there's a 1TB disk, but pretty sure you are seeing low speeds due to device limits, that's why it starts reasonably fast, while it's being cached to RAM then it slows down when it's limited by the device write speed.

Well, it seems you're right: SSD speed is trashy...

I did a test and I tried to copy and paste in the same cache drive an img of a VM...

After the first 2-3 minutes, it goes from 300 mb/s to 50-60 mb/s...

It seems I've to ask refund to Amazon for the SSDs and get a different model...

image.png.d55e22ea66dc46dc0e26faa60cf9d66b.png

Link to comment

@ConnerVT thank you for that!

Unfortunately I don't have any M2 slot on the mobo, and my pci-e slots are full so I can't put any M2 drive in my build...

I had 3 pci-e slot on my Supermicro X10SLL-F:

  • 1x pci 8x in a 16x slot where I put my Nvidia Quadro P620 for Plex
  • 1x pci slot 8x in 8x slot where I put LSI SAS 9211-8i
  • 1x pcie 4x in a 8x slot where I put my Mellanox 10 gbe eth card...

So the only solution for me was to use sata ssd and now I set the cache pool as raid-0 to enhance the performances...

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.