Slow 10Gbe Speeds


Guest

Recommended Posts

Hi, I just installed a 10gbps nic in my unraid server and my pc when using normal ethernet it was fine but now using the preferred lane of 10gbps between the server and my pc the 10Gbe lane seems to be only going at 150 mbs however I got 2 raid 0 ssd's in the server and an Nvme in my pc

 

The Hardware im using is a 

Chelsio N320E-SR - Server

Mellanox Connect X2 - PC

Cisco SFP-H10GB-ACU7M - Cable

 

When looking in windows setting and unraid the link is said to be 10gbps and all controller lights are flashing to indicate a link

 

 

Edited by Guest
cus
Link to comment
1 hour ago, 1812 said:

I feel like this is a windows problem.

 

 

might consider perhaps consider swapping the 10gbe cards out with one another.

Ok Update I unpluged and repluged the original cards and It seems to be working now however the speeds are very slow

 

Edited by Guest
Link to comment
2 hours ago, MarkPla7z said:

I am writing to the SSD cache I even created a share using Cache only to test however the speeds were fluctuating between 50 Mbs - 300

Mbs https://gyazo.com/ad1fa6269a09877737522da8de428154

 

what type of file(s) is/are this? is it the same both directions?

 

also verify on the dashboard that your connection is 10000 Mb/s

Edited by 1812
Link to comment
55 minutes ago, 1812 said:

 

what type of file(s) is/are this? is it the same both directions?

 

also verify on the dashboard that your connection is 10000 Mb/s

They Are video files and my dashboard and windows explorer indeed both say 10000Mb/s do you think the nic could be thermal throttleing as my pc is watercooled so not much air goes past it

 

Edited by Guest
Link to comment
14 hours ago, johnnie.black said:

Looking at the iperf results it appears to me that there's something very wrong with the 10GbE network, you're not even close to getting 100Mbits, that would point to some issue with the hardware, try different cables and different NICs, testing with iperf should get you close to line speed.

I've Tried putting both nic's in my computer and Iperfing them in a closed loop with two diffirent cables and both were getting 4gbps as my pc doesnt have enough pcie lanes to run both at full speed meaning the nic's and the cables are fine its just somthing to do with the connection between my pc and the server

Link to comment
8 minutes ago, johnnie.black said:

That doesn't make much sense, iperf is only getting 20Mbits, and even if you don have enough lanes you must have at least one lane, and one PCIe 2.0 lane is capable of around 3200Mbits.

Nonono you mis understood I took all the hardware out of the server and out it into my pc and connected the cables and when I did an Iperf test it got 4gbps in my pc which it should of got as each card didnt have all the pc lanes needed which means the problem is with the server as now I put the other card back in the server and the iperf test is back to 20Mbps and ive tried multiple cards in my server could it be my network settings https://gyazo.com/da97c8950c79ef3909bfe777fccf0f2c 

Edited by Guest
Link to comment

No, default settings should get speed much faster the gigabit out of the box, using jumbo frames might improve some more, nothing else needed, it might be a compatibility issue with the NIC and the server, look for a bios update and try a different PCIe slot if available.

 

P.S. please upload pics to the forum directly, don't use external sites.

Link to comment
14 minutes ago, johnnie.black said:

No, default settings should get speed much faster the gigabit out of the box, using jumbo frames might improve some more, nothing else needed, it might be a compatibility issue with the NIC and the server, look for a bios update and try a different PCIe slot if available.

 

P.S. please upload pics to the forum directly, don't use external sites.

Im going to try putting the nics both in the server and doing an iperf test like that but I dont know why would my server have any issues its a proliant dl360p g8. Just did the test and for some reason im getting a bandwidth of 23.6 gbits/s so I really dont know whats the issue

 

Edited by Guest
Link to comment
Just now, johnnie.black said:

And both have the same problem? Then there's something wrong with the server.

 

I have Mellanox on 5 Unraid servers working without any issues.

Do you know if its an hp dl360p g8 problem cus as I see you got a thread on hp proliant problems so I suppose you would know

Link to comment
1 hour ago, MarkPla7z said:

Do you know if its an hp dl360p g8 problem cus as I see you got a thread on hp proliant problems so I suppose you would know

Not an hp problem. I’ve used solar flare and mellanox cards hitting 500MB/s.

 

if you want to test, install windows on the server bare metal and check. That would be my next step.

Link to comment
Just now, 1812 said:

Not an hp problem. I’ve used solar flare and mellanox cards hitting 500MB/s.

 

if you want to test, install windows on the server bare metal and check. That would be my next step.

then its really wierd I've tested all the hardware so would it be some windows/unraid issue

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.