Guest Posted December 20, 2018 Share Posted December 20, 2018 (edited) Hi, I just installed a 10gbps nic in my unraid server and my pc when using normal ethernet it was fine but now using the preferred lane of 10gbps between the server and my pc the 10Gbe lane seems to be only going at 150 mbs however I got 2 raid 0 ssd's in the server and an Nvme in my pc The Hardware im using is a Chelsio N320E-SR - Server Mellanox Connect X2 - PC Cisco SFP-H10GB-ACU7M - Cable When looking in windows setting and unraid the link is said to be 10gbps and all controller lights are flashing to indicate a link Edited December 29, 2018 by Guest cus Quote Link to comment
1812 Posted December 29, 2018 Share Posted December 29, 2018 I feel like this is a windows problem. might consider perhaps consider swapping the 10gbe cards out with one another. Quote Link to comment
Guest Posted December 29, 2018 Share Posted December 29, 2018 (edited) 1 hour ago, 1812 said: I feel like this is a windows problem. might consider perhaps consider swapping the 10gbe cards out with one another. Ok Update I unpluged and repluged the original cards and It seems to be working now however the speeds are very slow Edited December 29, 2018 by Guest Quote Link to comment
1812 Posted December 29, 2018 Share Posted December 29, 2018 37 minutes ago, MarkPla7z said: Ok Update I unpluged and repluged the original cards and It seems to be working now however the speeds are very slow are you writing to the array or to an ssd cache? define "very slow" Quote Link to comment
Guest Posted December 29, 2018 Share Posted December 29, 2018 (edited) 4 minutes ago, 1812 said: are you writing to the array or to an ssd cache? define "very slow" I am writing to the SSD cache I even created a share using Cache only to test however the speeds were fluctuating between 50 Mbs - 300 Mbs https://gyazo.com/ad1fa6269a09877737522da8de428154 Edited December 29, 2018 by Guest Quote Link to comment
Guest Posted December 29, 2018 Share Posted December 29, 2018 Doing an Iperf im getting faster speeds on my 1Gbps cable than my 10gbps fibre nic's https://gyazo.com/9e914da7558f13e8d0bf70a948c646e7 Quote Link to comment
1812 Posted December 29, 2018 Share Posted December 29, 2018 (edited) 2 hours ago, MarkPla7z said: I am writing to the SSD cache I even created a share using Cache only to test however the speeds were fluctuating between 50 Mbs - 300 Mbs https://gyazo.com/ad1fa6269a09877737522da8de428154 what type of file(s) is/are this? is it the same both directions? also verify on the dashboard that your connection is 10000 Mb/s Edited December 29, 2018 by 1812 Quote Link to comment
Guest Posted December 29, 2018 Share Posted December 29, 2018 (edited) 55 minutes ago, 1812 said: what type of file(s) is/are this? is it the same both directions? also verify on the dashboard that your connection is 10000 Mb/s They Are video files and my dashboard and windows explorer indeed both say 10000Mb/s do you think the nic could be thermal throttleing as my pc is watercooled so not much air goes past it Edited December 29, 2018 by Guest Quote Link to comment
JorgeB Posted December 29, 2018 Share Posted December 29, 2018 Looking at the iperf results it appears to me that there's something very wrong with the 10GbE network, you're not even close to getting 100Mbits, that would point to some issue with the hardware, try different cables and different NICs, testing with iperf should get you close to line speed. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 14 hours ago, johnnie.black said: Looking at the iperf results it appears to me that there's something very wrong with the 10GbE network, you're not even close to getting 100Mbits, that would point to some issue with the hardware, try different cables and different NICs, testing with iperf should get you close to line speed. I've Tried putting both nic's in my computer and Iperfing them in a closed loop with two diffirent cables and both were getting 4gbps as my pc doesnt have enough pcie lanes to run both at full speed meaning the nic's and the cables are fine its just somthing to do with the connection between my pc and the server Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 That doesn't make much sense, iperf is only getting 20Mbits, and even if you don have enough lanes you must have at least one lane, and one PCIe 2.0 lane is capable of around 3200Mbits. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 (edited) 8 minutes ago, johnnie.black said: That doesn't make much sense, iperf is only getting 20Mbits, and even if you don have enough lanes you must have at least one lane, and one PCIe 2.0 lane is capable of around 3200Mbits. Nonono you mis understood I took all the hardware out of the server and out it into my pc and connected the cables and when I did an Iperf test it got 4gbps in my pc which it should of got as each card didnt have all the pc lanes needed which means the problem is with the server as now I put the other card back in the server and the iperf test is back to 20Mbps and ive tried multiple cards in my server could it be my network settings https://gyazo.com/da97c8950c79ef3909bfe777fccf0f2c Edited December 30, 2018 by Guest Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 Aah, OK, and if you put the NIC back on the server iperf goes down to 20Mbits again? If yes the server is the problem. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 Just now, johnnie.black said: Aah, OK, and if you put the NIC back on the server iperf goes down to 20Mbits again? If yes the server is the problem. Do you have any idea what may be causing it these are my network settings https://gyazo.com/da97c8950c79ef3909bfe777fccf0f2c Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 No, default settings should get speed much faster the gigabit out of the box, using jumbo frames might improve some more, nothing else needed, it might be a compatibility issue with the NIC and the server, look for a bios update and try a different PCIe slot if available. P.S. please upload pics to the forum directly, don't use external sites. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 (edited) 14 minutes ago, johnnie.black said: No, default settings should get speed much faster the gigabit out of the box, using jumbo frames might improve some more, nothing else needed, it might be a compatibility issue with the NIC and the server, look for a bios update and try a different PCIe slot if available. P.S. please upload pics to the forum directly, don't use external sites. Im going to try putting the nics both in the server and doing an iperf test like that but I dont know why would my server have any issues its a proliant dl360p g8. Just did the test and for some reason im getting a bandwidth of 23.6 gbits/s so I really dont know whats the issue Edited December 30, 2018 by Guest Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 And now its back to normal the first iperf reading 20mbps the rest 0mbps Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 IMO best bet would be to try a different brand NIC. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 Just now, johnnie.black said: IMO best bet would be to try a different brand NIC. I have Mellanox and Chelsio which both have been said to work with unraid Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 And both have the same problem? Then there's something wrong with the server. I have Mellanox on 5 Unraid servers working without any issues. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 Just now, johnnie.black said: And both have the same problem? Then there's something wrong with the server. I have Mellanox on 5 Unraid servers working without any issues. Do you know if its an hp dl360p g8 problem cus as I see you got a thread on hp proliant problems so I suppose you would know Quote Link to comment
JorgeB Posted December 30, 2018 Share Posted December 30, 2018 That's not my thread, don't have any HP servers, I use Supermicro boards only. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 Just now, johnnie.black said: That's not my thread, don't have any HP servers, I use Supermicro boards only. oh sorry that was the other guy in this thread Quote Link to comment
1812 Posted December 30, 2018 Share Posted December 30, 2018 1 hour ago, MarkPla7z said: Do you know if its an hp dl360p g8 problem cus as I see you got a thread on hp proliant problems so I suppose you would know Not an hp problem. I’ve used solar flare and mellanox cards hitting 500MB/s. if you want to test, install windows on the server bare metal and check. That would be my next step. Quote Link to comment
Guest Posted December 30, 2018 Share Posted December 30, 2018 Just now, 1812 said: Not an hp problem. I’ve used solar flare and mellanox cards hitting 500MB/s. if you want to test, install windows on the server bare metal and check. That would be my next step. then its really wierd I've tested all the hardware so would it be some windows/unraid issue Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.