10gbe Performance


Go to solution Solved by xxnumbxx,

Recommended Posts

I have 2 unraid servers, just upgraded to a 10gbe/2.5gbe switch, which is fed by a DAC to the switch from my UDM:SE

 

(Version: 6.10.0-rc2) AMD 3900x - 2.5gbe card (cat6 to the server 2.5gbe port)

(Version: 6.10.0-rc2) Dell r630 with E5-2620 v3 x2 - Intel x540 10gbe card (server rj45 10gbe to sfp+ to eth cat6 in the switch)

 

Each server has a VM and all updated windows 10, on the AMD server I can max out my internet 1.4gb down on a speedtest, the Dell server I cant seem to get over 1gbe, unraid says its connected at 10gbe on the Dell. I know the dell is older equipment but, why the slowness.

 

I took a 6gb iso from my laptop which is hooked up 2.5gbe to the same switch, I was able to write to the AMD server at 280 MB/s (maxing out my nvme). I did the same test with my Dell which as a 2.5 SSD, i know its not as fast as the nvme, but I can only get 80 MB/s on the transfer.

 

Something is bottlenecking the Dell and I cant tell when it is. Any Ideas on why slow on the Dell Server?

 

image.png.3fe5b10e45809d23bf884aa8a8c8f591.png

Link to comment
  • 2 weeks later...
On 3/5/2022 at 3:35 PM, emuhack said:

I have 2 unraid servers, just upgraded to a 10gbe/2.5gbe switch, which is fed by a DAC to the switch from my UDM:SE

 

(Version: 6.10.0-rc2) AMD 3900x - 2.5gbe card (cat6 to the server 2.5gbe port)

(Version: 6.10.0-rc2) Dell r630 with E5-2620 v3 x2 - Intel x540 10gbe card (server rj45 10gbe to sfp+ to eth cat6 in the switch)

 

Each server has a VM and all updated windows 10, on the AMD server I can max out my internet 1.4gb down on a speedtest, the Dell server I cant seem to get over 1gbe, unraid says its connected at 10gbe on the Dell. I know the dell is older equipment but, why the slowness.

 

I took a 6gb iso from my laptop which is hooked up 2.5gbe to the same switch, I was able to write to the AMD server at 280 MB/s (maxing out my nvme). I did the same test with my Dell which as a 2.5 SSD, i know its not as fast as the nvme, but I can only get 80 MB/s on the transfer.

 

Something is bottlenecking the Dell and I cant tell when it is. Any Ideas on why slow on the Dell Server?

 

image.png.3fe5b10e45809d23bf884aa8a8c8f591.png

I am having a similar issue. Have a dell R730 and only pulling 1.5Gbps. Try running iperf3 on both machines to rule out drives. If you haven't used iperf before spaceinvader has a video here https://www.youtube.com/watch?v=DF5IOvitw4I. Let me know your readings. Maybe we can figure this out for both of us. 

Link to comment
On 3/5/2022 at 10:35 PM, emuhack said:

Something is bottlenecking the Dell and I cant tell when it is. Any Ideas on why slow on the Dell Server?

 

Some people experience speed degradation due to bonding.

Under Interface Rules change your 10G to become eth0 instead of eth2 (you need to restart after this)

Disable bonding and use the interface with bridging enabled.

 

Link to comment
5 hours ago, bonienl said:

 

Some people experience speed degradation due to bonding.

Under Interface Rules change your 10G to become eth0 instead of eth2 (you need to restart after this)

Disable bonding and use the interface with bridging enabled.

 

is this what you are talking about

image.png.320ff40e6841469ad019b92ca83daf35.png

Link to comment
1 hour ago, xxnumbxx said:

Yes you are bonding your connections. Change bonding mode to no, and change bonding members to eth0. 

Well that was a hot mess, had to blow away my network config on the flashdrive 4x

 

I am ruling that the card that I got is either bad or the drivers are not loading correctly. In the list of ports, the MAC address's are all the same for each port, and at the bottom of the screen they all have individual ones, when I make the changes like you recommended the server boots to no IP and cant access anything via the GUI. I have it set how it was, and when I get some $ and time I will look at another card for the server, but the weird thing is though it does say connected at 10gb on the main unraid page. 

image.png.00ad1dc57666fa8e4b8cdc370588c53c.png

 

After work I will try iperf to see what the actual speeds are

Link to comment
11 minutes ago, xxnumbxx said:

You need to change the adapter setting for eth2 to eth0 so unraid uses it. If you go to setting->network setting you should see the below settings. Change the NIC Mac for eth2 to eth0 and reboot. Unraid will then use your 10Gb connection.

 

unraid_network.png.c7d04aeca792927aa4c2da5284dd1685.png

Here goes nothing!

Link to comment
18 minutes ago, xxnumbxx said:

You need to change the adapter setting for eth2 to eth0 so unraid uses it. If you go to setting->network setting you should see the below settings. Change the NIC Mac for eth2 to eth0 and reboot. Unraid will then use your 10Gb connection.

 

unraid_network.png.c7d04aeca792927aa4c2da5284dd1685.png

You sir deserve a beer (or redbull if beer is not your thing, lol) 

 

that worked and ran a speedtest on the VM and not as fast as my newer server, but I expect that. BOOM!

 

image.png.e77e6e1812ad5a8209575b3f6b769586.png

Link to comment
  • 2 months later...
  • 2 weeks later...

FYI it appears interface rules was removed in .10.2 release, so no longer can change to eth0

 

Edit: Disregard, it appears to be a known bug that many are reporting now, so a fix should be coming, they did not remove it on purpose.

Edited by Jclendineng
known issue
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.