Lousy SMB throughput on 10gig


Recommended Posts

Been happy with unraid for some time on gigabit Ethernet with throughput that is saturating gigabit connection. Recently bought a couple of 10g cards to try out in the server which has drastically slowed down SMB (~250mbit). Iperf3 is showing full throughput to a windows machine. Machine has a 2.5g USB adapter that is connected over Ethernet to a 10g switch. Server is connected via DAC cable to the switch. 10g cards used in the server is a melenox connect x2 and QLogic Corp. cLOM8214 on a true x8 pcie port.

Things iv'e tried:
Iperf test
Removed the bonded connection, where the 10g connection is not bonded to anything else
Set the MTU to 9000.

 

Is it configuration error, hardware, or a bug?

Link to comment

It is certainly possible to get near 10G link speed. See results below.

 

Copying a 20G file from Unraid (cache pool) to workstation (nvme disk) using SMB

image.png.38151de3f75225b2099a3bdbdc074fef.png

 

And this is copying a 20G file from workstation (nvme disk) to Unraid (cache pool) using SMB

image.png.228d2b081d0d30e5425fe2e1ae58cdb4.png

 

How are you testing?

Keep in mind that both source and destination must be fast enough to fully load the link.

Link to comment
  • 2 weeks later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.