Slower Network Write Speed to NVMe Cache Disk


Recommended Posts

I recently switched out my cache drive from an older 250GB Crucial SSD to a 500GB Samsung Evo 960 NVMe SSD using a PCI-e to NVMe adapter. The weird thing is, even though the NVMe drive is faster I'm seeing slower speeds from it, particularly when I transfer files to it over my gigabit hard-wired network. With the SATA SSD I could saturate my network every time transferring at 113MB/s both read and write. But with the NVMe SSD the fastest I can write to it over the network is 84MB/s and I still get 113MB/s reads from it so I know it's not the network. Here is the hardware I'm using:

 

Asrock 990FX Extreme9 w/ latest firmware - I'm using PCI-e slot #1 which is a PCI-e x16 slot

This PCIe to NVMe adapter - which is capable of 1500+MB/s write

Samsung Evo 960 NVMe SSD - which can easily max out the adapter

 

I enabled jumbo frames in Unraid's settings but that didn't make a difference. Any suggestions would be highly appreciated. Thank you!

 

Link to comment

It's probably useful to attach Diagnostics (Tools -> Diagnostics -> attach zip file).

 

Have you done actual network-only test (e.g. iperf) and/or storage-only test (e.g. diskspeed docker, or even dd / rsync)?

 

Also as a side note, you sort of misunderstood how the adapter would work.

Given your mobo has PCIe 2.0 + the M.2 device is a x4 device + the adapter itself is just basically rewiring, your M.2 will only run at PCIe 2.0 x4 speed so theoretical 2GB/s max throughput. All those claimed speed numbers on the listing are meaningless.

Not that it should have any impact to your 84MB/s discussion here but just thought to mention it.

Link to comment

Watch the SpaceInvader One on iperf testing below. Without an actual test, you can't completely eliminate network issues.

 

With the 84MB/s, what sort of data are you copying over? Did you trim it? What's the reported temperature? Are you sure it's being written to cache and not to the array?

 

 

 

 

Link to comment

I transfer mostly video files, ranging from 1GB to 60GB and they always max out at 84MB/s now. The screenshot shows a transfer I just did. As you can see it hits 84MB/s right away and sits there, almost like there's a bottleneck somewhere. I am positive it's going to the cache drive. I monitored the cache drive temp in Unraid during the transfer and it stayed 88°F the whole time. I have the SSD trim app installed and set to run every 4 hours.

 

I'll watch that video and do the tests but it's highly unlikely it's a network issue when I've been doing hardwired transfers to this server at 113MB/s for over 5 years now and the ONLY thing that changed was switching out my SSD cache drive from a SATA one to an NVMe one.

Screenshot_20200330-114246_Remote Desktop.jpg

Link to comment

Finally had some time to watch the video and run iperf and shockingly it is the network causing the slowdown.

C:\iperf3>iperf3 -c 192.168.1.3
Connecting to host 192.168.1.3, port 5201
[  4] local 192.168.1.2 port 58770 connected to 192.168.1.3 port 5201
[ ID] Interval           Transfer     Bandwidth
[  4]   0.00-1.00   sec  84.0 MBytes   704 Mbits/sec
[  4]   1.00-2.00   sec  83.9 MBytes   704 Mbits/sec
[  4]   2.00-3.00   sec  84.0 MBytes   705 Mbits/sec
[  4]   3.00-4.00   sec  83.9 MBytes   704 Mbits/sec
[  4]   4.00-5.00   sec  83.8 MBytes   703 Mbits/sec
[  4]   5.00-6.00   sec  84.0 MBytes   704 Mbits/sec
[  4]   6.00-7.00   sec  84.0 MBytes   704 Mbits/sec
[  4]   7.00-8.00   sec  84.0 MBytes   704 Mbits/sec
[  4]   8.00-9.00   sec  83.9 MBytes   704 Mbits/sec
[  4]   9.00-10.00  sec  83.5 MBytes   700 Mbits/sec
- - - - - - - - - - - - - - - - - - - - - - - - -
[ ID] Interval           Transfer     Bandwidth
[  4]   0.00-10.00  sec   839 MBytes   704 Mbits/sec                  sender
[  4]   0.00-10.00  sec   839 MBytes   704 Mbits/sec                  receiver

iperf Done.

It just makes no sense to me since literally nothing about my network has changed since switching from SATA to NVMe SSD.

Link to comment

Aaaand I solved it. Decided to make sure I had the latest NIC software installed on my Windows 10 source machine, and sure enough after installing that I am now transferring at 113MB/s to my Unraid server.

 

Thank you everyone for helping me troubleshoot and leading me down the right path to fix the issue!

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.