JonathanM Posted February 2, 2019 Share Posted February 2, 2019 9 minutes ago, squirrelslikenuts said: bare metal windows 7 is what all my testing is being done with. Sorry, I was going by what you posted. 4 hours ago, squirrelslikenuts said: A windows VM on one machine (on nvme cache drive) writing to the other servers SMB NVMe share (or ssd share) reports over 100 MB/s transfer. The same windows VM copying data from the other servers SMB NVMe share maxes out at 65 MB/s. Link to comment
JorgeB Posted February 3, 2019 Share Posted February 3, 2019 I'll do some more testing when I have the time, but I did the first test today expecting to get full gigabit read and writes, which should be between 110/114MB/s and definitely not getting it on reads, and it's not the hardware as I can get much faster read speeds with 10GbE, different hardware can make a bigger difference, as the OP is seing, but it does really appear that there's some issue with the read speed over SMB, at least in some cases, and it's not the first time, I remember some releases ago I could only get fast speeds with some of my servers by forcing SMB to 2.02, it doesn't appear to help now. Link to comment
limetech Posted February 3, 2019 Share Posted February 3, 2019 1 hour ago, johnnie.black said: I'll do some more testing when I have the time, but I did the first test today expecting to get full gigabit read and writes, which should be between 110/114MB/s and definitely not getting it on reads, and it's not the hardware as I can get much faster read speeds with 10GbE, different hardware can make a bigger difference, as the OP is seing, but it does really appear that there's some issue with the read speed over SMB, at least in some cases, and it's not the first time, I remember some releases ago I could only get fast speeds with some of my servers by forcing SMB to 2.02, it doesn't appear to help now. Is this with Intel ixgbe driver? Link to comment
squirrelslikenuts Posted February 3, 2019 Author Share Posted February 3, 2019 SOLVED I've made progress. Found an Acer i5-650 system in the basement with 8gb ram and threw Windows 10 on an SSD into it. After updating all the Windows Updates and throwing in an Intel PCIe network card, I was able to achieve this..with no changes to the unRAID server. Tested on unRAID 6.6.6 Previous tests were with a (higher end hardware but windows 7 system with onboard nic). No magic config. Fresh install of Windows 10 and an Intel PCIe network card. Thats it. Will test with the onboard nic in that system and report back. First Pic is WRITES TO the unRAID server Second Pic is READS FROM the unRAID server Kinks worked out, Im ready to buy Link to comment
squirrelslikenuts Posted February 3, 2019 Author Share Posted February 3, 2019 5 hours ago, jonathanm said: Sorry, I was going by what you posted. sorry that was the last test I had done, but my main original post was tested from bare metal Win7 Link to comment
JorgeB Posted February 3, 2019 Share Posted February 3, 2019 7 hours ago, limetech said: Is this with Intel ixgbe driver? No, my 10GbE NICS are Mellanox, gigabit are Intel, though apparently using an Intel NIC fixed the issue for the OP. Link to comment
Marshalleq Posted February 3, 2019 Share Posted February 3, 2019 So when you say solved, you know what was causing it? Or you just proved Unraid can perform with the right client? If the former, I'm keen to understand. Thanks. Link to comment
squirrelslikenuts Posted February 3, 2019 Author Share Posted February 3, 2019 5 hours ago, Marshalleq said: So when you say solved, you know what was causing it? Or you just proved Unraid can perform with the right client? If the former, I'm keen to understand. Thanks. Unfortunately no. And it pisses me off. The client (i7-3820, Asus Maximus MB, 32GB ram, intel ssd boot drive, all WD black drives) was a Windows 7 system. It has served me well for 6 years (since last re-install), and can WRITE to various servers (ubuntu, freenas and unraid) all at over 100 MB/s. When reading from the arrays, it would max out at 65 MB/s like clockwork, across 3 different server OS (with a slight bump in speed reading from ubuntu). I changed 4 variables (yes I know thats bad lol) at once to get a solid 112 MB/s R/W speed. Different Hardware (lower power Acer prebuilt i5/8gb/120gb ssd) Different OS - Windows 10 (albeit fully reinstalled and "fresh") Different Network Card Different Port/Cable on the switch I will not dedicate more than 1 more hour to tracking down what went wrong, as I was looking for an excuse to upgrade to Windows 10 so take advantage of installing (without kijigering) natively on an NVMe boot drive. Offending client that was capped at 65 MB/s read speeds from unRAID (network RX) was using an; -Intel 82579V Gigabit Network Adaptor (onboard) I'm unaware if this has known issues with unRAID, but the server shouldn't care what chipset of card is on the other end as long as it can handle GbE My goal was to get full speed from unRAID, and I have. If that requires a different network card or a different OS so be it. Link to comment
limetech Posted February 3, 2019 Share Posted February 3, 2019 2 hours ago, squirrelslikenuts said: the server shouldn't care what chipset of card is on the other end as long as it can handle GbE Correct. If you google 'windows network throttling' there are many ways windows (the client) could be responsible for this. Link to comment
squirrelslikenuts Posted February 3, 2019 Author Share Posted February 3, 2019 19 minutes ago, limetech said: Correct. If you google 'windows network throttling' there are many ways windows (the client) could be responsible for this. I just found it interesting that it would throttle to 65 MB/s exactly, EVERY TIME. Link to comment
limetech Posted February 3, 2019 Share Posted February 3, 2019 26 minutes ago, squirrelslikenuts said: I just found it interesting that it would throttle to 65 MB/s exactly, EVERY TIME. Seems tell-tale of an imposed speed limit. Link to comment
squirrelslikenuts Posted February 3, 2019 Author Share Posted February 3, 2019 2 hours ago, limetech said: Seems tell-tale of an imposed speed limit. Also find it interesting that I can squeeze a few more Mbit if I use FTP instead of SMB. I would assume this is due to Samba overhead ? Link to comment
Marshalleq Posted February 4, 2019 Share Posted February 4, 2019 You can see in the comments of the below article, you're not the only one - some of this is related to read speed, but of course the case is different in that it's not windows. Changing to NFS helps sometimes but in all cases the Mac is slower than windows or linux. It seems to point to the open source versions of the protocols being implemented differently, of which Unraid does have. Just food for thought. https://www.tech-knowhow.com/2017/01/mac-os-network-transfer-speed-still-broken-sierra/ Link to comment
JorgeB Posted February 4, 2019 Share Posted February 4, 2019 4 minutes ago, Marshalleq said: You can see in the comments of the below article, you're not the only one Yeah, this happens with various OSes, I saw the same with my FreeNAS server when I had it, SMB reads are noticeable slower then writes over 10GbE, I see the same with Unraid, never really worried about it since it's fast enough and looks like it's a Samba "feature", at least on some hardware configs. Link to comment
Marshalleq Posted February 4, 2019 Share Posted February 4, 2019 Also was a problem with AFP. Of course running Windows on that same hardware was fine. And apparently it's been getting worse with each iteration of Mac OS. Link to comment
Joseph Posted September 22, 2019 Share Posted September 22, 2019 On 2/3/2019 at 2:30 AM, johnnie.black said: No, my 10GbE NICS are Mellanox, gigabit are Intel, though apparently using an Intel NIC fixed the issue for the OP. I've been fighting slow network/transfer speeds for a while and I'm beginning to think its due to the onboard NICs on both unRAID boxes. Can someone recommend a solid Intel Dual Port 1Gb/E I should try? I apologize if this is the wrong thread for this type of post, but it seemed to me like it's the right topic. Thanks! Link to comment
Recommended Posts
Archived
This topic is now archived and is closed to further replies.