theunraidhomeuser Posted May 24, 2021 Share Posted May 24, 2021 (edited) Setup: 8 x 14 TB Seagate Exos 1 of which is a parity drive 1 Samsung 970 Pro 1 TB Cache SSD 1 PCIe SATA controller Asus 5 3600 CPU MSI ATX 490 motherboard 64 GB DDR4 RAM Hi there, I occasionally move large files to the NAS, then the cache drive fills up as the mover can't move the files quickly enough to the HDDs. I noticed that the mover only writes to one disk at a time when the other disks are idle. Would it not be fair to assume that if the faster SSD were to write to multiple disks at the same time, the mover would be far more efficient? I feel I'm missing the benefit of having 8 drives spinning and the IO bottleneck is quite annoying, as it always interrupts my file operations due to the cache drive filling up too quickly. I'm on a 1Gbit/s LAN so moving files IN to the NAS at reasonable 100 MB/s and I feel the SSD only writes at about half that speed to a single hard drive... Has anybody else had that issue and found a solution? I have use the onboard SATA for 4 drives and have a SATA controller for the other 4. Usually everything works fine and the speed of the mover is pretty much the same for the motherboard Sata controller and the PCIe one.. Is this a performance issue by design? I use 1 parity drive to keep things in sync, is that maybe the reason why only one hard drive can be written to at any time? Either way it's really frustrating. Thanks folks, appreciate the help as always! Edited May 24, 2021 by theunraidhomeuser correction Quote Link to comment
JorgeB Posted May 24, 2021 Share Posted May 24, 2021 4 hours ago, theunraidhomeuser said: Would it not be fair to assume that if the faster SSD were to write to multiple disks at the same time, the mover would be far more efficient? No, it would be slower since parity would need to be concurrently update for all. 4 hours ago, theunraidhomeuser said: I'm on a 1Gbit/s LAN so moving files IN to the NAS at reasonable 100 MB/s and I feel the SSD only writes at about half that speed to a single hard drive... Has anybody else had that issue and found a solution? That's about the max speed you can get with gigabit. For large writes your array disks with turbo write enable should be able to keep up with gigabit, so better just transferring directly to the array. Quote Link to comment
theunraidhomeuser Posted May 24, 2021 Author Share Posted May 24, 2021 1 hour ago, JorgeB said: For large writes your array disks with turbo write enable should be able to keep up with gigabit, so better just transferring directly to the array. that's a great idead, I didn't realize the disks could cope with that 1Gbit influx.. will try and revert Quote Link to comment
6of6 Posted May 26, 2021 Share Posted May 26, 2021 I only use cache for "download/process" stuff. Anything that "should" be saved to the array goes directly to the array. I haven't worked on hard drives in a long time, but the "Turbo Write" algorithm... well... it makes as much sense as UnRAID itself. 6. Quote Link to comment
theunraidhomeuser Posted May 26, 2021 Author Share Posted May 26, 2021 Thanks folks I have turbo write enabled since the very first day but still not a big win.. disabling the cache will solve the io bottleneck but still be epotentially slower than the gigabit lan connection, as the write speeds are somewhat slow and again just to one disk at a time… I really didn’t think about this impact of the parity drive at the beginning as I thought this would be able to cope with data being written to multiple disks at a time..so I guess I’ll just test without cache for now. thanks everyone Quote Link to comment
JorgeB Posted May 26, 2021 Share Posted May 26, 2021 2 hours ago, theunraidhomeuser said: as the write speeds are somewhat slow and again just to one disk at a time… Your disks should be fast enough for gigabit to be the bottleneck when transferring with turbo write, I can transfer to one of my arrays at 200MB/s+ sustained with disks slower than those. Quote Link to comment
theunraidhomeuser Posted May 26, 2021 Author Share Posted May 26, 2021 I don’t get anywhere close to those speeds… hm.. need to troubleshoot over the weekend. It can’t even be competing drives as only drive at a time (2 with parity) are written to at the same time… my drives are getting a bit warm so I’ll try a different case to see if that helps… thanks! Quote Link to comment
JorgeB Posted May 26, 2021 Share Posted May 26, 2021 Note that with gigabit you can't get more than around 115MB/s, diags saved during a transfer might show something, also good idea to run a single stream iperf test to check network bandwidth. Quote Link to comment
theunraidhomeuser Posted May 26, 2021 Author Share Posted May 26, 2021 1 hour ago, JorgeB said: Note that with gigabit you can't get more than around 115MB/s, diags saved during a transfer might show something, also good idea to run a single stream iperf test to check network bandwidth. Fully aware, but I see speeds of 50 MB/s... Quote Link to comment
JorgeB Posted May 26, 2021 Share Posted May 26, 2021 19 minutes ago, theunraidhomeuser said: Fully aware, but I see speeds of 50 MB/s.. Those would be normal with turbo write disable. Quote Link to comment
theunraidhomeuser Posted May 26, 2021 Author Share Posted May 26, 2021 17 minutes ago, JorgeB said: Those would be normal with turbo write disable. agree, but I see this WITH turbo write enabled... Quote Link to comment
JorgeB Posted May 26, 2021 Share Posted May 26, 2021 Then 1 hour ago, JorgeB said: also good idea to run a single stream iperf test to check network bandwidth. Quote Link to comment
theunraidhomeuser Posted May 26, 2021 Author Share Posted May 26, 2021 3 hours ago, JorgeB said: Then Iperf yields 933 Mbit/sec which is in line with expectations. I think the next bottleneck would be the SATA controller or BIOS settings (MSI board). Quote Link to comment
theunraidhomeuser Posted July 21, 2021 Author Share Posted July 21, 2021 On 5/26/2021 at 12:27 PM, JorgeB said: Note that with gigabit you can't get more than around 115MB/s, diags saved during a transfer might show something, also good idea to run a single stream iperf test to check network bandwidth. yes, I'm aware, I'm talking about speeds like 30 MB/s.. so far below that amount. I'd be happy if I managed sustained 110MB/s Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.