Yousty

Members
  • Content Count

    65
  • Joined

  • Last visited

Community Reputation

3 Neutral

About Yousty

  • Rank
    Newbie

Converted

  • Gender
    Undisclosed

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. I actually figured out how to do it on my own and it's super easy. Figured I'd post the solution here in case anyone else finds this thread via googling like I did. I'm assuming everyone is following the guide on OmerTu's github and you're at step B.3. Just create a new docker container and add the following port and variables to it.
  2. Did you ever get this figured out sonofdbn? I would like to set it up as well but not sure where to begin.
  3. Thank you!!! I updated my AMC input and output directories using storage as shown in the picture and move now works!
  4. I'm hoping somebody can help. I have Filebot setup and working perfectly. I download files from my seedbox to a temp folder (mnt/user/temp) and then have AMC rename and move the files to where they belong (mnt/user/Movies or TV Shows) because I don't need the files in the temp directory after they're renamed/moved. However I've noticed using the move command requires reading and writing the entirety of the file, which seems odd to me since it's staying on the cache drive (until unraid's mover runs and moves the files to the array). Usually it's not that big of a deal but a lot of times I'm dow
  5. Aaaand I solved it. Decided to make sure I had the latest NIC software installed on my Windows 10 source machine, and sure enough after installing that I am now transferring at 113MB/s to my Unraid server. Thank you everyone for helping me troubleshoot and leading me down the right path to fix the issue!
  6. Finally had some time to watch the video and run iperf and shockingly it is the network causing the slowdown. C:\iperf3>iperf3 -c 192.168.1.3 Connecting to host 192.168.1.3, port 5201 [ 4] local 192.168.1.2 port 58770 connected to 192.168.1.3 port 5201 [ ID] Interval Transfer Bandwidth [ 4] 0.00-1.00 sec 84.0 MBytes 704 Mbits/sec [ 4] 1.00-2.00 sec 83.9 MBytes 704 Mbits/sec [ 4] 2.00-3.00 sec 84.0 MBytes 705 Mbits/sec [ 4] 3.00-4.00 sec 83.9 MBytes 704 Mbits/sec [ 4] 4.00-5.00 sec 83.8 MBytes 703 Mbits/sec [ 4] 5.00-6.00 sec 8
  7. I transfer mostly video files, ranging from 1GB to 60GB and they always max out at 84MB/s now. The screenshot shows a transfer I just did. As you can see it hits 84MB/s right away and sits there, almost like there's a bottleneck somewhere. I am positive it's going to the cache drive. I monitored the cache drive temp in Unraid during the transfer and it stayed 88°F the whole time. I have the SSD trim app installed and set to run every 4 hours. I'll watch that video and do the tests but it's highly unlikely it's a network issue when I've been doing hardwired transfers to this server
  8. I have attached my Diagnostics report. Yes, I ran the DiskSpeed docker, but as far as I can tell it only benchmarks read speed, which it benchmarked at 1,502MB/s the whole test. I'm not terribly familiar with iperf so unsure what to run, but I figured my network wasn't the issue since I always maxed out network speed with my previous SSD. nas-diagnostics-20200330-1001.zip
  9. I recently switched out my cache drive from an older 250GB Crucial SSD to a 500GB Samsung Evo 960 NVMe SSD using a PCI-e to NVMe adapter. The weird thing is, even though the NVMe drive is faster I'm seeing slower speeds from it, particularly when I transfer files to it over my gigabit hard-wired network. With the SATA SSD I could saturate my network every time transferring at 113MB/s both read and write. But with the NVMe SSD the fastest I can write to it over the network is 84MB/s and I still get 113MB/s reads from it so I know it's not the network. Here is the hardware I'm using:
  10. Yes, I'm using ethernet. Yes I'm in a cold climate with dry air, what could possibly be causing a bad ground?
  11. This build worked fine for a few months then started exhibiting this issue. I just unplugged and re-plugged every power connection in the case, guess we'll see if that resolves the issue.
  12. Yeah, I have some SATA power splitters to be able to power all the hard drives.
  13. Can somebody please try and help solve an issue I've been having the last several months that's driving me crazy?! For some reason my Unraid server will hang whenever my tower is touched, we're talking placing something on top of it or even something as simple as the Roomba bumping into it. If that happens it becomes completely unresponsive even though I can still hear the fans running. There is nothing on the monitor and trying to access webUI fails. This has happened about 20 times in the past few months and it's driving me crazy because the only way to fix it is to hard shutdown with the po
  14. Actually the worst thing that can happen is frying the motherboard leads going to the CPU fan header due to high current draw, but the odds of that with only 3 fans total is pretty slim.