-
Posts
12 -
Joined
-
Last visited
Content Type
Profiles
Forums
Downloads
Store
Gallery
Bug Reports
Documentation
Landing
Posts posted by shimi269
-
-
I have put a GTX 760 into my setup today, but for some reason the only data that displays is the Temperature and Fan Speed on the dashboard.
- 1
-
After extensive testing, the issue is now resolved, it was just changing the cache drives file system to XFS and not using BTRFS
- 1
-
-
I been experiencing freezes and slowdowns when downloading via LFTP (using NerdPacks) on Unraid, I have managed to narrow it down to some IO issues, as put in the screenshots. But I'm at a loss as to what is causing the issues. I have tried running the download with Docker turned 'on' and 'off' and also by setting the paths to 'mnt/user' and 'mnt/cache', but BTRFS still spikes to 99% usage, and if downloading to 'user' SHFS to 99%. If I download to 'cache' BTRFS still spikes to 99% usage but also LFTP to 99% (as seen in the screenshot below).
The dips/freezes in the network, are matching to the spikes in the storage writes, as well as the 99% spikes (screenshot above). I download to the cache only, which then invokes mover every night. My Docker containers are stored on a sperate SSD, and I've eliminated them being the issue, by disabling Docker and running LFTP and still getting the issue.
Server Spec:
i7 2600k
32GB RAM
x2 250GB Crucial SSDs
x4 8TB WD Drives
-
I've got an issue that I just can't seem to find the root cause of. I have recently upgraded to 1GB fibre, and everything can download at the full advertised speed. Apart from LFTP, it can reach the full bandwidth (using FTP in that case, just to show the issue in the pictures), but LFTP will freeze randomly, the speed will drop, and then spike. I have attached pictures of 3 different runs so showcase this.
I don't know how LFTP works, but I'm guessing that it downloads to RAM(?), and then get pushed to the SSD cache?? I'm thinking this because the SSD writes are correlating with the Network and CPU dips. But if this is the case, how I can I fix this issue to get my full bandwidth?
I have also attached the test script I am using for now, but I have tried many variations which still give me the same issue.
-
14 hours ago, Energen said:
Instead of using max protocol, I'm using
server min protocol = SMB3_11
client min protocol = SMB3_11I don't know if it actually causes me any problems or not ... guess I never really bothered to look / investigate... maybe I should try to get the best settings possible but boy that's a headache to speed test everything reliably...
Just pull down a file from your SMB share and if you are saturating your network speed, then it's all good. I believe this isn't an issue on all machines running Windows 10, just happened to be the issue with mine.
-
Update 2: Issue is fixed, it was SMB 3 transfer issues (known to be the cause on Windows 10), put 'max protocol = SMB2_02' on SMB Extras under settings.
- 1
-
Update: I ran iperf, both as client and server and it saturated the full GB connection. So still at a loss why the transfer speeds between the server and my PC are so slow on a wired connection. Windows 10 issue maybe?
-
-
My Unraid server is connected via ethernet (1000 mbps) and so is my PC. When I transfer a file from Unraid to my PC (via SMB shares) I get 2MB down, and when I transfer a file from my PC to Unraid I get 40MB up.
Now when I use the wireless connection on my PC instead of Ethernet (5Ghz channel) and I transfer a file from Unraid to my PC (via SMB shares) I get 50MB down, and when I transfer a file from my PC to Unraid I get 70MB up.
I have no idea why having a wired connection is giving me a much slower speed!
-
@jimmydrac1 Did you ever find a solution for this?
Upgraded to 6.10 and now new folders are created with wrong user group
in General Support
Posted
Any updates on this? I'm having the exact same issue, and still can't find a remedy for this yet