jordanmw Posted January 21, 2019 Share Posted January 21, 2019 After adding 2 1 Tb western digital NVME drives to my motherboard- all operations seem to freeze the OS. I added these drives to pass through to a couple of windows 10 VMs but everything I do after adding them causes my rig to freeze. Any assistance would be appreciated. I can start it in maintenance mode but any formating of the drives or starting the array freezes the system and requires a hard reboot. My bios and unraid both recognize the drives but can't get past initializing the array without a full freeze requiring reboot. I can't even collect log files when it freezes. Link to comment
trurl Posted January 22, 2019 Share Posted January 22, 2019 Does the command line still work when this happens? Have you tried tailing the syslog to see if there is anything output when this happens? See the Need Help sticky pinned near the top of this same subforum for ideas on getting more information (if possible). Link to comment
jordanmw Posted January 22, 2019 Author Share Posted January 22, 2019 No shell commands and gui becomes unresponsive. Haven't tried tailing the syslog yet but have removed one drive and got the same result. If I remove both, I am back to normal and the array starts. Also should mention that when I first installed them, it made me reassign the cache drives but still failed to start. Link to comment
jordanmw Posted January 22, 2019 Author Share Posted January 22, 2019 I guess I will get a diagnostics dump and upload it when I get home- has anyone had experience with the WD 1tb black NVME drives with x399 and unraid? Link to comment
jordanmw Posted January 23, 2019 Author Share Posted January 23, 2019 Ok, so with either drive in either slot- the OS freezes after a few minutes without doing anything. The array isn't even started, and I can't get a diag without it freezing before it completes. Anyone have any ideas on things to try here? Link to comment
jordanmw Posted January 23, 2019 Author Share Posted January 23, 2019 So if I can't run diag on the array without it freezing- what is my next step? If I remove both drives, I am back to normal, array starts and works like normal. Is there some way I can have it run diag on startup and save the file onto the usb drive? Has anyone else had similar issues after adding a nvme drive? Someone? Link to comment
trurl Posted January 23, 2019 Share Posted January 23, 2019 Did you try tailing the syslog? Link to comment
jordanmw Posted January 23, 2019 Author Share Posted January 23, 2019 yeah, it will start tailing it- but freezes before it gets to anything meaningful. Same thing with diag- a few seconds after starting it, it freezes and doesn't complete. Link to comment
jordanmw Posted January 23, 2019 Author Share Posted January 23, 2019 Well apparently I managed to get a NVMe drive that has issues with linux, saw this on a amazon review: "I've had issues trying to access this drive in Ubuntu 18.04. Freezes the system. Not sure if its a driver or bios or something else. Western Digital forums and Ubuntu forums show other people have had similar issues with this drive. Still works great in Windows 10 though." So I guess I need to find out what needs to be added to use the drive properly. Once I find what is needed, I hope LT will be willing to add it into the distro..... Link to comment
jordanmw Posted January 23, 2019 Author Share Posted January 23, 2019 Here are some of the reports of the issues with linux distros and WD black NVMe issues: https://community.wd.com/t/linux-support-for-wd-black-nvme-2018/225446/5 Other users of X399 boards claim I need to add this to the boot config: nvme_core.default_ps_max_latency_us=5500 Apparently some other people have a bios setting that can be changed instead- not seeing that option on my x399 taichi. Usually I would just add this to grub, what do I need to do on unraid to get this enabled? Link to comment
saarg Posted January 24, 2019 Share Posted January 24, 2019 You could try adding it in the syslinux.cfg file on the append line of the used boot entry. On the main page, click the n the flash drive and scroll down to the syslinux.cfg part and you can edit it in the webgui. Link to comment
jordanmw Posted January 24, 2019 Author Share Posted January 24, 2019 5 hours ago, saarg said: You could try adding it in the syslinux.cfg file on the append line of the used boot entry. On the main page, click the n the flash drive and scroll down to the syslinux.cfg part and you can edit it in the webgui. Thanks saarg- will give it a shot tonight. I was going to try last night but ended up hosting a lan party on it with monster hunter world as the game of choice- so much 4 player fun- especially since they finally patched to enable 21:9! Link to comment
jordanmw Posted January 25, 2019 Author Share Posted January 25, 2019 Looks like it's a bios update again that makes things right. 3.50 bios says updates for m.2 raid enhancements but added the required parameter for the nvme to function properly. Unraid again, is not the culprit. Threadripper still has some issues that are being worked out, but at least they are fixing these issues quickly... still maintaining that this is probably the most value conscious build I have ever pulled off. Link to comment
Recommended Posts
Archived
This topic is now archived and is closed to further replies.