MichaelAnders Posted June 9, 2020 Posted June 9, 2020 Hi all, I've been playing around all day with Unraid and creating some RAID0 drives for performance reasons. Now, just before I decided to - merge the last two disks - assign an existing disk as 2nd parity - allow a full rebuild I - luckily - ran unraid again to double check that all is good. But it's not... One of my drives states "device is disabled" - and I'm not sure why? I've also detached the disk from the RAID controller and plugged it into the mobo, didn't change anything. Attached are the diagnostics. HDD in question is Disk 8, sdb From what I can see: 1) SMART looks fine 2) the disk is present in dmesg 3) "dd if=/dev/sdb of=/dev/null" also shows the disk being accessed and I hear it as well Any ideas? tower-diagnostics-20200609-1610.zip Quote
JorgeB Posted June 9, 2020 Posted June 9, 2020 Diags are after rebooting so we can't see what happened, but disk looks fine. Quote
MichaelAnders Posted June 9, 2020 Author Posted June 9, 2020 Thanks for the quick reply! But now you got me confused... The log is from the current state after rebooting and before mounting the disks (which I will now do and back up the disk to another PC). If the disk is fine then why is Unraid stating the disk is disabled? Shouldn't Unraid state "why" disk 8 (although "fine") is disabled? Maybe in some log (I didn't see anything)? If not, would be a great thing to see in the logs something like "disk 8 disabled due to XYZ", making it easier to narrow down the problem. Quote
itimpi Posted June 9, 2020 Posted June 9, 2020 A disk gets disabled when a write to it fails for some reason. How to handle a disabled disk is described here in the online documentation. Quote
JorgeB Posted June 9, 2020 Posted June 9, 2020 1 hour ago, MichaelAnders said: Shouldn't Unraid state "why" disk 8 (although "fine") is disabled? If it happens again grab the diags before rebooting. Quote
uDrew Posted July 4, 2020 Posted July 4, 2020 I got the same problem. upc03-diagnostics-20200705-0045.zip Quote
trurl Posted July 4, 2020 Posted July 4, 2020 Looks like disk1 has completely disconnected. Check all connections, both ends, SATA and power, including splitters. Then post new diagnostics. Quote
trurl Posted July 4, 2020 Posted July 4, 2020 Also looks like emulated disk1 is unmountable. Don't do anything else without further advice. Quote
uDrew Posted July 5, 2020 Posted July 5, 2020 This is the second times after I changed the new motherboard because old motherboard (H270-ITX/AC) was broken. New motherboard: AION-8800 New expansion card: ORICO PCI-E SATA expansion card with 5 SATA ports Last time, it reports disk 4 is disable. Then I do the following steps: 1. remove disk 4 from array 2. start array, then stop array 3. add disk 4 to array, start array 4. system rebuild array 5. during rebuilding array, many error was found on Parity, disk 1 and disk 2 6. after finishing rebuilding array, I restart machine. Everything back to normal. 7. I buy a new hard disk try to copy data from array. 8. after several hours, it reports disk 1 is disable. upc03-diagnostics-20200705-0958.zip Quote
trurl Posted July 5, 2020 Posted July 5, 2020 Those diagnostics are just a continuation of the previous. You need to shut down and check the connections, reboot and post new diagnostics. Quote
uDrew Posted July 5, 2020 Posted July 5, 2020 okay, I did the follow step: 1. shut down 2. check the connection is correct 3. turn on the machine 4. all led lights for every SATA port is on 5. generate diagnostics report: upc03-diagnostics-20200705-1311.zip 6. start array 7. generate diagnostics report: upc03-diagnostics-20200705-1313.zip 8. screen capture: Screenshot from 2020-07-05 13-14-55.png thanks a lot! upc03-diagnostics-20200705-1313.zip upc03-diagnostics-20200705-1311.zip Quote
JorgeB Posted July 5, 2020 Posted July 5, 2020 Check filesystem on the emulated disk1, if contents looks correct after it's fixed, and only if that's true, you can rebuild on top, I would also suggest replacing/swapping cables on disk1 before rebuilding to rule them out. Quote
uDrew Posted July 5, 2020 Posted July 5, 2020 I did the following, but drive 1 is still disabled. 1. shutdown machine 2. change all cables 3. start machine 4. start array with maintenance mode 5. fix disk 1: xfs_repair -d 6. reboot machine 7. start machine with maintenance mode or not (both tried) 8. drive 1 is still disabled. 9. generate diagnostics report: upc03-diagnostics-20200705-1648.zip 10. screen capture: Screenshot from 2020-07-05 16-46-00.png upc03-diagnostics-20200705-1648.zip Quote
JorgeB Posted July 5, 2020 Posted July 5, 2020 24 minutes ago, uDrew said: but drive 1 is still disabled. Once a disk gets disabled it need to be rebuilt, since the emulated disk is now mounting and if contents look correct you can rebuild on top: https://wiki.unraid.net/Troubleshooting#Re-enable_the_drive Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.