Jump to content

2 drives marked as disabled, need help getting running again


Recommended Posts

I noticed today that two of my drives are marked as disabled. One of them is a parity drive, the other is a data drive in the array. I am running dual parity. I am looking for advice on properly assessing whether my drives are faulty and needing to be replaced. I ran short SMART tests on both drives and that came back normal. Also of note, I have two extra drives that are currently in the array but empty, so I could repurpose these to replace the two disabled drives if they are bad. If I need to do that, what order would be best to tackle this? Party or data drive first?

 

I have not had a drive failure before in unraid, so I don't want to take any uneducated steps that could permanently lose data. 

 

Edited by rangusT
Link to comment

Unfortunately I didn't think to grab diagnostics before shutting down the system.

 

In the main tab there were counts in the errors column on the affected drives. I want to say between 4-5000 on the parity2 and a few hundred on disk5. I'm assuming these errors triggered the disable.

 

If I do a rebuild on top, which way would be safest? Parity first, or data drive first? Would it make sense to use one of the empty drives to rebuild into rather than on top of the existing drives?

 

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...