Jump to content

2x Drive Failure Sanity Check


Recommended Posts

I had two HDDs show up as disabled overnight. One is the second parity drive the other is a data drive. Both were active in a mover operation over night. I just want to make sure the failures are legit and aren't caused by something else. I've moved the drives to different backplanes in my case, although they're all still going through the same HBA card and they both still showed up as disabled.

 

I've attached the smart report for both drives and as my system diagnostic files. Anything look funny to the experts out there?

 

If the drive failures are real, should I rebuild the parity drive first, or the data drive? Thanks in advance for any help into this.

zeus-smart-20210629-1223 (1).zip zeus-smart-20210629-1223.zip zeus-diagnostics-20210629-1208.zip

Link to comment

For future reference, Diagnostics already includes SMART for all attached disks, syslog, and many other things. Take a look.

 

SMART for both disks looks OK. Syslog resets on reboot so unless you have an older syslog we can't see what happened. Bad connections are much, much more common than bad disks.

 

5 minutes ago, c010rb1indusa said:

still showed up as disabled

You have to rebuild them. You can rebuild both at once.

 

https://wiki.unraid.net/Manual/Storage_Management#Rebuilding_a_drive_onto_itself

Link to comment
2 hours ago, trurl said:

For future reference, Diagnostics already includes SMART for all attached disks, syslog, and many other things. Take a look.

 

SMART for both disks looks OK. Syslog resets on reboot so unless you have an older syslog we can't see what happened. Bad connections are much, much more common than bad disks.

 

You have to rebuild them. You can rebuild both at once.

 

https://wiki.unraid.net/Manual/Storage_Management#Rebuilding_a_drive_onto_itself

 

Okay I will try to see if other connection options result in anything different.

 

2 hours ago, trurl said:

Also, your appdata has files on the array.

 

I know, there is a folder that I can't delete either via console or tools like krusader

2 hours ago, JorgeB said:

1st LSI should be updated to latest firmware, this one has known issues:


Jun 29 10:04:09 Zeus kernel: mpt2sas_cm0: LSISAS2008: FWVersion(20.00.04.00), ChipRevision(0x03), BiosVersion(07.39.02.00)

 

2nd one is fine:


Jun 29 10:04:09 Zeus kernel: mpt2sas_cm1: LSISAS2008: FWVersion(20.00.07.00), ChipRevision(0x03), BiosVersion(07.39.02.00)

 

 

Thank you for pointing this out. Will do this and report back.

Link to comment

Updated my LSI firmware so everything should be good to go. And tried using on-board data but still no luck booting w/o disabled. System can still see the drives. Followed @constructor link on rebuilding a drive onto itself It's rebuilding now. Will report back tomorrow morning when it's hopefully completed

44 minutes ago, trurl said:

You can't move or delete open files. Go to Settings - Docker, disable then try again to move or delete.

 

Thanks will give that a try.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...