dual red-balled drives, how to proceed?


Recommended Posts

would either of these allow me to move or copy that date to another disk, if it's still there?

 

Also, since I've been trying to move it after it failed, I suspect I'm really just moving the 'data' that parity has calculated, and that the data on the disk has not changed.  If so, and if I can see the disk outside the array, what's gonna happen if I try to move it all, considering about 1/3 of the data has been 'moved' onto other disks?  I'll have LOTS of duplicate files.  how will I 'fix' that problem?

 

I'm so not very happy right now :(

Link to comment

If you copy the data from the md volume, it will keep parity intact.  If you access any of the drives from there base device, such as /dev/sdX you will be running outside parity and need to rebuild.

 

If you copied and updated files on the emulated disk and then rebuild parity and bring that disk back in, you would end up with dups since the files would still reside on the real disk.

 

It is a tuff call.  Usually when my setup gets too out of sorts, I would verify all the data disks look like they are valid, and build a new config around the data disks and let parity rebuild on the existing parity drive.

 

I wonder if you unformatted is coming from it thinking the format is something different that when it is. 

 

I found a quick way to get parity back if you think all the data disks are ok and most of the parity is valid is to start with a new config, put all the disks back into the original slots and then start the array with the valid parity checked.  This will bring the array back online, and then it runs a parity check and will correct any parity that doesn't match.

 

You may have to deal with the dups after the array is back, but producing a file list on each drive and compare the relative filenames to remove the dups...

 

Link to comment

hmmm...

 

yeah, not much fun right now.

 

I'm trying to move data around on the known good disks, to free one of them up, and to get some of my 'scattered' files back together.  Once I have that done, I'll try to mount disk9 outside the array, then try to move everything onto the newly empty drive, then deal with finding duplicates, and eliminating them.

 

Is there any script for this duplicate repair, or is this a nightmare, 'do it by hand' thing?  if I can't automate it, I'll probably just forget it, and replace the missing data, as I suspect it will take about as long, but require much less of my time.

 

:(

 

thanks again for all the help.

 

I sent jonp a PM yesterday, and emailed Tom & jonp this morning, but haven't heard anything back yet.  I know they are busy with testing, so I'm not surprised, nor upset, but it's frustrating looking at the potential disaster, and not knowing how best to proceed.

Link to comment

Actually, from what I remember, I think unmenu has a duplicate report you can click and run.  I think the latest version works on v6...

There's actually a couple different duplicate detection threads running around if you search the forum. The problem is people seem to define duplicate differently depending on their circumstances, filename vs. content.

 

In this case, you would probably want to find and list all duplicate name and relative path instances, then compare content and delete one if they are binary identical. That is completely different from someone wanting to find binary dupes with different names, or dupe names with different content.

 

In a file recovery situation, you may have many identical file names, but only 1 with good content, so you would have to look through the entire list by hand to figure out which was corrupt and which was the good copy.

 

If the data is re-creatable, it may be easier to clean house and start over.

Link to comment

There's actually a couple different duplicate detection threads running around if you search the forum. The problem is people seem to define duplicate differently depending on their circumstances, filename vs. content.

 

In this case, you would probably want to find and list all duplicate name and relative path instances, then compare content and delete one if they are binary identical. That is completely different from someone wanting to find binary dupes with different names, or dupe names with different content.

 

In a file recovery situation, you may have many identical file names, but only 1 with good content, so you would have to look through the entire list by hand to figure out which was corrupt and which was the good copy.

 

If the data is re-creatable, it may be easier to clean house and start over.

 

thanks.  Once I make room, and get disk9 out of the array (one way or the other), I'll dig thru the forums and try to find the duplicates threads.  Ideally, I can run something, and get a report, and can find the 'exact' duplicates, and test a few to make sure everything still works as it should (video plays, image opens without corruption, etc), then will delete exact duplicates.

 

Then, look for duplicate filenames, and see if I have a good and a bad copy of any more files, then clean as necessary.

 

if I have any duplicates, but with different names, that's something I will worry about another day.

 

Depending on how the 'cleaning' of disk9 goes, it will determine at what point I 'clean house' and re-create what I lost.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.