rutherford Posted August 25, 2021 Share Posted August 25, 2021 Recently I had another XFS file error and had to do the XFS repair tool. This time around, 320G of newly created Lost and Found weirdo files. Other than making yet another backup of pictures, or music collection - what’s up with these loss of files? I mean, this is unRaid, we’ve got an array, we’ve got a disk pool. We’ve got a personal server at home, yet are still vulnerable to file loss. is this a type of loss I can avoid? Is there a tweak in my system I can make? Like move the whole thing to btrfs? Pull the corrupted drive, simulate the drive with array, restore the entire thing like that? am I missing something? thanks! Quote Link to comment
JorgeB Posted August 25, 2021 Share Posted August 25, 2021 If you keep having file system corruption without an apparent reason there's likely an underlying hardware issue, like bad RAM. Quote Link to comment
trurl Posted August 25, 2021 Share Posted August 25, 2021 1 hour ago, dkerlee said: Pull the corrupted drive, simulate the drive with array, restore the entire thing like that? Rebuild cannot fix corruption. Quote Link to comment
rutherford Posted August 25, 2021 Author Share Posted August 25, 2021 Thanks for the replies. it’s happened once before, but it was a couple years ago. This time, I think it was tied to a power outage. I suspected if there was something as simple as rebuilding the drive, it would have been in the documentation or Spaceinvaders video! you guys can see where I’m coming from here though right? Quote Link to comment
trurl Posted August 25, 2021 Share Posted August 25, 2021 1 hour ago, dkerlee said: This time, I think it was tied to a power outage. Get an UPS Quote Link to comment
rutherford Posted August 25, 2021 Author Share Posted August 25, 2021 I have a UPS. Maybe it’s setup wrong? I’ll double check. but the original question remains: what, if anything, can I do to mitigate some of these file corruption problems? Quote Link to comment
itimpi Posted August 26, 2021 Share Posted August 26, 2021 By far the commonest cause of file corruption is RAM related issues causing in-memory corruption before data is written out to the drives. An untidy shutdown (e.g. after a power loss) can also cause incomplete/bad writes to a drive although these are normally recoverable by the file system repair utilities. This is why a UPS is strongly recommended so you can avoid u tidy shutdowns after a power loss. The other thing is to have some way of detecting any corruption in individual files (such as using BTRFS for array disks) or have a way of generating checksums (such as Dynamix Fille Integrity plugin )so that when corruption is detected you know what files are affected so you can restore them from your backups. A key point is that you should always have backups of anything important that would be an issue if it was lost as whatever precautions you take there is always the potential of data loss if something unforeseen happens on the server. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.