Jump to content

DougG

Members
  • Content Count

    9
  • Joined

  • Last visited

Community Reputation

0 Neutral

About DougG

  • Rank
    Newbie
  1. johnnie.black, thank you. Updating to xfsprogs 5.0.0 finished my checkdisk and upon reboot, my array came back up.
  2. I have cancelled and ran it 2x more. Let it run for over 36 hours total. Still the same.... resetting inode 4321574511 nlinks from 11 to 9 any ideas?
  3. Everything has been running fine, upgraded to 6.7.0 and rebooted. Disk 3 is "Unmountable: No file system". I ran xfs_repair -n, then xfs_repair, then currenly running xfs_repair -L. It's been running for 8+ hours. Seems to be stuck on "resetting inode 4321574511 nlinks from 11 to 9" Any help would be appreciated. dransik-diagnostics-20190516-0910.zip
  4. Appreciate all your help on this. Had the drive (after it successfully mounted and rebuilt) go into a read-only state. Had to run the xfs_repair, but now all is checking out.. I ordered a new PSU. Seems like that will solve all my issues.
  5. Thanks for all your input, it has rebuilt and back to normal. Now I need to purchase a PSU that will hold 9+ drives. I've been doing some research. Many say a 350 is big enough. I have a 250 now. I do not think I will ever go above 10 drives. I have 2 questions. 1) How much benefit do you actually get from 2 parity drives? 2) How should I size a PSU for an approx 10 drive system? (2 drives are SSD) Again, thanks for all your help.
  6. interesting. I have taken a 2nd computer's power supply and moved 3 hard drives over to that one. It looks like it might work. It's been rebuilding the drive for about 5 minutes now where before it would fail after 10-20 seconds. The rebuild must draw more power than a pre-clear. I bet my original drive might still be good too. Guess that can be a 2nd parity drive. I'll keep you updated on its success/failure. But it's looking a lot better already.
  7. Appreciate the input. I tried a 2nd SATA port on the motherboard. Then I tried one on the expansion card. I've been running fine for a year. I just pre-cleared the drive again. Swapped SATA cables. Checked to make sure connections are solid. Everything seems fine before I add it to the array. But as soon as I add it back to Disk 1, it fails within minutes.
  8. Hello All, Hope you can help I had a disk (disk 1) fail. I removed it and purchased a new one(WD Red WD30EFRX). Everything is running fine (in emulated mode on disk1) . When I put the new drive in, and assign it, it starts the rebuild and gets errors within seconds. I purchased another drive (same model), thinking it was a bad drive and put that one in. Same thing. The drive goes to Disabled / Emulated. I then changed the SATA connector to a different SATA port and assigned the disk to disk1. Same thing happens. I'm kind of at a loss at what to do to get the drive back to "good" Attached is my diagnostics report. Thanks in Advance. dransik-diagnostics-20190130-2131.zip