Tisrok

Members
  • Posts

    12
  • Joined

  • Last visited

Everything posted by Tisrok

  1. Everything worked out. Parity was successfully rebuilt. Old parity drive swapped, old data drive swapped, all is running perfect and no data loss.
  2. Thanks to you and the mod, this worked. I rebuilt the parity on the formerly removed 1TB drive (about two hours). Verified everything was in order, which it was (unraid is awesome) Then followed the procedure but starting with the parity drive. Stopped the array. Unassigned 4TB parity drive. Shut down. Installed 12TB parity drive. Booted up. Assigned new drive to old drives slot. Started array. Rebuilding currently, should take about 20 hours for my array. After it completes, I'll install the new data drive using the same process. Thanks again.
  3. Thank you for the help, that sounds like the logical solution. I'll give it a try.
  4. Greetings, I'm having issues replacing some drives with new larger ones. My current config, A 4TB parity drive, several 4TB data drives, and a few 1TB data drives. I'm trying to switch one of those 1TB data drives out for a 12TB one, and then the parity drive from 4 to 12TB as well after. I followed the "safer method" procedure in the Unraid Docs. Stopped the array, replaced a 1TB data drive with the new 12TB. Assigned the new drive to old slot in the GUI. Started the array. Array receives a start error. First it said "wrong disk" now it just says "Disk in parity slot is not biggest." If I were to swap that 4TB parity with new 12TB parity, correct me if I'm wrong but wouldn't it fail to rebuild my data drive? Not sure what I'm missing here but any help would be appreciated. Thanks ahead.
  5. I've confirmed the card is working as intended. I don't have enough SATA ports on my motherboard to support my array without it. I've narrowed the issue down to something being wrong with the parity. With all drives (including moving my current Western Digital parity drive) set as data drives except for my pool/cache ssd, the array starts and works fine. As soon as I move any drive into the parity slot and attempt to start the array, it hangs on "mounting disks..." and crashes.
  6. Yes, all drives except the cache were running off the card. My apologies, disregard the Startech bit, I was thinking of my other RAID card. This one is a GLOTRENDS PCIe 2.0 X4 to SATA III 8 Port compatible with Linux/Windows/etc. Power supply should be fine, it's a brand new modular Corsair 750 watt, replacing my older Corsair 750 that worked under roughly the same load. Between all my components at load it shouldn't be using more than 450 watts.
  7. I removed my parity drive and both new seagate drives and with the remaining six drives the array works fine.. Tested files ranging from 5MB-4GB without issue. I'm really at a loss here. Additional info: Forgot to note that my first steps were to swap all sata cables, and the power supply is brand new out of the box.
  8. I upgraded my build from an older Intel setup to an AMD FX8350 on an MSI 970 Gaming motherboard and swapped two aging (but still working) drives with two new Seagate 1tb hard drives. I have 9 drives in my array. 1 parity drive. 7 data drives. 1 pool/cache SSD. I use a 8 port PCI-E startech RAID card along with some of the SATA ports on the motherboard. This configuration worked just fine before the hardware changes. Issues since doing the upgrade: 1st. Being unable to connect or even establish link lights. After a few reboots that issue miraculously went away with no changes to the configuration or BIOS. 2nd. One of the seagate drives would either give an "unable to mount" feedback or completely crash when starting the array. I re-formatted the drive, performed all checks which all passed, then the mounting issue went away but it crashes every single time I start the array. So I removed that drive from the array, started again, and the array actually started and worked, for a few minutes. Once I start writing anything, things get wonky and crash. To observe this, I ran a continuous ping while rebooting the machine, starting the array, and moving files (attached screenshot). Constant steady 3 to 4 ms response time, then once I start moving files, it continued getting 3-4ms response times for about one minute, then times out, then comes back but decelerates into the 1000-3000ms response times, then becomes completely unreachable/locked up, only fixable via a hard power cycle. I've tried moving drives around, removing the parity temporarily, removing my cache drive, etc. Basically any hardware configuration you can imagine in my setup ends in a crash at worst or an infinite "mounting drives" or "starting services" at best. Any help at all would be greatly appreciated. tower-diagnostics-20210525-2053.zip
  9. Thanks for the reply. The network issue somehow resolved itself after a few reboots (no configuration or hardware changes...). I'm now having other issues with starting the array and crashing, but that'd be for a new thread.
  10. Latest version as of today. Furthermore, "ifconfig eth0" says "device not found"
  11. I just switched from a very old intel build to an AMD fx8350 along with an MSI 970 gaming motherboard. The components are all working as they should be except there are no link lights/no connection when using the ethernet port on the motherboard. Using a USB ethernet dongle, I can get link lights, but still can't establish a connection. Verified that the LAN controller is enabled in BIOS. Giving the command "ifconfig eth0" pulls device not found. "ifconfig" with the dongle connected shows a "br0" device but it pulls a totally wrong IP (wrong subnet) The furthest I've tried is editing the network config file and manually setting a static IP but that didn't work either. I'm not a linux user so I have no idea how to manually fix the issue. If I had to guess I'd say driver issue but there aren't any official MSI drivers for linux. Any help would be greatly appreciated.