Jump to content


  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About rottenpotatoes

  • Rank

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. So that was simple. When I opened both my second drive and my primary, I noticed that the EFI folder had a "-" character in front of it. I removed it and it booted fine in UEFI mode. Thanks all.
  2. So I RMA'd my RAM and got the replacements today. I popped them in and booted to memtest on the second usb. I didnt see any errors after running a few minutes. I swapped back to my primary drive, the one that said not bootable, and that is still an issue. How do I get my drive bootable again? This had all my configurations, plugins and such.
  3. I spoke too soon. Now I am getting errors like crazy on the test #8 moving inversions 32bit pattern.
  4. I used another flash drive and made another bootable unraid, and started the memtest. It would not work when booted to UEFI, but when I booted legacy, it went right to the utility. Its been running now for a couple hours without error.
  5. I tried to go ahead and boot into UEFI mode to run that memtest, but it fails to even try to boot. Where the Legacy mode comes up with an error message that says non-bootable, when in the boot menu I choose the UEFI mode, it immediately returns like it doesn’t even try and the screen just flickers like it refreshed. So now after doing that check box, I can’t boot into my flash drive at all. And to answer squids question, no I have tried no alternative cables, router ports, or ethernet jacks on the motherboard
  6. Actually, when I try to boot in non UEFI mode it says the device is not bootable. I checked the bios and made sure that the uefi is set to 'other os', not 'windows uefi', and changed the compatibility support module to 'uefi and legacy oprom '
  7. No I have not ran any previously. I just disabled that check box on my drive and rebooted. Is there a specific test I need to run?
  8. I rebooted into regular mode and I have attached this diagnostic file. unraid-diagnostics-20191002-1648.zip
  9. Although I was almost certain that I already had this set properly, I rebooted and went in to verify that the network stack was disabled in the BIOS. It is, and I attached a screenshot just to be thorough. I am now booted back into safe mode awaiting further instructions.
  10. I am using the standard unraid image on the stable channel.
  11. That was interesting. So I stopped my docker service and after shutting down my VMs that auto-boot, the VM service. I then went to network settings to check 2 and 3 above. It was already set to IPV4 only, so I tried to change the bonding to no. When I tried to hit save, my machine crashed again. I took a photo of my monitor and Ill attach it. I had to do a hard reset. After boot, I went back again to disable bonding. This time, when i tried to hit apply, I got an error stating that my flash drive was not mounted in R/W mode. This time I did a reboot from the Main page. Upon this boot I was able to apply the bonding setting. Then I rebooted and entered safe mode. I downloaded my diagnostics and I have attached the new file. unraid-diagnostics-20191002-1536.zip
  12. I bounced my server, and then grabed a fresh diagnostic zip. I have attached it. unraid-diagnostics-20191002-1442.zip
  13. I removed the nerdpack plugin, and then grabed a fresh diagnostic zip. I have attached it. unraid-diagnostics-20191002-1205.zip
  14. To the best of my knowledge, nothing major has changed. There have been VM updates, and Docker updates, but Ive been on 6.7.2 for a while now. I had added a new docker, but in my troubleshooting, I turned it off and disabled autostart. This still did not fix my crashing. To the best of my knowledge, I am number 3, but nothing hardware related has changed in quite a while.