ptmurphy

Members
  • Posts

    94
  • Joined

  • Last visited

Everything posted by ptmurphy

  1. Been running for a week or so with no issues have turning of C-States in the BIOS. Weird how this popped up. This server has been running for 3 years without the issue, but with the 6.12.3 and 6.12.4 versions seems to have introduced the issue to my server. Like you said, probably a Linux Kernel issue. Thank you for the help!
  2. I will give this a try and see what happens. This server has been running without issues for about 3 years. Would it be normal for the problem to start so long after a stable configuration?
  3. In the last few days I have noticed Unraid not being responsive after a few hours. On startup everything works fine - shares, dockers, plugins, VMs, etc. However, after 2-3 hours, it goes totally unresponsive - computer is still on, but can't ping the IP address and nothing displayed on the screen. I did some searching and found where the Docker Custom Network Type set to MACVLAN can cause issues, so changed that to IPVLAN, but the problem is still happening. I am running Unraid Version 6.12.4 (updated when problem was happening on 6.12.3). Any other suggestions on what to look for? I have attached the diagnostics in case that will help. tower-diagnostics-20231105-1942.zip
  4. I will keep an eye on it and see if it happens again. Updated to 6.12.3 today, so hopefully that stabilizes things a bit.
  5. Thanks Jonathan. Appreciate the input.
  6. Every once in a while one of my cache drives (in a pool by itself) also shows as an unassigned device. When this happens, nothing on the cache drive is accessible. A reboot fixes it, but any idea what could cause this?
  7. I have reached a point where my VM vdisk sizes are getting too large, taking up almost all of my cache drive, therefore want to move them (or at least the main Windows 11 VM that is taking up most of the space) to another location. I am putting in a 2TB Samsung 980 NVMe M.2 drive and was curious what recommendations this group would have. I have thought of these three options, and not sure what makes the most sense... 1. Upgrade my cache drive to the 2TB, removing the old 1TB SSD cache, and basically run as I am today with VMs on the cache drive (going from 1TB SSD to 2TB NVMe) 2. Add the NVMe drive (leaving current cache for cache and docker only) using the NVMe to store the vdisks. 3. Add the NVMe drive (leaving current cache for cache, smaller VMs, and docker only) converting the Windows vdisk to a physical disk and running it on the NVMe drive. 4. Other thoughts or suggestions? What would you do?
  8. This issue has not happened since I got the syslog server running, so I will close for now and re-open or create a new case if it starts happening again. For now I'll just assume it was a problem with a plugin or a docker container that got updated and resolved the issue.
  9. OK, I got it up and running, logging to a cache-only share, and will let it run until the problem happens again.
  10. I upgraded to 6.10.0-RC2 specifically to get support for TPM BIOS for Windows 11. Everything went well on the upgrade, and everything works fine, with one problem. About once a day my system becomes unresponsive, locks up or crashes. The web interface doesn't respond, there is nothing on the primary display, and I looked through the logs but it seems to not show anything, but maybe I am missing something (diagnostics attached). The only way I can get the system back is to force a reset by holding down the power button. Once that happens, everything starts fine and works, and then I have to manually stop the parity check. I did let the parity check finish this morning just in case. I looked through the support threads to see if there were any plugins that could cause this, but I don't see anything. Any ideas what could be going on? Unraid -diagnostics-20220213-2124.zip
  11. Thanks for the reply. I did finally get it to work. Seems like Unraid didn't like trying to switch from VNC to passing through the video after initially creating the VM. I started from scratch, passed through the video card, and just for safety sake passed through the VBIOS was well.
  12. Does it get all the way through installing? If it hangs at the beginning or in the middle of the installation, I had a similar problem and when installing, choose the "Safe Video" (or something like that) mode.
  13. I successfully upgraded my video card (from Nvidia 970 to 3060 TI) and mostly everything is working (Unraid, Windows VM, Plex). However, I have setup a Ubuntu VM and having a little problem. When using the 3060 TI as the video card, all works well until I install the Nvidia drivers in Ubuntu. Without the Nvidia driver it displays in 4k, etc. I tried to install the Nvidia driver, and upon reboot I just get a black screen. Note that the Nvidia 3060 TI is a secondary card. When I set the VM back to VNC for the video card, the VM will only boot into installation mode, trying to re-install Ubuntu. Any suggestions?
  14. I did the upgrade, it was fairly simple. I took the following steps, and it wasn't too bad. After swapping the video card, I did have a bit of trouble getting my machine to boot from the USB drive initially, but that was another issue. 1. I backed up my VM image file, vm XML setting file, and my Unraid USB stick to be safe 2. Removed Nvidia drivers from the VM 3. Shutdown VM 4. Shut down Server and swap out video card 5. Edit VM setup to pass through RTX 3060 Ti and its corresponding sound card 6. Reinstall Nvidia drivers on VM When you boot into the VM for the first time, you will probably be stuck at a low resolution and get error 43 in device manager (assuming your VM is Windows) until you do step 6 and reinstall your video drivers from Nvidia. That is all it was for me. Everything went well and all is working as expected. Hope that helps...
  15. I had a similar question a few days ago in this thread... I haven't received any replies, so tomorrow I am going to proceed with the upgrade. I will report back here with any problems I run into and pointers based on my experience.
  16. I have finally secured a new video card and am planning to swap out my old video card and want to make sure I do this correct and understand how to do it before I start. I have two cards currently in the system... 1. Nvidia GeForce GT 710 used by Unraid 2. Nvidia GeForce GTX 970 - passed through to VM, not using VBIOS dump since it is a second card. I am using PCIe ACS override (set to "Both") to pass through an NVMe drive, a SATA controller (for Blu Ray drive), and a USB Controller for mouse/keyboard, and any other attached device such as an external drive. I am replacing the GTX 970 with an Nvidia RTX 3060 Ti. Here is what I was planning to do... 1. Remove Nvidia drivers from the VM 2. Shut down Server and swap out video card 3. Edit VM setup to pass through RTX 3060 Ti 4. Reinstall Nvidia drivers on VM Are there any steps I am missing so that Unraid forgets the old card, sees the new card, and allows it to pass through the VMs?
  17. That seems to have fixed the problem. I guess I should have tried the drivers first thing, but give that nothing changed except the version of Unraid, and the VM worked with the older driver before, it never crossed my mind that upgrading Unraid would affect the video driver in a VM. Thank you for the suggestion and help! I now have a Windows 11 VM successfully running.
  18. I upgraded from 6.9.2 to 6.10.0-RC2 to convert a Windows 10 VM to Windows 11 for testing. However, when I start the Windows 10 VM, the video is locked at 800x600 and doesn't seem to recognize the video card - NVIDIA 970. The passthrough seems to be working since the video does display. Any suggestions on where to start? I have attached my diagnostics file. tower-diagnostics-20220131-1154.zip
  19. Thank you for the information. I hooked the drive back up in the usb enclosure and it still worked, so seems the 3 pin 3.3v power was my issue. I did cover pin 3 (more accurately pins 1-3 - dang those pins are small. Couldn't get the 1-pin wide tape to stick, but covering 1-3 worked), plugged the drive in, and it worked! Again, appreciate the help!
  20. I adding a fourth 8TB drive (1 parity, 3 data) to my array. All 3 previous drives have been shucked WD EasyStore drives and went into the array with no problems. This fourth drive (same 8TB shucked WD EasyStore) works on USB, as well as an external eSATA dock, but when connected to an internal SATA power cable, it does not power up (have tried several cables that other working drives are attached to, and still no power to that drive). In my research, it appears that there is a new standard on some drives that if 3.3 volts are applied the drive does not turn on, as referenced in this thread... However, I can't find how to actually fix this. It appears that the third pin on the SATA cable/connection to the drive needs to be disabled. Is that correct? If so, what's the best way to do that without any permanent damage? Also, how do I verify that this is actually the issue?
  21. Worked like a charm! Funny how sometimes the issues that seem complex end up being the easiest to solve. I thought I was going to be deep inside Unraid/Linux bootup scripts.
  22. Well I be darned... I didn't know such a thing existed! I have one on order and will give that a go when it gets here on Thursday. Thank you very much!!!
  23. Interesting issue here, and I debated whether to put this here or in the VM section. Given that it seems to be an issue with Unraid switching the primary GPU, I thought maybe general support was the place... feel free to move if needed. Here is the issue. I have a server setup with two video cards (both different versions of the Nvidia GT 710). I hook both monitors up and Unraid sees the card in the first slot as the primary GPU, which is what I want. I then pass the second video card through to the Windows 10 VM. This works perfectly as long as both monitors are connected and on during the boot process. The trouble begins when I remove the monitor connected to the primary video card (the one Unraid is using) and then reboot. When the system starts, if there is not a monitor on the first card, Unraid takes over the second card as primary, which then prevents the VM from starting. I don't really want or need the second monitor hooked up and on all of the time, and would like to just use it for troubleshooting if needed. Any idea how to force Unraid to use the first card, even if a monitor is not hooked to it? Hardware... Ryzen 9 3900x processor Asus AM4 TUF Gaming X570-Plus 32GB RAM 1GB Nvidia GT 710 in PCIe slot closest to the processor (x16 slot) 2GB Nvidia GT 710 in the other x16 PCIe slot Thank you for any direction on this issue. I have searched the forums and watched many videos (most from SpaceInvader One) and haven't been able to find a resolution. Thank you!
  24. I use a program called Bvckup 2 on all of my computers to backup to the undraid server. It moves only files that have changed (if you wish) and you can schedule the backups from real-time (any time a file change is detected) to once a day, week, month, etc.). It is a very small program, very fast, easy to configure, and I have never had a problem with it. The licensed version is very cheap.
  25. Energen - thank you for the reply. Windows would never install at all... It would get to a certain stage and then error out with the error shown in the first message. The driver part was probably a bit of confusion on my part as a last ditch effort. I did finally get it to work but downloading Windows a third time and creating a third ISO file and installing from that. Not sure why it worked, but I did find a message on another board where someone had the same issue and it was a corrupt ISO installation file. It seems very unlikely that happened to me twice, but the third download did work.