Jump to content

iphillips

Members
  • Posts

    5
  • Joined

  • Last visited

iphillips's Achievements

Noob

Noob (1/14)

0

Reputation

  1. Meh, maybe not nightmares, but I'm starting to pull my hair out a bit. I'm trying to run a couple VMs as daily-use computers. One, a Windows 10 machine, is primarily used for gaming (emulation mostly, Dolphin, CEMU, etc). It was working very well until unRAID 6.9, when things got weird. I've had to edit my registry to explicitly turn on MSI signalling for my GPU (1050ti), etc. etc. to keep things from crashing constantly, but now I've got it working well. My Mac VM is coming along nicely, Big Sur is passing through to my Radeon R9. USB passthrough is killing me, though. My X99 motherboard seems to have a pair of Intel USB controllers. I've fiddled with BIOS settings to get them listed separately in unRAID without fiddling with unsafe interrupts or any of that other nonsense. I'm giving one of them to unRAID, the other is stubbed and my Windows VM is playing nicely with it. When it comes to adding another controller for the Mac, though, things go sideways. After fighting with a few cards that don't seem to handle PCI reset properly (one VL805-based, one FL1100-based, one FL1100-based with a VL812 hub controller), I just picked up a Renesas u720201-based card that behaves properly when shutting down and starting VMs. Of course, it seems that unless I'm doing something wrong, u720201 cards are not natively supported by macOS. I've tried assigning it to the Windows VM instead, but then that VM started locking up after 15 seconds or so. Restarted in safe mode, ensured that MSI signalling was enabled for that card, but I can't seem to get it to work. So. I'm four USB cards in, and zero-for-four. The Mac seems to do okay with the onboard X99 controller, but I apparently need that for the Windows VM. I'm making do in the meantime by passing through individual USB devices through the Renesas card to the Mac VM, but that's not a long term solution. Not posting logs or XML yet since I'm looking for more of a plan of action.. 6.9 and beyond have made my Windows VM incredibly fragile with my current hardware setup so I don't want to mess with that much.. Is it possible to boot unRAID from a PCI USB card and stub both the onboard controllers? Is there another PCI USB card that I should be looking at instead (I have an x4 slot available) that the Mac will play nice with? Should I give up entirely and find a new hobby? Would love some feedback. Thanks!
  2. Quick update -- this link, posted by mikeg_321 turned out to be the key. Enabling MSI interrupts for the GPU in question after booting in safe mode did the trick. So far it looks like we have a happy ending!
  3. Slammed my head against a brick wall for months trying to solve this problem. X99 motherboard, Haswell-E CPU, Nvidia 1050ti GPU. 6.8.3 works perfectly, but both 6.9.2 and 6.10 completely hose my Windows 10 VM, video TDR failure when initializing the GPU. Is anyone at Lime aware of this issue? Because it's starting to look as though 6.8.3 is the last version I'll ever be able to use, and I'm already having trouble with apps requiring 6.9.
  4. Long time unraid user here. Haven't really had any serious problems until now, but I'm ready to give up over this one. I had a working Windows 10 VM until recently, when a USB card started to get finicky. Before I realized it was a hardware problem, I decided to wipe the VM and start from scratch. Big mistake, apparently. Intel Haswell-E system. I'm passing through an nVidia GTX1050ti and one of my motherboard's onboard USB controllers. Both are stubbed. VBIOS image for the 1050 is valid, and gave me no problems before (nVidia card was set as the primary GPU). Installing Windows 10 Home version from an official ISO, downloaded from Microsoft one week ago. I have a valid license, however activation seems to have no bearing on the issue. I've tried installing via VNC, applying all updates, and then installing drivers. This results in different interesting behaviour -- notably, that whatever happens, after about five minutes or so, the VM will crash and get caught in a boot loop. After a few reboots, I'm greeted with the recovery screen. Switching back to VNC and no longer passing through the GPU allows Windows to boot normally again. The boot loops begin a few minutes after either attempting to install the latest drivers off a USB stick, or a few minutes after installing the VirtIO internet driver, presumably because Windows is attempting to find an nVidia driver on its own to install. To rule out a bad GPU, I swapped out a GTX950 for the GTX1050ti. It exhibits the same maddening behaviour. The GPU is either primary or secondary, depending on what I'm trying that particular minute. (I also have an AMD R9 that I use for a macOS VM -- I've tried passing it to the Windows VM; video works but sound does not, but I haven't fought much of that battle) Both GPUs are stubbed, XML edited with the appropriate "multifunction='on'" and triple checked that settings are correct for bus/function. Not sure what other info might be helpful (logs, XML etc).. but here's my VM XML from my latest failed attempt and my IOMMU groups. Unraid is set to boot in legacy mode, permit UEFI boot is deselected. Any help would be immensely appreciated. VM XML.txt IOMMU Groups.txt VFIO PCI log.txt
  5. Generally happy with unRAID, but one problem has been puzzling me lately. I'm running 6.7.2 on a six-core Haswell-E. A few VMs set up.. Two are headless, one linux and one macOS. Three have pass-through video cards -- one running macOS which is always running, and a Windows 10 vm that trades places with a libreelec vm depending on if I'm gaming or not. All the VMs are smooth and stable. I can game on the Windows machine while Photoshopping on the macOS machine all day, no hiccups. The problem is when I start switching between libreelec and Windows. They use the same hardware setup, pass through video card, USB card, etc.. with the exception that I allocate less memory and CPU cores to the libreelec vm. After switching back and forth a few times, eventually it'll throw an error when starting up the one I want. An execution error, telling me that a resource is unavailable. Shortly afterwards, unRAID will lock up completely, and I'll have to hard-reset the server and check parity yet again. Also, when the server comes back up after this event, the VM tab is invariably missing. I need to go back into settings and re-enable. Any advice? With this one exception, everything seems to be running perfectly.
×
×
  • Create New...