Juxsta Posted December 15, 2020 Author Share Posted December 15, 2020 On 12/10/2020 at 6:32 PM, turnipisum said: Try this one. I think it's correct. windows-vm.xml 6.59 kB · 1 download Hey turns out you can only move pci devices that are on bus 0x00 to a slot > 0x00. Otherwise if the bus is greater than 0x00 then the slot must be on 0x00. Here's my modified config that loads up but still gives me an error. Let me know if you noticed something is wrong windows-vm-turnip.xml Quote Link to comment
gray squirrel Posted December 25, 2020 Share Posted December 25, 2020 So I am still suspicious of the VBIOS as I could never get the gpuz method to work. As your on an ITX you can’t have two GPU. There is a new guide for this that will slow your to dump it directly in unraid. https://youtu.be/FWn6OCWl63o Quote Link to comment
chay Posted January 1, 2021 Share Posted January 1, 2021 (edited) Fwiw, I've had success passing through a 3090 to a windows vm without needing to dump the vbios. I originally had a single 2080 Ti and had various issues with setting up the windows vms. I ended up following various tutorials from SpaceInvaderOne and LinusTechTips, but that didn't fully solve the issue. For me, I also needed to try some of the recommendations in https://forums.unraid.net/topic/85374-gigabyte-x570-aorus-elite-pro-wifi-ultra-tips-tricks/page/2/ as well as https://forums.unraid.net/topic/77609-guide-pass-through-primary-gpu-headless-unraid-fix-usb-panic-problems-and-linux-guest-audiovideo-stutter/ and was eventually able to pass through the 2080 Ti to the windows vm. For the 2080 Ti, I did need to dump the vbios. I then purchased the 3090, so I currently have both a 2080 Ti as well as a 3090 in my system. I'm stubbing both and passing both to separate vms (the 3090 to the windows vm and the 2080 Ti to an Ubuntu vm). Unraid doesn't have access to any GPU, which normally in my experience requires you to dump the vbios for the graphics cards, but for me I did not have to. The 3090 worked fine without it. Though it is being passed to a windows vm that was already setup with the 2080 Ti initially w/updated drivers. So that may have helped. Also fwiw, whenever I update the nvidia drivers in the windows vm the screen will go black and stay that way (this is the case with both the 2080 Ti as well as the 3090 for me). I've found if I just wait 10-20 minutes (> the amount of time for the update to finish) and then restart the vm, it'll work again with the driver updated. Not sure why that happens, but everything seems to work correctly after the restart so I haven't looked much further into it. Edited January 1, 2021 by chay clarity Quote Link to comment
chay Posted January 1, 2021 Share Posted January 1, 2021 Also, have you tried an older version of Q35 for the machine type? For me it's working with Q35-4.2, and I see you're using Q35-5.1 from the initial screenshot. IIRC, earlier this year I tried creating a new windows vm with an updated version and had issues Quote Link to comment
Aistis4 Posted January 15, 2021 Share Posted January 15, 2021 This one worked for me. https://mathiashueber.com/fighting-error-43-nvidia-gpu-virtual-machine/#comments 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.