NVIDIA GPU passthrough


WRX48

Recommended Posts

So ever since I watched Linus from LTT I have wanted to use a VM for my gaming needs and a smaller net top PC for basic web browsing much like how the Shadow box does it. Also I just like having the physical machine in a closet so I don't see it. I did the RGB thing and discovered the hard way it's too much of a distraction for me when using the system. Anyway Every time I have passed through either a 1650 super or 1060 6GB card to a Windows 10 VM the driver from NVIDIA kills it's self and I just get that awful 800x600 "Microsoft basic display adapter" as my default since windows has stopped the card from working. I can't even remote into it; I have to use VNC to operate in the VM. I have tried everything; Watching his video's real close to see if I missed anything and also watching Space Invader One's video series on the subject to no avail. I got to the point I gave up and just built a basic workstation looking gaming PC. But I ultimately want to get this going so that my wife and I have one physical machine and less clutter. I am looking at ebay for a used server with a xeon cpu in it for the core count as the enthusiast intel platform is out of my budget.

 

Help!!!

Link to comment

You don't say much about the system but I have Nvidia passthrough working fine with 1650 (regular), Quadro P1000 and 1060 3GB. I run these as remote gaming VM's with parsec.

 

I use SeaBios and FX 4.2 chipset for the VM as I keep my main system on the stable Unraid version, currently 6.8.3

 

I have mulitiple cards, so the card passed through is not the primary boot card.

The card(s) are stubbed using VFIO-PCI CFG plugin

Installed windows

Installed the Virt IO drivers

Installed Nvidia Driver

 

I have a dummy HDMI plug connected to each card so it powers up correctly.

 

I find sometimes when making changes to the VM, it will suddenly ignore the fixed IP in windows but I can usually connect with RDP if I check the router for the MAC address of the VM. Ideally, I fix the IP to the MAC in the router then is stable vs DHCP.

 

What is your system spec?

 

 

 

 

 

 

Link to comment

So; my setup was a AMD R5 2600, 16GB of ram, Asus TUF B450 Gaming motherboard. It sounds to me like due to the constraints of the motherboard/case it wouldn't have worked the way I had wanted it too. What motherboard/case combo are you running? My problem is that when I install the NVIDIA driver, that part is fine. I reboot and as soon as I do the driver detects it's in a VM and kills it's self. I will have to get a ATX/EATX motherboard and case that will accommodate the changes and report back. The R5 has no iGPU so that's probably my main problem.

Link to comment
11 hours ago, WRX48 said:

So; my setup was a AMD R5 2600, 16GB of ram, Asus TUF B450 Gaming motherboard. It sounds to me like due to the constraints of the motherboard/case it wouldn't have worked the way I had wanted it too. What motherboard/case combo are you running? My problem is that when I install the NVIDIA driver, that part is fine. I reboot and as soon as I do the driver detects it's in a VM and kills it's self. I will have to get a ATX/EATX motherboard and case that will accommodate the changes and report back. The R5 has no iGPU so that's probably my main problem.

My system spec is in my sig, but in case you're on mobile it's a E5-2660 V3 in SuperMicro X10SRA so HEDT, though my test machine which currently has the 1060 uses a X99 with the same chip. I deliberately went for PCI-E lanes over single core performance as I only needed relatively low power remote gaming VM's. One of these pretty much just runs whichever Roblox game my son is 'boosting' in 24/7 so it's not all demanding. 

 

For cases, I'm using Antex P101s as there are 8 PCI-E expansion slots in the backplane whch helps with an ATX HEDT board where the lower slot (7th) is a x8 PCI since you can use a x1.5 or x2 width GPU in that slot. I've actually moved 1 of the HDD modules from the test server to the main server so I now have 10 HDD slots + 2 x SSD slots + 4 further SSD slots that I 3d printed that are attached to the PSU tunnel.

 

If you are using single GPU or the primary GPU, did you use a VGA BIOS, these can be downloaded from tech power up. You can also try disabling Hyper V in the VM settings. There are some threads on Nvidia errors but once I got around the changing IP address for RDP and needing a dummy plug to fake a monitor I haven't had many other issues with Nvida. My AMD GPUs have been somewhat more fickle resulting in me pulling it from the main server and using the spare quadro for the gaming VM. 

 

There was some info that some AGESA versions on the AMD platform are better than others, so some BIOS versions may work better, however it's not something I've needed to deal with as yet.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.