Changed GPUs - Win10 VM hangs on boot - switched back no luck


dexxy

Recommended Posts

Hey all,

 

I have (had) a totally serviceable windows 10 VM that I was passing a Radeon 290x Tri-X OC through to. It was loud and power hungry so I swapped it out for a Radeon RX 570. First time through everything worked fine -- Windows immediately recognized the new card and updated drivers. I started getting visual artifacts, though, and shut down the machine. From that point on I haven't been able to get the VM to launch with a GPU passed through. The windows logo shows and spins dots until it freezes, or, if I try to pass through a card with a specified BIOS file Windows isn't able to see the OS and tells me I can't use a system restore point. Switched back to the 290x but still can't get the VM to boot with the GPU passed through. I can VNC in fine, and I can get the passthrough working on an old vdisk backup from a few months ago, but I'd really like to avoid reverting to that old backup.

 

So, any advice here? Would a different card (Nvidia?) help?

 

thanks!

Link to comment

@dexxy For quite a while some of the newer AMD drivers had some issues if installed in a VM. I guess the 290x didn't used that version and was fine. Don't really know if the Windows driver your issue. Back than most people had issues by the AMD one. Try to install a driver that is released I think around oct 2018 or earlier. This worked for most people. A couple people reported by using Q35 as machine type for the VM they had no issues. Maybe test this but keep in mind, could be possible you loose your windows activation and have to reactivate the VM. Maybe test with a fresh install or a copy of your VM first before screw up your existing one.

Link to comment

Finally got it working! A good 20 hours of troubleshooting but finally have it up and running. Ended up swapping out for a Geforce 1070. I ran into some issues with PCIe settings, though. I couldn't get the card to show up in my system devices list. Ended up pulling every PCI card except the GPU and was able to see it. Then, slowly plugged everything back in but couldn't get both of my USB 3.0 controllers recognized. I finally straightened that out, but for anyone that comes across this in the future:

 

System:

GA-7PESH2 gigabyte server board

2x e5-2667 v1 Xeons

32GB ram at 1600MHz

Nvidia 1070

USB 3.0 controller with four rear-facing ports

USB 3.0 controller with two internal headers (for case USB ports)

 

basically, this is a mobo quirk. the GA-7PESH2 splits up PCIe slots with 4 of the 5 (the topmost 4) going directly into CPU0 with the remaining x1 slot going into the PCH chipset and then into CPU0. my cards were from the same manufacturer and seemed to trip up unRAID if they were both going directly into CPU0. I split the cards up with one in slot 4 and one in slot 5 (that x1) slot. everything is working fine, now. both the GPU and the USB controllers.

  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.