GPU pass through issue


Recommended Posts

Problem description:

When booting up my Win 10 VM the screen goes black and nothing else happens. The VM works just fine with VNC.

 

My Setup and Settings:

I checked that IOMMU and HVM both are enabled, My Vm is running as Q35-2.5 with OVMF Bios.

I am trying to use a gtx970 (Inno3d Herculez) with 6 cores of a 6700K on an MSI Z170 Krait Gaming (the "old" one).

 

Any ideas what might help at that point?

Link to comment

I have barely had time to debug it, but I'm seeing a similar issue with a similar system - I have a Core i7-6700, ASRock Z170 Extreme7+ motherboard, and an Asus Strix GTX 970, running unRaid 6.2b18.  I've got the BIOS set to use the discrete GPU rather than the internal one on boot, and I've also got IOMMU and HVM enabled in the BIOS.

 

I can see the text-mode console output from unRaid on boot, but as soon as I start the Win 10 VM to install it, I get a black screen and nothing else.

 

[Edit: forgot to add that I'm using the default "Windows 10" VM template that's included with unRaid 6.2b18 and I haven't messed with the config. I'm passing in 6 cores and about 20GB of RAM to the VM.]

 

One thing that occurred to me was that I might need to switch cables or something, as I'm using fairly old HDMI ones - I have a monitor being shipped to me and was planning to experiment with using the DisplayPort interface on the GPU instead once that arrives. I'll follow-up with any info if I discover anything.

 

Are there logs or anything inside unRaid that I can look at to help diagnose the problem?

Link to comment

Also having this issue on the following build

 

Intel 3970x

Gigabyte x79-up4

32gb ram

Msi gtx 980

Evga GeForce 210

GeForce 8600

 

The 980 is not working at all even when the rom is being provided. I've tried in Seagate and ovmf with no luck.

 

I was able to create another vm with the 210 and had no issues.

Link to comment

I was able to get farther with this by setting up the onboard GPU to be the main display in my BIOS instead of the external GTX 970 video card. After rebooting, when I passed the external card to the VM it wound up displaying Windows. (Now I'm having trouble with getting the VirtIO drivers to load, but that's a separate issue.)

 

At the same time I changed out the video cable I was using to connect, possibly that made a difference too?

Link to comment

I ran into a similar issue with my AMD W7100, until I noticed that the indicator lights on my keyboard was acting strange. I did a lot of experiments and found that even though the screen was black, windows was still trying to load but the GPU did not get an activation signal. My fix was:

 

1) Force stop VM

2) Change to VNC, either of 2 things will happen, you will either boot straight into Windows as normal or you get an EFI boot screen. If you get the boot screen type EXIT, then choose Boot Device and then highlight either EFI device or EFI device 1 and it will boot up windows

3) Shutdown windows in VNC

4) Restart your unRAID

5) Edit VM to boot up with your GPU and it should now kick in and boot as normal

 

This is an issue with some AMD cards not shutting down properly, it could also apply to your card also, worth a try.

 

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.