GPU Passthrough giving Black Screen with any VM


Recommended Posts

Hi,

 

I have been trying to use unraid as a desktop using GPU passthrough to use my vms. The vms work fine using VNC, I have been successfully installing windows 10/ubuntu 19.10 vms using that. However, I keep getting no signal to my monitor whenever I try to start a vm using gpu passthrough.

 

Here is my setup:

- Asus ROG Strix 470-f Gaming motherboard.

- AMD Ryzen 7 2700x

- Nvidia GT 520 in PCIE 1 slot (used as my main graphics card, the unraid GUI displays on the monitor connected to this card

- Nvidia GTX 1080ti in PCIE 2 slot (the one I want to passthrough, the connected monitor gives a black screen)

- 16GB RAM

 

I do not have any integrated graphics, this is why I put my old GT 520 in this computer, so that it is used for unraid and I have the 1080 free for passthrough. AMD Vi is enabled in the bios.

 

Here is what I tried:

- ACS Override did not solve the issue

- The problem happens weather I create the VM with OVMF or SEABios

- I dumped my 1080 rom using SpaceInvader's tutorial, didn't solve

- I tried setting the VM to a Q35 machine

 

Attached are the syslog and vms xml.

 

Thank you for your help.

tower-diagnostics-20200126-1052.zip vm_windows.xml vm_ubuntu.xml

Link to comment

First try passing through both the GPU and its HDMI audio device. Passing through just one usually doesn't work.

 

Failing that, try stubbing the 1080 by adding this to your syslinux after "append":

 vfio-pci.ids=10de:1b06,10de:10ef 

Reboot and try again (passing through both GPU and HDMI audio)

 

Failing that, backup your motherboard BIOS then update it to latest BIOS and try again.

Link to comment

I had similar issue once, when trying to use 2 graphic cards. I do not remember exactly what I did to make it work, but I ditched the idea because the gaming performance running off a second pcie slot was horrible.

 

Try only using the 1080ti in the first slot. Run unraid in faceless mode, make a user script to stub the card after boot and then try passing it to a vm.

Edited by Ver7o
Link to comment
  • 5 months later...

Sorry to hijack this oldish thread, but I'm having a similar issue with the Macinabox Catalina vms which works really well using the VNC viewer, but after I passthrough my AMD Vega 56 GPU, my console turns black and I can't get the MacOS vms to see/use the GPU.  The vms says it started, of course, I can't see the Mac boot loader and once it's up, I can't splashtop into the Mac as it never becomes available in the splashtop app only when I use VNC, but then of course it's not passing through the GPU.  I've manually edited the XML to ensure all the edited changes are correctly reflected in the XML from using the form mode and have read/tried all the other posted tips/possible solutions to no joy.  I'm really trying/struggling...and very appreciative of any assistance!!  Here's my diagnostics with the anonymized XML embedded.server1-diagnostics-20200727-1918.zip

Edited by PeteManhardt
Had to correct the state that I'm unable to remote into the running vms after passing the GPU through, it's only when I run it using VNC graphics that I can
Link to comment

Hey, this might be silly for me to even ask but during passthrough are you keeping VNC enabled? I have the exact same issue when I have both VNC & a GPU enabled for a VM. By disabling VNC, everything just works for me.

Edit:
Also, I just reviewed your setup and it's very similar to mine, only I have 4 1080tis. I had this exact problem on two different systems, both due to chipset limitations.

With my 1st system my m.2 slot was directly connected to one of my PCIe slots and both couldn't function at the same time. (weirdly enough it was one of the PCIe slots in the mid, which made me even more confused).

My 2nd system was advertised as having multiple PCIe 4.0 slots, but I later found out that they couldn't all run as PCIe 4.0s at the same time. My 4th PCIe slot for my last simply just runs at only PCIe 1.0 when the other 3 are being used, which produced a black screen if I try to pass it through, effectively making my 4th 1080ti little more than a brick.

Edited by neogenesisrevo
Link to comment

I was considering trying to figure out how to get VNC working with a GPU passthrough, but hadn't tried it; so, thanks for sharing you weren't successful. I've an onboard ASPEED vga which I'm using via a iKVM console and a single AMD Vega 56 GPU that Catalina supports that's connected to a display which shows no output during unraid boot or at anytime through the vm startup, etc. I'm not seeing any hardware conflicts in the logs.  On you thought of PCIe version, I know the slot the VGA is using (the only PCIe slot being used) is a full PCIe 3.x slot; so, I'm stumped.  The thing is as soon as I edit the XML to remove the Vega and use VNC, it works really well.  I'm simply passing the Vega to get Catalina to recognize it to use it for video editing for my teenage son as a birthday 'gift'; days later, it's sort of backfiring on me :(

 

Anyways, thanks for the reply!

Link to comment

SOLVED!  Thanks to Sparkie's post referenced below I was able to resolve my issue by doing the following:

I changed my xml as per his post with the multifunction tag with the audio occupying the same bus slot though I didn't have the ROM entry, then changed Unraid's Flash to boot legacy instead of UEFI, then changed the ASUS bios to boot CSM for both legacy & UEFI & then for legacy for all listed items except for the Flash disk. Unraid booted with no errors which I could watch from the iKVM, then started up my Macinabox vm again with no errors and l could see it on the display via the Vega, then logged into to the vm from my Win10 PC via splashtop and it worked showing the Vega in the system info graphics output! 

 

Now, I know of course without that ROM entry, I have an issue whenever I stop the vm, the Vega fans go on full blast and I get an execution error when I attempt to restart the vm again which I resolve by restarting the unraid server.  For me/my son, this is the only vm we'll be hosting on it as he's getting into video editing/rendering; so, the vm has all the cpu cores and RAM allocated anyways; thus, not a big deal to have to occassionally restart unraid since it's connected to a UPS/whole house generator and will be running non-stop otherwise.  If it becomes a bigger deal, then we'll tackle that.  He's happy, that's all that matters :)

 

Link to comment
  • 8 months later...
On 7/27/2020 at 5:43 PM, neogenesisrevo said:

Hey, this might be silly for me to even ask but during passthrough are you keeping VNC enabled? I have the exact same issue when I have both VNC & a GPU enabled for a VM. By disabling VNC, everything just works for me.

How do you disable VNC? I assumed they were mutually exclusive options but I’m getting the same black screen with GPU pass through and would like to try disabling VNC if I can figure out how.

Link to comment
  • 9 months later...

Hi all, 

 

I'm desperate for help on this one.  I've been trying for days to get a working VM with GPU passthrough to work without any success.  I carefully followed space invaders videos, Superboki's videos as well without any success.  I've had a VM working with VNC without any issues but problems starts when I try to set it up with passthrough.  I am able to get windows to show up for a couple of sec/min and then the screen goes black.  Here's my setup

 

Asus B560 Prime A

Inter i5 11600K (passthrough 6 cores)

Asus Nvidia Geforce GTX 780 (primary gpu, igpu dedicated for Unraid)

16gb ram 3200 Mhz (passthrough 12gb)

 

I forced my bios to use the iGpu as primary screen to free up my gpu.  To do so, I am booting Unraid in uefi.  When rebooting unraid it sometimes miss start so I need to restart and then it works.  I've added a vbios just like Space invaders is requiring.  I've edited it using a hex editor.   In the latest VM I tried, I made sure to not install any VNC driver as video card.  It's getting frustrating as I feel I am close of getting it but can't figure out what's missing.

 

Thanks for your help!

 

 

 

 

Edited by startsomewhere
Link to comment
13 hours ago, startsomewhere said:

I am able to get windows to show up for a couple of sec/min and then the screen goes black

Gpu is not set as multifunction.

You didn't run the vm before attaching diagnostics, so no info on what's happening.

Gpu is not isolated (attached to vfio).

You may need to dump and use your vbios.

 

Run the vm, then attach diagnostics if you are not able to fix yourself.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.