Windows 10 guest VM fails when installing ATI RX480 drivers


Neo_x

Recommended Posts

Hi everybody

 

i am running at wits end here, and hope somebody with a similar system/setup can provide guidance - especially since it seems like with an ATI system i should not be encountering this :/

I am trying to get a stable windows  10 VM running for gaming purposes, in an attempt to save on buying additional hardware for two machines.

 

 

 

some notes

  • Asus primce X370 Pro (latest bios loaded)
  • Ryzen 2700x CPU
  • HIS ATI RX 480 GFX  ( only card installed, no onboard graphics available)
  • 2 x 8 port SAS cards ( thus utilizing all three PCIEx16 slots)
  • 64GB ram (Corsair LPX 3000MHz C15 Memory Kit for DDR4 Systems) - i believe the bios with default settings is not running this at 3000mhz yet. (will play around adjusting this one i get a stable solution going)
  • No ACS multifunction enabled( alll devices i want to pass through is in separate IOMMU groups) - ACS multifunction was tested however with no changes in the below issues.
  • 64 bit radeon software adrenalin edition 18.10.2-oct25 was utilized

 

First i tried with a UEFI setup ( UEFI enabled in bios as well as Unraid Flash, and then creating an OVMF windows 10 template with graphics card passed through (including a captured bios- the one downloaded from techpowerup didnt work).  Starting the VM on OVMF bios did not produce any output on the connected monitor - not sure why.

 

Reading through a few guides, recommended disabling UEFI and creating the VM using Seabios(i also disabled hyper-v as part of my troubleshooting) 

this presented me with the normal windows installation, which then allowed me to get into the desktop and install the Virtio drivers for the three unknown devices (network especially), and disabled windows UAC.

after this i restarted the VM, downloaded the ATI drivers, and installed. roughly on 60% (usually where a normal computer display will go blank for a few seconds before resuming the driver install), the Passed through display switched off, and never returned.


KVM XML and diagnostics attached

 

to match time entries in Diagnostics syslog, please see below :

 

  • 19:35 start VM (default VGA driver only - RX480 passed through)
  • 19:44 - installation started Custom. Graphics and audio driver only
  • 19:47 : roughly 60% - screen goes dark, no further output from VM
  • 19:52 - attempt normal VM stop via unraid web gui- spinning icon for 20 seconds, then shows normal started icon again.
    • nothing on VM logs or Syslog to indicate any issues.
  • 19:54 force stop and start VM
    • windows logo is shown, spinning icon, then freezes
    • no error on VM or system log.
    • have to force stop. not recoverable.

 

 

I am really not sure what i could be doing wrong here. any body have some additional pointers i could try?

 

Thank you team!

 

 

storage-diagnostics-20181101-2031.zip

windows 10 .xml

Link to comment

/bump

anybody have ideas i could try?

i attempted to switch to a pre-installed windows 10 with drivers already loaded, where some progress was made, but it still seemed unstable (no crashes, but was unable to get any software/games started that required 3D)

 

will try to install drivers again , or if all else fails, see if i can get a secondary card installed (challenge being that i need to sacrifice an 8 port SAS card which will reduce my array capacity by 8 drives :( )

 

Link to comment

Is this the issue where some graphics card fail to reset properly? I believe it usually affects people who pass through a card to one VM, then stop the VM and try to pass it through to a different VM. Since the driver installation is effectively resetting the card, perhaps it's the same issue?

 

Note that your three PCIe x16 slots are actually configured as x8 (v3), x8 (v3) and x4 (maximum, v2) from top to bottom.

The bottom slot may receive as few as x2 PCIe v2 lanes if you use the x1 slots that share lanes with it.

 

Do you really need so many disks? Fewer larger capacity ones have much better performance than more lower capacity ones. If you're aiming for many high capacity disks then maybe you need a dedicated NAS box and a dedicated VM box.

 

There are lots of cheap single slot graphics cards that you could use to dedicate a GPU to Unraid. You can even get them with a x1 connector or use a small adapter card that allows a x16 low profile card to fit in a x1 full height slot.

Link to comment

I had a similar situations to this when running my VMs passthrough with any method except one. Non-UEFI Unraid. VM running a Q35-2.10 machine, with OVMF BIOS. And a known valid rom for my video card. Especially crucial if your taking the GPU away from unraid (Headless) for your VM. Had to try a few different video bios (roms) to find one that would work.

Edited by metathias
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.