NVIDIA Corporation GP107 [GeForce GTX 1050] (rev a1) - Multiple Monitor Support Not Working


Sparkie
Go to solution Solved by ghost82,

Recommended Posts

Looks like I have lost multiple monitor functionality on my Windows 10 VM. I was running version 6.9 of Unraid and all was working perfectly. Two 1080p monitors and a Epson Projector. While I was away it looks like a power failure occurred that lasted over 20 minutes upon which time my UPS shutdown. Upon restart Unraid started up OK but my Win10 VM was no longer functional. Anyway upon logging in to the GUI Unraid flagged that I was running an unsupported version of Unraid and recommended upgrade to latest version which I did. Reboot and everything looked OK. VM still would not start. Then realizing my VirtIO was probably out of date I upgraded to virtio-win-0.1.225-1.iso. I tried restarting the VM, no joy.

I edited the VM and updated the Machine Bios: to OVMF TPM (could not get OVMF to work).

The Machine parameter is set to: i440fx-7.1 (this is the latest, previously it was 6.1 but not absolutely certain).

The video card is passed thru to the VM along with it's sound.

Tried rebooting and it worked. Windows 10 started OK, but only had one monitor. I installed the latest Nvidia drivers for the 1050 graphics card and still no joy, only one monitor of the three works. Opened the Nvidia control panel and clicked on multiple monitors and it shows the 3 connected monitors with an option to select the other two monitors besides the working one.

Now the problem:

When clicking on one of the other monitors I get the following messages:  "This GPU supports 1 display".

This is not true because I ran it with three monitors previously (the card has a DVI port, an HDMI port and a Display Port).

If I click on another monitor and click Apply that monitor works but the other two do not, even though the Nvidia drivers sees all three.

BTW I updated all the drivers on the new virtio iso, EXCEPT the Balloon drivers. Could not find how to do that one. I did see somewhere on the forums that you will see under Device Manager and entry for Other devices, but I do not see that. How to update the Balloon drivers?

Also I have passed thru the graphics card to the VM with an edited ROM bios with the Header section removed. So I know that is good, it worked before no problems.

So to summarize:

1. How to reinstate multiple monitor support?

2. How to update the Balloon drivers from the VirtIO ISO (it is mounted as drive E: in Windows 10.)

 

Any help on this issue would be greatly appreciated.

Thanks

 

Link to comment

I would try all the combinations of a dual monitor setup instead of 3...could well be a hardware issue, like a short...As a first try I would try hdmi+displayport.

As far as the balloon driver, you should locate under device manager --> system devices, a pci standard ram controller; click on it and manually update the driver pointing to the virtio iso.

Or it could be listed as Virtio balloon driver, in system devices.

Note that if you run the virtio win guest tool exe file inside the vm instead of manually update the drivers, it should update all the drivers automatically.

If no balloon driver is installed it could be located in unknown devices.

Edited by ghost82
Link to comment

@ghost82OK, I have done some testing. Tried all combinations as you noted. Same result.

I swapped out video card, this time a Asus Phoenix GTX1050Ti with 4Gb RAM instead of the Gigabyte GTX1050 with 2GB RAM.

Again same result only one monitor out of three can be enabled at a time.

I disconnected the projector, same result only one monitor out of 2 can be enabled at a time.

Double checked the template for the VM, everything looks OK (did not check the XML).

So I started to suspect maybe the Win10 Image was damaged somehow. So...

I created a new Windows 11 VM and fired that up.

Set it all up with the Virtual display. Win 11 working fine with all virtio drivers installed.

Installed the Teamviewer client for remote access.

Went in and selected the Asus Video Card and associated sound card.

Updated the template and started the VM.

Checking "display settings" I see three monitors (previously reconnected all three) via TeamViewer, working remotely.

Accessing via Teamviewer I could login OK. So video via the Asus card was working fine (although only seeing one screen).

But I could not verify if all three monitors were in actual fact were displaying.

Shutdown the VM after Win11 wanted to do some updates and have not been able to restart the VM ever since.

The VM just hangs. Blew away that VM and recreated a new W11 VM again. Virtual display works just fine but selecting the video card and associated sound and the VM just hangs.

Have not been able to get video working with the video card ever since, rebooted the server several times, same result, even powered down the server just in case the video needed full power off, no joy.

Now I don't know if changes have been made to the XML for VMs with Graphics Card passthrough for Unraid 6.11.1 or 6.10.x (I was running 6.9.x previously).

This is the only video card in the Unraid server.

In 6.9.x you had to hand-edit the XML to add the multifunction parameter to the video (43.00.00) and edit the slot & function for the sound (43.00.01).

Do you know if the necessity to edit the XML has been removed in 6.10.x or 6.11.1 for a passed thru video card with sound?

 

Here is a snippet form the current XML for video and associated sound. Note there is no multi-function parameter included which I had to add previously as per SpaceInvaderOne's video instructions for a Windows 10 VM. I note that his recent video for Windows 11 makes no mention of hand-editing the XML, he just selects the Video Card and assoicated sound, clicks update and starts the VM no problem. Hence my qestion, is this not a problem now? Or is there some work-around 'under the hood' we are not seeing. Here is the XML snippet for Video and associated sound:

 

<hostdev mode='subsystem' type='pci' managed='yes'>
      <driver name='vfio'/>
      <source>
        <address domain='0x0000' bus='0x43' slot='0x00' function='0x0'/> MY EDIT: VIDEO  LINE NO MULTIFUNCTION PARAMETER
      </source>
      <alias name='hostdev0'/>
      <rom file='/mnt/user/domains/Windows 10/Asus.EditChapel.GTX1050Ti.4096.171212.rom'/>
      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/> 
    </hostdev>
    <hostdev mode='subsystem' type='pci' managed='yes'>
      <driver name='vfio'/>
      <source>
        <address domain='0x0000' bus='0x43' slot='0x00' function='0x1'/> MY EDIT: THIS IS THE SOUND
      </source>

 

Is the above correct for only one video card in the system and being passed-thru to the VM?

Sorry for being long-winded, but any suggestions would be greatly appreciated.

At this point just trying to get the VM up and running again on the video card which would be fantastic and then I can go to the site to verify everything.

Cheers

 

Edited by Sparkie
Minor updates to add clarifications
Link to comment
14 hours ago, Sparkie said:

In 6.9.x you had to hand-edit the XML to add the multifunction parameter to the video (43.00.00) and edit the slot & function for the sound (43.00.01).

Do you know if the necessity to edit the XML has been removed in 6.10.x or 6.11.1 for a passed thru video card with sound?

Yes, it's still necessary, so it should be:

    <hostdev mode='subsystem' type='pci' managed='yes'>
      <driver name='vfio'/>
      <source>
        <address domain='0x0000' bus='0x43' slot='0x00' function='0x0'/>
      </source>
      <alias name='hostdev0'/>
      <rom file='/mnt/user/domains/Windows 10/Asus.EditChapel.GTX1050Ti.4096.171212.rom'/>
      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0' multifunction='on'/> 
    </hostdev>
    <hostdev mode='subsystem' type='pci' managed='yes'>
      <driver name='vfio'/>
      <source>
        <address domain='0x0000' bus='0x43' slot='0x00' function='0x1'/>
      </source>
      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x1'/>
    </hostdev>

Attach a diagnostics zip just after you start a vm that hangs.

Was the vbios working correctly with 3 monitors?Attach the vbios too.

 

The only thing I could suggest to see if the issue is caused by the vm or by the gpu is to try to install windows bare metal with all the drivers on a spare hd and check the 3 monitors.

Edited by ghost82
Link to comment

@ghost82Thanks for the help again. The server at issue is at a remote site and I am managing remotely so the delay in getting back with the current situation.

 

1. I updated the xml as you suggested ( and spaceinvaderone originally noted  ) and can now boot the VM again. Thanks for that. Partial success.

2. I did install windows bare-metal as you suggested with the video card in question and Windows now has multi-monitor (3 monitors) support again.

3. Going back to the VM again and it only has one monitor support, so the bare-metal experiment verifies the Video Card (Gforce 1050 Ti 4MB RAM) is working correctly.

4. In the VM in 3.) above I was working without passing through the vbios. Wanted to keep the variables to the minimum and then add back in if 3-monitor support was working. I will try it again with passed thru bios, Attached below.

5. Regarding diagnostics did not attach as after performing the xml edits you recommended the VM starts OK, but with just one monitor.

6. I was thinking about installing a Linux VM and seeing if that might work with 3-monitor support. I don't know if Linux (say Ubuntu) will auto-detect or revert to a basic resolution if no Nvidia Gforce 1050 drivers are present when Ubuntu starts. Windows will automatically detect the video card being added and load the appropriate drivers. I have checked Nvidia's website and they do have drivers available for Linux. I would setup Linux via the VNC access through the VM, then once setup and working OK then add the 1050 with sound, edit xml and restart and see what happens. Will report when complete.

Thanks again...

 

Asus.EditChapel.GTX1050Ti.4096.171212.rom

Link to comment
  • Solution

I would suggest to completely uninstall nvidia drivers with ddu and try to install them again, maybe testing different versions, starting with the version that works on bare metal. Make sure to first delete all nvidia devices (even hidden ones) in windows device manager too.

The vbios should be ok, it contains valid legacy and efi vbioses.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.