[GUIDE] Fix Nvidia Code 43 Issue on Nvidia GPU


Siwat2545

Recommended Posts

I'm running into this issue also.  I even got the driver version that is mentioned in the guide.  Also downloaded the older 1607 windows build and still having no luck.  Are there setting that I need to add in the XML? Tried from reddit:

<kvm>
  <hidden state='on'/>
</kvm>

Not seeing any change- still getting code 43 on every version of driver I try.  Anyone know how to vfio-pci the card? as steve asked... dead in the water with my 4 gpu setup.

 

Has anyone had success with this since 388.0 versions?

Edited by jordanmw
Link to comment

I did a couple tests. I have a EVGA 1050ti in my first PCIE slot and a 1080ti in my third slot. The 1080ti is mainly used for a gaming VM and the 1050ti for Linux VMs i use. For the Linux VMs i don't have to use a extra VBIOS. Only choosing Q35 and it works out of the box. The 1080ti i can passthrough to a Win10 VM without any modifications. Create the VM (i440fx; OVMF) with VNC, install the OS, add the GPU+Audio later, install the driver, remove VNC done. For the card in the first slot if i want that to passthrough to a Win10 VM i have to do the same as before excepts that i have to passthrough the VBIOS of the card to get it working without the ERROR 43. I got my VBIOS from TechpowerUP for the EVGA 1050ti and modified it like SpaceInvader described it in his video. Passthrough the modified VBIOS remove VNC, done and it works. But if VNC is enabled i'm still getting the Error. So make sure after installinng your OS and enabling RDP or installing Teamviewer to remotly accessing your VM, that you remove VNC. Another thing to mention it makes no difference for me turning the Hyper-V option on or off or manual edit the xml, it works.

 

 

Link to comment
  • 1 month later...

My Win10 VM w/ GTX 1070 was working flawlessly for the past 2 years.

 

Then, I updated both the MB's BIOS and to 6.6.5 yesterday. Now I'm getting the Code 43 error in Windows Device Manager. I've tried most the above steps short of flashing the graphics card's BIOS (never had to do that before).

 

Any ideas? Hoping it's a 6.6.5 thing and the next update will fix it...

Link to comment

Same here, but I have managed to get myself up and running. I went down the patching the drivers route but it didn't work as you have to stay in test signing mode and it breaks games.

 

So you need to stop Windows automatically updating drivers via the Advanced system properties hardware tab. Then I ran the driver uninstaller to clean out all traces of previous drivers.

 

https://www.guru3d.com/files-details/display-driver-uninstaller-download.html

 

Then find driver version 378.78 and install that as well as set this 

 

<kvm>
      <hidden state='on'/>
    </kvm>

 

Not sure what worked but I am up and running again. The only fly in the ointment might be whether the 1070 of yours is supported by those drivers. Although given it has been working flawlessly it does seem strange it suddenly broke without you changing the drivers in the VM...

 

 

Link to comment
1 hour ago, ToXIc said:

is there anywhere specific to put this? 

Sorry it sits under features

 

Mine looks like this

 

 <features>
    <acpi/>
    <apic/>
    <hyperv>
      <relaxed state='on'/>
      <vapic state='on'/>
      <spinlocks state='on' retries='8191'/>
      <vendor_id state='on' value='123456789ab '/>
    </hyperv>
    <kvm>
      <hidden state='on'/>
    </kvm>
    <vmport state='off'/>
  </features>

Link to comment

Just in case anybody has had the same experience as me; none of the posted suggestions or steps helped me while attempting to fix this GPU passthrough problem (on both Mac and W10 VMs). After countless hours of troubleshooting, my solution (frustratingly simple) was booting Unraid into legacy mode instead of UEFI/efi mode. The moment I decided to boot into legacy and then fire up the vm's, I had no issues with installing the appropriate Nvidia drivers on either operating system's vm.

 

I'm not sure if this is common knowledge, as I just recently delved into Unraid, but hopefully it helps someone.

Edited by Kudagra
Link to comment
  • 1 month later...
On 11/25/2018 at 7:39 PM, Kudagra said:

Just in case anybody has had the same experience as me; none of the posted suggestions or steps helped me while attempting to fix this GPU passthrough problem (on both Mac and W10 VMs). After countless hours of troubleshooting, my solution (frustratingly simple) was booting Unraid into legacy mode instead of UEFI/efi mode. The moment I decided to boot into legacy and then fire up the vm's, I had no issues with installing the appropriate Nvidia drivers on either operating system's vm.

 

I'm not sure if this is common knowledge, as I just recently delved into Unraid, but hopefully it helps someone.

Your suggestion worked for me, but I had to do one more thing.  

 

First I switched my Unraid to use Legacy Mode by clicking on the Flash drive under Main and scrolling to the bottom and unselecting Permit UEFI boot mode. I rebooted and then brought my array and VMs back online, but was still experiencing the Code 43 on the VM assigned to the primary graphics card.  Also tried reinstalling nvidia drivers at this point, but that didn't work.

I then had to also redo my reflink for my VM .img file from the master image.  I have 3 windows machines with 3 graphics cards and to save space on my cache pool, I created reflinks for all of them.  

cp --reflink /path/to/vdisk.img /path/to/snapshot.img

Once I did that and booted it back up, everything was back to normal. (Doing this before switching to Legacy Mode in the GUI had no effect)

 

So why did I have to do this?  I replaced my USB because I broke it while installing a new HD.  I was able to reinsert it and get a backup, but didn't want to continue using a physically broken USB.  My assumption is my new USB wanted to boot UEFI by default for some reason, as I loaded my backup the same was as originally via the make_bootable script) and not through the new fancy usb creator  (which didn't work). Since all 3 of my windows VMs have been working for a year and a half, I knew where to focus in on (I had also noticed the bootup process looked different, ie it was in UEFI mode)

 

Hopefully this helps the next person.

Link to comment
  • 1 month later...

Appears I have been struck by the Code 43 issue. I moved my GTX1080 to a different PCIe slot and my VMs, Windows 10 and Linux, would start but there was no display output. I set a primary video display as VNC with the 1080 as secondary. When looking via VNC I could see the error on the 1080 in Windows System Devices. The Linux VMs crash straight soon after boot so I cant see what they are doing.

 

I tried using a new edited ROM but it didnt resolve the issue. Hypervisor is off in the VM template and obviously isnt there for Linux VMs.

 

At the moment cant get display output on any VMs.

 

Any suggestions?

Link to comment

Thankfully a short lived issue. Though it was a bizarre solution.

 

I read on the net someone suggesting using Teamviewer. I accessed the VM using VNC, installed Teamviewer. I un-installed the GTX1080 then restarted the VM. Once restarted I got the Teamviewer pwd then closed the VN client. Accessing the VM via Teamviewer I updated the driver. Once I restarted the VM the 1080 worked with no problems. For some reason the Linux VMs also began to work. No idea why. They didnt display video but once I changed the Windows 10 device the Linux devices started working.

Link to comment
8 minutes ago, jj_uk said:

nothing seems to work for me, uuugh.

I really dont know if i fixed mine or it just came good. I could only see code 43 when i used vnc. I read though that using vnc will often show code 43 anyway. I thought to update the driver using device manager, via vnc, but most buttons were greyed out. I could uninstall the 1080 though. So i did. After reboot i noticed the buttons in device manager for the 1080 were no longer greyed out. So i closed vnc and acessed the vm via teamviewer. I updated the driver via device manager. The moment i did my displays started working. I rebooted the vm and all is working normal.

 

I dont what this had to do with linux vms though.

Link to comment

Humm. I'm accessing the VM with google remote desktop. I can see the error in the device manager. 

 

How dare Nvidia restrict how I choose to use MY graphics card. I'm probably going to send it back for a refund and buy a different brand. Recommendations? 

 

My current card is a 1050 ti - that's all i need really.

 

Edit: Looks like Nvidia are the main manufacturers of cards and the others dont support all games, so need to get this working, uugh!

Edited by jj_uk
Link to comment

Just in case this helps someone...

 

I recently upgraded my motherboard, cpu, and ram, but kept the same 1050ti I had been using before. Previously, to get the video card to pass through, I had to disable HyperV and use the edited vbios method described in the method above. I also was already booting in Legacy mode, and had separated out my IOMMU groups so that the video card was in its own group. I also am passing through the audio portion of the GPU as the sound card.

 

After the hardware upgrade, I was again getting the error 43. After a few days of googling around and trying different things, what ended up working for me was actually REMOVING the manual vbios override, and letting it use whatever was the default. Not sure why this worked, but hey, I'm happy. 

 

For what it's worth, I was upgrading from an old Dell t5610 with dual xeon e5-2680v2 (20 total cores, 40 threads) to an asus z390 with an i9-9900k. Good luck!

 

 

Link to comment
On 2/23/2019 at 4:40 PM, darrenyorston said:

I really dont know if i fixed mine or it just came good. I could only see code 43 when i used vnc. I read though that using vnc will often show code 43 anyway. I thought to update the driver using device manager, via vnc, but most buttons were greyed out. I could uninstall the 1080 though. So i did. After reboot i noticed the buttons in device manager for the 1080 were no longer greyed out. So i closed vnc and acessed the vm via teamviewer. I updated the driver via device manager. The moment i did my displays started working. I rebooted the vm and all is working normal.

 

I dont what this had to do with linux vms though.

This was similar to my experience- I had to remove the VNC video card and assign the physical- then did the remaining config on a physical attached monitor.

Link to comment
  • 2 months later...

So, I'm about out of ideas. (This is on my Dell T310 Server, specs in my signature)

Still trying to get a Asus 1030 card to passthrough, and having no joy.

I've watched S-I-1's videos a dozen times. (Thank you!)

Made over a dozen VM's with various settings.

The unRAID server boot mode is set to Legacy.

Downloaded the Asus 1030 BIOS from techpowerup and edited it, as shown.

Am able to get Win 10 Pro to boot, video to show up, and fully load up. (login works)

Can even get to where the card is recognized as a 1030.

But any driver tried only leaves me with

"Windows has stopped this device because it has reported problems. (Code 43)"

-----------------------

Within the VM:

Doing passthrough of the CPU (intel Xeon, from 1 to 6 cpus), or emulated.

Using OVMF bios to boot. (will not boot under SeaBios)

Machine is i440fx-2.8, 9 or 11 (Will not boot under i440fx-3.0)

Hyper-V is set to no.

Using either SD or HDD to boot makes no difference (although IDE seems to work best, occasionally)

Am passing through the card & the HDMI sound lane as well. (And sometimes a second sound card)

The Video ROM is included or not (and including it does seem to make the booting process more stable.)

 

Also tried a similar set up with Windows 7, but it crashed on boot up.

Most of the time can only use the VM with the MS basic (800x600) video display driver.

Do ATI Radeon cards have the same issues?

From some posts elsewhere, it looks like they are having similar issues with VMs too.

Edited by rollieindc
Link to comment
4 hours ago, steve1977 said:

How many GPUs do you have? I have the same issue. Everything works as long as i have only one GPU. I can pass it through. Once i add a second GPU to my mobo (not even using it for anything), the first one no longer passes through.

Just the one (Asus 1030). There is the onboard graphics chip, but not using it - although I have not tried disabling it ... yet.

Link to comment
4 hours ago, Squid said:

This is what I did: dumped my own bios as per here https://www.youtube.com/watch?v=mM7ntkiUoPk

 

After that, Windows still showed Code 43, but I downloaded and installed the Nvidia drivers directly and then everything worked out for me.

1

That and disabling the motherboard graphics chip are all I have left to try, but that will have to wait a bit. (Family life calls!)

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.