Jump to content

Ubuntu Server VM borken after adding gpu passthrough (unraid 6.12.8)


Recommended Posts

I have two VMs in unraid, Windows 11 for gaming and Ubuntu Server for developing. The idea is that typically my gpu is passed through to Windows 11 and if I want to run gpu accelerated dev workloads I can shut down windows, pass the gpu to ubuntu vm and restart it. I had done this in the past (maybe 6 months ago) and it had worked fine although I was using a different gpu back then. Today I attempted to do this and after rebooting the ubuntu VM it was completely unresponsive. It does not connect to my network (cant see it in router, cant ping, cant ssh), it does not stop gracefully I need to force stop in order for it to shutdown. I did force stop it eventually and remove the gpu passthrough to see if it would work but its the same and VNC console simply says "Guest has not initialized the display (yet)". Not entirely sure how to fix this I would appreciate any help. Here is screenshot of vm tab, vm logs, and xml files:

Unfortunately I did not save the xml prior to making changes (rookie mistake I know).

Screenshot 2024-08-23 at 10.49.11 AM.png

ubuntu.log ubuntu.xml

 

Edit: Previous gpu was also a 40 series but I had removed all the nvidia drivers from the vm and was planning on reinstalling them with the new gpu passed in. I also didnt make any changes to the xml directly for passing the gpu I used the gui view to switch from vnc to 4080 and back.

Edited by fadynakhla
additional info
Link to comment
Posted (edited)

I was able to fix this by just adding a new vm and using the same vdisk and share mount. Here is the xml for the new working version would love some help figuring out what the issue was.

ubuntu_working.xml 

This is also still with no gpu passthrough I will test that but also on a new vm rather than editting the current one.

 

Edit: Spoke too soon, I can use VNC now but ssh is still not working will continue debugging and report back

Edited by fadynakhla
additional info
Link to comment

Ok so final update both new VMs (one with gpu one without) are working fine. Unreachability was because I had forgotten to change virbr0 to br0. Still would appreciate any insights into what went wrong with the first VM but everything is working at least.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...