SmokeyColes Posted December 15, 2021 Share Posted December 15, 2021 (edited) Hi I have a Ubuntu VM, with VFIO GTX1660 Ventus OC 6GB. I am passing through the Nvidia video, sound, usb, and one more (they are bound together). My Overide is multifunction. The binding has the green dots. I have tried: SpaceInvader vbios sccript - script fails, so no option No bios Tech bios download with hex editor When I boot Ubuntu if I have purged all NVIDIA drivers through VNC and then run ubuntu in GPU passthrough - The Tiana logo loads with a ubuntu logo - then i get a solid black screen (no flashing cursor) I cant input or do anything except force stop. If I purge NVIDIA drivers then install sudo apt-get install nvidia-drivers-470 I then get a black screen with a flashing underscore character. Whether I use the vbios seems to make no difference. I have searched the unraid forums and tried many things, but I am at a loss now. One option I want to try is booting in a ubuntu safe mode, the problem is i have no idea how to do this within a VM. But if someone knows the solution, I would be so grateful. Up to now everything has failed with my VM attempts and I'm sort of at the point of quitting with unraid all together, I've a good setup and is gutting to see every effort fail so badly with no help or support. Edited December 16, 2021 by SmokeyColes Quote Link to comment
SmokeyColes Posted December 15, 2021 Author Share Posted December 15, 2021 (edited) BUMP does anyone know how i can force the recovery process in ubuntu within a VM? Edited December 15, 2021 by SmokeyColes Quote Link to comment
SmokeyColes Posted December 16, 2021 Author Share Posted December 16, 2021 (edited) Ok so this is what I did and I resolved it myself. It requires lots of testing reboots and switching between KVM VNC and GPU. This is no means a science but if it helps others (rather than getting no response - please use, and like!!) Though before you do, use a backup VM plugin / script so you have a backup to fallback on (even if the backups in a bad state). The idea is I literally tried everything - Display Manager (your linux login), drivers (GPU), and a attack on GRUB. I could create a new VM so I was fortunate to see this working (so I knew it was software on the VM) although UNRAID it is a real shame, there is no force a boot from a Live CD (post install) and you sure as hell wont get support! For me GPU vbios did nothing so I didn't use it, and the official nvidia drivers don't play well at all on my system. It took me days of effort... lots of learning! So here are some useful commands, I really recommend getting SSH installed anyway you can. That is my number 1 tip, 2nd - backup VM vdisk. Things are so much less bleak when you know its talking back to you and not just stalled on the black screen. I've tried to make the process sequential however truthfully you may need to jump around the steps. Basics: ======= sudo apt update sudo apt install openssh-server [allows you to SSH into a black screen for terminal on another PC] sudo apt-get install --reinstall ubuntu-desktop gnome-shell ubuntu-gnome-desktop unity [reinstalls all post display manager stuff] sudo reboot Display Manager: ================ cat /etc/X11/default-display-manager /usr/sbin/lightdm sudo apt purge gdm3 lightdm sudo apt-get install lightdm [OR] sudo apt install gdm3 [preferred] systemctl disable lightdm systemctl enable gdm sudo dpkg-reconfigure lightdm [OR] sudo dpkg-reconfigure gdm3 [select gdm3] sudo nano /etc/gdm3/custom.conf [change #WaylandEnable=false, remove comment if needed] sudo service gdm start sudo reboot Drivers: ======== sudo apt-get remove --purge '^nvidia-.*' sudo apt-get remove --purge xserver-xorg* sudo rm /etc/X11/xorg.conf [this was posted on every site, file didn't even exist for me but I reckon this is a regular problem] sudo ubuntu-drivers devices [shows GPU drivers] sudo apt-get install --reinstall nvidia-common sudo apt-get install (driver from above) e.g. sudo apt-get xserver-xorg-video-nouveau sudo apt autoremove sudo apt install nvidia-prime sudo prime-select nvidia nvidia-settings sudo reboot GRUB ==== sudo nano /etc/default/grub to change GRUB_CMDLINE_LINUX_DEFAULT="quiet splash" to be GRUB_CMDLINE_LINUX_DEFAULT="quiet splash nouveau.modeset=0" sudo update-grub sudo reboot Boot Repair App (I didn't try this but found it) ================ sudo apt-add-repository ppa:yannubuntu/boot-repair sudo apt-get updatesudo sudo apt-get install -y boot-repair boot-repair Edited December 16, 2021 by SmokeyColes 1 Quote Link to comment
rvijay007 Posted December 29, 2021 Share Posted December 29, 2021 (edited) Thanks for your post and sorry you weren't getting any help. I experienced the exact same issue when trying to install a VM for Ubuntu Desktop 20.04.3, minimal installation. Without setting up the graphics card (NVIDIA 1080Ti), everything worked fine and I was able to do many other aspects of the setup, and that already included ssh via `sudo apt install ssh`. However, once I added the video card, I was experiencing this issue you describe. Namely, starting VM led to Tiamo boot screen, then black screen, then small black screen in upper left corner, then fully black screen. Luckily you pointed out that I should still be able to ssh, and luckily I set that up prior to hooking up the graphics card. I should point out that removing the graphics card option from the VM definition makes the VM work properly, so it's not hard to get back to a working state. Only the VNC graphics option was used in this case (the default for creating a new Unraid VM). Given that you listed many steps, I took some educated guesses as to what might be high value to start with, which were the following. SSH'd via another computer sudo apt-get install --reinstall ubuntu-desktop gnome-shell ubuntu-gnome-desktop unity [reinstalls all post display manager stuff] sudo apt-get install --reinstall nvidia-common sudo apt-get install xserver-xorg-video-nouveau sudo apt install nvidia-settings sudo nano /etc/default/grub > Added nouveau to the config as you wrote sudo update-grub sudo reboot Upon rebooting, the VNC screen was a bit off, but at least the login screen was readable. Once logged in, the screen refreshed and looked proper again. It seems that everything is around getting the nouveau driver installed. Afterwards, I went to Software & Updates > Additional Drivers. I selected the latest NVIDIA driver that is "proprietary, tested" instead of nouveau, and then I removed the grub line addition I made from your instructions (reverted steps 6/7). Rebooted and everything still works, so I'm not sure those lines are needed or not. What I don't understand is why Ubuntu needs to display render via the graphics card when I have the VNC Graphics Card still enabled? I don't want Ubuntu to use the NVIDIA secondary graphics card for rendering the display as I need it only for computational perspectives. If anyone knows how I can configure Ubuntu Desktop to only use the VNC graphics card for display purposes, please let me know. Edited December 29, 2021 by rvijay007 Quote Link to comment
ghost82 Posted December 29, 2021 Share Posted December 29, 2021 14 minutes ago, rvijay007 said: I don't want Ubuntu to use the NVIDIA secondary graphics card for rendering the display as I need it only for computational perspectives I would check the xorg.conf file, especially at the device section, where it should be possible to force display output to what you want, by specifying BusID (for ex: BusID "PCI:3:0:0") --> bus:slot:function of the video portion in the vm, find it with lspci command. Delete all that is related to your secondary gpu. Make a backup first, as there will be the possibility to break the system. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.