Fairly new to unRaid, but getting the hang of things thanks to Spaceinvaderone's excellent videos. Have learned a lot from him, so I decided to upgrade my Dell T30 to something a little beefier. Now running on a Ryzen 7 3700x with 32GB Ram, Asus Tuf X570 w/wifi, Nvidia RTX 2070, 256GB nvme cache drive, a 500GB WD Black nvme for bare metal windows to run as a vm as well, 1 10TB Parity drive, and 1 10TB and 1 8TB data drive with all my media. I'm sure it is overkill hardware wise, but wanted something that would last for a while, and the 8 cores of the 3700x run circles around the old Xeon 1225.
So now you have an understanding of my hardware setup, hopefully someone can tell me how to fix a problem, and it may be that I am just slow, or ignorant of the obvious.
Problem: Followed Spaceinvaderone's excellent video on Booting a Windows 10 Bare Metal install as a VM under unRaid. Was able to pass through all necessary usb controllers, keyboards, nics, and nvme drives, get it to boot properly and install the necessary virtio drivers to make everything work. Nvidia drivers were installed under the bare metal Win 10 install, but I am assuming that I will have to install them again when I pass through the gpu, correct? I followed Spaceinvaderone's instructions to the letter regarding passing through the gpu and other necessary items, even getting into the iommu reconfigurations that needed to be done as the iommu grouping looks like this:
IOMMU group 23:[10de:1f07] 0a:00.0 VGA compatible controller: NVIDIA Corporation TU106 [GeForce RTX 2070 Rev. A] (rev a1)
[10de:10f9] 0a:00.1 Audio device: NVIDIA Corporation TU106 High Definition Audio Controller (rev a1)
[10de:1ada] 0a:00.2 USB controller: NVIDIA Corporation TU106 USB 3.1 Host Controller (rev a1)
[10de:1adb] 0a:00.3 Serial bus controller [0c80]: NVIDIA Corporation TU106 USB Type-C Port Policy Controller (rev a1)
Edited the unRaid boot drive to reflect the following to seperate out the devices:
append initrd=/bzroot vfio-pci.ids=10de:1f07,10de:10f9,10de:1ada,10de:1adb
I am able to start the VM and remote connect and login through VNC without issue. When I change the VM Graphics Card section from VNC to the NVIDIA GPU and boot the VM, it gives me a green arrow and says "Started". I have tried to initiate an RDP session into the VM, but it gives me the "Remote Desktop cannot find the computer WinVM" error and won't connect. The VM has a Cat5 connection to the router. Will RDP not connect unless you have logged into the windows session first? How would I get RDP to connect to the Windows login screen and then login? Or am I missing something on the Windows side that needs to be set? My credentials are good as I can connect and login when the graphics is set to VNC.
Also, can I connect the gpu in the server directly to a monitor and have it output the windows desktop? I have a monitor with multiple connections, so I have the GPU connected to it through HDMI and a laptop through miniDP. I don't see any output on the screen when on the HDMI connection while running the VM. When just unRaid is running I can see where the bootloader ran through and started unRaid, but as soon as I start the VM with GPU passthrough, the screen goes black.
Any insight here would be appreciated. Like I said, fairly new and probably missing something super simple.