batesman73 Posted December 21, 2019 Share Posted December 21, 2019 Hi, I read several things about this but didn't find an answer that really fit my needs. Or at least I do not really understand NVIDIA-Grid or AMD GIM. Anyhow. What I'm thinking about is to replace some of my computers with a beefy Unraid machine. The thing I'm unsure about is the graphics and I see two options. 1. I don't know how to set this up, but is it possible to share a graphics card simultaneously between two running VMs ? Lets say use HDMI port 1 for a general Linux/Win box and port2 for Libreelec. And also put audio through them. 2. Use several graphic cards and pass them through. Question here would be. What are least power demanding graphic cards (mostly idle). I'm living in Germany so current does cost a lot more than in the states. Can anyone point my to a wiki or something similar Quote Link to comment
meep Posted December 21, 2019 Share Posted December 21, 2019 1 hour ago, batesman73 said: Hi, I read several things about this but didn't find an answer that really fit my needs. Or at least I do not really understand NVIDIA-Grid or AMD GIM. Anyhow. What I'm thinking about is to replace some of my computers with a beefy Unraid machine. The thing I'm unsure about is the graphics and I see two options. 1. I don't know how to set this up, but is it possible to share a graphics card simultaneously between two running VMs ? Lets say use HDMI port 1 for a general Linux/Win box and port2 for Libreelec. And also put audio through them. 2. Use several graphic cards and pass them through. Question here would be. What are least power demanding graphic cards (mostly idle). I'm living in Germany so current does cost a lot more than in the states. Can anyone point my to a wiki or something similar Hi you won't be able to share a single GPU with two different VMs running simultaneously, regardless of how many outputs it has. You could share a GPU across multiple VMs, provided that the don't run at the same time , but that's not what you asked I cannot answer the 'least power demanding graphics cads', sorry. You might need to fist establish what you want to do with your VM and find out the GPU features you need. I once ran an unRaid server with 3x Windows 8 VMs all using HD5xxx single slot fanless cards. That system sipped power, but I wouldn't be running games or rendering or anything too demanding in those VMs. Figure out what horsepower you need, and for what, and that will help narrow down the search. Quote Link to comment
batesman73 Posted December 22, 2019 Author Share Posted December 22, 2019 Mmmhmh, in general I think this technology (NVIDIA-Grid or AMD MxGPU) exists. But I didn't find anything related to KVM. From what I read it is possible to use it wit XEN and VMWARE. Look here. "https://www.brianmadden.com/opinion/AMD-MxGPU-aims-to-give-GRID-a-run-for-its-money" But at least for now it's obviously still in it infancy. The usecase would be. Host for homeserver for several Dockers (including transcoding for emby but not many streams) and VMs for my and my wifes general purpose/office PC, Libreelec, Retrogaming machine and so forth. Quote Link to comment
batesman73 Posted December 26, 2019 Author Share Posted December 26, 2019 Hi there, coming back to this topic ... I'm currently in the process to change my systems. When I go the way with two seperate graphic cards I still have questions. Maybe someone could answer this. Current planing is to use a nvidia gtx1050 for unraid itself and to support HW transcoding in an Emby container. A second gpu (amd rx560) will be used for Retrogaming and multimedia VMs in passthrough mode. In addition I want to use the system as a daily driver for my linux/debian system. Is there a way (vnc or something faster) to use the unraid system in graphical mode (the nvidia card) and display a vm fullscreen with acceptable speed ? I know the easiest way would be to use an additional gpu and pass that through, but as I don't have demanding tasks (only office and surfing) it should be enough power in that system. Quote Link to comment
gacpac Posted November 17, 2020 Share Posted November 17, 2020 This is something that if limetech manages to implement, it will be great news. Only vmware does that for now.Sent from my Pixel 2 XL using Tapatalk Quote Link to comment
KBlast Posted January 7, 2021 Share Posted January 7, 2021 I am also interested in taking 1 GPU, virtualizing it into multiple vGPU resource pools then sharing those pools to different VMs. For example... 10GB Card split into 10 x 1GB vGPUs to be shared across 10 windows VMs each thinking they have their own discreet 1GB GPU. Is this possible now on unraid? Quote Link to comment
gray squirrel Posted January 7, 2021 Share Posted January 7, 2021 The RTX 3000 cards supposedly support SR-IOV from a hardware point of view. But Nvidia will need to enable it (they won’t as it’s an enterprise feature). Unraid would also need to support it. Although cool to have 1 GPU and to split it up. It’s a very fringe case. fit two or three GPUs or use this guide to share the GPU (not at the same time) Quote Link to comment
Yeyo53 Posted January 16, 2021 Share Posted January 16, 2021 Regarding this topic: could be possible to share a 1060 GTX with the PLEX container (90% of the time) but if I start a Windows VM (10% of the time) it get locked by the VM? Quote Link to comment
mSedek Posted March 31, 2021 Share Posted March 31, 2021 Nvidia yesterday released a driver that unlocks gpu virtualization so, what do i need to do to share my 2070 super tu multiples vms at the same time?? Quote Link to comment
SimonF Posted March 31, 2021 Share Posted March 31, 2021 (edited) The announce doesnt support vGPU, just that its now offically supported as passthru to a winows vm with latest drivers Edited March 31, 2021 by SimonF Quote Link to comment
iamnypz Posted April 11, 2021 Share Posted April 11, 2021 Nvidia's Virtualization Unlocked On Gaming GPUs via Hack | Tom's Hardware Quote Link to comment
SimonF Posted April 11, 2021 Share Posted April 11, 2021 3 hours ago, iamnypz said: Nvidia's Virtualization Unlocked On Gaming GPUs via Hack | Tom's Hardware From what I have read of the hack it spoofs vendor ID for the GPUs, But I think you still have to buy licences for the vGPU drivers etc and doesn't look a straight forward install. Quote Link to comment
iamnypz Posted April 11, 2021 Share Posted April 11, 2021 Ahhh dayum! Spoiler 1 hour ago, SimonF said: From what I have read of the hack it spoofs vendor ID for the GPUs, But I think you still have to buy licences for the vGPU drivers etc and doesn't look a straight forward install. Spoiler Quote Link to comment
jonp Posted April 12, 2021 Share Posted April 12, 2021 Just a friendly reminder everyone, discussing how to circumvent other vendors licensing will get content removed / banned from this forum. Tread lightly here 3 Quote Link to comment
GreenEyedMonster Posted April 24, 2021 Share Posted April 24, 2021 I believe this is worth investigating. The GitHub has an explanation on how to do it. (I'm just not technical enough to figure it out...) You do have to pay for the vendor licensing after 90 days. So worth a shot to try it out and see if worth it. The cost of entry isn't that high either... $100+ $25 a year. I know it would save me buying a more expensive motherboard... Quote Link to comment
max2veg Posted April 25, 2021 Share Posted April 25, 2021 Virgil 3D GPU project (from: GNOME Boxes's GitLab wiki, "3daccel" wiki page): Virgil is a research project to investigate the possibility of creating a virtual 3D GPU for use inside qemu virtual machines, that allows the guest operating system to use the capabilities of the host GPU to accelerate 3D rendering. The plan is to have a guest GPU that is fully independent of the host GPU. The project is currently investigating the desktop virtualization use case only. This use case is where the viewer, host and guest are all running on the same machine (i.e. workstation or laptop). Some areas are in scope for future investigation but not being looked at, at this time. Run a desktop and most 3D games I've thrown at it. --- So it seems that the current limitation is that it requires for the VMs and viewer apps all run on the machine (host / VM server)... but it's a step forward. Quote Link to comment
djpain Posted April 25, 2021 Share Posted April 25, 2021 (edited) On 4/24/2021 at 2:54 PM, GreenEyedMonster said: I believe this is worth investigating. The GitHub has an explanation on how to do it. (I'm just not technical enough to figure it out...) You do have to pay for the vendor licensing after 90 days. So worth a shot to try it out and see if worth it. The cost of entry isn't that high either... $100+ $25 a year. I know it would save me buying a more expensive motherboard... I've been reading the vGPU wiki and the instructions for other linux os don't seem too hard. I'll give it a shot and report back. EDIT: You need a corporate email to sign up Edited April 26, 2021 by djpain Quote Link to comment
Maor Posted June 17, 2021 Share Posted June 17, 2021 Seems like we will get a user friendly tutorial within a couple of days. Quote Link to comment
Mobius71 Posted July 31, 2021 Share Posted July 31, 2021 Found a good video demonstrating using 1 gpu for 2 vm's, but it's done with Hyper V. Might be some correlations to be made with KVM on unraid, but I'm not well versed enough to know. Still thought it was interesting. Quote Link to comment
Mobius71 Posted July 31, 2021 Share Posted July 31, 2021 Well I should've watched the previous post before I posted as the video Maor posted references the link I posted. Sorry for the redundant info everyone. Quote Link to comment
GreenEyedMonster Posted August 10, 2021 Share Posted August 10, 2021 Nope! I think if we could do a nested Windows VM with Hyper -V enabled that "should" work. The nested VM with Hyper-V working is the challenge. Would be really cool if we could do it!! Quote Link to comment
namtr0 Posted August 30, 2021 Share Posted August 30, 2021 Ya, not gonna work unless someone does the vgpu hack and verifies getting it working on unraid. Hyper-v requires gen2 VMs to use GPU-P which can not be done nested. I was able to verify windows see's the GPU if passed through and even reports it partitionable, but can't do a Gen2 vm on the nested hyper-v so that's the stop there. Can't do unraid on hyper-v either due to USB issues. Can DDA a usb through to VM, but still fails to boot due to the way hyper-v just does not allow USB boot of guest. It's a dead end all around. looking to move to proxmox and use vgpu hack and just use nested unraid Would love anyone with good KVM experience to look at the vGPU hack that works on Proxmox and figure out how to get it working on unraid.... Quote Link to comment
razrbk Posted January 12, 2022 Share Posted January 12, 2022 if I followed correctly, main difference between Proxmox and Unraid is Proxmox let user play with kernel and dkms while Unraid don't. You can create your own kernel but much more complex on Unraid. So vGPU mods requires blacklisting kernel GPU drivers to install vGPU kernel drivers and this is impossible without asking the boss. Limetech could be reluctant to add the feat as this is 'borderline' with Nvidia's drivers license. But I agree core purpose of Unraid is resources virtualization and vGPU will be by force the next mandatory integration to survive in such business. Quote Link to comment
glockmane Posted December 27, 2022 Share Posted December 27, 2022 Please add this Feature, I got a RTX 3090 with 24GB RAM and I would love to split it into 4 vGPUs.. 2 1 Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.