Dual gaming rig build issues


Recommended Posts

Hi comunity ! 

 

I was fascinated a while back when Linus tech tips has built that dual gaming rig and a week ago just posted another video on the matter so i pulled the trigger and bought the new gaming rig consisting of :

- Gigabyte Aorus Elite AC mobo with the latest F5c bios

- Intel 10th gen core i9 10900K cpu
-
HyperX Fury Black DDR4 3200MHz CL16 dual channel 64GB

- 2 x Nvme Samsung EVO 500 gb

- 2 x Gigabyte GTX 1660 Ti 6Gb

Seasonic Focus GX, 80+ Gold, 750W

 

The first issue i got was the fact that the latest stable release 6.8.3 is old enough and doesn't support the onboard nic so i had to switch to the 6.9.0-beta25 release. Then while waiting for the graphics cards to arrive i put everything else toghether and created the two gaming vm's using windows 10. That went well and i was hopeful that i could complete the task.

But another show stopper turned me down when with the two graphics cards i couldn't make everything work. I watched spaceinvaderone videos of how to rip the rom from the card and did so and also managed to hex edit to cut out the non usable starting portion. I was quite shocked to see that i had to trim it down from 1,047,040 bytes to 881,664 bytes. vm1 with the 1st graphics card just woudn't work (in fact i coudn't get the bios out of it - the file would be empty).  vm2 with the 2nd graphics card worked alright and was able to see the graphics passed through and install the nvidia drivers w/o issues. i did left on the integrated graphics and had a monitor hooked to it at all times to be able to pass console commands and to make the two discreet cards available so i don't know where and if i failed to do something on the configuration side.

 

The 3rd thing that didn't work ok at first was the fact that after adding in the graphics cards there was just too much hardware and the integrated nic woudn't work anymore. So i had to resort to disabling the integrated sound and sata controller. But that isn't a big issue as i don't plan to use the onboard sound and the vm's have each a dedicated nvme drive, so for internal disk array used two usb sticks - one for data and one for parity.

 

Another thing to note that was quite interesting to see is that i confirmed hardware and software is running w/o unraid by just booting directly either of the dedicated nvme drives. With just two minutes of windows reconfiguration to detect and configure everything and a reboot i had everything ready to test out.

Having in mind that i am quite a Newbie and started documenting and implementation just a week ago i resort to this fine comunity to help me out and possibly make the whole thing work.

Edited by Cristian Sava
Link to comment
13 hours ago, Cristian Sava said:

Hello Cristian,

 

As someone who has also recently gone through this (i7-10700K, Asus Z490-A Prime, 6.9.0 Beta 25), I personally chose to go the path of passing the entire NVME through to the VM (configured through VFIO).

 

If I may suggest, it might be easier for you to run windows on each NVME drive and complete the entire setup of windows baremetal and then boot back into unraid and pass each NVME (and hardware,GPU, etc) to each independent VM. You'll still need to load the correct drivers from unraid but that's pretty easy.

 

Hopefully this helps somewhat.

Link to comment

Hi @Dolce

 

I did in fact separate the two vm's each with their own nvme drive and gpu. Once more i did passthrough all related multifunction devices for the gpu that are exposed to the unraid (video, audio, usb-c controller). Baremetal all works alright with booting both. But inside unraid only the 2nd vm works. the 1st doesn't boot due to something related to the gpu.

Edited by Cristian Sava
Link to comment

I think that i may have found the issue. It seems that my first graphics card is jumping from one id to the other (01 and 02).
When the first card's id is 01 it seems to display "(rev a1)" and when id is 02 it shows as "(rev ff)".
The second card seems to be stable at id 04 with "(rev a1)".
I managed just once to boot both vm's when they displayed different revisions. This is very strange.

q1 : is this an issue cause the cards are suposed to be idendical (should i just put in two different cards)?
q2 can i make their id's "sticky" somehow if i were to have them both identical ?

Untitled.png
 

Edited by Cristian Sava
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.