Jump to content

Cristian Sava

  • Content Count

  • Joined

  • Last visited

Community Reputation

0 Neutral

About Cristian Sava

  • Rank
  • Birthday 01/01/1977


  • Gender
  • Location
    Iasi Romania
  1. Yes I have 3 monitors. One is for the onboard gfx. The other two are for the nvidia cards. They are all connected simultaneously.
  2. I think that i may have found the issue. It seems that my first graphics card is jumping from one id to the other (01 and 02). When the first card's id is 01 it seems to display "(rev a1)" and when id is 02 it shows as "(rev ff)". The second card seems to be stable at id 04 with "(rev a1)". I managed just once to boot both vm's when they displayed different revisions. This is very strange. q1 : is this an issue cause the cards are suposed to be idendical (should i just put in two different cards)? q2 : can i make their id's "sticky" somehow if i were to have them both identical ?
  3. Hi @Dolce I did in fact separate the two vm's each with their own nvme drive and gpu. Once more i did passthrough all related multifunction devices for the gpu that are exposed to the unraid (video, audio, usb-c controller). Baremetal all works alright with booting both. But inside unraid only the 2nd vm works. the 1st doesn't boot due to something related to the gpu.
  4. Hi comunity ! I was fascinated a while back when Linus tech tips has built that dual gaming rig and a week ago just posted another video on the matter so i pulled the trigger and bought the new gaming rig consisting of : - Gigabyte Aorus Elite AC mobo with the latest F5c bios - Intel 10th gen core i9 10900K cpu - HyperX Fury Black DDR4 3200MHz CL16 dual channel 64GB - 2 x Nvme Samsung EVO 500 gb - 2 x Gigabyte GTX 1660 Ti 6Gb - Seasonic Focus GX, 80+ Gold, 750W The first issue i got was the fact that the latest stable release 6.8.3 is old enough and doesn't support the onboard nic so i had to switch to the 6.9.0-beta25 release. Then while waiting for the graphics cards to arrive i put everything else toghether and created the two gaming vm's using windows 10. That went well and i was hopeful that i could complete the task. But another show stopper turned me down when with the two graphics cards i couldn't make everything work. I watched spaceinvaderone videos of how to rip the rom from the card and did so and also managed to hex edit to cut out the non usable starting portion. I was quite shocked to see that i had to trim it down from 1,047,040 bytes to 881,664 bytes. vm1 with the 1st graphics card just woudn't work (in fact i coudn't get the bios out of it - the file would be empty). vm2 with the 2nd graphics card worked alright and was able to see the graphics passed through and install the nvidia drivers w/o issues. i did left on the integrated graphics and had a monitor hooked to it at all times to be able to pass console commands and to make the two discreet cards available so i don't know where and if i failed to do something on the configuration side. The 3rd thing that didn't work ok at first was the fact that after adding in the graphics cards there was just too much hardware and the integrated nic woudn't work anymore. So i had to resort to disabling the integrated sound and sata controller. But that isn't a big issue as i don't plan to use the onboard sound and the vm's have each a dedicated nvme drive, so for internal disk array used two usb sticks - one for data and one for parity. Another thing to note that was quite interesting to see is that i confirmed hardware and software is running w/o unraid by just booting directly either of the dedicated nvme drives. With just two minutes of windows reconfiguration to detect and configure everything and a reboot i had everything ready to test out. Having in mind that i am quite a Newbie and started documenting and implementation just a week ago i resort to this fine comunity to help me out and possibly make the whole thing work.