AlexSterk

Members
  • Posts

    6
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

AlexSterk's Achievements

Noob

Noob (1/14)

0

Reputation

  1. I did get it fixed. But I really don't remember how. Sorry but it's been 2 years. I don't really use it anymore anyway because it just wasn't what I had hoped it would be.
  2. Someone informed me that the reason OMVF did not work, is because my Windows installation could apparently not boot with UEFI. After running 'mbr2gpt' this was resolved and I no longer got the Interactive Shell, but just the boot logo. Unfortunately, the problem with the blue screens is still not resolved. It is the exact same as it was with SeaBIOS.
  3. More Updates: So booting the VM gave me a BSOD with INACCESSIBLE_BOOT_DEVICE. After trying another 2 times it went into Automatic Repair (failed with Boot critical file: f:\boot\resources\custom\bootres.dll is corrupt. ) and from there I was able to successfully boot into safe mode. Once there I was able to reboot and the machine booted just fine with no BSOD and no safe mode (even further reboots were fine). Then, when I went back to bare metal, I would get the same BSOD while booting. The same safe mode trick worked, but then the VM would get a BSOD again. I apparently can't have both without BSOD? It's either the VM or Bare Metal, and I can switch them by booting into safe mode... I don't know what to do, and I am contemplating whether to stay on the VM (and probably lose some performance) or just go bare metal and only go through all this trouble only when I need a VM, for example when I want to run 2 machines side by side. Anyone know what to do from here on out?
  4. UPDATE 2: Turns out SeaBIOS was working, it just took a really long time for the screen to become active... However, attempting to boot my already existing Windows installation through a VM causes a Blue screen with "inaccessible boot device". I'm in the process if installing a new copy of Windows in a VM because of this. I guess I'll just have to reinstall games, or potentially share my steam drive with the VMs, if possible. If anyone has a fix for the blue screen, let me know!
  5. UPDATE: * OVMF does not work with VNC instead of the GPU * SeaBIOS DOES work with VNC instead of the GPU * I followed this guide (https://forums.unraid.net/topic/56049-video-guide-how-to-easily-passthough-a-nvidia-gpu-as-primary-without-dumping-your-own-vbios/) by spaceinvaderone to get my GPUs rom, not sure if it was necessary, but it still doesn't work with either OVMF or SeaBIOS... (EDIT: Turns out this wasn't necessary since I booted unRaid with my intel iGPU) * Using both VNC and the GPU in SeaBIOS does not work. VNC window says something about the guest not having a display turned on?
  6. (Full disclosure I also made a Reddit post on this, posting here as well for visibility https://www.reddit.com/r/unRAID/comments/b8ogc7/ovmf_keeps_booting_to_uefi_interactive_shell_and/?ref=share&ref_source=link) I've really been wanting to try out unRaid, mainly for its VM feature. My idea was that I would be running my PC with a bare metal Windows 10 installation, and also be able to boot into this installation from a VM (as explained in this guide: https://forums.unraid.net/topic/72338-video-guide-how-to-dual-boot-baremetal-windows-and-unraid-then-boot-the-same-windows-as-a-vm/ by spaceinvaderone). That way, I could use my PC as normal, but when a friend comes over, we can both play games together (I plan on getting another GPU for this but first I want to get this working before I spend money on that). Although no matter how I change the settings, every time I boot the VM with OVMF, I get kicked to UEFI Interactive Shell. Typing 'exit' and selecting continue just kicks me back to the shell. When I try a VM with SeaBIOS, my screen doesn't even receive any input (blank). I've also tried to do 'real' VMs with a Windows install ISO, but I have the same problem, can't even boot into the installation disks. I have a screen connected to my CPU's integrated graphics for unRaid itself, and selected my dedicated GPU for the VM... Furthermore, when I try changing i440fx to Q35 I get the following error in the unRaid webUI: XML error: The PCI controller with index='0' must be model='pcie-root' for this machine type, but model='pci-root' was found instead I don't know what to do anymore... Is it my hardware? My Hardware: * asrock z370m pro4 * Intel i7-8700k (iGPU for unRaid) * Nvidia GTX 1050 Ti (GPU for VM) (note: has multiple screens attached, not sure if that matters) * 1 SSD with the Windows installation * 1 HDD as Data drive for the Windows installation * 1 HDD in the unRaid array. I don't plan on using unRaid for anything else yet, so I didn't get any extra disks etc. I just want to get my specific use case working first. Does anyone know what to do?!