Win10 VM boot issues upgrading from 6.9.2 to 6.11.5


smick
Go to solution Solved by smick,

Recommended Posts

Hi Everyone,

 

 

I have an UnRaid setup that I primarily as a Windows 10 computer. I have successfully updated UnRaid from 6.9.2 to 6.11.5 and it boots. However, I am unable to get my Windows 10 VM working. Previously, I had ACS override set to both to separate out my video card. After much experimentation I have disabled ACS override. In System Devices I see my video card and most but not all devices I need for the VM in their own IOMMU groups. I checked the ones I could and rebooted UnRaid. 

 

Launching my Windows 10 VM it fails to boot and I'm getting spammed with this message.

 

2022-12-12T17:19:50.325553Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x3ce6, 0x0,1) failed: Device or resource busy

 

That's as far as I am able get. My first priority is to get the VM to boot. The second priority is to pass through the device below which is now grouped in IOMMU group 14 and not available. This is a duel ethernet NIC add-on card. I hope to get this working once I can successfully boot the VM.

 

[8086:10c9] 28:00.0 Ethernet controller: Intel Corporation 82576 Gigabit Network Connection (rev 01)

[8086:10c9] 28:00.1 Ethernet controller: Intel Corporation 82576 Gigabit Network Connection (rev 01)

 

Thanks!

 

Steve

 

tower-diagnostics-20221212-1120.zip

Link to comment

I feel like I'm going backwards. Before I got 6.11.5 to boot, I'd reboot the computer and get nothing - no text at all like the BIOS didn't start. Finally I restored the old 6.9.2 and got it to boot. I then updated to 6.11.5 and everything seemed stable except Win10 wouldn't boot.

 

Now my computer spontaneously turns off from inside UnRaid and when I try to reboot its back to the no BIOS type issue with no text on the screen. Eventually it powers off. I have been able to pull the flash drive and insert it after the BIOS starts and got it to boot. Then in a short time it turns itself off.

 

About a year ago I had an issue and posted to the forum and it turned out it was my power supply which I replaced with a Corsair RM750x. Does this sound like another power supply issue? I'm running out of ideas. Do I need a new computer? Arggg.

 

Any help appreciated.

 

Steve

Link to comment

Do you know what method you're booting in? Legacy or EFI?

To change to EFI, on the "Main" page click on the "Flash" drive and check "Permit UEFI boot mode" at the bottom of the page then reboot.

 

If you run into issues, the manual way to disable this, say from another computer, is to rename the EFI folder on the flash drive from "EFI" to "EFI-" (that's a hyphen at the end)

 

Hope that helps.

Link to comment

Thanks for the reply. Your post made me realize that it was booting in UEFI mode. I could tell something was different because UnRaid's boot screens looked different with smaller text, etc. It also explains why my hardware was grouped differently under System Devices. Thanks for the heads up!

 

Good news bad news:

 

Changing it to boot legacy just caused it to not find a bootable OS. I tried running the Make bootable script on both MAC and Windows but it just wouldn't boot. 

 

The good news is that I was able to go back to previous 6.9.1 version that did successfully boot and Windows booted too! The emergency is over but I still have some concerns/questions.

 

1) I'm deathly afraid to update UnRaid at least on this computer that I use for Windows. I have another UnRaid server I use as a file server that is on 6.11.5 and it has always updated successfully. To me it seems like a security issue to not update UnRaid which is why I tried to force an upgrade (again). Should I be worried about this?

 

2) I don't understand why my previous backup of 6.9.2 didn't boot Windows after failing on the 6.11.5 upgrade attempt. For me, it doesn't seem that going back the the backup is guaranteed to work; at least not when it comes to booting Windows. Am I wrong on this or missed a technical detail?

 

3) It sure would be nice to be able to use a separate flash device to attempt upgrades. That way I wouldn't have to write over my "good" flash drive to experiment. Is there a way to do this?

 

4) It seems as long as my computer doesn't have anymore spontaneous power down, my issue was booting in UEFI mode. Do other people experience UEFI bugs like this? 

 

I do love UnRaid and appreciate the support provided by the forums. Cheers!

Link to comment

I took another crack at it yesterday. As long as I boot in legacy mode, 6.11.5 comes up and I don't have the power down issue I had booting in UEFI. The IOMMU groups in System Devices match what I see in 6.9.2. My Windows 10 VM seems to boot since the log looks normal but the screens are black. Performing a shutdown from the VM manager works as expected. My VM is set up to boot in SeaBIOS. I have a test Win10 VM that boots in OVMF and that doesn't work either. 

tower-diagnostics-20221214-1150.v6.11.5.Win10ScreenBlack.zip

Edited by smick
Added diagnostics
Link to comment
On 12/14/2022 at 11:04 AM, PassTheSalt said:

I've also been having issues with a windows 10 VM.  I had it working before but after changing USB devices it wont start and i don't see anything useful in the logs.   i wonder if theres just some issues with 6.11.x?

 

I would bet on it. I'm another user with a Win 10 VM that starts fine in 6.10.3, but any 6.11.x version I've tried to upgrade to has left my VM behaving *very* oddly.

 

 

 

Link to comment

I check the VM logs and there is nothing that could explain why the output from the video card is just blank.   Also I can't even remote into the VM. The VM is running but not reply any pings with the video card but boots up fine with VNC.

 

This must be a common problem with the new verison. Any ideas?

Link to comment

From the diag file: qemu directory: Windows 10.txt

 

This was in the log when I tried 6.11.5 in EFI boot mode and started the VM.

 

0000:29:00.0:region3+0x109623, 0x0,1) failed: Device or resource busy
2022-12-12T17:20:05.957319Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109624, 0x0,1) failed: Device or resource busy
2022-12-12T17:20:05.957331Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109625, 0x0,1) failed: Device or resource busy
2022-12-12T17:20:05.957342Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109626, 0x0,1) failed: Device or resource busy

...

 

Anybody have any ideas on what would cause the top line failure?

 

Steve

 

Link to comment

Having also similar issue.

 

After some upgrade, not sure which one (currently on Version: 6.11.5 ) i'm not able to boot into my VM Win10 via VNC, just black screen.

I have this VM as testing VM so i use it just few times a month or two.

 

Is on M2 SSD via "spaces win clover" per this guide 

 

 

 

Even i try to take Unraid OS USB out and boot directly from M2 SSD, Windows loads fine...so definitely some issue with Unraid, but not a clue where

 

Other VMs like ubuntu, debian shows just fine via VNC

 

guys any help please?

@SpaceInvaderOne

@limetech

 

Thanks

unraidtower-diagnostics-20221223-1349.zip

Link to comment

Guys seems like i got it working, not sure which setting helped in the end.

Was watching new video on Win11 install and he did upgrade the VirtIO there, so i tried :)

 

 

1. Download latest VirtIO driver

image.thumb.png.c521842a7b1bb6700941b931222fd9ed.png

 

2. Then i did:

- Machine: on latest 7.1 (before 7.0)

- VirtIO Drivers ISO: latest you just downloaded

- VM Console WS Port: in here i had before "-1", maybe due to fact i did edit TXT as this menu in UI wasnt present.

On other VMs with Linux were values like 5701 and up+

 

Try it and let know if is working now ;)

 

 

image.thumb.png.c4cb52244e340b72465a443d689dfd9a.png

Link to comment
  • 2 weeks later...
1 hour ago, mrtech213 said:

I updated the virtio driver and the VM is still not booting up with the video card. Is the new Unraid verison having issues with an NVIDIA GeForce RTX 3060

 

@limetech

@SpaceInvaderOne

 

 

Can anyone help with these?

Have you tried with vnc primary and secondary 3060. Then look in device manager to see state of driver for 3060?

 

Also try pcie_aspm=off in syslinux config.

Edited by SimonF
Link to comment
1 hour ago, SimonF said:

Have you tried with vnc primary and secondary 3060. Then look in device manager to see state of driver for 3060?

@SimonF

 

So the VM is booting up with VNC as the primary and 3060 as the secondary.  Looking into the Device Manager under Display adapter, the video card is listed. Checked for updates for the driver via GeForce Experience and everything is updated.

 

image.png.2ad1a91215a389077394bd6cdb90d99f.png

 

 

 

Not sure why the VM doesn't boot up with the video card as the primary.

 

 

 

Tried pcie_aspm=off in syslinux config and the VM is still not booting with the video card as primary

Edited by mrtech213
Link to comment
19 minutes ago, mrtech213 said:

@SimonF

 

So the VM is booting up with VNC as the primary and 3060 as the secondary.  Looking into the Device Manager under Display adapter, the video card is listed. Checked for updates for the driver via GeForce Experience and everything is updated.

 

image.png.2ad1a91215a389077394bd6cdb90d99f.png

 

 

 

Not sure why the VM doesn't boot up with the video card as the primary.

 

 

 

Tried pcie_aspm=off in syslinux config and the VM is still not booting with the video card as primary

Can you show system devices from tools menu for 3060

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.