Jump to content

smick

Members
  • Posts

    36
  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

smick's Achievements

Noob

Noob (1/14)

1

Reputation

4

Community Answers

  1. Thanks! This was my problem. My rootshare was working and all of a sudden stopped. Somehow the SMB security setting must have changed.
  2. I had a rootshare set up in unassigned devices with a mapped network drive in Windows. The mapped drive could no longer find the UnRaid rootshare. It was mapped to \\MYUNRAID\user-pool.rootshare. On the server I couldn't unmount the rootshare without restarting the array. I deleted it and created a new one and it put it in user/rootshare/user-pool.rootshare. Now I can't map that in windows. What am I missing? myunraid-diagnostics-20230227-2220.zip
  3. Success! I've finally got my troubled 6.9.2 setup upgraded to 6.11.5. In the end it took the latest mobo BIOS, UnRaid booting Legacy, and a new NVidia GT1030. The BIOS update allowed me to get away from ACS override since my video card and audio are now in their own IOMMU groups. I now have two UnRaid servers running GT1030s and can run the Windows VM on either JIC. Any good ideas on how to best use one to back up the other datawise? The mobo is MSI Carbon Pro AC with with a Ryzen 2700X. By the way, the reason for the GT1030 is its the cheapest video card I can find that does 4K. Down side is it can't be used in Plex for HW decodes. Regarding vfio binding in System Devices; it doesn't seem to matter if I bind the video card or not. Should I just bind everything available so I can use them safely in VMs or only what I know I need or doesn't it matter since if I can bind them they are not used by UnRaid? I would also suggest that when we do a manual flash backup it would be nice to have a pop up option to change the file name to describe the conditions. That would be useful for trouble shooting and make it easy to restore milestone backups. JMTC. Also for problems like this UEFI vs EFI booting is critical. EFI is "legacy" UEFI is new and looks better with smaller text in my case. UEFI = "EFI" directory name in the flash, "EFI-" is legacy. My luck has been with legacy. I used a MAC and virtual PC to create boot flashes and the most reliable method for me was to not use the flash creator but instead format the flash FAT32 on the PC, unzip the back up, copy to the flash, and run the respective boot script from the base flash directory. Sometimes UnRaid backs up as a directory and sometime a zip file so PAY ATTENTION. Look at that flash drive and even try to understand the file structure. That will go a long way to resolving mysterious boot issues. Remember EFI vs EFI-. If you have a random power down issue, you might have the Dynamix S3 Sleep plugin install and sleep enabled. TURN IT OFF in the settings if updating your mobo BIOS. It could be that the old BIOS did not support S3 sleep and the new one does! I didn't have to specify a "Graphic ROM BIOS" in my VM though I was ready to do that after downloading from techpowerup.com and clearing the header per the SpaceInvaderOne video. So what does that callout do exactly? Does the specified Graphic ROM BIOS supersede the BIOS in the card or does it provide some secondary source of some information to UnRaid? Also double check you have HVM and IOMMU enabled in the BIOS and shown here ESPECIALLY if you updated your BIOS. It might have changed! Get this from the info icon at the top of the screen. HVM goes by many names, for my BIOS it was SVN and it moved after the BIOS update.
  4. Here's an update on my problem that started the thread. I have my server upgraded to 6.11.5 with the latest MB BIOS but still have persistent issues passing through a graphics card to the Win10 VM. 1) I was able to move my Win10 VM to another UnRaid server which boots correctly using my NVidia GT1030. It works fine although the NVidia audio occasionally studders. 2) I have added a Radeon RX-560 to my Tower machine. I was able to get Win10 to boot in 6.9.2 with the Radeon but it was unstable and then started freezing during the rotating dots portion of Windows boot. After upgrading to 6.11.5 it will boot with VNC but not with a video card. I have tried both legacy and UEFI boot modes with the RX-560. Previous attempts with the NVidea GT-1030 also failed on this motherboard. I thought it would be interesting to compare the VM boot logs between the two machines. I have two sets of comparison images below with the left side being the Tower non-working VM and the right side is the working VM on my other server. There is an obvious section missing in the non-working version! Any ideas greatly appreciated. The top is legacy mode, bottom is UEFI. Sorry for the tiny text. tower-diagnostics-20230104-1047.zip
  5. I received this error on 6.11.5 and UnRaid advised I post my diagnostics for review. Diagnostics are attached. Thanks. myunraid-diagnostics-20230104-0902.zip
  6. I found the solution: Settings -> Sleep Settings were enabled. It was "sleeping" shortly after I booted UnRaid. This happened after upgrading to 6.11.5. I may have been booting in UEFI rather than legacy without realizing it and the feature just didn't work in legacy.
  7. Thanks. I was able to get into the BIOS and change the boot order.
  8. I have an MSI B450 Tomahawk Max with a Ryzen 2600 running 6.11.5. It has ALL by files on it so I need to get it to boot. What I did. I had a single HD5450 graphics card and a 2 port SATA card with 2 cache SSDs attached. It is supposed to support 2 graphics cards so I tried plugging in a Radeon RX-560 as a second card. In doing this, I had to move the SATA card to a different slot to make room for the new graphics card. Now it won't boot. I see the SATA controller finds the two cache drives and then it says "Press any key to exit". That's it - it just hangs and if I press the power button it immediately turns off. I can't seem to get into the bios. Last thing I did before I rebooted was assign the HD5450 to a Win11 VM to check passthrough and it worked. That's when I rebooted with the second video card. I tried pulling the new Radeon card and moving the SATA card back to its original location. It still won't boot. I tried deleting /boot/config/vfio-pci.cfg because I had my HD5450 and all the other options in System Devices checked before I rebooted. It still doesn't boot. I pulled the SATA card out completely doesn't boot either. Please help!!! I am stuck. Steve
  9. From the diag file: qemu directory: Windows 10.txt This was in the log when I tried 6.11.5 in EFI boot mode and started the VM. 0000:29:00.0:region3+0x109623, 0x0,1) failed: Device or resource busy 2022-12-12T17:20:05.957319Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109624, 0x0,1) failed: Device or resource busy 2022-12-12T17:20:05.957331Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109625, 0x0,1) failed: Device or resource busy 2022-12-12T17:20:05.957342Z qemu-system-x86_64: vfio_region_write(0000:29:00.0:region3+0x109626, 0x0,1) failed: Device or resource busy ... Anybody have any ideas on what would cause the top line failure? Steve
  10. Did you check the VM log? Mine appears to start and stop normally, I just don't get any output from the video card. This must be a common problem. Any ideas anybody?
  11. Any ideas on why a 6.11.5 Win10 VM would seemingly start normally and screens would be black?
  12. I suspect the power down issue is related to the bios. I had the issue booting UEFI. Booting legacy mode is stable.
  13. I took another crack at it yesterday. As long as I boot in legacy mode, 6.11.5 comes up and I don't have the power down issue I had booting in UEFI. The IOMMU groups in System Devices match what I see in 6.9.2. My Windows 10 VM seems to boot since the log looks normal but the screens are black. Performing a shutdown from the VM manager works as expected. My VM is set up to boot in SeaBIOS. I have a test Win10 VM that boots in OVMF and that doesn't work either. tower-diagnostics-20221214-1150.v6.11.5.Win10ScreenBlack.zip
  14. Thanks for the reply. Your post made me realize that it was booting in UEFI mode. I could tell something was different because UnRaid's boot screens looked different with smaller text, etc. It also explains why my hardware was grouped differently under System Devices. Thanks for the heads up! Good news bad news: Changing it to boot legacy just caused it to not find a bootable OS. I tried running the Make bootable script on both MAC and Windows but it just wouldn't boot. The good news is that I was able to go back to previous 6.9.1 version that did successfully boot and Windows booted too! The emergency is over but I still have some concerns/questions. 1) I'm deathly afraid to update UnRaid at least on this computer that I use for Windows. I have another UnRaid server I use as a file server that is on 6.11.5 and it has always updated successfully. To me it seems like a security issue to not update UnRaid which is why I tried to force an upgrade (again). Should I be worried about this? 2) I don't understand why my previous backup of 6.9.2 didn't boot Windows after failing on the 6.11.5 upgrade attempt. For me, it doesn't seem that going back the the backup is guaranteed to work; at least not when it comes to booting Windows. Am I wrong on this or missed a technical detail? 3) It sure would be nice to be able to use a separate flash device to attempt upgrades. That way I wouldn't have to write over my "good" flash drive to experiment. Is there a way to do this? 4) It seems as long as my computer doesn't have anymore spontaneous power down, my issue was booting in UEFI mode. Do other people experience UEFI bugs like this? I do love UnRaid and appreciate the support provided by the forums. Cheers!
  15. Here's my diagnostics from earlier in the day when it just had the Win10 VM issue. tower-diagnostics-20221212-1120.zip
×
×
  • Create New...