[SOLVED] Unraid won't (re)boot - bad flash drive?


Recommended Posts

Until yesterday I ran version 6.8.3, but had  problems passing through a Nvidia GPU in VMs (all OS's) and had to do more than one hard reboot because the whole system stalled. So after seeing Spaceinvader One's new vid on 6.9 beta22 and it's feature to isolate hardware in System Devices I thought to give it a go. Yesterday evening a scheduled parity check was performed, and when that was finished I got to System Devices and isolated the GPU I want to pass through, and had to do a reboot.

 

Half way through the reboot it is stuck (see attached image).imageproxy.php?img=&key=e5eec7c5c933ca16

 

So I don't have any syslogs or other files to attach, but I thankfully did make a backup of my flash drive before upgrading. :)

 

I hope someone can tell me that it is just a corrupt flash drive, and why this happened.

 

Some personal background: I am relatively new to Unraid and at the moment only using my Unraid system as a fileserver, but also a novice in Linux.

 

All help is appreciated and have a great day.

IMG_0260.jpg

Edited by LeonardS
Link to comment

Maybe a stupid consideration, but if you isolate the gpu, shouldn't you expect to read nothing after the vfio-pci?

Isolation means that unraid (the host) cannot use the isolated device; I'm quite sure that if you connect to unraid from a second device on the same network you will find that it's not stuck, it simply cannot use the isolated gpu.

Assuming you only have 1 gpu, try to not bind the gpu to vfio and pass the gpu bios in the xml of your virtual machines: hopefully, the gpu will switch from the host to the guest and the other way around (with my nvidia titan black the gpu works when unraid 6.8.3 boots, it is correctly passed through to the vm, but it doesn't switch back to unraid once the vm is shutdown (not a real problem for me because I only use the vm, which is set to autoboot and once inside the vm I can manage unraid from there, so once the vm is shutdown I shutdown unraid too with the mechanical button on my case or I connect to unraid with my mobile on the same wifi network since unraid has an ethernet/wifi bridge attached); I have another system with linux manjaro and here it works both ways: manjaro --> vm and vm --> manjaro).

If you have 2 gpu use the primary for the host (it can be the integrated one too --> if so set it in the bios to be the primary) and bind to vfio the secondary gpu, to be used for your vms.

Edited by ghost82
  • Thanks 1
Link to comment
2 hours ago, ghost82 said:

Maybe a stupid consideration, but if you isolate the gpu, shouldn't you expect to read nothing after the vfio-pci?

Isolation means that unraid (the host) cannot use the isolated device; I'm quite sure that if you connect to unraid from a second device on the same network you will find that it's not stuck, it simply cannot use the isolated gpu.

That was not stupid at all, but quite logical. Indeed I could connect from a second device on the network. After unselecting the devices under System Devices and clicking the 'Bind...' button again (there is no Unbind button btw), another reboot and it all works again. Thanks! :-))

 

3 hours ago, ghost82 said:

Assuming you only have 1 gpu, try to not bind the gpu to vfio and pass the gpu bios in the xml of your virtual machines

I have and I won't try that again.

 

Once again thanks and much appreciated!!

Have a great day.

Link to comment
  • JorgeB changed the title to [SOLVED] Unraid won't (re)boot - bad flash drive?

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.