Jump to content

zionlion

Members
  • Posts

    3
  • Joined

  • Last visited

Everything posted by zionlion

  1. Hi UNRAIDers, I have an issue with my server and I don't really know how to solve it. My unraid server has been running fine for more than a year now. So in general everything is set up and functional. Of course you shouldn't change a running system, but stupid as I am, I had to add a Firewire PCIe card (cheap 20$ model with VIA chipset) to my server. This card introduced a few issues: I am unable to pass the card through to my Win 10 VM. The card works as expected, when booting directly (baremetal) into the Win 10 VM. However, the presence of the card prevents a complete shutdown of my PC from within Win 10. The power LED stays on and the case fans keep running. Regarding problem 1: I have tried different approaches to pass the card through. First, I opted for the vfio-pci.cfg, as the card is in its own IOMMU group (20, see attached picture) and this method already worked well for my GPU, NVME and onboard USB controller. Unfortunately, this method leads to unraid not being able to boot at all. It seems to get stuck at the vfio-pci.cfg read stage (I think), but the syslog doesn't show any errors or hints towards the exact problem. The consequence is that the PC gets restarted and the unraid thumb drive is not detected anymore. I have to plug it into my laptop and remove the vfio-pci.cfg to get unraid to boot again. The other thing I tried is to simply add the following to the Win 10 VM xml to directly pass the card without adding the card to the vfio-pci.cfg: <hostdev mode='subsystem' type='pci' managed='yes'> <driver name='vfio'/> <source> <address domain='0x0000' bus='0x04' slot='0x00' function='0x0'/> </source> <address type='pci' domain='0x0000' bus='0x00' slot='0x09' function='0x0'/> </hostdev> This resulted in the server crashing (no freeze, just a hard restart) whenever I started the VM. Does anyone have experience with these cards and has an idea why the passthrough might fail? Regarding problem 2: The Firewire card works nicely from booting baremetal into my Win 10 VM. I am using the default Win 10 driver (VIA 1394 OHCI compliant host controller). As described, I am not able to completely shutdown the computer from within the baremetal Win 10. After shutdown, the power LED and the fans keep running. Also, the power button doesn't do anything in this stage anymore. I can only do a long press to hard switch off the PC. This only happens from the baremetal boot. Switching off the VM when running in unraid is not a problem and also the unraid server completely switches off the PC. So I don' think it is a wrong setting in Win 10 or the Bios (of course I also checked those settings). I have the feeling that there is some driver or hardware issue with Win 10 in combination with the Firewire card that leads to the described issue. Additionally, I tried installing the legacy drivers (https://www.studio1productions.com/Articles/Firewire-1.htm), which have been described to help with these cards, though for me it only lead to blue screens. Has anyone used these type of Firewire cards in a similar setting and could help me with these type of problems? Lastly, some basic info about my server: Unraid server Plus, version 6.9.2 Motherboard: ASRock - X570 Phantom Gaming 4, Bios Version: P4.20 Processor: AMD Ryzen 7 3700X 8-Core @ 3.6 GHz
  2. I have the same issue. The reason why some sites work and other don't has to do with which of those support IPv6, because my VM gets such an address, but I don't get an IPv4 address. Any ideas how to fix that?
  3. I am experiencing a similar problem with incorrectly displayed high CPU load and crashes of unraid, when starting or stopping my Win 10 VM. Can you tell me what your solution ( "pcie_no_flr=1022:149c,1022:1487" ) does? What are these pcie devices, I couldn't find anything about them in your diagnostic files.
×
×
  • Create New...