Fraddles

Members
  • Posts

    8
  • Joined

  • Last visited

Fraddles's Achievements

Noob

Noob (1/14)

2

Reputation

  1. Oh, I am well aware that I am playing with edge cases... Your driver works like a dream, far less hassle to setup than a plain Debian install... Kudos to you! Thanks! EDIT: Have resolved my issues with the new container... all working nicely.
  2. One small update... the newly released version of the wyoming-whisper container broke my config... Still debugging... Cheers.
  3. Sorry, some additional detail... I am using CUDA with the 'rhasspy/wyoming-whisper' container to do STT, part of HA's voice assistant setup. I did not install any thing on the Unraid host, or in the container itself... I copied the required files from a Debian install I had been playing with before trying out Unraid. Details on that install can be found here; https://github.com/Fraddles/Home-Automation/tree/main/Voice-Assistant Most of the setup on that page is not required with Unraid and this driver, but as previously noted the cuDNN libraries are still needed. Not all of them I just copied the four listed files into a subfolder in my docker appdata and mapped them into the container. There does seem to be a few options for 3rd party containers with GPU support baked in, or I could build my own, but I prefer to use the 'official' containers where possible. Mapping a handful of static files in is not a difficult task Cheers.
  4. I had the same issue with libraries missing trying to use CUDA for Speech To Text with Whisper. The solution I used was to collect the missing library files from a machine with them installed (at least some of them come from the `nvidia-cudnn` package) and them map them into the container (section out of my docker-compose file); - /mnt/docker/appdata/CUDA/libcudnn_ops_infer.so.8.5.0:/usr/lib/x86_64-linux-gnu/libcudnn_ops_infer.so.8:ro - /mnt/docker/appdata/CUDA/libcudnn_cnn_infer.so.8.5.0:/usr/lib/x86_64-linux-gnu/libcudnn_cnn_infer.so.8:ro - /mnt/docker/appdata/CUDA/libcublas.so.11.11.3.6:/usr/lib/x86_64-linux-gnu/libcublas.so.11:ro - /mnt/docker/appdata/CUDA/libcublasLt.so.11.11.3.6:/usr/lib/x86_64-linux-gnu/libcublasLt.so.11:ro Cheers
  5. Not sure if it covers your use case, but for Home Assistant, to expose a single fixed IP and have a private network for containers to communicate, I did it like this; version: '3.9' ### networks: hass: bond0: external: true ### services: homeassistant: container_name: Home-Assistant image: ghcr.io/home-assistant/home-assistant:stable networks: bond0: ipv4_address: 192.168.xx.xx hass: volumes: - /mnt/docker/appdata/HASS/Home-Assistant:/config restart: always ## whisper: container_name: Whisper image: rhasspy/wyoming-whisper networks: - hass volumes: - /mnt/docker/appdata/HASS/Whisper:/data command: --model small-int8 --language en --beam-size 5 restart: always ## piper: container_name: Piper image: rhasspy/wyoming-piper networks: - hass volumes: - /mnt/docker/appdata/HASS/Piper:/data command: --voice en-us-ryan-medium restart: always All three containers can communicate via the `hass` network, and the HA container has a fixed IP that is exposed via macvlan. Although not in this example (and not yet migrated to Unraid), my InfluxDB stack uses a similar setup, with InfluxDB itself on a static macvlan IP, but the Adminer container exposed via port mapping on the host IP. Just add a `ports:` section as usual... Cheers.
  6. @nraygun I did not have them mounted, but yes, if they were mounted I would unmount them first. I am thinking of using a similar system to you for offline backups once I get this all setup how I want it. Cheers
  7. Having recently setup my first Unraid server in a Dell Vostro, I tested this... * Startech 3-bay 3.5-5.25 trayless and 2x 2bay 2.5" trayless... * 5 bays connected to motherboard SATA ports, 2 bays connected to ASM1062 mini-pcie card. With 2 SSds installed as a pool, and one 3.5" disk to allow me to start the array (all on mobo ports), I was happily able to swap disks in and out of the other four bays while benchmarking the disks I had on-hand. I did not even bother to spin them down before ejecting them... Cheers.
  8. I would suggest that it is showing you the serial number of the RAID card, not the actual disks. You may need to use a tool from the card vendor (HP) to view the actual disk serial numbers, or you may be able to see them in the card's BIOS... Cheers.