Rhynri

Members
  • Content Count

    55
  • Joined

  • Last visited

Community Reputation

12 Good

About Rhynri

  • Rank
    Newbie

Recent Profile Visitors

776 profile views
  1. Geforce GPU Passtrhough For Windows Virtual Machine (Beta) I was looking up something unrelated and stumbled upon this Nvidia link. This could potentially mean an end for Code 43 issues, although the text at the bottom makes me wonder if you can pass through a primary card under this regime. Worth a test for a brave soul, as I use my VM as a work machine and can't afford bricking it right this second to test, although I will once I have time if someone else hasn't by then.
  2. Same here! We meet again, @testdasi. It's almost like we use unraid in similar manners.
  3. I recently switched from having disk images to passing through NVME controllers. This can drag your VM across nodes if the drive is on a different one from the rest of the VM hardware. I had a well behaved VM (memory wise) that now apportions a bit of memory across nodes).
  4. @jonp - Am I correct in thinking that we have this in 6.8.0-rc3? Edit: Actually it looks like a lot of these "unscheduled" things are already present. Maybe time for some pruning?
  5. Must have just been a fluke on my part, I'll delete this.
  6. @SpaceInvaderOne - What is that screen you are using for your server name and youtube view count? It's super cool and I'd love to have one to display various home-automation data like power consumption and solar production, and real-time data from my Weatherflow station.
  7. Can confirm. Use a 2010 MBA as my media acquisition device and get 17-18mb/sec over SMB to my server, which is great considering how old that wifi chip is.
  8. +1 for cloning+history for XML. Reverting states is a pain. @glennv - as a side note, GUI editor is getting better at not blowing away custom changes. Can't remember the critical OSX ones to tell you if it's blowing them up or not.
  9. Not sure if this got added since I last did a swap. If so, please disregard. I have my drives in external USB docks. I used to have them all in stacked dock (with all four) but then I had to spin up the whole thing just to access one. When I moved them I had to do it one at a time because Unraid saw them as new drives (with the different controllers) despite the fact it could easily see the serial number under drive info. Can we have unraid suggest drives that it's seen in the array in the correct positions via serial numbers? I'm intending to move them all into the case, but
  10. Do you still need to have the extra root hub verbage in the xml?
  11. What made you decide to start making videos for the community?
  12. Yeah, I think I bricked my cache data trying to add the drive back in the wrong way too. >.<
  13. Hello! I'd like to submit a feature request for a setting that prevents the array from starting if there is an issue with the cache drive/array. I recently noticed that my motherboard was missing a molex power plug, so I shut down the system and popped the plug in. Somewhere along the way, I bumped a plug on my U.2 mounted NVME drive, loosening it just enough to take it off line. Upon starting unraid, the Array started as normal, but obviously the cache array was offline. My cache array is a BTRFS software RAID: Data, RAID0: total=1.11TiB, used=1.11TiB
  14. Unraid has been an absolute lifesaver when it comes to managing my home tech infrastructure. I’ve consolidated so much into one system it’s not even funny. And the support you guys give to your users is unreal.
  15. @binhex - Found a solution for that scanning issue that I had to manually lock Plex back to 1.14 for. (It was a while back.) Edit: The problem showed up in the logs only as Jun 03, 2019 14:55:55.967 [0x151e50971740] WARN - Scanning the location /media/[Library Name] did not complete Jun 03, 2019 14:55:55.967 [0x151e50971740] DEBUG - Since it was an incomplete scan, we are not going to whack missing media. One of my scanners (Hama) was silently failing on certain files. I had to put a bunch of debugs into the .py files to sort it, but once I realized tha