RichB

Members
  • Posts

    12
  • Joined

  • Last visited

Everything posted by RichB

  1. Just checking in on games/game play side of the gaming server question here...Some cool ones I've started to zero in on include: Atlas (pirate, first person survival, building, exploring with multiplayer capability), Mindustry, Satisfactory, but still looking. Myst like first person with puzzles vs. combat centric is preferred. Bringing in the multiplayer coop side, maybe quests and battles from time to time would be ok, but could play solo maybe with NPC AI squad? The intent is for a private server, no interest in advanced dbags constantly killing and stealing or destroying things. Thanks all
  2. Been wondering about PokerTH, an opensource poker game. Not having docker specific experience, does this translate well into a good self hosted game server app here? I searched and coudn't find anything related to a traditional playing card game server. Thanks for all your work here!
  3. Hope the lounge is appropriate place for this post, if not just let me know and apologies... I've just setup unraid and am exploring all it can do. Interested in learning about and trying a self hosted game server and see a few options in CA apps. Wondering if docker approach is best for my setup given it's a mini pc vs. a VM? My setup: I have an AMD Ryzen 7 5800H mini pc (8core, 16 thread, 3.2 to 4.4Ghz) with integrated Radeon Vega 8 graphics (2Gb memory I think, 8 graphics cores?); 64Gb ram ddr4; zfs array, 1 parity drive, cache pool: 1 - 2Tb nvme cache & with 1 - 2Tb Samsung ssd I want to try to self host a gaming server for solo play and maybe infrequent friends joining sometimes, then maybe they don't for a while but game still keeps moving forward smoothly playing solo. Ideally, they can rejoin game later when they want and still be part of my group and hop into where I'm at. I like puzzle games mostly, explore then solve and also casual tower defense and maybe some building/exploring/surviving but not just an intense shoot em up, weapons fest. Not sure what games like this come to anyones mind. I'm not a big gamer so don't have much recent background. I really liked Myst series back in the day. Not into virtual RPG board games. Great options would be easy install/setup, some control to make things easier at beginning like enemies, attacks, resources so solo isn't a slaughter or grind while I figure it out. Looking for your success stories on unraid and good matches based on above. Thanks in advance! I really appreciate the community here
  4. Follow up: TLDR - The Ryzen 7 5800H mini pc can run ~4 1080p HW TC's before choking(higher bit rate 1080p original down to 8gbps, not 4k tc). 10+ direct play streams. CPU was the constraint on the HW TC count, not GPU. If you really need to do many HW TC's, go with Intel gen 12+(with quicksync) and 16 to 32Gb RAM. You can get away with gen 8+ as well and may find better deals. Good luck. I was able to fire up a bunch of streams to test out performance informally. Browser was on same local network, chrome latest version. Navigated in each window to Plex UI, picked a movie that had a rate/quality higher than 1080p 8mbps, forced to 1080p, 8mbps. Was able to get up to 4 hardware transcodes going before trying to fire up the 5th failed. Did not see any perceptible buffering, but didn't run for that long. GPU got up to about 20-30% max, so it was CPU constrained I think. CPU was showing a bunch of red and orange with overall % at 70%+ and close to 100 at times. Another factor to CPU load was new unraid build was creating parity disk and seeing s lot of IOwait on cpu? I also have around 20 dockers running and 1 VM that was off at the time, so this mini pc can handle a lot. I also tried to open a bunch in direct play to see how many streams I could get. I got close to 10 1080p streams and had to get back to work. For me, this inexpensive $330 mini pc(w/32gb ram and 1Tb nvme (gen 3) was a great little home lab pc to try things out. PS: Don't tell the fanatics, but I got a usb 3 (gen 1) 8 bay Syba DAS enclosure and am running the full array from it with smart data working and no drops or errors so far. Fingers crossed. I had 2 4bay DAS enclosures, but the drive identification in the array slots was getting confused and think splitting across usb controllers was the issue. So far no problem with that. We shall see. Good luck all.
  5. Well, I've found out that there are a multitude of potential issues in using unraid with multiple USB DAS ensclosures, so my little homelab has to do some growing up it seems. I have one more trick up my sleeve and that's an 8 bay Syba USB3 DAS enclosure before I rule it out altogether. Nice thing about this one is it is physically switched so no soft sync on power through USB to PC. I'll set this as solution out to let others see what a mess multiple USB enclosures can be. Will post follow ups if my 8 bay enclosure does what I want. Thx to those who replied
  6. Follow up weirdness and don't poke at me for my homelab build...TLDR : I think the USB3 DAS enclosures are confusing the ids/drives unraid sees across two enslosures. 1 DAS works fine and UI acts normal with array disk assignments. I have a mini PC with two 4 bay USB3 DAS enclosures to leverage existing stuff and try this out. So I build with the first 4 bay and the 4 qty 4tb drives. Goes nicely. I decide to grab the 2nd DAS enclosure and drives. Drop one 4Tb parity drive in first DAS/Array. Plug in the 2nd DAS and migrate all the data to the shares on the 3 drive/no parity array. I also backup key files onto another external WD 4Tb usb drive just in case. Now I'm ready to format the old drives which were NTFS and add them to the array. Here's where the weirdness begins. If I try to add drives from the original array to the new array setup, the UI gets wonky. Won't let me add some of the original 4Tb's into the array. When I unplug the 2nd DAS, all of the larger drives I migrated from drop of course and magically all 4 of the 4Tb drives in the 1st DAS get recognized without an issue. The moment the 2nd DAS gets plugged in the UI doesn't allow me to add some of the 4Tb's to the array. I think this is getting into USB weirdness zone. Any ideas on how to make this work? Since both DAS are identical, could some id's be identical causing unraid to get confused on drive ID's? Would a usb hub in front of one help make unique? Please don't make fun of this goofiness. I'm trying to save some money versus buying a whole new build for $$$. Thx for any advice and sorry for the long story.
  7. I'm new to ZFS and unraid so don't know the nuances. I just want to add the 3 Tb drives that were in the functioning array back in with the new drives. They were each formatted individually as ZFS. The drive array UI slots will not let me select the drives I want into each slot. It's forcing a new 8Tb in Disk 1 slot and for some of the existing drives from former array it won't let me add in any slot. Sorry, I'm confused
  8. FYI, I'm trying to add back the original and 'new' drives in the array devices section and the UI is not letting me add some or forcing the 8Tb's into slots. Really weird behavior. Did I break the array? Do I have to create a new 1 drive array and then do another new config not saving any drive assignments or cache only? This feels really weird behavior and confusing
  9. I had the original 3 - 4tb drives in a zfs array with a 4tb parity drive, so they are presently ZFS with data on them I don't want to lose. Then I decided I wanted to expand the array and used more drives I had in a windows DAS setup. I then did new config, removed the parity drive to speed the migration. Migrated the files, now want to build a new array with all the drives listed and make the 14Tb my parity drive. So you're saying add all of the drives I desire into array first (including the parity drive at same time?), then format the ones needed. Or should add all but the 14tb parity, format them, then when done, stap array and add in the parity drive which will build from scratch? Will this have any destructive effect on the existing 3 qty 4Tb drives with data presently formatted as ZFS? Does the disk #/order of new array matter relative to the original slots held by the 3 - 4tb drives? Thank you very much for your reply and help!
  10. I'm in the process of first steps to building my unRAID server. So far I had created an array of 3 4tb nas drives into an array with no parity drive (yet). I had a few other drives from a Win/NTFS das setup that I migrated content over using Krusader. Then I wanted to take to older drives and add them to the array. What I did next stopped existing array. I then enabled destructive mode on unassigned disks. I then cleared the partitions on the old disks. I went to tools, new config and applied. Went back to main and was going to format the new, old drives I want to add to the array. Now they each have a format option nex to them and the prior drives in array have a grayed out 'mount' button. When I go to format the drives to be added as ZFS, it is asking me for a Pool Name. Should I have this value already or is this brand new pool name? This confused me and I don't want to destroy the existing array drives content in anything I do. Should I just make up a pool name and start formatting the new drives? How will this affect the drives that were in the old array config? I'm scared to break something that causes data loss on the 3 original drives in array. Finally, in the original array the first 3 drives in array were 4TB in order: sdb, sdc, sdd below with the grey mount buttons. When I assign drives to slots above at the end, does the old to new slot order matter? I may have only selected to keep the cache array assignments preserved in new config screen : ( If I can get past the format/ZFS pool name step, should I add just all drives above in any order to the array itself first as step 1, than after confirming added, stop array again then add the largest 14Tb as the parity drive or just do all at once? Sorry for the long post, just nervous being my first time through adding new disks and don't want to lose anything. Thanks in advance!!! FYI: The UI seems to not allow me to put the 4Tb disks back in their original slots in array, which were Disk 1-3 corresponding to sdb-sdd drives. Seems to force the unformatted 8Tb below into disk 1 slot. Maybe I have to format the new ones first?
  11. Sonarr and MP4 Automator - How to get post Sonarr processing to optimize for direct play under this docker? I'm new to unraid and still have Python and MP4 Automator installed running postsonarr.py on my other machine. Is there a way to replicate while still keeping unraid stock and not getting setup wiped in future docker updates? I've seen mentions of a handbrake docker using a drop folder, but not familiar how to workflow from Sonarr to completed downloads folder, then moves to final tv folder tree with the show name. I'm a little confused on all this. My setup is on a mini pc homelab with an AMD igpu, so I'd love my ...arr files to be direct play optimized from the get go. Thanks all and sorry if I missed above or elsewhere.
  12. This is a quick post to try and help those newer to unraid like me to setup a Plex container and enable hardware transcoding using a modern AMD Radeon iGPU often found in mini PC's. I'm setting up a low power home lab to learn and get some utility out of unRAID. I was doing my initial setup and walking through the steps to get HW TC to work. Key steps below: Install Plug in Radeon TOP (Optional, but suggested) Install Plugin, GPU Statistics Install App, Plex-Media-Server (official from Plex) - this is only version that natively supports HW TC without other extra parameters and craziness I'm aware of. Ensure the repository is: plexinc/pms-docker:plexpass - I believe that Plex HW TC support is only available in the Plex Pass build, which I have a lifetime membership so I think the Plex server claim step ties back to that possibly Go to bottom of page and select '+ Add another Path, Port, Variable, Label or Device', Select 'Device' Enter in Name field: AMD GPU Enter in Value field: /dev/dri:/dev/dri Save, Done when complete Go to Plex UI, Server settings > General and Check for updates and make sure you have the latest, then make sure under Server Update Channel, Beta is selected, save ensuring you're up to date fully and have Beta selected I tend to like cycling Docker altogether when I make major changes to ensure the instances all reflect them so I go to Settings, Docker and set Enable Docker to No, Apply...wait for things to cycle down, then change back to Yes, Apply, Done Go to settings in your Plex server UI then Transcoder then go to "Hardware transcoding device" and then select your specific gpu. Key Step to verify: Go to Plex and start a movie, then force a transcode by changing the playback settings to a different value. Then in Plex UI go to dashboard, now playing, expose details and you will know HW TC is working by seeing '(hw)' at end of the transcode line. At this point, I think those with modern AMD Ryzen iGPUs with have hardware transcoding enabled in an unRAID Plex docker container! For reference my setup is: unRAID 6.12.4, AMD Ryzen 7 5800H based mini pc with Radeon (Vega 8 based) GPU. Hope this helps others.