• Posts

  • Joined

  • Last visited

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

s449's Achievements


Rookie (2/14)



  1. About 3 times in the last year I notice my server is randomly inaccessible. Plex is down, the webGUI is inaccessible, and my SuperMicro IPMI interface is inaccessible. I think it usually happens overnight. I finally got a spare monitor so this time I plugged it into my machine to debug and saw just a black screen. I assume it's related to this: I do have an Nvidia 1060 GPU passed through to Plex for hardware encoding/decoding. But debugging all of the above issues aside, the bigger issue is when I push the physical power button on my Node 804 it does nothing. Usually when I initiate shutdown via the webGUI, immediately I hear some beeps and it goes down. But I've waited 10+ minutes after hitting the physical button, no beeps, no shut down. I've pushed it a number of times, it doesn't seem to work. However, if I hold it for around 5 seconds, it does an unclean shutdown (like the power cord was ripped out). So it does seem to be connected somehow. But maybe not connected the right way? Maybe this is simply a hardware issue or maybe something is making it stuck like on Windows "[software] is preventing shutdown". Any ideas? Thanks!
  2. Edit: Re-writing this entire post because I finally understand what happened. My original post mentioned how my dockers randomly had (?) before upgrading to 6.10.0 and blah blah. Light bulb moment from finding this reddit comment on Google. Before updating I also ran Cleanup Appdata and mass deleted everything as is habit and as has never been an issue. I now vaguely remember one of them having dockerMan on it which I didn't know what it was at the time. Learned what that is and learned my lesson the hard way. But no clue why that showed up today for the first time. Any ideas? Regardless, my /boot/config/plugins/dockerMan/templates-user/ folder is empty. I have a flash backup but it's from March 2021 so it's missing some things. I have the Unraid.net My Servers plugin but it's only giving me the option to download a flash backup from a time after I deleted it. Luckily I can just re-set these up but a few of these had some pretty unique variables and configs I'm worried I won't remember. Any chance of getting an older flash backup from My Servers? Any chance of some sort of data recovery I can do?
  3. This might be an Unraid issue, but is there any way to hide the Location column or just in general truncate columns instead of line breaking them? The UX feels more difficult with varying row heights.
  4. Not sure if this is changed or something weird on my end, but I just installed this and by default my GitHub folder is owned by "65534". The folders inside it are part of the group "nobody" so I think something went weird, like an overflow. I ran sudo chown -R nobody:users -R GitHub/ But after restarting the container it just went back to 65534:users I just saw this docker today in the community apps spotlight but it looks like it hasn't been updated in a year and no responses here. Is this project abandoned? Do you need someone to take it over?
  5. This seemed to stop working for me. Haven't changed anything but the logs are now just: Writing shreddit.yml file... Writing praw.ini file... Traceback (most recent call last): File "/usr/local/bin/shreddit", line 33, in <module> sys.exit(load_entry_point('shreddit==6.0.7', 'console_scripts', 'shreddit')()) File "/usr/local/lib/python3.9/dist-packages/shreddit/app.py", line 44, in main shredder = Shredder(default_config, args.user) File "/usr/local/lib/python3.9/dist-packages/shreddit/shredder.py", line 28, in __init__ self._connect() File "/usr/local/lib/python3.9/dist-packages/shreddit/shredder.py", line 80, in _connect self._r = praw.Reddit(self._user, check_for_updates=False, user_agent="python:shreddit:v6.0.4") File "/usr/local/lib/python3.9/dist-packages/praw/reddit.py", line 150, in __init__ raise ClientException(required_message.format(attribute)) praw.exceptions.ClientException: Required configuration setting 'client_id' missing. This setting can be provided in a praw.ini file, as a keyword argument to the `Reddit` class constructor, or as an environment variable. Looking inside the Docker, my praw.ini file is empty. But my docker config is set up with all that information. Maybe it's not connecting somehow?
  6. Having the same issue. Came here to say so and saw this, what are the odds? I tried an Xbox Core Controller over Bluetooth to my Apple TV 4K, worked in Steam Big Picture menus just not in any games. I also tried touch controller on my iPhone, same deal. Interestingly mouse controls with the touch screen did work fine. Both scenarios do work when streaming from other devices on my network which leads me to believe it’s an issue with this container specifically. Everything else is working flawlessly for me, though. I’m really excited about this project.
  7. Awesome container, really excited to test this. I do have this issue on Safari, doesn't happen in Firefox. I can still use the VNC, it just is in the way. No big deal, I can set up in Firefox and shouldn't need to touch it again. Thank you for making this!
  8. I read this a while back and a few weeks ago finally decided to give it a try. I’m the main user of my server and don’t plan on sharing Dockers to people outside of my home (except for Plex). I used reverse proxy for my own convenience. So it did make me wonder why am I opening more ports and adding more steps, more points of failure, and an increased security risk for convenience? How convenient is reverse proxy, really? After over a month of ditching reverse proxy and using exclusively WireGuard outside of network to access my server I can confidently say its been barely a thought. I would definitely recommend others consider giving it a try if your situation is similar. It’s not at all annoying the few times I need to right click > activate on an icon in my taskbar or open up the WireGuard app > toggle on. Hopefully one day I can add a pfSense router to my home network to add another level of convenience but for now I’m very happy simplifying my set up. I’ve even ditched Nextcloud and use SyncThing because of it and have been very happy. So much less maintenance between ditching the two. I remember I used to s waste entire afternoons debugging reverse proxy on a few especially difficult Dockers and never succeeding. Thanks for this post!
  9. Wow! This is exciting. I've been interested in this for months. How does this compare to Calibre? And does it support sending books to Kindle over email? I wonder if this app is overkill for someone who reads maybe 1-3 books a year and doesn't follow any authors in particular. I just like the idea of sticking to the *arr ecosystem. Thanks for this! Excited to give it a try regardless.
  10. My whole backup solution is a bit messy right now. Currently I use Rclone, Google Drive, and BackBlaze B2. The way it works is that I have about 70GB of important data (documents, music/video projects, graphics, etc) in a Google Drive folder on my Mac. If there's anything I'm done with, like if I finish a video project, that gets manually transferred to "cold storage" on Unraid and eventually up to B2 through Rclone (User scripts plugin, once a day). But for security, I keep an offline version of Google Drive on my server and also back that up to B2. So basically when it comes to my daily-use files: desktop Google Drive folder -> Google Drive cloud -> Unraid Google Drive share -> BackBlaze B2. And of course I also have more cold storage shares on Unraid that also go to BackBlaze B2. What I'm thinking is skipping the Google Drive middle-man and just using Rclone from my Mac to Unraid shares. Maybe even making like a "warm storage" intake share on Unraid so I can still manually move stuff to the cold storage shares. But how well would Rclone work for this? I know it's CLI only which is fine, I'm comfortable with that, but I'm worried it won't be as robust. As in, I won't really know about errors or issues. I'm also worried about how it handles constant file changes, like if I'm working on a music project in a folder that Rclone touches and files get frequently created/deleted. Also, what would happen if an Rclone Mac -> Unraid sync happens at the same time as the Unraid -> B2 sync? Do they communicate with each other somehow? Is this worth looking into? Any tips would be appreciated. I'm trying to de-google and simplify a bit in general. How do you all manage your daily storage and cold storage backup?
  11. Hey is there any way to run this with the OBS virtual webcam? I'm not seeing "start virtual camera". I'm trying to use it as a way to stream video files from my server through "VLC Video Source" and into Discord. It sounds like no matter what I'll need to run Discord + OBS on a VM and tunnel the stream to it. But I assume that would be better performing considering with an OBS docker I can pass through my GPU (which is shared with Plex) and solely use the VM to upload the stream.
  12. Hello, lately I've been having an issue where some of my torrents from public trackers get stuck at 99.9% done. This then doesn't trigger the stop ratio rule so I end up just endlessly uploading and wasting bandwidth. I try to be a good seeder but not that good lol. The only manual solution I've found is if I right click on the torrent and do "force re-check", rTorrent will throw this error: And then it will crash: But once it's backed up, it's instantly fixed: Does anyone know what's going on here? It seems like it's actually done on rTorrent's end and it's just not reporting to ruTorrent? And therefore not reporting to Sonarr to import.
  13. I set my media share to not touch the cache. But yesterday I thought, I don't remember why, let me set it to use cache. Today I realized why. Mover doesn't seem to like the hard linked files in the cache and now they're stuck. Mover won't touch those files no matter how many times I invoke it. So, I figured I could manually move the files in /mnt/. I know the hard links will break, but I'll just re-import with Sonarr/Radarr, no big deal. My question is, what's the right terminal command? I wanna make sure I don't mess anything up. Would I use: mv /mnt/cache/media /mnt/disk4/media ? Thanks for the help!
  14. Yes I did finally! Thanks to this: Looks like you don't need the /mnt/user/ part of the path. Ended up adding some exclusions and my full excluded folders path is this: binhex-plex/Plex Media Server/Crash Reports/,binhex-plex/Plex Media Server/Logs/,binhex-plex/Plex Media Server/Media/,binhex-plex/Plex Media Server/Cache/,.DS_Store,*.log,*.log.*,*.tmp,.Recycle.Bin Adding that made my appdata backup go from 4 hours to 15 minutes, and 149 GB (compressed) to 11.1 GB (compressed).