Jump to content

sauso

Members
  • Content Count

    51
  • Joined

  • Last visited

Community Reputation

9 Neutral

About sauso

  • Rank
    Advanced Member

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. Hey Mate. Yep it's still for sale. Shoot me a private message if you want.
  2. Hey guys, Located in Geelong Australia, I have a unas 810a micro ATX case with a 350w gold seasonic PSU. Only selling as I have upgraded to a full depth rack case. This case is awesome and is extremely space efficient. Pictures of the case can be found here. http://www.u-nas.com/xcart/product.php?productid=17640 Comes with 2 X 120mm fans and 1 60mm fan. Will also include Sata cables and a x16 PCI extender. Asking for $300 plus postage. Will post some actual photos tomorrow.
  3. Sorry Kaizac, Been offline for a couple of weeks. Mergrfs command below. mergerfs /mnt/user/Media:/mnt/user/mount_rclone/google_vfs/Media:/mnt/user/mount_rclone/nachodrive/Media /mnt/user/mount_unionfs/google_vfs/Media -o rw,async_read=false,use_ino,allow_other,func.getattr=newest,category.action=all,category.create=ff,cache.files=partial,dropcacheonclose=true
  4. This isn't exactly Unraid related but if anyone is interested Google cloud gives away $300 of credits to try google cloud. I spun up an ubuntu server and put sabnzbd on it and installed rclone to point to my team drive (6 mounts with 6 different accounts shared to the one team drive). Set Radarr to point to my GCP SAB Server and then wrote a simple post processing shell script to upload to one of the 6 mounts depending on the time. Based on the upload speed I am getting from GCP to GDrive (Around 45 - 50 MBps) I rotate the mount every 4 hours. That way i don't hit the 750GB per user per day limit. The shell script does a rclone move from my cloud VM to my google drive. as the folder is mounted in the union on my local server radarr just hard links it (mergerfs FTW). Side note that I pause downloads on post processing as i found sometimes i had malformed downloads and this started to cause a queue from post processing which if left unattended would cause my GCP server to run out of HD. Radarr then has a remote mapping to translate my cloud mount path to the local path. Should be able to get through my backlog in a few days!
  5. Try looking in the logs and see if you can see whats going on when it evaluates.
  6. run the docker run through the terminal and see what happens.
  7. yeah no idea how i missed it! I was getting decent times already but now I would even say it is faster than my spin up time!
  8. wow. I seemed to also be missing --vfs-cache-mode writes from my mount. I have just seen a drastic improvement of playback with that one change. Also after i fixed the docker not running in the script everything is working perfectly. So happy this finally got done!
  9. Hey @DZMM i seem to be having problems with the docker run in the script. Works perfectly if i run it manually. Getting a TTY error. Removing the -it seems to make it work. Not sure if you are seeing the same issues.
  10. Oh man this will be awesome. I had a bunch of files in my drive but wanted to change the modified date. it downloaded it locally before it change the file. will give this a shot tomorrow.
  11. My egg will only work on python 3.8 and deluge 2. won't work on 1.
  12. Oh Man, they looks awesome! cheers!
  13. I just updated and mine's been stable for the last 10 mins. How many torrents are you seeding?