Akatsuki Posted March 28, 2022 Share Posted March 28, 2022 8 hours ago, thekiefs said: I have that same structure but hard links don't work. I'm using transmission transmission download location - /mnt/user/mergerfs/downloads/torrents/complete/movies/file.mkv radarr media location - /mnt/user/mergerfs/gd/Movies/movie folder/file.mkv Root folders in Radarr show /mnt/user/mergerfs/gd/Movies Here are the rest of my settings: Just confirming, but both the /mnt/user paths are set to /data and that you have hardlinks enabled in Radarr/Sonarr (under Settings --> Media Management) Quote Link to comment
thekiefs Posted March 28, 2022 Share Posted March 28, 2022 3 hours ago, Akatsuki said: Just confirming, but both the /mnt/user paths are set to /data and that you have hardlinks enabled in Radarr/Sonarr (under Settings --> Media Management) Yes, to both Quote Link to comment
kesm Posted March 28, 2022 Share Posted March 28, 2022 On 3/25/2022 at 12:34 PM, Akatsuki said: Your destination in the Flood download client config in Sonarr should be /user/mount_mergerfs/downloads/complete/sonarr/ I believe. That should hopefully get it sorted Worked for me, thanks for the suggestion! Quote Link to comment
[email protected] Posted April 2, 2022 Share Posted April 2, 2022 hi guys sorry if this is already answered but i cant find anywhere in this topic how can i add multiple service accounts so i can upload more than 750gb per day? Is this part of the script okay? or how do you add multiple service account remotes? to upload more than 750gb a day RcloneCommand="move" RcloneRemoteName="gdrive_vfs" RcloneUploadRemoteName="gdrive_upload_vfs1" RcloneUploadRemoteName="gdrive_upload_vfs2" RcloneUploadRemoteName="gdrive_upload_vfs3" RcloneUploadRemoteName="gdrive_upload_vfs4" RcloneUploadRemoteName="gdrive_upload_vfs5" RcloneUploadRemoteName="gdrive_upload_vfs6" RcloneUploadRemoteName="gdrive_upload_vfs7" RcloneUploadRemoteName="gdrive_upload_vfs8" RcloneUploadRemoteName="gdrive_upload_vfs9" RcloneUploadRemoteName="gdrive_upload_vfs10" LocalFilesShare="/mnt/user/local" RcloneMountShare="/mnt/user/mount_rclone" MinimumAge="15m" ModSort="ascending" Quote Link to comment
DZMM Posted April 2, 2022 Author Share Posted April 2, 2022 5 hours ago, [email protected] said: hi guys sorry if this is already answered but i cant find anywhere in this topic how can i add multiple service accounts so i can upload more than 750gb per day? Is this part of the script okay? or how do you add multiple service account remotes? to upload more than 750gb a day RcloneCommand="move" RcloneRemoteName="gdrive_vfs" RcloneUploadRemoteName="gdrive_upload_vfs1" RcloneUploadRemoteName="gdrive_upload_vfs2" RcloneUploadRemoteName="gdrive_upload_vfs3" RcloneUploadRemoteName="gdrive_upload_vfs4" RcloneUploadRemoteName="gdrive_upload_vfs5" RcloneUploadRemoteName="gdrive_upload_vfs6" RcloneUploadRemoteName="gdrive_upload_vfs7" RcloneUploadRemoteName="gdrive_upload_vfs8" RcloneUploadRemoteName="gdrive_upload_vfs9" RcloneUploadRemoteName="gdrive_upload_vfs10" LocalFilesShare="/mnt/user/local" RcloneMountShare="/mnt/user/mount_rclone" MinimumAge="15m" ModSort="ascending" Instructions are on GitHub https://github.com/BinsonBuzz/unraid_rclone_mount Quote Link to comment
[email protected] Posted April 2, 2022 Share Posted April 2, 2022 (edited) 17 hours ago, DZMM said: Instructions are on GitHub https://github.com/BinsonBuzz/unraid_rclone_mount edit: i get it now i have to setup first autoreclone to make work thank you. Edited April 3, 2022 by [email protected] Quote Link to comment
thekiefs Posted April 3, 2022 Share Posted April 3, 2022 (edited) On 3/27/2022 at 11:51 PM, thekiefs said: Yes, to both Any other ideas on fixing hard links? I'm stuck. It also impacts Sonarr. Edited April 3, 2022 by thekiefs Quote Link to comment
DZMM Posted April 3, 2022 Author Share Posted April 3, 2022 1 hour ago, thekiefs said: Any other ideas on fixing hard links? I'm stuck. It also impacts Sonarr. Dunno, maybe try /user? Maybe using /mnt/user does something odd Quote Link to comment
tungndtt Posted April 4, 2022 Share Posted April 4, 2022 Hello, Thank you for your works. It took me a week to go this far but it still not work for me. Now that I ran the script and I have my google drive files appeared in /user/mount_mergerfs/gdrive and /user/mount_rclone/gdrive , but I can't find /user/mount_unionfs anywhere. Please guide me what to do next? Quote Link to comment
DZMM Posted April 4, 2022 Author Share Posted April 4, 2022 2 hours ago, tungndtt said: Hello, Thank you for your works. It took me a week to go this far but it still not work for me. Now that I ran the script and I have my google drive files appeared in /user/mount_mergerfs/gdrive and /user/mount_rclone/gdrive , but I can't find /user/mount_unionfs anywhere. Please guide me what to do next? The files are in the right place - /user/mount_mergerfs/gdrive as you are using mergerfs not unionfs (mergerfs replaced unionfs) 1 Quote Link to comment
tungndtt Posted April 4, 2022 Share Posted April 4, 2022 2 hours ago, DZMM said: The files are in the right place - /user/mount_mergerfs/gdrive as you are using mergerfs not unionfs (mergerfs replaced unionfs) That means I will let plex point its libraries to mergerfs instead of unionfs right? Quote Link to comment
DZMM Posted April 4, 2022 Author Share Posted April 4, 2022 8 minutes ago, tungndtt said: That means I will let plex point its libraries to mergerfs instead of unionfs right? yes 1 Quote Link to comment
neow Posted April 8, 2022 Share Posted April 8, 2022 hello, i saw google business suite will evolve. Do you think it will impact storage for plex and the price? thanks Quote Link to comment
Logopeden Posted April 10, 2022 Share Posted April 10, 2022 Hello all. First of thanks for the amazing work with this. im new to most of this. i have been trying to get the service account part to work. but just dont know how to do this. i appreciate if someone can send me a DM and try to help me out. been trying to get the AutoRclone to work but yea.. - what i want to do. got a the "normal" script running and working. i whant this "normal" script transferd over to service accounts. Quote Link to comment
Akatsuki Posted April 11, 2022 Share Posted April 11, 2022 13 hours ago, Logopeden said: Hello all. First of thanks for the amazing work with this. im new to most of this. i have been trying to get the service account part to work. but just dont know how to do this. i appreciate if someone can send me a DM and try to help me out. been trying to get the AutoRclone to work but yea.. - what i want to do. got a the "normal" script running and working. i whant this "normal" script transferd over to service accounts. Have you tried reading the Optional: create service accounts section at the github: https://github.com/BinsonBuzz/unraid_rclone_mount Quote Link to comment
belizejackie Posted April 11, 2022 Share Posted April 11, 2022 On 2/12/2022 at 9:50 AM, DZMM said: I've managed this weekend to successfully integrate a seedbox into my setup and I'm sharing how I did it. I've purchased a cheap seedbox as my Plex streams were taking up too much bandwidth as I've gone from a 1000/1000 -->360/180 -->1000/120 connection, so it's been a pain trying to balance each day the bandwidth and file space requirements of moving files from /local to the cloud, and having enough bandwidth for Plex, backup jobs etc etc My setup now is: 1. Seedbox downloading to /home/user/local/nzbget and /home/user/local/rutorrent 2. rclone script running each min to move files from /home/user/local/nzbget --> tdrive_vfs:seedbox/nzbget and sync files from /home/user/local/rutorrent --> tdrive:seedbox/rutorrent (torrent files need to stay for seeding) 3. added remote path to ***arr to look in /user/mount/mergerfs/tdrive_vfs for files in /home/user/local (thanks @Akatsuki) It's working perfectly so far as my local setup hasn't changed, with rclone polling locally for changes that have occured in the cloud. Here's my script - I've stripped out all the options as I don't need them: #!/bin/bash ###################### ### Upload Script #### ###################### ### Version 0.95.5 ### ###################### # REQUIRED SETTINGS ModSort="ascending" # "ascending" oldest files first, "descending" newest files first # Add extra commands or filters Command1="--exclude _unpack/**" Command2="--fast-list" Command3="" Command4="" Command5="" Command6="" Command7="" Command8="" # OPTIONAL SETTINGS CountServiceAccounts="14" ####### END SETTINGS ####### echo "$(date "+%d.%m.%Y %T") INFO: *** Starting Core Upload Script ***" ####### create directory for script files ####### mkdir -p /home/user/rclone/remotes/tdrive_vfs ####### Check if script already running ########## echo "$(date "+%d.%m.%Y %T") INFO: *** Starting rclone_upload script ***" if [[ -f "/home/user/rclone/remotes/tdrive_vfs/upload_running" ]]; then echo "$(date "+%d.%m.%Y %T") INFO: Exiting as script already running." exit else echo "$(date "+%d.%m.%Y %T") INFO: Script not running - proceeding." touch /home/user/rclone/remotes/tdrive_vfs/upload_running fi ####### Rotating serviceaccount.json file ####### cd /home/user/rclone/remotes/tdrive_vfs/ CounterNumber=$(find -name 'counter*' | cut -c 11,12) CounterCheck="1" if [[ "$CounterNumber" -ge "$CounterCheck" ]];then echo "$(date "+%d.%m.%Y %T") INFO: Counter file found." else echo "$(date "+%d.%m.%Y %T") INFO: No counter file found . Creating counter_1." touch /home/user/rclone/remotes/tdrive_vfs/counter_1 CounterNumber="1" fi ServiceAccount="--drive-service-account-file=/home/user/rclone/service_accounts/sa_spare_upload$CounterNumber.json" echo "$(date "+%d.%m.%Y %T") INFO: Adjusted service_account_file for upload remote to ${ServiceAccountFile}${CounterNumber}.json based on counter ${CounterNumber}." ####### Transfer files ########## # Upload nzbget files /usr/local/bin/rclone move /home/user/local tdrive_vfs: $ServiceAccount --config=/home/user/.config/rclone/rclone.conf --user-agent="external" -vv --order-by modtime,$ModSort --min-age 1m $Command1 $Command2 $Command3 $Command4 $Command5 $Command6 $Command7 $Command8 --drive-chunk-size=128M --transfers=4 --checkers=8 --exclude rutorrent/** --exclude deluge/** --exclude *fuse_hidden* --exclude *_HIDDEN --exclude .recycle** --exclude .Recycle.Bin/** --exclude *.backup~* --exclude *.partial~* --drive-stop-on-upload-limit --delete-empty-src-dirs --log-file=/home/user/rclone/upload_log.txt # Sync rutorrent files /usr/local/bin/rclone sync /home/user/local/seedbox/rutorrent tdrive_vfs:seedbox/rutorrent $ServiceAccount --config=/home/user/.config/rclone/rclone.conf --user-agent="external" -vv --order-by modtime,$ModSort --min-age 1m $Command1 $Command2 $Command3 $Command4 $Command5 $Command6 $Command7 $Command8 --drive-chunk-size=128M --transfers=4 --checkers=8 --exclude *fuse_hidden* --exclude *_HIDDEN --exclude .recycle** --exclude .Recycle.Bin/** --exclude *.backup~* --exclude *.partial~* --drive-stop-on-upload-limit --log-file=/home/user/rclone/sync_log.txt ####### Remove Control Files ########## # update counter and remove other control files if [[ "$CounterNumber" == "$CountServiceAccounts" ]];then rm /home/user/rclone/remotes/tdrive_vfs/counter_* touch /home/user/rclone/remotes/tdrive_vfs/counter_1 echo "$(date "+%d.%m.%Y %T") INFO: Final counter used - resetting loop and created counter_1." else rm /home/user/rclone/remotes/tdrive_vfs/counter_* CounterNumber=$((CounterNumber+1)) touch /home/user/rclone/remotes/tdrive_vfs/counter_$CounterNumber echo "$(date "+%d.%m.%Y %T") INFO: Created counter_${CounterNumber} for next upload run." fi # remove dummy files and replace directories rm /home/user/rclone/remotes/tdrive_vfs/upload_running mkdir -p /home/user/local/seedbox/nzbget/completed echo "$(date "+%d.%m.%Y %T") INFO: Script complete" exit Do you have any solution? After I Mount my drive it because it takes up all my bandwidth every day. keep downloading even its idealing ! Quote Link to comment
Logopeden Posted April 11, 2022 Share Posted April 11, 2022 (edited) 16 hours ago, Akatsuki said: Have you tried reading the Optional: create service accounts section at the github: https://github.com/BinsonBuzz/unraid_rclone_mount Yes, but kinda get stuck on the install part. unsure how to proceed (dont whant to mess up) Edit: Manage to make the SA and uploaded them to \appdata\other\rclone\service_accounts At first i got 404 errors that i gathered to be permission error. then i change the sharing setting to all that got the link can edit, but that only made another crypt folder. is there someone that can guide me any further. Edited April 12, 2022 by Logopeden Quote Link to comment
DZMM Posted April 11, 2022 Author Share Posted April 11, 2022 58 minutes ago, belizejackie said: Do you have any solution? After I Mount my drive it because it takes up all my bandwidth every day. keep downloading even its idealing ! Probably Plex - stop each docker one by one to see what's hammering your mount Quote Link to comment
Logopeden Posted April 12, 2022 Share Posted April 12, 2022 (edited) @DZMM Hello with a bit of tinkering i got service accounts to work to work. just whanted to let you know that when you use "my drive" not "shared-drives" you get duplicated crypt folders. (one is from the old setup/the orginal owner, the other crypt folder is from the one the drive is shared whit) and after som googel i found this is a know problem. when transfering to another folder that is "shared with me" is ther a way to make this not happen? my skills is not good enof to make the fix below happen. in the other forum i found this work around "" Instead of rclone copy --drive-shared-with-me "Drive:example shared with me.txt" Drive:Backups write rclone copy "Drive,shared_with_me:example shared with me.txt" Drive:Backups and all should be well. "" Edited April 12, 2022 by Logopeden Quote Link to comment
DZMM Posted April 12, 2022 Author Share Posted April 12, 2022 2 hours ago, Logopeden said: @DZMM Hello with a bit of tinkering i got service accounts to work to work. just whanted to let you know that when you use "my drive" not "shared-drives" you get duplicated crypt folders. (one is from the old setup/the orginal owner, the other crypt folder is from the one the drive is shared whit) and after som googel i found this is a know problem. when transfering to another folder that is "shared with me" is ther a way to make this not happen? my skills is not good enof to make the fix below happen. in the other forum i found this work around "" Instead of rclone copy --drive-shared-with-me "Drive:example shared with me.txt" Drive:Backups write rclone copy "Drive,shared_with_me:example shared with me.txt" Drive:Backups and all should be well. "" Sorry, I didn't really understand the SA account creation bit myself and I think I fluked it! I can't remember who added the instructions to the Github as it wasn't me. Quote Link to comment
Logopeden Posted April 12, 2022 Share Posted April 12, 2022 48 minutes ago, DZMM said: Sorry, I didn't really understand the SA account creation bit myself and I think I fluked it! I can't remember who added the instructions to the Github as it wasn't me. ok. do you think this is somthing that can be fixed in the future? if someone smarter than me can look on this. the link is https://github.com/rclone/rclone/issues/2675 it describes the problem that is happening whit me as well. hope it helps Quote Link to comment
DZMM Posted April 12, 2022 Author Share Posted April 12, 2022 3 hours ago, Logopeden said: ok. do you think this is somthing that can be fixed in the future? if someone smarter than me can look on this. the link is https://github.com/rclone/rclone/issues/2675 it describes the problem that is happening whit me as well. hope it helps I've never had this problem (carefully) transferring files between shared drives. Quote Link to comment
Gr0undfall Posted April 14, 2022 Share Posted April 14, 2022 Hi all, I'm hoping someone can help me, I'm having some issues getting this set up correctly. The problem is when I run the mount script, it looks like it works fine, the remote share is mounted but nothing populates from google drive. If I run the mount command from terminal the mount is added and the files/folders populate. In either of these situations, when I do a rclone lsd my remote in the terminal, I can see all my directories. Anyone experience anything similar? Thanks in advance. Quote Link to comment
DZMM Posted April 14, 2022 Author Share Posted April 14, 2022 2 hours ago, Gr0undfall said: Hi all, I'm hoping someone can help me, I'm having some issues getting this set up correctly. The problem is when I run the mount script, it looks like it works fine, the remote share is mounted but nothing populates from google drive. If I run the mount command from terminal the mount is added and the files/folders populate. In either of these situations, when I do a rclone lsd my remote in the terminal, I can see all my directories. Anyone experience anything similar? Thanks in advance. Post your script and logs. Also, are you running the script in the background........ Quote Link to comment
southloven Posted April 28, 2022 Share Posted April 28, 2022 I just got a 2nd mount up and running and it hit me I May have a problem with restarts. If docker containers have to be started after the mounts are established how to you make sure both mounts have completed before starting the dockers? Has any one addressed? or is this not a problem? Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.