Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

8 hours ago, thekiefs said:

 

I have that same structure but hard links don't work. I'm using transmission

 

transmission download location - /mnt/user/mergerfs/downloads/torrents/complete/movies/file.mkv

radarr media location - /mnt/user/mergerfs/gd/Movies/movie folder/file.mkv

 

Root folders in Radarr show /mnt/user/mergerfs/gd/Movies

 

Here are the rest of my settings:

49570469_ScreenShot2022-03-27at11_53_43.thumb.png.f5e8ef8c4d7a7584d9fda7daf6bd8211.png

 

1294706208_ScreenShot2022-03-27at11_54_21.thumb.png.39f1d94f11125e4b5ac97b400371641a.png

 

Just confirming, but both the /mnt/user paths are set to /data and that you have hardlinks enabled in Radarr/Sonarr (under Settings --> Media Management)

image.png.ece50435c46d407d219cb1169d3b9835.png

Link to comment
On 3/25/2022 at 12:34 PM, Akatsuki said:

 

Your destination in the Flood download client config in Sonarr should be /user/mount_mergerfs/downloads/complete/sonarr/ I believe. That should hopefully get it sorted :)

Worked for me, thanks for the suggestion!

Link to comment

hi guys sorry if this is already answered but i cant find anywhere in this topic how can i add multiple service accounts so i can upload more than 750gb per day?

 

Is this part of the script okay? or how do you add multiple service account remotes? to upload more than 750gb a day

RcloneCommand="move" 
RcloneRemoteName="gdrive_vfs" 
RcloneUploadRemoteName="gdrive_upload_vfs1"
RcloneUploadRemoteName="gdrive_upload_vfs2"
RcloneUploadRemoteName="gdrive_upload_vfs3"
RcloneUploadRemoteName="gdrive_upload_vfs4"
RcloneUploadRemoteName="gdrive_upload_vfs5"
RcloneUploadRemoteName="gdrive_upload_vfs6"
RcloneUploadRemoteName="gdrive_upload_vfs7"
RcloneUploadRemoteName="gdrive_upload_vfs8"
RcloneUploadRemoteName="gdrive_upload_vfs9"
RcloneUploadRemoteName="gdrive_upload_vfs10"
LocalFilesShare="/mnt/user/local" 
RcloneMountShare="/mnt/user/mount_rclone" 
MinimumAge="15m" 
ModSort="ascending" 

 

Link to comment
5 hours ago, [email protected] said:

hi guys sorry if this is already answered but i cant find anywhere in this topic how can i add multiple service accounts so i can upload more than 750gb per day?

 

Is this part of the script okay? or how do you add multiple service account remotes? to upload more than 750gb a day

RcloneCommand="move" 
RcloneRemoteName="gdrive_vfs" 
RcloneUploadRemoteName="gdrive_upload_vfs1"
RcloneUploadRemoteName="gdrive_upload_vfs2"
RcloneUploadRemoteName="gdrive_upload_vfs3"
RcloneUploadRemoteName="gdrive_upload_vfs4"
RcloneUploadRemoteName="gdrive_upload_vfs5"
RcloneUploadRemoteName="gdrive_upload_vfs6"
RcloneUploadRemoteName="gdrive_upload_vfs7"
RcloneUploadRemoteName="gdrive_upload_vfs8"
RcloneUploadRemoteName="gdrive_upload_vfs9"
RcloneUploadRemoteName="gdrive_upload_vfs10"
LocalFilesShare="/mnt/user/local" 
RcloneMountShare="/mnt/user/mount_rclone" 
MinimumAge="15m" 
ModSort="ascending" 

 

Instructions are on GitHub

 

https://github.com/BinsonBuzz/unraid_rclone_mount

 

 

Link to comment

Hello,

 

Thank you for your works.

It took me a week to go this far but it still not work for me.

Now that I ran the script and I have my google drive files appeared in /user/mount_mergerfs/gdrive and /user/mount_rclone/gdrive , but I can't find /user/mount_unionfs anywhere.

 

Please guide me what to do next?

Link to comment
2 hours ago, tungndtt said:

Hello,

 

Thank you for your works.

It took me a week to go this far but it still not work for me.

Now that I ran the script and I have my google drive files appeared in /user/mount_mergerfs/gdrive and /user/mount_rclone/gdrive , but I can't find /user/mount_unionfs anywhere.

 

Please guide me what to do next?

The files are in the right place - /user/mount_mergerfs/gdrive as you are using mergerfs not unionfs (mergerfs replaced unionfs)

  • Like 1
Link to comment

Hello all.

 

First of thanks for the amazing work with this.

 

im new to most of this.  i have been trying to get the service account part to work. but just dont know how to do this.
i appreciate if someone can send me a DM and try to help me out.

been trying to get the AutoRclone to work but yea..

 

 

-

what i want to do.

 

got a the "normal" script running and working.

i whant this "normal" script transferd over to service accounts.

 

 

Link to comment
13 hours ago, Logopeden said:

Hello all.

 

First of thanks for the amazing work with this.

 

im new to most of this.  i have been trying to get the service account part to work. but just dont know how to do this.
i appreciate if someone can send me a DM and try to help me out.

been trying to get the AutoRclone to work but yea..

 

 

-

what i want to do.

 

got a the "normal" script running and working.

i whant this "normal" script transferd over to service accounts.

 

 

 

Have you tried reading the Optional: create service accounts section at the github: https://github.com/BinsonBuzz/unraid_rclone_mount

Link to comment
On 2/12/2022 at 9:50 AM, DZMM said:

I've managed this weekend to successfully integrate a seedbox into my setup and I'm sharing how I did it. 

 

I've purchased a cheap seedbox as my Plex streams were taking up too much bandwidth as I've gone from a 1000/1000 -->360/180 -->1000/120 connection, so it's been a pain trying to balance each day the bandwidth and file space requirements of moving files from /local to the cloud, and having enough bandwidth for Plex, backup jobs etc etc

 

My setup now is:

 

1. Seedbox downloading to /home/user/local/nzbget and /home/user/local/rutorrent

2. rclone script running each min to move files from /home/user/local/nzbget --> tdrive_vfs:seedbox/nzbget and sync files from /home/user/local/rutorrent --> tdrive:seedbox/rutorrent (torrent files need to stay for seeding)

3. added remote path to ***arr to look in /user/mount/mergerfs/tdrive_vfs for files in /home/user/local (thanks @Akatsuki)

 

image.thumb.png.3913ed3e6d131c62dfc107da9c0a04d6.png

 

It's working perfectly so far as my local setup hasn't changed, with rclone polling locally for changes that have occured in the cloud. 

 

Here's my script - I've stripped out all the options as I don't need them:

#!/bin/bash

######################
### Upload Script ####
######################
### Version 0.95.5 ###
######################

# REQUIRED SETTINGS
ModSort="ascending" # "ascending" oldest files first, "descending" newest files first

# Add extra commands or filters
Command1="--exclude _unpack/**"
Command2="--fast-list"
Command3=""
Command4=""
Command5=""
Command6=""
Command7=""
Command8=""

# OPTIONAL SETTINGS

CountServiceAccounts="14"

####### END SETTINGS #######

echo "$(date "+%d.%m.%Y %T") INFO: *** Starting Core Upload Script ***"

####### create directory for script files #######
mkdir -p /home/user/rclone/remotes/tdrive_vfs

#######  Check if script already running  ##########
echo "$(date "+%d.%m.%Y %T") INFO: *** Starting rclone_upload script ***"
if [[ -f "/home/user/rclone/remotes/tdrive_vfs/upload_running" ]]; then
	echo "$(date "+%d.%m.%Y %T") INFO: Exiting as script already running."
	exit
else
	echo "$(date "+%d.%m.%Y %T") INFO: Script not running - proceeding."
	touch /home/user/rclone/remotes/tdrive_vfs/upload_running
fi

####### Rotating serviceaccount.json file #######

cd /home/user/rclone/remotes/tdrive_vfs/
CounterNumber=$(find -name 'counter*' | cut -c 11,12)
CounterCheck="1"
if [[ "$CounterNumber" -ge "$CounterCheck" ]];then
	echo "$(date "+%d.%m.%Y %T") INFO: Counter file found."
else
	echo "$(date "+%d.%m.%Y %T") INFO: No counter file found . Creating counter_1."
	touch /home/user/rclone/remotes/tdrive_vfs/counter_1
	CounterNumber="1"
fi
ServiceAccount="--drive-service-account-file=/home/user/rclone/service_accounts/sa_spare_upload$CounterNumber.json"
echo "$(date "+%d.%m.%Y %T") INFO: Adjusted service_account_file for upload remote to ${ServiceAccountFile}${CounterNumber}.json based on counter ${CounterNumber}."

#######  Transfer files  ##########

# Upload nzbget files
/usr/local/bin/rclone move /home/user/local tdrive_vfs: $ServiceAccount --config=/home/user/.config/rclone/rclone.conf --user-agent="external" -vv --order-by modtime,$ModSort --min-age 1m $Command1 $Command2 $Command3 $Command4 $Command5 $Command6 $Command7 $Command8 --drive-chunk-size=128M --transfers=4 --checkers=8 --exclude rutorrent/** --exclude deluge/** --exclude *fuse_hidden* --exclude *_HIDDEN --exclude .recycle** --exclude .Recycle.Bin/** --exclude *.backup~* --exclude *.partial~* --drive-stop-on-upload-limit --delete-empty-src-dirs  --log-file=/home/user/rclone/upload_log.txt

# Sync rutorrent files
/usr/local/bin/rclone sync /home/user/local/seedbox/rutorrent tdrive_vfs:seedbox/rutorrent $ServiceAccount --config=/home/user/.config/rclone/rclone.conf --user-agent="external" -vv --order-by modtime,$ModSort --min-age 1m $Command1 $Command2 $Command3 $Command4 $Command5 $Command6 $Command7 $Command8 --drive-chunk-size=128M --transfers=4 --checkers=8 --exclude *fuse_hidden* --exclude *_HIDDEN --exclude .recycle** --exclude .Recycle.Bin/** --exclude *.backup~* --exclude *.partial~* --drive-stop-on-upload-limit --log-file=/home/user/rclone/sync_log.txt

#######  Remove Control Files  ##########

# update counter and remove other control files

	if [[ "$CounterNumber" == "$CountServiceAccounts" ]];then
		rm /home/user/rclone/remotes/tdrive_vfs/counter_*
		touch /home/user/rclone/remotes/tdrive_vfs/counter_1
		echo "$(date "+%d.%m.%Y %T") INFO: Final counter used - resetting loop and created counter_1."
	else
		rm /home/user/rclone/remotes/tdrive_vfs/counter_*
		CounterNumber=$((CounterNumber+1))
		touch /home/user/rclone/remotes/tdrive_vfs/counter_$CounterNumber
		echo "$(date "+%d.%m.%Y %T") INFO: Created counter_${CounterNumber} for next upload run."
	fi

# remove dummy files and replace directories
rm /home/user/rclone/remotes/tdrive_vfs/upload_running
mkdir -p /home/user/local/seedbox/nzbget/completed
echo "$(date "+%d.%m.%Y %T") INFO: Script complete"

exit

 

Do you have any solution? After I Mount my drive it because it takes up all my bandwidth every day.

keep downloading even its idealing !

Link to comment
16 hours ago, Akatsuki said:

 

Have you tried reading the Optional: create service accounts section at the github: https://github.com/BinsonBuzz/unraid_rclone_mount

Yes, but kinda get stuck on the install part. unsure how to proceed (dont whant to mess up)

 

Edit: Manage to make the SA and uploaded them to \appdata\other\rclone\service_accounts

 

At first i got 404 errors that i gathered to be permission error.

then i change the sharing setting to all that got the link can edit, but that only made another crypt folder.

 

is there someone that can guide me any further.

Edited by Logopeden
Link to comment
58 minutes ago, belizejackie said:

Do you have any solution? After I Mount my drive it because it takes up all my bandwidth every day.

keep downloading even its idealing !

Probably Plex - stop each docker one by one to see what's hammering your mount

Link to comment

@DZMM

Hello

 

 with a bit of tinkering i got service accounts to work to work.

just whanted to let you know that when you use "my drive" not "shared-drives" you get duplicated crypt folders. (one is from the old setup/the orginal owner, the other crypt folder is from the one the drive is shared whit)

and after som googel i found this is a know problem. when transfering to another folder that is "shared with me"

 

is ther a way to make this not happen? my skills is not good enof to make the fix below happen.

 

in the other forum i found this work around
""

Instead of

rclone copy --drive-shared-with-me "Drive:example shared with me.txt" Drive:Backups

write

rclone copy "Drive,shared_with_me:example shared with me.txt" Drive:Backups

and all should be well. 

""

 

 

Edited by Logopeden
Link to comment
2 hours ago, Logopeden said:

@DZMM

Hello

 

 with a bit of tinkering i got service accounts to work to work.

just whanted to let you know that when you use "my drive" not "shared-drives" you get duplicated crypt folders. (one is from the old setup/the orginal owner, the other crypt folder is from the one the drive is shared whit)

and after som googel i found this is a know problem. when transfering to another folder that is "shared with me"

 

is ther a way to make this not happen? my skills is not good enof to make the fix below happen.

 

in the other forum i found this work around
""

Instead of

rclone copy --drive-shared-with-me "Drive:example shared with me.txt" Drive:Backups

write

rclone copy "Drive,shared_with_me:example shared with me.txt" Drive:Backups

and all should be well. 

""

 

 

Sorry, I didn't really understand the SA account creation bit myself and I think I fluked it!  I can't remember who added the instructions to the Github as it wasn't me.

Link to comment
48 minutes ago, DZMM said:

Sorry, I didn't really understand the SA account creation bit myself and I think I fluked it!  I can't remember who added the instructions to the Github as it wasn't me.

ok. do you think this is somthing that can be fixed in the future?

if someone smarter than me can look on this. the link is

 

https://github.com/rclone/rclone/issues/2675

 

it describes the problem that is happening whit me as well.

 

hope it helps :)

Link to comment

Hi all,

 

I'm hoping someone can help me, I'm having some issues getting this set up correctly.

The problem is when I run the mount script, it looks like it works fine, the remote share is mounted but nothing populates from google drive.

If I run the mount command from terminal the mount is added and the files/folders populate.

In either of these situations, when I do a rclone lsd my remote in the terminal, I can see all my directories. 

 

Anyone experience anything similar? 

 

Thanks in advance.

Link to comment
2 hours ago, Gr0undfall said:

Hi all,

 

I'm hoping someone can help me, I'm having some issues getting this set up correctly.

The problem is when I run the mount script, it looks like it works fine, the remote share is mounted but nothing populates from google drive.

If I run the mount command from terminal the mount is added and the files/folders populate.

In either of these situations, when I do a rclone lsd my remote in the terminal, I can see all my directories. 

 

Anyone experience anything similar? 

 

Thanks in advance.

Post your script and logs.

 

Also, are you running the script in the background........

Link to comment
  • 2 weeks later...

I just got a 2nd mount up and running and it hit me I May have a problem with restarts. If docker containers have to be started after the mounts are established how to you make sure both mounts have completed before starting the dockers? Has any one addressed? or is this not a problem?  

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.