Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

1 hour ago, Spladge said:

 

1204192459_2019-03-1517_53_07-MyDrive-GoogleDrive.png.c75c7c01970b5e0f52a6655fe4e515ad.png

 

Server side renames and stability at scale. Otherwise whatever works.... I use rclone VFS and mergerfs combo on my other servers but not unraid, that is currently as per the guide in this thread.

 

Do you use tdrives?  I can't get my space used anymore.  I don't have a PB like you though - insane!  I think I'm around 250-350TB.

 

I'd love to ditch unionfs as well for better delete support and because it doesn't support hardlinks.   Mergerfs does, but I'm waiting for rclone union to be beefed up for a all-in-one solution, and because mergerfs isn't available on unraid.

Link to comment

Not using teamdrives really - and that includes a trash that hasn't been emptied. Plus a whole bunch of messy duplicate files I am slowly trying to sort out.  I wouldn't be surprised if I could delete a couple of hundred TB out of that.

 

I am using teamdrives, but looking at how I might organise things going forward. Half the stuff I have isn't even scanned into plex or emby, just sitting there for the sake of it.

 

To check space you can use an rclone command on the remote.
https://rclone.org/commands/rclone_size/

Edited by Spladge
Link to comment

Still dont get why its better. I dont want to rename anything. And hows deleting better? Never had stability problems either. But maybe that comes with some PBs?

 

If i understand my setup correct, i dont even need that delete script, because where my data is is all RW. And if its RW, it can delete?!

 

But all in one solution is ofc prob best.

Edited by nuhll
Link to comment

I’m trying to jump into Rclone headfirst. I have been using Unraid for years on a HP N54L but it’s time to upgrade to offsite storage. I have read your guide @dzmn and it’s the most comprehensive guide on setting up Rclone with GSuite I have found. It’s weird that there is almost zero complete guides on setting this up. In saying that I would love to jump on discord screen share with someone with experience and do this in one hit.

Link to comment

Ok, i have got rclone up and running using @DZMM awesome scripts but i have several questions and things i would like to clarify.

 

1. How would i make Rclone upload tv shows and movies located in /mnt/user/MULTIMEDIA/Movies and /mnt/user/MULTIMEDIA/TV Shows respectively that are older than 7 days.

 

How do i then point sonarr and Radarr to /mnt/user/mount_unionfs/google_vfs/tv_shows. I assume some sort of symlink would need to be made, im running both these programs on windows currently.

 

sorry for the possibly stupid questions.

Link to comment

1. u use the provided upload script (change accordingly)

2. Read the tutorial, you need to combine your local storage with the remote storage via mountfs, the 3. directory is what you enter in your programm, which dont know if its local or remote anymore at this point. No need for symlink or anything.

 

Just work the tut step by step, and change accordingly (like age, speed, directorys). If u dont know anymore, then ask, and tell at which exact step you have problems. Its very hard to setup at the first time. But its worth it... (i mean unlimited storage, hello? :D)

Edited by nuhll
Link to comment
On 3/16/2019 at 1:08 PM, DZMM said:

If you're confident you won't be changing files eg using sonarr/radarr then yes you could use RW for everything.  If not, RW on the mount would cause too many potential problems IMO

Till now i didnt noticed any problems.

 

I do delete files from time to time, e.g. if a new version comes out, how shoudl that be a problem? Shouldnt it just delete it remote, bc its RW, or local, bc its also RW and then just reupload the new version? (im just curios, its running perfectly fine, i think)

Link to comment
1 minute ago, Bolagnaise said:

Thank you for the assistance, someone should make a video on this lol, maybe when i work it all out i will.

 

at some point a all in one solution will come, it just takes time, i dont udnerstand it really, but currently were using mountfs and rclone, at some point rclone should include all of it. So it will be easier, i guess.

Link to comment
5 minutes ago, nuhll said:

at some point a all in one solution will come, it just takes time, i dont udnerstand it really, but currently were using mountfs and rclone, at some point rclone should include all of it. So it will be easier, i guess.

As time has shown repeatedly, if it is easy to do and becomes mainstream, say goodbye.

Link to comment
1 minute ago, shimlapinks said:

As time has shown repeatedly, if it is easy to do and becomes mainstream, say goodbye.

What should be goodbye? Google? Then we use another service, there will always be a way. And if its at some point online (your data) then the transfer isnt really the problem. 

Edited by nuhll
Link to comment
On 11/30/2018 at 12:58 AM, DZMM said:

it doesn't matter as no real files exist on your array for those folders - just virtual files.

 

Just add the current folders to your unionfs mount and then move them manually behind the scenes to your rclone_upload folder where they'll get uploaded over time e.g.

 


unionfs -o cow,allow_other,direct_io,auto_cache,sync_read /mnt/user/rclone_upload/google_vfs=RW:/mnt/user/mount_rclone/google_vfs=RO:/mnt/user/OLD_MEDIA_LOCATION=RO /mnt/user/mount_unionfs/google_vfs

Then add the unionfs folders to sonarr etc.  You just have to make sure that the filepaths matchup e.g.

 

- /mnt/user/mount_rclone/google_vfs/TV Show 1

- /mnt/user/Old/Media_location/TV Show 1

I think this is exactly what I’m trying to achieve.

 

So for a large move I would manually move my TV show and movie folder from their current share /mnt/user/multimedia to /mnt/user/mount_rcclone and then point sonarr and Radarr and plex to the unionfs share. Does this bypass the filtering in the upload script? I only want to upload newer files that are 7 days old but nothing before that as I want to keep that in local storage. 

 

 

 

 

Link to comment
13 hours ago, Bolagnaise said:

I think this is exactly what I’m trying to achieve.

 

So for a large move I would manually move my TV show and movie folder from their current share /mnt/user/multimedia to /mnt/user/mount_rcclone and then point sonarr and Radarr and plex to the unionfs share. Does this bypass the filtering in the upload script? I only want to upload newer files that are 7 days old but nothing before that as I want to keep that in local storage. 

 

 

 

 

Add --min-age 7d to your rclone upload command 

Link to comment
8 hours ago, DZMM said:

Add --min-age 7d to your rclone upload command 

just to ask another question, how would i script it so that it would only upload files with a min age of 2D but with also a max age of 2 days. So it will only upload new files that have a min/max 2 day age period, or is that not possible?

Link to comment
Just now, Bolagnaise said:

just to ask another question, how would i script it so that it would only upload files with a min age of 2D but with also a max age of 2 days. So it will only upload new files that have a min/max 2 day age period, or is that not possible?

What do you mean?

 

If you set the script to max age 2 then it will upload files older then 2 days.

Link to comment
1 minute ago, nuhll said:

If im correct it uses the "created" date. (if i download old linux movies then they get uplaoded even if i set to 1y)

 

I wouldnt bother uploading such fresh data.

well min age 30m means if i understand it correctly that all data that is at least 30 min old is uploaded, i ran this script and tested it and it started to upload TB of data, so i switched to max age 2d and it is now taking a long time to filter out files. I initially tried max age 7d and it was also taking a long time so i thought i would try to reduce it to 2 to do a quick test to see what it would upload, im currnetly waiting for the script to finish.

Link to comment
2 minutes ago, Bolagnaise said:

well min age 30m means if i understand it correctly that all data that is at least 30 min old is uploaded, i ran this script and tested it and it started to upload TB of data, so i switched to max age 2d and it is now taking a long time to filter out files. I initially tried max age 7d and it was also taking a long time so i thought i would try to reduce it to 2 to do a quick test to see what it would upload, im currnetly waiting for the script to finish.

Yes, thats correct. You can adjust the workers (threads) to speed it up, if u have spare cpu.

  • Like 1
Link to comment
1 hour ago, SoloLab said:

Hello, 

 

So, I have this setup, thank you very much. 

 

But I have a question about my where I create the subdirectories `movies` & `shows`, etc?

 

If I create those folders in the upload directory, once done there deleted. 

In the rclone mount

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.