Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

23 minutes ago, sdamaged said:

Guys, is this a working strategy to back up pictures and videos to my Google Drive?

This is for streaming files not backing up.  But, yes you could use it for moving your videos to the cloud - I'm not sure about moving photos as I haven't tried that yet.

Link to comment
On 7/11/2019 at 3:54 PM, sdamaged said:

Guys, is this a working strategy to back up pictures and videos to my Google Drive?

It is. I have stopped using Crashplan and switches to using Google. The only caveat is there's a hard limit of 400k files per tdrive (soft limit of about 200k, beyond whiach performance is a biatch). If you think a portion of your backup can go above 100k then split it out to its own tdrive.

Link to comment
18 hours ago, testdasi said:

It is. I have stopped using Crashplan and switches to using Google. The only caveat is there's a hard limit of 400k files per tdrive (soft limit of about 200k, beyond whiach performance is a biatch). If you think a portion of your backup can go above 100k then split it out to its own tdrive.

What performance problems have you seen beyond 200k files?  I've just checked and I'm at around 70k items for my plex media library tdrive, but I have another tdrive for backups which probably is over 200k files due to all the little system files

 

06.07.2019 00:50:41 PLEX LIBRARY STATS
Media items in Libraries
Library = Movies
  Items = 16147

Library = TV - Adults
  Items = 34549

Library = TV - Kids
  Items = 19054

 

Link to comment

This is my consolidated video files only drive. It's a lot of stuff and still 100,000 off the teamdrive limit.
 Music really starts to add up if you start collecting that.

rclone size video: --fast-list
Total objects: 299984
Total size: 705.433 TBytes (775631319410200 Bytes)

Note that folders count as part of the file count and stuff in the trash too, I've hit the limit a few times because of stuff in the trash.

Edited by Spladge
Link to comment
1 hour ago, Spladge said:

This is my consolidated video files only drive. It's a lot of stuff and still 100,000 off the teamdrive limit.
 Music really starts to add up if you start collecting that.


rclone size video: --fast-list
Total objects: 299984
Total size: 705.433 TBytes (775631319410200 Bytes)

Note that folders count as part of the file count and stuff in the trash too, I've hit the limit a few times because of stuff in the trash.

ahh didn't realise folders counted.  My backup tdrive is further away from the limit than I thought:

 

root@Highlander:/mnt/user/public# rclone size tdrive_vfs: --fast-list
Total objects: 72443
Total size: 322.704 TBytes (354816683643653 Bytes)
root@Highlander:/mnt/user/public# rclone size tdrive_backup_vfs: --fast-list
Total objects: 80245
Total size: 8.307 TBytes (9133993530852 Bytes)

Edit: are you sure folders count as objects?  My movie structure is /movies/movie_name so each of my 16k movies has an associated folder, so my objects should be over 80k not 72k

Edited by DZMM
Link to comment
21 minutes ago, DZMM said:

ahh didn't realise folders counted.  My backup tdrive is further away from the limit than I thought:

 


root@Highlander:/mnt/user/public# rclone size tdrive_vfs: --fast-list
Total objects: 72443
Total size: 322.704 TBytes (354816683643653 Bytes)
root@Highlander:/mnt/user/public# rclone size tdrive_backup_vfs: --fast-list
Total objects: 80245
Total size: 8.307 TBytes (9133993530852 Bytes)

Edit: are you sure folders count as objects?  My movie structure is /movies/movie_name so each of my 16k movies has an associated folder, so my objects should be over 80k not 72k

I'm currently on a backup of 2 TB with almost 400k files (370k)..... I thought backing up my cache drive would be a good idea, whilst forgetting that Plex appdata is huge of small files.

 

Currently also getting the limit exceeded error. So I'm pretty sure rclone doesn't count folders as objects, but Gdrive does.

  • Like 1
Link to comment
42 minutes ago, Kaizac said:

I'm currently on a backup of 2 TB with almost 400k files (370k)..... I thought backing up my cache drive would be a good idea, whilst forgetting that Plex appdata is huge of small files.

 

Currently also getting the limit exceeded error. So I'm pretty sure rclone doesn't count folders as objects, but Gdrive does.

Thanks.

 

My trdrive_backup_vfs includes my plex appdata backup, but I use the CA backup plugin so it's just one big tar file

Link to comment
14 minutes ago, DZMM said:

Thanks.

 

My trdrive_backup_vfs includes my plex appdata backup, but I use the CA backup plugin so it's just one big tar file

Yeah I do the same, but thought backing up my whole cache was a nice addition. I was wrong ;).

 

On another note, I'm not having any memory problems for a few months now. So maybe rclone changed something but I'm never running out of memory. Hope it's better for others as well.

Link to comment

Hi, at the moment I'm testing this as an alternative to using stablebit cloud. So far setup appears to be working ok. Except I'm have an issue with buffering. When playing a movie for instance I can see that the file plays in plex, and that it does have a visible buffer but after 30 odd minutes the play back will stop. Pressing play again in plex will make it play for another 3 seconds or so then pause again.

 

Would this be an issue with my rclone mount it self or a plex media server issue.

 

Here is my current rclone mount from unraid script

 

Quote

rclone mount --allow-other --buffer-size 1G --dir-cache-time 72h --drive-chunk-size 512M --fast-list --log-level INFO --vfs-read-chunk-size 64M --vfs-read-chunk-size-limit off gdrive_media_vfs: /mnt/user/mount_rclone/google_vfs &

Any help would be appreciated

Thanks

Link to comment

Are you using a Shield to test playback by any chance?

The best way to test I have found is using a wired PC with the PMP app https://www.plex.tv/media-server-downloads/#plex-app

@Kaizac Try emptying the trash on your drives if you are getting the limit exceeded error
 

rclone delete remotename: --fast-list --drive-trashed-only --drive-use-trash=false -v --transfers 50

Add "--dry-run" flag if you want to test drive first
May also be useful to dedupe, particularly video files.
 

rclone dedupe skip remotename: -v --drive-use-trash=false --no-traverse --transfers=50

And remove empty directories

 

rclone rmdirs remotename: -v --drive-use-trash=false --fast-list --transfers=50

AGAIN
 

--dry-run

If you want to see what it will do first.

  • Like 1
Link to comment

Yes, the shield had networking broken in an update earlier this year. High bitrate files in particular. There is a fix in the beta firmware but I think they may have closed it off for now. Hopefully that means it is shipping in the next update. I think I linked the page to apply for the firmware access a few pages back if you wanted to try your luck.

Link to comment

@DZMMThanks for all those scripts it is greatly appreciated.

I played around few months ago with Google Apps for business in a similar setup but using a windows solution instead of rclone (Stablebit Clouddrive)

I ended up building my current unraid server but because it is exponentially growing in size, the cost of HDD is becoming very high.

I an considering testing your solution so I am super interested into your tutorial...

What is the purpose of the teamdrive? Do you use it and merge it into plex at one time or is it only here to provide a seconde 750G/day allowance ?

I was scared at the time that Google Enforce one day the limit per user and that I loose all my content. Do you have a backup strategy somewhere? I saw that you had a second Gdrive?

Thanks again for sharing all this stuff !

Edited by yendi
Link to comment
Just now, nuhll said:

Yes, he use it to come over the limits.

 

But i dont see a way a "nroaml user" is hitting this limits. (after the initial upload)

So it adds up 750go to the initial user and the teamshare is never used? So with this you could upload 1500gigs a day? That is great !

I have a 1000/400 fiber internet and during my test I was hitting this limit everyday (initial upload)

Link to comment
2 hours ago, yendi said:

So it adds up 750go to the initial user and the teamshare is never used? So with this you could upload 1500gigs a day? That is great !

I have a 1000/400 fiber internet and during my test I was hitting this limit everyday (initial upload)

No. The Team Share is a shared storage which multiple users have access to. So you get 750gb per day per user connected to this Team Share. It's not just extra size added to a specific account.

Link to comment
4 hours ago, Kaizac said:

No. The Team Share is a shared storage which multiple users have access to. So you get 750gb per day per user connected to this Team Share. It's not just extra size added to a specific account.

Quote

 tdrive: - a teamdrive remote.  Note: you need to use a different gmail/google account to the one above (which creates and shares the Team Drive) to create the token - any google account will do.  I recommend creating a 2nd client_id using this account

  • when I request a cliend_id + secret i should request it from a different gmail account?
  • This remote will stay always empty? I dont understand the purpose of this remote, could you please elaborate?

 

  • How to do the initial upload? Copy media in the /mnt/user/mount_unionfs/google_vfs/xxxxx/? 
  • How to see upload progress --> (local files gets deleted when upload finished as example?) I have 40TB+ to upload so I wanting to plan this:
    • Is there a way to make symlink or something similar to do a continious upload during few weeks and keep my plex as it is in parallel? So I could switch once everyhting has been uploaded?
  • How does rclone work with Cache? I have a SSD where all downloads goes and the cache is emptied every night. Should I disable cache now?
  • Why is there a script for Radarr and not for Sonarr? i dont see where is the tutorial it is used.

Thanks for the help !

 

EDIT: I started right after work and i'm kinda stuck:

Here is my rclone config:

[gdrive]
type = drive
client_id = XXXXXXXXXXXXXXXXXXXXXXXX.apps.googleusercontent.com
client_secret = XXXXXXXXXXXXXXXXXXXXXXXX
scope = drive
token = {"access_token":"XXXXXXXXXXXXXXXXXXXXXXXXXX","token_type":"Bearer","refresh_token":"1/n-7ZOV5GTQUhOYNW_8txP2xIFciNSN6sOtCxjbvSbEQ","expir
y":"2019-07-23T19:16:10.643780944+02:00"}

[gdrive_media_vfs]
type = crypt
remote = gdrive:crypt
filename_encryption = standard
directory_name_encryption = true
password = XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
password2 = XXXXXXXXXXXXXXXXXXXXXXXXXXX

When I input the command for mountcheck:

root@Kanard:~# rclone copy mountcheck gdrive_media_vfs: -vv --no-traverse
2019/07/23 18:59:45 DEBUG : rclone: Version "v1.48.0-073-g266600db-beta" starting with parameters ["rcloneorig" "--config" "/boot/config/plugins/rclone-beta/.rclone.conf" "copy" "mountcheck" "gdrive_media_vfs:" "-vv" "--no-traverse"]
2019/07/23 18:59:45 DEBUG : Using config file from "/boot/config/plugins/rclone-beta/.rclone.conf"
2019/07/23 18:59:46 DEBUG : mountcheck: Size and modification time the same (differ by -365.632µs, within tolerance 1ms)
2019/07/23 18:59:46 DEBUG : mountcheck: Unchanged skipping
2019/07/23 18:59:46 INFO  : 
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 1 / 1, 100%
Transferred:            0 / 0, -
Elapsed time:        1.1s

2019/07/23 18:59:46 DEBUG : 5 go routines active
2019/07/23 18:59:46 DEBUG : rclone: Version "v1.48.0-073-g266600db-beta" finishing with parameters ["rcloneorig" "--config" "/boot/config/plugins/rclone-beta/.rclone.conf" "copy" "mountcheck" "gdrive_media_vfs:" "-vv" "--no-traverse"]

but then, I try to start mount script and I have an error:

23.07.2019 18:44:56 INFO: mounting rclone vfs.
2019/07/23 18:44:58 Fatal error: Can not open: /mnt/user/mount_rclone/google_vfs: open /mnt/user/mount_rclone/google_vfs: no such file or directory
23.07.2019 18:45:01 CRITICAL: rclone gdrive vfs mount failed - please check for problems.

I tried to use only the rclone mount command but same error of path.

 

If I manually create the path, I can run the command but it hangs in the ssh window (i never get the prompt again) so I assume that there is something fishy. Permissions issues?

 

Thanks

Edited by yendi
Link to comment
25 minutes ago, nuhll said:

So everything works perfectly, im just not sure if there is a problem because iam seeing high downloads when plex radarr, sonarr and nzbget are doing nothing.

 

Is there any way to check rclone what he is doing? (ive looked up via ssh and its definitly coming from rclone)

Radarr or Sonarr may be updating the modified date on the file. To do this, the entire file is pulled locally, the time is updated and then the file is uploaded again. I turned this functionality off in Radarr/Sonarr

 

To answer your question though, enable verbose logging in rclone

Edited by markrudling
Link to comment
On 7/23/2019 at 2:43 PM, yendi said:
  • when I request a cliend_id + secret i should request it from a different gmail account?
  • This remote will stay always empty? I dont understand the purpose of this remote, could you please elaborate?

 

  • How to do the initial upload? Copy media in the /mnt/user/mount_unionfs/google_vfs/xxxxx/? 
  • How to see upload progress --> (local files gets deleted when upload finished as example?) I have 40TB+ to upload so I wanting to plan this:
    • Is there a way to make symlink or something similar to do a continious upload during few weeks and keep my plex as it is in parallel? So I could switch once everyhting has been uploaded?
  • How does rclone work with Cache? I have a SSD where all downloads goes and the cache is emptied every night. Should I disable cache now?
  • Why is there a script for Radarr and not for Sonarr? i dont see where is the tutorial it is used.

Thanks for the help !

 

EDIT: I started right after work and i'm kinda stuck:

Here is my rclone config:


[gdrive]
type = drive
client_id = XXXXXXXXXXXXXXXXXXXXXXXX.apps.googleusercontent.com
client_secret = XXXXXXXXXXXXXXXXXXXXXXXX
scope = drive
token = {"access_token":"XXXXXXXXXXXXXXXXXXXXXXXXXX","token_type":"Bearer","refresh_token":"1/n-7ZOV5GTQUhOYNW_8txP2xIFciNSN6sOtCxjbvSbEQ","expir
y":"2019-07-23T19:16:10.643780944+02:00"}

[gdrive_media_vfs]
type = crypt
remote = gdrive:crypt
filename_encryption = standard
directory_name_encryption = true
password = XXXXXXXXXXXXXXXXXXXXXXXXXXXXX
password2 = XXXXXXXXXXXXXXXXXXXXXXXXXXX

When I input the command for mountcheck:


root@Kanard:~# rclone copy mountcheck gdrive_media_vfs: -vv --no-traverse
2019/07/23 18:59:45 DEBUG : rclone: Version "v1.48.0-073-g266600db-beta" starting with parameters ["rcloneorig" "--config" "/boot/config/plugins/rclone-beta/.rclone.conf" "copy" "mountcheck" "gdrive_media_vfs:" "-vv" "--no-traverse"]
2019/07/23 18:59:45 DEBUG : Using config file from "/boot/config/plugins/rclone-beta/.rclone.conf"
2019/07/23 18:59:46 DEBUG : mountcheck: Size and modification time the same (differ by -365.632µs, within tolerance 1ms)
2019/07/23 18:59:46 DEBUG : mountcheck: Unchanged skipping
2019/07/23 18:59:46 INFO  : 
Transferred:             0 / 0 Bytes, -, 0 Bytes/s, ETA -
Errors:                 0
Checks:                 1 / 1, 100%
Transferred:            0 / 0, -
Elapsed time:        1.1s

2019/07/23 18:59:46 DEBUG : 5 go routines active
2019/07/23 18:59:46 DEBUG : rclone: Version "v1.48.0-073-g266600db-beta" finishing with parameters ["rcloneorig" "--config" "/boot/config/plugins/rclone-beta/.rclone.conf" "copy" "mountcheck" "gdrive_media_vfs:" "-vv" "--no-traverse"]

but then, I try to start mount script and I have an error:


23.07.2019 18:44:56 INFO: mounting rclone vfs.
2019/07/23 18:44:58 Fatal error: Can not open: /mnt/user/mount_rclone/google_vfs: open /mnt/user/mount_rclone/google_vfs: no such file or directory
23.07.2019 18:45:01 CRITICAL: rclone gdrive vfs mount failed - please check for problems.

I tried to use only the rclone mount command but same error of path.

 

If I manually create the path, I can run the command but it hangs in the ssh window (i never get the prompt again) so I assume that there is something fishy. Permissions issues?

 

Thanks

post your mount script please

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.