Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

1 hour ago, Kaizac said:

@DZMM just noticed my other mounts were not working while the main one (which has --rc) does. When I put in the mount command on itself it works. Was that also what you were experiencing and why you removed --rc?

 

EDIT: seems like it was glitching out that not all mounts commands came through. So I put in a sleep between every mount and rebooted and it worked now. I am always getting a docker daemon error on startup though. I've already put in a sleep of 60 seconds but that doesn't seem to solve it. You don't have this issue?

 

Strangely enough I just got API banned it seems and both API's are not working to stream. So maybe the ban happens on TD level and not on API/user level.

How many mounts do you have?  I just have one rclone team drive mount and 1 for unionfs - you only need to mount the team drive once. 

 

I ditched rc as it only helps populate the dir-cache a little bit faster after mounting, but it was causing problems with remounts that I couldn't be bothered to work out as the benefits are small.  Yes, a 5-10s sleep after each mount helps avoid any problems.

 

I'm doing ok with 4 my uploads to one team drive for about 40 hours so far - I've done way more than 750GB to the one TD, and more than 750GB/account so I think you're problems are probably because of your bad mounts.

Link to comment
12 minutes ago, francrouge said:

I did what you told me and plex seem to like that.

 

plex has scan 2100 movies so its good and it continues

 

thx for help

it was probably restarting plex in the act of changing to slave that fixed your problem as per my suggestion, not changing to slave.

Link to comment
1 hour ago, DZMM said:

How many mounts do you have?  I just have one rclone team drive mount and 1 for unionfs - you only need to mount the team drive once. 

 

I ditched rc as it only helps populate the dir-cache a little bit faster after mounting, but it was causing problems with remounts that I couldn't be bothered to work out as the benefits are small.  Yes, a 5-10s sleep after each mount helps avoid any problems.

 

I'm doing ok with 4 my uploads to one team drive for about 40 hours so far - I've done way more than 750GB to the one TD, and more than 750GB/account so I think you're problems are probably because of your bad mounts.

I have 3 mounts: Gdrive personal drive, Gdrive Team Drive and Gdrive Team Drive for Bazarr. So I've just tested it with another of my API's and it also gives problems playing media on Gdrive. Some media plays (but this is more recent media, not sure why that matters). So it seems that bans happen on the Team Drive and not on API/user level.

 

That's really unfortunate cause Bazarr is a great piece of software, but getting banned at random is really back.

Link to comment

Well you can be (API)banned on upload and download side. Once you get banned on one side the other side still works, which is nice. But I think my bans happened either because I'm both running Emby as Bazarr which are both analyzing files to get subtitles. So I disabled Emby for subtitles now.

 

Other explanation can be that my several reboots and remounts which all used --rc and cached the directory caused problems since Google was seeing too many directory listing.

 

What happens in my case now is that I just can't play videofiles, so it throws an error that it can't open the file. At around 00:00 PST I think it's resetted and everything works again.

 

For now I will keep my API for playback and API for Bazarr seperated, since it does distribute API hits. But I'm afraid the bans happen based on the files itself and not so much based on users/API (the quota are very hard to hit anyways).

Link to comment

Last problem i see is this:

2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 3: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 3: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 3: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 2: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 2: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 2: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 1: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 1: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 1: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley: directory not empty

 

it tries to remove every directory and fails... why? Why does it try to remove non empty directorys? I understand that he removes empty directories after uploading, but why at every run? (he is also making a lot of directories at each run?!)

Edited by nuhll
Link to comment
3 hours ago, nuhll said:

Last problem i see is this:

2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 3: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 3: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 3: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 2: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 2: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 2: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 1: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley/Staffel 1: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley/Staffel 1: directory not empty
2018/12/23 19:05:20 DEBUG : Silicon Valley: Removing directory
2018/12/23 19:05:20 DEBUG : Silicon Valley: Failed to Rmdir: remove /mnt/user/Archiv/Serien/Silicon Valley: directory not empty

 

it tries to remove every directory and fails... why? Why does it try to remove non empty directorys? I understand that he removes empty directories after uploading, but why at every run? (he is also making a lot of directories at each run?!)

I think that's just the wording for --delete-empty-src-dirs which isn't great - rather than saying 'directory not emptyting, I'm not doing anything' the command does this.  Have you looked in the directories listed to verify it's not a problem?

Link to comment
1 hour ago, Kaizac said:

@DZMM did you ever experiment with syncing 2 team drives? I can't test it right now because of my ban, so will do that tomorrow.

What I'm thinking is to run a full duplicate of the main TD. I'm curious if transfers are near instant and don't count towards the daily quota.

 

why do you want to sync TDs?  I'm running with just one with 4 user uploads and it's going well with no issues at all so far after nearly 2 days running

Link to comment
Just now, DZMM said:

why do you want to sync TDs?  I'm running with just one with 4 user uploads and it's going well with no issues at all so far after nearly 2 days running

Because I want to have a failover when I get API banned. Then I can at least still access my files, whereas now I just can't watch anything which is quite annoying.

Link to comment
10 minutes ago, Kaizac said:

Because I want to have a failover when I get API banned. Then I can at least still access my files, whereas now I just can't watch anything which is quite annoying.

I moved about 70TB in 24 hours i.e. more than the 10TB outgoing limit, so internal google transfers don't count towards the outbound transfer quota.

 

I'm not sure how you'd automate keeping teamdrives in sync though.  I think something went wrong with your setup e.g. something downloading to analyse in the background, or something trying to connect too fast.  have you checked the API console to see if there are any clues?

Link to comment
2 minutes ago, DZMM said:

I moved about 70TB in 24 hours i.e. more than the 10TB outgoing limit, so internal google transfers don't count towards the outbound transfer quota.

 

I'm not sure how you'd automate keeping teamdrives in sync though.  I think something went wrong with your setup e.g. something downloading to analyse in the background, or something trying to connect too fast.  have you checked the API console to see if there are any clues?

I know the internal transfers don't count towards quota, but I wonder if it's the same for copying. Tomorrow I will test it with rclone sync, see how fast it is.

 

I think the ban I got today was just an accident because of all the remounting and such. But still Bazarr gives such high API hits (about 200k per day) that it's just a risk. Animosity over on the rclone forums also looked at my logs before and all we can see is that files just get analyzed and opened and closed many times. And it's know that Google doesn't like that behaviour. Plex also has the risk of the same behaviour since Animosity also saw that in his own logs.

Link to comment

Unfortunately copying/syncing TD to TD is counted as uploading so that's not an option.

 

@DZMM what are your current cront timings on the upload scripts? DId you also create a script for each upload user/API or did you put it in 1 script? I'm getting rate limited and I think it's because Im running an upload script every 20 minutes which gives a lot of api hits due to the checkers.

Link to comment
12 minutes ago, Kaizac said:

Unfortunately copying/syncing TD to TD is counted as uploading so that's not an option.

 

@DZMM what are your current cront timings on the upload scripts? DId you also create a script for each upload user/API or did you put it in 1 script? I'm getting rate limited and I think it's because Im running an upload script every 20 minutes which gives a lot of api hits due to the checkers.

I did a sleep 30m before deleting the upload file in the script to prevent too many checks. the upload script itself gets called every 5 min. 

 

# remove dummy file
sleep 30m

rm /mnt/user/appdata/other/rclone/rclone_upload

### upload script end

Edited by nuhll
Link to comment
6 hours ago, Kaizac said:

 

@DZMM what are your current cront timings on the upload scripts? DId you also create a script for each upload user/API or did you put it in 1 script? I'm getting rate limited and I think it's because Im running an upload script every 20 minutes which gives a lot of api hits due to the checkers.

- 4 scripts one for each API + unique client ID (each run against one of movies_adults, movies_kids & backup; movies_uhd and tv_shows), all running every hour. 

 

On average, 3 of 4 of the scripts are running at any given time because of the backlog.

Link to comment
On 12/25/2018 at 8:45 PM, DZMM said:

- 4 scripts one for each API + unique client ID (each run against one of movies_adults, movies_kids & backup; movies_uhd and tv_shows), all running every hour. 

 

On average, 3 of 4 of the scripts are running at any given time because of the backlog.

Thanks! I discovered that my upload scripts were returned to an previous version and weren't working. Now it's all running as expected again :).

Link to comment
On 12/22/2018 at 3:50 PM, Kaizac said:

Locally to create the backup.

@Kaizac you might need to create another teamdrive/upload script for your backup.  My backup just took my main teamdrive over 400k files (prob all the little plex files and my photo album backup), so I've just created a new teamdrive just for my backup separate from my media teamdrive

Link to comment
4 hours ago, DZMM said:

@Kaizac you might need to create another teamdrive/upload script for your backup.  My backup just took my main teamdrive over 400k files (prob all the little plex files and my photo album backup), so I've just created a new teamdrive just for my backup separate from my media teamdrive

Yeah I've done that from the beginning thankfully own TD with own API. Also I'm keeping all my metadata and subtitles local on a seperate folder from my rclone_upload, to keep rclone scans/uploads fast and limit the amount of cloud folders.

 

Also someone on the rclone forums mentioned the high amount of API hits by Bazarr were a bug and are solved in the development branch. So I've switched to that and my API hits have dropped immensely! So no API bans anymore at the moment :).

Link to comment
6 minutes ago, francrouge said:

Hi guys

I got a question for you how many streams are you able to have with gdrive ?

Thx

Envoyé de mon Pixel 2 XL en utilisant Tapatalk
 

That totally depends on your download speed and your processing power of your server. And if you want to stream outside of your home it also depends on your upload speed. If you can mostly direct stream (no transcoding) it will depend 90% on your download/upload speed.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.