Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

Rebooted unraid earlier today and am now getting an error. Everything loads fine. Mounts correctly on the first attempt. Next two times the script runs it looks good. Third time it runs I get this error:

 

17.12.2020 21:00:02 INFO: Creating gdrive_media_vfs mergerfs mount.
mv: cannot stat '/mnt/user/mount_unionfs/gdrive_media_vfs': Transport endpoint is not connected
mkdir: cannot stat '/mnt/user/mount_unionfs/gdrive_media_vfs': Transport endpoint is not connected
fuse: bad mount point `/mnt/user/mount_unionfs/gdrive_media_vfs': Socket not connected
17.12.2020 21:00:02 INFO: Checking if gdrive_media_vfs mergerfs mount created.
17.12.2020 21:00:02 CRITICAL: gdrive_media_vfs mergerfs mount failed.  Stopping dockers.

 

Any attempt to re-run the script results in the same error. Any idea what's causing this and how to prevent it in the future?

 

EDIT: noticed in my system log the following:

`mergerfs[20393]: segfault at 14966c44cff8 ip 000014966de79151 sp 000014966c44cff0 error 6 in mergerfs[14966de46000+b5000]`

Edited by privateer
Link to comment
3 hours ago, privateer said:

Rebooted unraid earlier today and am now getting an error. Everything loads fine. Mounts correctly on the first attempt. Next two times the script runs it looks good. Third time it runs I get this error:

 

17.12.2020 21:00:02 INFO: Creating gdrive_media_vfs mergerfs mount.
mv: cannot stat '/mnt/user/mount_unionfs/gdrive_media_vfs': Transport endpoint is not connected
mkdir: cannot stat '/mnt/user/mount_unionfs/gdrive_media_vfs': Transport endpoint is not connected
fuse: bad mount point `/mnt/user/mount_unionfs/gdrive_media_vfs': Socket not connected
17.12.2020 21:00:02 INFO: Checking if gdrive_media_vfs mergerfs mount created.
17.12.2020 21:00:02 CRITICAL: gdrive_media_vfs mergerfs mount failed.  Stopping dockers.

 

Any attempt to re-run the script results in the same error. Any idea what's causing this and how to prevent it in the future?

 

EDIT: noticed in my system log the following:

`mergerfs[20393]: segfault at 14966c44cff8 ip 000014966de79151 sp 000014966c44cff0 error 6 in mergerfs[14966de46000+b5000]`

Have you looked in file manager to see if the folder exists?  I had a similar error a few weeks ago where when I looked in putty at the folder, it was borked and had a ? next to it.  I don't know why, but rebooting fixed it.

Link to comment
6 hours ago, DZMM said:

Have you looked in file manager to see if the folder exists?  I had a similar error a few weeks ago where when I looked in putty at the folder, it was borked and had a ? next to it.  I don't know why, but rebooting fixed it.

 

I had the same marking. It's like the connection to the gdrive was timing out - had nothing to do with the script running.

 

I tried rebooting a few times, including manually unmounting everything and removing the local folders and remounting, all to no avail. Waited 6 hours and tried again and works like a charm. Must have been on Google's side.

Link to comment

something is definitely going on ... even after a reboot, it came back. I'm in the middle of an upload on my paltry 400Mbit down/40Mbit up connection.... so definitely not hitting the quota - even though I do have service accounts. 

 

wonder if this is related to the stuff that happened last week w. google... maybe they are putting in some other sort of throttling? 

Link to comment
1 minute ago, axeman said:

something is definitely going on ... even after a reboot, it came back. I'm in the middle of an upload on my paltry 400Mbit down/40Mbit up connection.... so definitely not hitting the quota - even though I do have service accounts. 

 

wonder if this is related to the stuff that happened last week w. google... maybe they are putting in some other sort of throttling? 

I don't think the issue is at Google's end, as the problem is with the MergerFS mount - the rclone mounts seem to be behaving

Link to comment
2 minutes ago, DZMM said:

I don't think the issue is at Google's end, as the problem is with the MergerFS mount - the rclone mounts seem to be behaving

Yes - excellent observation. I see the mergerfs github shows some activity from 5 days ago on fuse.c - maybe there was a breaking change? MergerFS gets installed at each array reboot right? 

Link to comment
28 minutes ago, axeman said:

Yes - excellent observation. I see the mergerfs github shows some activity from 5 days ago on fuse.c - maybe there was a breaking change? MergerFS gets installed at each array reboot right? 

yep - maybe a reboot was needed to get the latest version

Link to comment
15 minutes ago, privateer said:

Well it worked for about 4 hours and now has gone back to being broken again.

 

?gdrive_media_vfs in red for me. Noticed when all my dockers turned back off.

what version of unraid are you using?  I've had problems with anything above Beta25 - lockups, slow plex and I think the mergerFS problems I had might have been when using Beta25+ 

Link to comment

Hi Folks! I've read the guide - https://rclone.org/onedrive/ and watched this video to try and deduce what needs to be done in a simple but granular step by step to get my "Pictures" folder in UNRAID to use OneDrive as a back up solution- https://www.youtube.com/watch?v=-b9Ow2iX2DQ&feature=emb_logo&ab_channel=SpaceinvaderOne 

 

The goal is to set up the pictures folder so that every little change that is made in "UNRAID Pictures" (add/delete/etc) syncs with the OneDrive folder. I've made a OneDrive account specifically for this task and have a full TB to use but for the life of me although I can get rclone to recognize the OneDrive account using 'rclone lsd XXXX'

 

What I can't get is the mounting part. When following the video at 8:20 I do not have a mount script, there's simply nothing really there. I guess what I want to do is mount my UNRAID Pictures folder as my OneDrive folder?? There seems to be nothing in my /mnt folder though...

 

I do not want two way syncing, only syncing from UNRAID to OneDrive account - so this should be fairly easy right?

 

 

EDIT - I just realized I posted earlier this week, sorry for the double post, but at least this one has more explanation . Wow...Sorry about that guys

Edited by sannitig
Link to comment

Interstingly - it looks like mergerFS files aren't going to the "local" but are being uploaded directly... 

 

Quote

2020/12/20 23:31:56 INFO : TV_Classics/MyShow/tvshow.nfo: Copied (replaced existing)
2020/12/20 23:31:56 INFO : TV_Classics/MyShow/tvshow.nfo: vfs cache: upload succeeded try #1

Seeing lots of these from a Sonarr refresh. 

Link to comment
6 hours ago, sannitig said:

Hi Folks! I've read the guide - https://rclone.org/onedrive/ and watched this video to try and deduce what needs to be done in a simple but granular step by step to get my "Pictures" folder in UNRAID to use OneDrive as a back up solution- https://www.youtube.com/watch?v=-b9Ow2iX2DQ&feature=emb_logo&ab_channel=SpaceinvaderOne 

 

The goal is to set up the pictures folder so that every little change that is made in "UNRAID Pictures" (add/delete/etc) syncs with the OneDrive folder. I've made a OneDrive account specifically for this task and have a full TB to use but for the life of me although I can get rclone to recognize the OneDrive account using 'rclone lsd XXXX'

 

What I can't get is the mounting part. When following the video at 8:20 I do not have a mount script, there's simply nothing really there. I guess what I want to do is mount my UNRAID Pictures folder as my OneDrive folder?? There seems to be nothing in my /mnt folder though...

 

I do not want two way syncing, only syncing from UNRAID to OneDrive account - so this should be fairly easy right?

 

 

EDIT - I just realized I posted earlier this week, sorry for the double post, but at least this one has more explanation . Wow...Sorry about that guys

As @BRiT pointed out, everything you need can be found at the start of this thread - although I think this solution is overkill if you are just backing up or syncing a few photos, as the solution in this thread is to optimise Plex playback from Google Drive. 

 

It probably can be re-used for OneDrive, but if you want to learn how to backup a photos folder using rclone I'd read the rclone sync page https://rclone.org/commands/rclone_sync/ as I don't see why you need to even mount.  If you need help, please create a new thread.

Link to comment
3 hours ago, axeman said:

Interstingly - it looks like mergerFS files aren't going to the "local" but are being uploaded directly... 

 

Seeing lots of these from a Sonarr refresh. 

Are you sure this is 'live' ?  I think the log is showing what happened when your upload script kicked in

Link to comment
2 hours ago, DZMM said:

Are you sure this is 'live' ?  I think the log is showing what happened when your upload script kicked in

Yeah, it's live. I just tried watching a show, and Emby must've updated the .nfo file, because log shows similar (upload script hasn't been run since a reboot).

 

Maybe smaller files that are already in the rclone cache get upload directly?

 

When I copy a large (new) file to the mergerFs mount, it does exactly as expected. The file goes to the corresponding folder on the "local" share. 

 

I'll test some more in a few hours. 

Link to comment

So it seems I've bit hit with a ban - "Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded" when trying to acess my files although I can't figure out why exactly.

 

Looking at the Quota stats on https://console.developers.google.com/apis/api/drive.googleapis.com/quotas I don't see me even getting close to the quota. I've also tried creating a new client id/secret to by-pass this, but I'm still getting the same error back.

 

image.thumb.png.3a400405d5d51e2949f1b4d9a426624e.png

 

image.thumb.png.495f95454bffceb9a441ca738212be04.png

I also have a completely different Team Drive using different credentials, and that seems to have been hit with a ban as well.

 

Any ideas?

Edited by teh0wner
Link to comment
So it seems I've bit hit with a ban - "Error 403: The download quota for this file has been exceeded., downloadQuotaExceeded" when trying to acess my files although I can't figure out why exactly.
 
Looking at the Quota stats on https://console.developers.google.com/apis/api/drive.googleapis.com/quotas I don't see me even getting close to the quota. I've also tried creating a new client id/secret to by-pass this, but I'm still getting the same error back.
 
image.thumb.png.3a400405d5d51e2949f1b4d9a426624e.png
 
image.thumb.png.495f95454bffceb9a441ca738212be04.png
 
Any ideas?
You got 750gb per day of upload i don't know if its include download also

Envoyé de mon Pixel 2 XL en utilisant Tapatalk

Link to comment

First off, I want to say the setup for this is super easy and I appreciate your hard work, DZMM. I started from scratch on a smaller server this weekend and I was able to have this up and running in no time. 

 

I'm seeing that my cache drive is being eaten up by the "mount_rclone" share, even after reset. My cache size is set to 400GB, which I copied from the scripts. Is this normal/expected? Is there a need for it to be so large? What exactly is cache doing here? 

Screen Shot 2020-12-21 at 9.15.45 AM.png

Link to comment
1 hour ago, drogg said:

First off, I want to say the setup for this is super easy and I appreciate your hard work, DZMM. I started from scratch on a smaller server this weekend and I was able to have this up and running in no time. 

 

I'm seeing that my cache drive is being eaten up by the "mount_rclone" share, even after reset. My cache size is set to 400GB, which I copied from the scripts. Is this normal/expected? Is there a need for it to be so large? What exactly is cache doing here? 

Screen Shot 2020-12-21 at 9.15.45 AM.png

 
 
 
 
 
 

Glad you got it all up and running (with no help!) easily.

 

The cache filling up quickly is something I'm keeping an eye on a bit on my server by manually browsing the cache every now and then to see what's in there.  My cache is getting populated mainly from Plex's overnight scheduled jobs i.e. analysing files that haven't been accessed by users. 

 

I'm trying to track how long something I've actually watched stays in the cache - if it's getting flushed within a day (or even hours), I'm probably going to turn the cache off.  E.g. I've just checked and some of the stuff I watched just last night isn't in the cache 17 hours later.....

 

I'm hesitant to increase the cache size to increase hit rate, as that's a lot of data (I have 7 teamdrives so I'm already caching over 2TB) to hold to get a slightly faster launch time and better seeking - every now and then.....  My server is doing a lot of scheduled work as I've decided to turn thumbnails back on, so maybe it'll settle down a bit in a month or two.

 

Edited by DZMM
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.