Guide: How To Use Rclone To Mount Cloud Drives And Play Files


DZMM

Recommended Posts

11 minutes ago, Tuftuf said:

I'm not having much luck with the unmount script on array stop, having to manually use fusermount -uz command each time. I let people start using plex again so I don't plan to stop it again just yet.

The unmount script doesn't have any fusermount commands, as the new script structure makes this difficult (mount locations are variable).  The script is intended to be a cleanup script to be run at array start.

 

Do you need it to run at array stop?  If so, just add your own fusermount commands to the script.

Link to comment
On 3/16/2020 at 9:30 AM, DZMM said:

The unmount script doesn't have any fusermount commands, as the new script structure makes this difficult (mount locations are variable).  The script is intended to be a cleanup script to be run at array start.

 

Do you need it to run at array stop?  If so, just add your own fusermount commands to the script.

 

My array was not stopping and I blamed this when I couldn't quite work out where the fuser command was, I'll have to see if there is something else causing it not to stop as it looks to be unrelated.  I don't plan on stopping it just yet, its running its purpose. Main focus is getting things ready to back it all up.

Link to comment

Big thank you for all the hard work into this container and the scripts. Posts here have helped me a lot understand and resolve issues I have had previously on mount_unionfs and mount_mergerfs. 

 

Last night I got mount_mergerfs up and running 5 folders/files uploaded successfully to mount_rclone. There is a couple hundred gb waiting in the local mount. A further 5 but empty folders uploaded but keep receiving this in the upload log.

 

Quote

18.03.2020 21:47:01 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 21:47:01 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 21:47:01 INFO: Exiting as script already running.
Script Finished Wed, 18 Mar 2020 21:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

 

I did a couple of shutdowns last night and not sure if this error is a result of any unclean shutdowns. I have a stock upload scripts besides changing RcloneUploadRemoteName="gdrive_vfs" to match the RcloneRemoteName

 

What should I be doing to fix it? Thanks

Link to comment
10 minutes ago, faulksy said:

Big thank you for all the hard work into this container and the scripts. Posts here have helped me a lot understand and resolve issues I have had previously on mount_unionfs and mount_mergerfs. 

 

Last night I got mount_mergerfs up and running 5 folders/files uploaded successfully to mount_rclone. There is a couple hundred gb waiting in the local mount. A further 5 but empty folders uploaded but keep receiving this in the upload log.

 

 

I did a couple of shutdowns last night and not sure if this error is a result of any unclean shutdowns. I have a stock upload scripts besides changing RcloneUploadRemoteName="gdrive_vfs" to match the RcloneRemoteName

 

What should I be doing to fix it? Thanks

Delete the checker files in the appdata other rclone folder. Something like upload running

Link to comment
18 minutes ago, Kaizac said:

Yeah upload running is the checker file for uploads. Delete it and you should be able to upload

It seemed to get going again but same error in log again. I have the script to run hourly which is the 1st and 3rd events. 4 new empty folders were added to mount_rclone, no files.

 

Quote

18.03.2020 21:25:15 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 21:25:15 INFO: Exiting as script already running.
Script Finished Wed, 18 Mar 2020 21:25:15 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

Script Starting Wed, 18 Mar 2020 21:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

18.03.2020 21:47:01 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 21:47:01 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 21:47:01 INFO: Exiting as script already running.
Script Finished Wed, 18 Mar 2020 21:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

Script Starting Wed, 18 Mar 2020 22:41:04 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

18.03.2020 22:41:04 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 22:41:04 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 22:41:04 INFO: Script not running - proceeding.
18.03.2020 22:41:04 INFO: Checking if rclone installed successfully.
18.03.2020 22:41:04 INFO: rclone installed successfully - proceeding with upload.
18.03.2020 22:41:04 INFO: Uploading using upload remote gdrive_vfs
18.03.2020 22:41:04 INFO: *** Using rclone move - will add --delete-empty-src-dirs to upload.
====== RCLONE DEBUG ======
Script Starting Wed, 18 Mar 2020 22:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

18.03.2020 22:47:01 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 22:47:01 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 22:47:01 INFO: Exiting as script already running.
Script Finished Wed, 18 Mar 2020 22:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

 

Link to comment
7 minutes ago, faulksy said:

It seemed to get going again but same error in log again. I have the script to run hourly which is the 1st and 3rd events. 4 new empty folders were added to mount_rclone, no files.

 

 

I think you made a spelling error somewhere. In your earlier posts you wrote gdrive_vsf instead of vfs

Link to comment
7 minutes ago, faulksy said:

It seemed to get going again but same error in log again. I have the script to run hourly which is the 1st and 3rd events. 4 new empty folders were added to mount_rclone, no files.

What are you trying to upload? Has your prior run managed to finish its upload before you start the 2nd one?

Link to comment
3 minutes ago, testdasi said:

What are you trying to upload? Has your prior run managed to finish its upload before you start the 2nd one?

Video files ranging from 2-8gb size. After Kalzac helped me delete the checker file I manually ran the script. log was 

Quote

18.03.2020 22:41:04 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 22:41:04 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 22:41:04 INFO: Script not running - proceeding.
18.03.2020 22:41:04 INFO: Checking if rclone installed successfully.
18.03.2020 22:41:04 INFO: rclone installed successfully - proceeding with upload.
18.03.2020 22:41:04 INFO: Uploading using upload remote gdrive_vfs
18.03.2020 22:41:04 INFO: *** Using rclone move - will add --delete-empty-src-dirs to upload.
====== RCLONE DEBUG ======

 

Scheduled hourly script occurred and same issue

Quote

18.03.2020 22:47:01 INFO: *** Rclone move selected. Files will be moved from /mnt/user/local/gdrive_vfs for gdrive_vfs ***
18.03.2020 22:47:01 INFO: *** Starting rclone_upload script for gdrive_vfs ***
18.03.2020 22:47:01 INFO: Exiting as script already running.
Script Finished Wed, 18 Mar 2020 22:47:01 +1030

Full logs for this script are available at /tmp/user.scripts/tmpScripts/rclone_custom_plugin/log.txt

 

No files uploaded at all in that time. 4 media folders were created in mount_rclone but no files. Storage used on my gdrive hasn't changed since last night

Link to comment
13 minutes ago, Kaizac said:

I think you made a spelling error somewhere. In your earlier posts you wrote gdrive_vsf instead of vfs

Just a typo here. My upload script is ok. I'm not knowledgeable enough so I have to keep things simple

Quote

# REQUIRED SETTINGS
RcloneCommand="move" # choose your rclone command e.g. move, copy, sync
RcloneRemoteName="gdrive_vfs" # Name of rclone remote mount WITHOUT ':'.
RcloneUploadRemoteName="gdrive_vfs" # If you have a second remote created for uploads put it here.  Otherwise use the same remote as RcloneRemoteName.
LocalFilesShare="/mnt/user/local" # location of the local files without trailing slash you want to rclone to use
RcloneMountShare="/mnt/user/mount_rclone" # where your rclone mount is located without trailing slash  e.g. /mnt/user/mount_rclone
MinimumAge="15m" # sync files suffix ms|s|m|h|d|w|M|y
ModSort="ascending" # "ascending" oldest files first, "descending" newest files first

 

Link to comment
9 minutes ago, Kaizac said:

Go to that upload.log file it should show what is happening. It's in appdata other rclonr

It goes to add things and then deletes

Quote

2020/03/17 21:46:48 INFO  : Starting bandwidth limiter at 12MBytes/s
2020/03/17 21:46:48 INFO  : Starting HTTP transaction limiter: max 8 transactions/s with burst 1
2020/03/17 21:46:51 INFO  : Encrypted drive 'gdrive_vfs:': Waiting for checks to finish
2020/03/17 21:46:51 INFO  : Encrypted drive 'gdrive_vfs:': Waiting for transfers to finish
2020/03/17 21:47:10 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012)-fanart.jpg: Copied (new)
2020/03/17 21:47:10 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012)-fanart.jpg: Deleted
2020/03/17 21:47:11 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012).nfo: Copied (new)
2020/03/17 21:47:11 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012).nfo: Deleted
2020/03/17 21:47:48 NOTICE: Scheduled bandwidth change. Limit set to 12MBytes/s
2020/03/17 22:09:08 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012).avi: Copied (new)
2020/03/17 22:09:08 INFO  : movies/HD/Back To The Sea (2012)/Back To The Sea (2012).avi: Deleted
2020/03/17 22:22:46 INFO  : movies/HD/Bee Movie (2007)/Bee Movie (2007).avi: Copied (new)
2020/03/17 22:22:46 INFO  : movies/HD/Bee Movie (2007)/Bee Movie (2007).avi: Deleted
2020/03/17 22:29:02 INFO  : movies/HD/A Cinderella Story Once Upon a Song (2011)/A Cinderella Story Once Upon a Song (2011).avi: Copied (new)
2020/03/17 22:29:02 INFO  : movies/HD/A Cinderella Story Once Upon a Song (2011)/A Cinderella Story Once Upon a Song (2011).avi: Deleted
2020/03/17 22:57:46 INFO  : movies/HD/Big (1988)/Big (1988).mkv: Copied (new)
2020/03/17 22:57:46 INFO  : movies/HD/Big (1988)/Big (1988).mkv: Deleted
2020/03/17 23:02:06 INFO  : movies/HD/Fantastic Four (2005)/Fantastic Four (2005).mkv: Copied (new)
2020/03/17 23:02:07 INFO  : movies/HD/Fantastic Four (2005)/Fantastic Four (2005).mkv: Deleted
2020/03/17 23:02:29 INFO  : movies/HD/Animal Kingdom (2010)/Animal Kingdom (2010).mkv: Copied (new)
2020/03/17 23:02:29 INFO  : movies/HD/Animal Kingdom (2010)/Animal Kingdom (2010).mkv: Deleted
2020/03/18 22:41:04 INFO  : Starting bandwidth limiter at 12MBytes/s
2020/03/18 22:41:04 INFO  : Starting HTTP transaction limiter: max 8 transactions/s with burst 1
2020/03/18 22:41:19 INFO  : Encrypted drive 'gdrive_vfs:': Waiting for checks to finish
2020/03/18 22:41:19 INFO  : Encrypted drive 'gdrive_vfs:': Waiting for transfers to finish
2020/03/18 22:42:04 NOTICE: Scheduled bandwidth change. Limit set to 12MBytes/s
 

 

Link to comment
1 minute ago, faulksy said:

It goes to add things and then deletes

 

Then the "error" is expected.

One of the upload is still running (as Kaizac said, 12Mbps is rather slow) so naturally the next run would stop.

The whole upload control file is exactly for this scenario i.e. avoid running multiple uploads of the same file to the same source.

 

You shouldn't be running the upload script on an hourly schedule with such a slow connection to be honest.

At least don't run it on schedule untill everything has been uploaded.

Link to comment
28 minutes ago, testdasi said:

You shouldn't be running the upload script on an hourly schedule with such a slow connection to be honest.

At least don't run it on schedule untill everything has been uploaded.

Actually any schedule is fine - that's why the checker file is there to stop additional upload jobs starting if there's an instance already running.  If you don't make user script changes while an instance is running, the logs should still keep updating to show you what's happening.

Link to comment
20 minutes ago, DZMM said:

Actually any schedule is fine - that's why the checker file is there to stop additional upload jobs starting if there's an instance already running.  If you don't make user script changes while an instance is running, the logs should still keep updating to show you what's happening.

I could have explained it clearer.

The rclone upload job could be terminated silently (e.g. out of memory is the most frequent problem I have seen), in which case, the termination of upload script due to existing ongoing upload is actually an indication that something is amiss.

This would be missed by the users if the script is run on a regular schedule during the initial massive upload. Most users don't check the upload log (and network stat etc) and just assume things are just going in the background.

Link to comment

Thanks for putting this together saves me days of uploads. I have tried to access my encrypted files from a windows pc using a copy of the rclone config file but am unable to find the media folders or files. I originally was using PlexGuide which i used the same SA's in my upload script i am using copy instead of move as i would like to have a copy on UnRaid and my gdrive 

Link to comment

I use vlans at home and this caused all the traffic to leave via the management address even after binding it to an interface within the rclone upload script. The fix was to add a route second routing table and route for the IP I assigned it to.

 

The subnet is 192.168.100/24

The gateway is 168.168.100.1

The IP assigned to rclone upload is 192.168.100.90

 

echo "1 rt2" >> /etc/iproute2/rt_tables
ip route add 192.168.100.0/24 dev br0.100 src 192.168.100.90 table rt2
ip route add default via 192.168.100.1 dev br0.100 table rt2
ip rule add from 192.168.100.90/32 table rt2
ip rule add to 192.168.100.90/32 table rt2

 

Edited by Tuftuf
  • Like 1
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.