[Plugin] rclone


Waseh

Recommended Posts

1 hour ago, Jamesmct said:

Sorry all if this is a dumb question.  I'm not sure if I'm searching for the right things.  Here goes:

The Rclone app configuration has the config script, Rclone custom script, mount script and unmount script.

 

My question is when/where can I invoke these scripts?  I don't see that my backups are running, and I'd like to mount the storage and I don't see it mounted as I wrote it in the script.

Here is what I would recommend doing:

  1. Sounds like you already have the rclone plugin installed. Great....but just ignore all that stuff in the configuration/settings page for it. Those are just examples to help you get the idea. Just use a command line terminal and type in "rclone config" to setup your remotes.
  2. Use the User Scripts plugin for unraid to create scripts. So for example, a mount script....you create the mount script in the User Scripts plugin, then just click the "Run in Background" button to have that mount script running the whole time. Or if you have some sort of rclone sync/copy script you want to run on a regular basis....you can have it run on a schedule using User Scripts plugin too. It uses cron scheduling format
  • Thanks 1
Link to comment
2 hours ago, Stupifier said:

Here is what I would recommend doing:

  1. Sounds like you already have the rclone plugin installed. Great....but just ignore all that stuff in the configuration/settings page for it. Those are just examples to help you get the idea. Just use a command line terminal and type in "rclone config" to setup your remotes.
  2. Use the User Scripts plugin for unraid to create scripts. So for example, a mount script....you create the mount script in the User Scripts plugin, then just click the "Run in Background" button to have that mount script running the whole time. Or if you have some sort of rclone sync/copy script you want to run on a regular basis....you can have it run on a schedule using User Scripts plugin too. It uses cron scheduling format

Thank you!  I noticed a button after looking around a little bit more too and you can copy to User Scripts.  Works like a charm.

Link to comment

I just setup 3 new Unraid servers. For the life of me I am having no luck in getting rclone plugin installed.  Other plugins and dockers install just fine. One of the servers is in a different state on a different backbone.

 

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/plugin/rclone.plg ... done
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/master/archive/rclone-2019.10.13-x86_64-1.txz ... done

+==============================================================================
| Installing new package /boot/config/plugins/rclone/install/rclone-2019.10.13-bundle.txz
+==============================================================================

Verifying package rclone-2019.10.13-bundle.txz.
Installing package rclone-2019.10.13-bundle.txz:
PACKAGE DESCRIPTION:
Package rclone-2019.10.13-bundle.txz installed.
Downloading rclone
Downloading certs
Download failed - No existing archive found - Try again later
plugin: run failed: /bin/bash retval: 1

Edited by joshopkins
Link to comment
  • 3 weeks later...

I have configured rclone, but am running into some issues after mounting. My share shows up, /mnt/disks/OneDrive, but is completely empty.

 

Mount script:

mkdir -p /mnt/disks/OneDrive

#This section mounts the various cloud storage into the folders that were created above.

rclone mount --max-read-ahead 1024k --allow-other onedrive: /mnt/disks/OneDrive &

 

rclone lsd output:

root@BigRig:~# rclone lsd OneDrive:
          -1 2020-03-15 20:55:57      3298 Desktop Backgrounds
          -1 2020-03-14 16:44:55         3 Documents
          -1 2020-03-14 18:27:32         2 School

 

What do i need to do to resolve this?

 

Also, if i want to move this share so that it is visable on my network machines, do i just need to change the /disks to /user?

 

Link to comment
33 minutes ago, mihcox said:

I have configured rclone, but am running into some issues after mounting. My share shows up, /mnt/disks/OneDrive, but is completely empty.

 

Mount script:

mkdir -p /mnt/disks/OneDrive

#This section mounts the various cloud storage into the folders that were created above.

rclone mount --max-read-ahead 1024k --allow-other onedrive: /mnt/disks/OneDrive &

 

rclone lsd output:

root@BigRig:~# rclone lsd OneDrive:
          -1 2020-03-15 20:55:57      3298 Desktop Backgrounds
          -1 2020-03-14 16:44:55         3 Documents
          -1 2020-03-14 18:27:32         2 School

 

What do i need to do to resolve this?

 

Also, if i want to move this share so that it is visable on my network machines, do i just need to change the /disks to /user?

 

You have a case-issue. According to your terminal output, you named the rclone remote "OneDrive" not "onedrive". That should fix that isssue.

Now to the followup question about having it visible on network.

  1. You should always mount rclone remotes to /mnt/disks/remote_name......like you already doing. Do NOT ever mount rclone remotes to user share locations.
  2. In Unraid Web UI...go to Settings--->SMB. Enable and In the extras field...you need to add your what you want to share. Here is an example:
    [global]
    force user = nobody
    [google]
    path = /mnt/disks/google
    comment =
    browseable = yes
    # Public
    public = yes
    read only=no
    writeable=yes
    writable=yes
    write ok=yes
    guest ok = yes
    vfs objects =

    You can just do a google search about SMB settings if you want to learn more. And obviously, there is also NFS and AFP network sharing protocols.....I only gave you an example of the SMB one because I share with Windows PCs.

  • Like 1
Link to comment
  • 2 weeks later...
  • 2 weeks later...

I have this Syntax, but it look like it only run on the first line, and doesn't run on the second line.  The first one already sync up to 3TB, but the 2nd one still on the 700GB

rclone sync /mnt/user/yummy/@Red cRed:@Red
rclone sync /mnt/user/yummy/@White cWhite:@White

 

 

Link to comment



I have this Syntax, but it look like it only run on the first line, and doesn't run on the second line.  The first one already sync up to 3TB, but the 2nd one still on the 700GB
rclone sync /mnt/user/yummy/@Red cRed:@Redrclone sync /mnt/user/yummy/@White cWhite:@White

 
 



If this is Google Drive, maximum upload per day is approx 750GB.
Link to comment
  • 2 weeks later...

I am facing a problem after I moved my test unRiad or set up my actually unRaid. I wanted to set up rclone here too. I had watched the video from Spaceinvader and adapted everything to my needs but I cannot sync when I "rclone" -v / mnt / user / documents nas-document: test "I get the following message.

 

2020/04/19 23:57:38 ERROR : ftp://192.168.1.13:21/test: not deleting files as there were IO errors
2020/04/19 23:57:38 ERROR : ftp://192.168.1.13:21/test: not deleting directories as there were IO errors
2020/04/19 23:57:38 ERROR : Attempt 3/3 failed with 2 errors and: update stor: 553 test/3dMARK 1060 GTX ohne OC.PNG: Permission denied.
2020/04/19 23:57:38 Failed to sync with 2 errors: last error was: update stor: 553 test/3dMARK 1060 GTX ohne OC.PNG: Permission denied.
root@Tower:~# rclone sync -v /mnt/user/Backup/test nas-dokumente
2020/04/19 23:57:58 INFO  : Local file system at /root/nas-dokumente: Waiting for checks to finish
2020/04/19 23:57:58 INFO  : Local file system at /root/nas-dokumente: Waiting for transfers to finish
2020/04/19 23:57:58 INFO  : Waiting for deletions to finish
2020/04/19 23:57:58 INFO  : 

I thought it might be due to the permissions on my Synology but with an FTP program I can figure it out and can also create directories.

 

Regards Maggi

Link to comment
  • 3 weeks later...

Hi folks,

After having a Gsuite account for over a year (with the view to setting up Rclone to back up my Unraid server) I finally got RClone set up and working over this weekend!!  I've got a crypt sync running currently for one of my shares (tv shows) syncing to a remote folder with the following command:

rclone sync -v "/mnt/user/TV Shows/" secure:backup/tvshows --bwlimit "07:00,3M 23:00,off"

Works great so far.

My question is, is there an easy way to  "add more sources/shares" to this same script, to sync all of my other shares as well?

I don't know enough about code etc to figure it out and I'm really confused with includes, excludes, --files-from etc etc.    If a files from txt file is the way to do it (presuming you list your sources directories in that text file and point rclone to that file to read the sources??)  where do I place that file?  In the rclone folder itself, or elsewhere?

 

I'm really sorry for the probably stupid question, but I've searched for most of the day and just got more and more confused 🙂

 

Thanks,

Joe

 

Link to comment
5 hours ago, joenitro said:

Hi folks,

After having a Gsuite account for over a year (with the view to setting up Rclone to back up my Unraid server) I finally got RClone set up and working over this weekend!!  I've got a crypt sync running currently for one of my shares (tv shows) syncing to a remote folder with the following command:


rclone sync -v "/mnt/user/TV Shows/" secure:backup/tvshows --bwlimit "07:00,3M 23:00,off"

Works great so far.

My question is, is there an easy way to  "add more sources/shares" to this same script, to sync all of my other shares as well?

I don't know enough about code etc to figure it out and I'm really confused with includes, excludes, --files-from etc etc.    If a files from txt file is the way to do it (presuming you list your sources directories in that text file and point rclone to that file to read the sources??)  where do I place that file?  In the rclone folder itself, or elsewhere?

 

I'm really sorry for the probably stupid question, but I've searched for most of the day and just got more and more confused 🙂

 

Thanks,

Joe

 

Best thing to do in these situation is to test it by including the -n flag in whatever rclone command you use. That flag is for "dry-runs".........it makes it extremely easy to test and do trial and error until you figure out the exact command that does it for you.

You are on the right track though with the includes/excludes flags. Sorry, I don't have the patience to tell you EXACTLY how to do what you want but.....you should be able to do something like this:
 

rclone sync -v "/mnt/user/" "secure:backup/" --bwlimit "07:00,3M 23:00,off" --include "share_name01/**" --include "share_name02/**" -n

Just look at that closely and reference back to the "rclone filtering" help pages....I did not test that command..but again....I added the -n flag there so you can trial and error until you get it right!

Edited by Stupifier
  • Like 1
Link to comment
7 hours ago, Stupifier said:

Best thing to do in these situation is to test it by including the -n flag in whatever rclone command you use. That flag is for "dry-runs".........it makes it extremely easy to test and do trial and error until you figure out the exact command that does it for you.

You are on the right track though with the includes/excludes flags. Sorry, I don't have the patience to tell you EXACTLY how to do what you want but.....you should be able to do something like this:
 


rclone sync -v "/mnt/user/" "secure:backup/" --bwlimit "07:00,3M 23:00,off" --include "share_name01/**" --include "share_name02/**" -n

Just look at that closely and reference back to the "rclone filtering" help pages....I did not test that command..but again....I added the -n flag there so you can trial and error until you get it right!

 

That's fantastic and more than helpful, thank you very much 🙂    I don't really like being spoon-fed as I like to learn and figure out stuff by myself as much as possible.   Will give it a go, especially with the -n flag.

 

Thanks again!

Link to comment
  • 2 weeks later...

Hi All,

 

I am seeing a strange thing and wondering if I am missing a setting or a flag or something. 
 

I recently setup rclone to move files from a seedbox server to a local server over SFTP. Files get added into the directory on the seedbox server and I’ve setup a script to check every so often for new files and start the move process. Smaller files go without issues but any larger files (like 10GB+) seem to have a slowdown after about 10 minutes of transferring the file and my CPU utilization continues at a high rate. 
 

The syntax I’m using is:

rclone move (Seedbox):(remote directory location) (local directory location) --min-age 1m --delete-empty-src-dirs --stats-one-line -P --stats 5s
 

it doesn’t matter if the command is issued off the CLI or through the script the same thing happens where it goes from ~250 Mbps down to ~50 Mpbs down to ~5Mbps until it finishes. 
 

if anyone has any insight it would be appreciated. 

Link to comment
1 minute ago, IKWeb said:

Not sure if I am making a Noob mistake here, I have installed the plugin and then SSH into my UNRAID server and run rclone config but I get an error that says 

 

-bash: rclone: command not found
 

What am I missing? 

That would suggest the plugin didn't install correctly. Have you tried reinstalling it? Does it show any errors during install? Does the plug-in show up in the web gui?

Link to comment
5 minutes ago, Waseh said:

That would suggest the plugin didn't install correctly. Have you tried reinstalling it? Does it show any errors during install? Does the plug-in show up in the web gui?

Looks like I made a mistake and installed the wrong version. Now when I try and install it I get this. 

----------Stable Branch installed----------
Uninstall Stable branch to install Beta!
plugin: run failed: /bin/bash retval: 1

 

I need to work out how to uninstall that version and then try again. 
 

Link to comment
3 minutes ago, IKWeb said:

OK thats finally removed. Can I ask @Waseh do you know if I want to setup Amazon S3 do I need to boot the server into GUI mode? 

Whenever asked....always try to authenticate "headless". The whole requirement to do all of this in the GUI mode is extremely outdated (if you're talking about from a SpaceInvaderOne guide). I don't think you should ever have to boot into GUI mode to do any rclone config setups

Link to comment
1 minute ago, Stupifier said:

Whenever asked....always try to authenticate "headless". The whole requirement to do all of this in the GUI mode is extremely outdated (if you're talking about from a SpaceInvaderOne guide). I don't think you should ever have to boot into GUI mode to do any rclone config setups

Thank you @Stupifier - Will try it out via termain and see how I get on. 

It was the SpaceInvader Video I have watched - But I am aware that is now like 3 years out of date. 

Link to comment

OK so got it installed, configured and setup as a share that can be viewed. When I try and copy something to it tho I get these errors. Any idea what I have done wrong? 
image.thumb.png.3ebdf80dfc28d78d1ca245d33b1913f8.png

 

Now this is way beyond my programming knowladge - I am a windows server guy 🙂

I have added this command to my mount script --vfs-cache-mode writes

So my mount script looks like 


rclone mount --max-read-ahead 1024k --allow-other --vfs-cache-mode writes amazons3:urvault /mnt/disks/amazon &

 

Does this look right? Files now copy up without any error, but stick on 99% for a few min and then finish. The test file I did with this was 300MB and it stoped on 99% for about 2min. 

Edited by IKWeb
Link to comment
OK so got it installed, configured and setup as a share that can be viewed. When I try and copy something to it tho I get these errors. Any idea what I have done wrong? 
image.thumb.png.3ebdf80dfc28d78d1ca245d33b1913f8.png
 
Now this is way beyond my programming knowladge - I am a windows server guy

I have added this command to my mount script --vfs-cache-mode writes

So my mount script looks like 

rclone mount --max-read-ahead 1024k --allow-other --vfs-cache-mode writes amazons3:urvault /mnt/disks/amazon &
 
Does this look right? Files now copy up without any error, but stick on 99% for a few min and then finish. The test file I did with this was 300MB and it stoped on 99% for about 2min. 
Research "rclone copy" and/or "rclone sync" commands.

Typically, "rclone mount" is really optimized only for reads, not so much writes.
  • Like 1
Link to comment

Nice plug-in! I'm gonna try it out and see if I can put it to use for storing back-ups to google drive and maybe eventually media files.

 

I'm curious if there are long term users of this plug-in / google drive who have also stored (encrypted) media on gdrive, and how that has been working out for them. Will google eventually delete your data? I assume they wouldn't know it is media as long as you encrypt it so that you would be safe, right?

 

Link to comment

I have google drive and an encrypted folder mounted as google and secure. I set this up following Spaceinvaderone's tutorial. The problem I am having is with the unmount script. The 'fusermount -u /mnt/disks/secure' returns 'Invalid argument' and can't properly unmount the secure folder. How can I fix this error?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.