Jump to content
thomast_88

[Support] Rclone (beta)

226 posts in this topic Last Reply

Recommended Posts

Docker for Rclone - a command line program to sync files and directories to and from various cloud services.

 

Application Name: Rclone

Application Site: http://rclone.org/

Docker Hub: https://hub.docker.com/r/tynor88/rclone/

Github: https://github.com/tynor88/docker-rclone/tree/dev

unRAID Template: https://github.com/tynor88/docker-templates

Setup/Configuration: http://rclone.org/docs/

Supported Cloud Services:

  • Google Drive
  • Amazon S3
  • Openstack Swift / Rackspace cloud files / Memset Memstore
  • Dropbox
  • Google Cloud Storage
  • Amazon Drive
  • Microsoft One Drive
  • Hubic
  • Backblaze B2
  • Yandex Disk
  • The local filesystem

Creating the initial config file (.rclone.conf)

docker exec -it Rclone rclone --config="/config/.rclone.conf" config

 

Feel free to post any questions/issues/requests relating to this docker in this thread.

 

<CAUTION>This is still a beta so don't expect everything to work. Currently I have it running with Amazon Cloud Drive + Dropbox sync, and it's working fine.</CAUTION>

Share this post


Link to post

Mounting encrypted volumes in this docker to the host or other dockers is not supported. This is not a missing feature in this docker container but a limitation in docker itself.

 

For mounting encrypted volumes to the host or other docker containers please use the new docker: Rclone-mount

Edited by thomast_88

Share this post


Link to post

Thank you, working great.

 

Is there any way to add multiple folders to sync?

Maybe the best is to use cron to call up a script, which we can add out own commands in?

 

Have a file like this

rclone copy /pictures remote:pictures
rclone copy /documents remote:documents
rclone copy /video remote:video

etc..

And then we just add the paths as needed.

 

Also is there any way to set the cron script to less frequent? I don't need every minutte, every hour is okey for me. Now it is just spamming my log. Maybe a variable, just in the plain cron format, with a text "Edit on your own risk, default value: */1 * * * *"

What do you think?

Share this post


Link to post

@thomast_88  Thanks! this is awesome. I'll try it this weekend. I echo Bjonness406's comments. Both requests would be things I'd like to see added, if possible.

Share this post


Link to post

Also is there any way to set the cron script to less frequent? I don't need every minutte, every hour is okey for me. Now it is just spamming my log. Maybe a variable, just in the plain cron format, with a text "Edit on your own risk, default value: */1 * * * *"

What do you think?

 

I actually already implemented that - I just didn't update the template. Pass the variable CRON_SCHEDULE with a cron which suits you. E.g. 0 * * * * for every hour. Every minute is a bit overkill - I might change the default to 1 hour anyway.

 

Update: I updated the template now, you should be able to see it if you tick "Advanced View" when setting up the container.

 

Share this post


Link to post

Thank you, working great.

 

Is there any way to add multiple folders to sync?

Maybe the best is to use cron to call up a script, which we can add out own commands in?

 

Have a file like this

rclone copy /pictures remote:pictures
rclone copy /documents remote:documents
rclone copy /video remote:video

etc..

And then we just add the paths as needed.

 

@thomast_88  Thanks! this is awesome. I'll try it this weekend. I echo Bjonness406's comments. Both requests would be things I'd like to see added, if possible.

 

We definitely need multiple folders, I just havn't come up with a smart way to implement it yet. Having a file like @Bjonness406 is suggesting would work, but it's not that user-friendly. I prefer a way through variables. I'll think of it and do some more work on it this weekend.

 

Custom commands would be nice aswell - I see you are using 'copy' while I'm using 'sync' :-)

 

If you come up with anything, let me know. You can follow the development of the container in the dev branch.

Share this post


Link to post

Thank you, working great.

 

Is there any way to add multiple folders to sync?

Maybe the best is to use cron to call up a script, which we can add out own commands in?

 

Have a file like this

rclone copy /pictures remote:pictures
rclone copy /documents remote:documents
rclone copy /video remote:video

etc..

And then we just add the paths as needed.

 

@thomast_88  Thanks! this is awesome. I'll try it this weekend. I echo Bjonness406's comments. Both requests would be things I'd like to see added, if possible.

 

We definitely need multiple folders, I just havn't come up with a smart way to implement it yet. Having a file like @Bjonness406 is suggesting would work, but it's not that user-friendly. I prefer a way through variables. I'll think of it and do some more work on it this weekend.

 

Custom commands would be nice aswell - I see you are using 'copy' while I'm using 'sync' :-)

 

If you come up with anything, let me know. You can follow the development of the container in the dev branch.

Well, I don't think Rclone at all is super user friendly, since it doesn't have a GUI.

 

What about having a file where you can add your own commands, but add some start examples?

Like this:

 

#Take away the "#" for the lines you want to use.
# Change "remote" to what you did call your remote during setup of the config file (Onedrive, Amazon, Google, backup2 etc)

#rclone copy /pictures remote:pictures # Copies the files from "/pictures" to "pictures" on the remote. 
#rclone copy /documents remote:documents
#rclone copy /video remote:video

#rclone sync /example remote:example #Sync's the content between both the destiations, need to have RW permission.

Or you could have a Varible called "Remote" where you specify what you have as remote.

 

If you do this, I can write guide on how to setup the config file from a windows pc at least (probably mac and linux too), then then add it correctly to the rclone commands. (I will make that for Onedrive since I use that, but will probably be the same for most of the other ones too.)

Share this post


Link to post

How about using the User Script plugin to call a command to your docker to execute the folder copy? You can put multiple commands in the user script itself. You can put UserSscripts on its built-in scheduler.

Share this post


Link to post

How about using the User Script plugin to call a command to your docker to execute the folder copy? You can put multiple commands in the user script itself. You can put UserSscripts on its built-in scheduler.

 

Wouldn't that interfere with the Docker philosophy? :)

Share this post


Link to post

Thank you, working great.

 

Is there any way to add multiple folders to sync?

Maybe the best is to use cron to call up a script, which we can add out own commands in?

 

Have a file like this

rclone copy /pictures remote:pictures
rclone copy /documents remote:documents
rclone copy /video remote:video

etc..

And then we just add the paths as needed.

 

@thomast_88  Thanks! this is awesome. I'll try it this weekend. I echo Bjonness406's comments. Both requests would be things I'd like to see added, if possible.

 

We definitely need multiple folders, I just havn't come up with a smart way to implement it yet. Having a file like @Bjonness406 is suggesting would work, but it's not that user-friendly. I prefer a way through variables. I'll think of it and do some more work on it this weekend.

 

Custom commands would be nice aswell - I see you are using 'copy' while I'm using 'sync' :-)

 

If you come up with anything, let me know. You can follow the development of the container in the dev branch.

Well, I don't think Rclone at all is super user friendly, since it doesn't have a GUI.

 

What about having a file where you can add your own commands, but add some start examples?

Like this:

 

#Take away the "#" for the lines you want to use.
# Change "remote" to what you did call your remote during setup of the config file (Onedrive, Amazon, Google, backup2 etc)

#rclone copy /pictures remote:pictures # Copies the files from "/pictures" to "pictures" on the remote. 
#rclone copy /documents remote:documents
#rclone copy /video remote:video

#rclone sync /example remote:example #Sync's the content between both the destiations, need to have RW permission.

Or you could have a Varible called "Remote" where you specify what you have as remote.

 

If you do this, I can write guide on how to setup the config file from a windows pc at least (probably mac and linux too), then then add it correctly to the rclone commands. (I will make that for Onedrive since I use that, but will probably be the same for most of the other ones too.)

 

Thanks for the suggestions. For now I made the multiple commands / folders really simple (but it works...).

 

There is a new environemt variable you can pass called "SYNC_COMMAND" where you basically override the default sync command. So now you can do:

 

SYNC_COMMAND="rclone sync /data remote:/data && rclone sync /data2 remote:/data2"

 

You cannot use variables inside the custom sync command. I've not figured out how to expand them yet :-).

 

Let me know how it works.

Share this post


Link to post

Let me know how it works.

Working fine :D

 

Would still prefeer to either use the "build in command" (the one you sat up), or make you own file with commands. 9 commands on that little text file gets very cluttered :P

If not, I can live with it..

 

Thanks!

Share this post


Link to post

I assume if I put a configuration password (encrypts the config file), this would likely cause the docker to fail? i guess i'll try and report back.

 

edit: yup, doesn't work  ;D 2016/09/26 03:00:01 Failed to read password: inappropriate ioctl for device

Share this post


Link to post

I assume if I put a configuration password (encrypts the config file), this would likely cause the docker to fail? i guess i'll try and report back.

 

edit: yup, doesn't work  ;D 2016/09/26 03:00:01 Failed to read password: inappropriate ioctl for device

 

Can you try doing this and see if it works (taken from the rclone documentation):

 

If it is safe in your environment, you can set the RCLONE_CONFIG_PASS environment variable to contain your password, in which case it will be used for decrypting the configuration.

 

If you are running rclone inside a script, you might want to disable password prompts. To do that, pass the parameter --ask-password=false to rclone. This will make rclone fail instead of asking for a password if RCLONE_CONFIG_PASS doesn’t contain a valid password.

 

UPDATE: I just tested by passing RCLONE_CONFIG_PASS environment variable to the container, and it decrypts the config file fine :D I'll add this to the documentation

Share this post


Link to post

Really excited to test this. I've been looking for a way to move my Plex library to play from the cloud and be encrypted and this might be it.

Share this post


Link to post

I'm confused how to set this up with Amazon Cloud Drive.

 

I'm assuming I would want Amazon to be the sync destination but how would I add it? I mean, it's not mounted yet.

Share this post


Link to post

I'm confused how to set this up with Amazon Cloud Drive.

 

I'm assuming I would want Amazon to be the sync destination but how would I add it? I mean, it's not mounted yet.

 

Everything you need appears to be in the OP -  the link to the rclone docs (which then provides a breakdown of how to configure each cloud service) and then the command to execute to write the config file.

Share this post


Link to post

I think this does exactly what I want to do which is backup my files to amazon cloud drive, encrypt and allow me to mount the drive.

 

I've done the initial configuration (I think) and I'm trying to do a --dry-run but I can't work out how to do it.  If I want to do a dry run with the test folder /mnt/user/test (I put mnt/user/ as the data path in the docker) syncing to a 'unraid' folder on acd remote 'secret' creating unraid/test on acd, I think I type:

 

rclone sync --dry-run /test secret:unraid

 

For the life of me I can't get this to work via SSH.  Do I have to type something at the front of rclone sync like:

 

docker exec -it Rclone rclone sync --dry-run  /test acd:unraid

 

sorry for the rookie question, but I'm not familiar with SSH but once someone explains how to run the rclone commands on unraid I think I'll be able to follow the rclone documentation for the rest.

 

Thanks in advance

 

Share this post


Link to post

I think this does exactly what I want to do which is backup my files to amazon cloud drive, encrypt and allow me to mount the drive.

 

I've done the initial configuration (I think) and I'm trying to do a --dry-run but I can't work out how to do it.  If I want to do a dry run with the test folder /mnt/user/test (I put mnt/user/ as the data path in the docker) syncing to a 'unraid' folder on acd remote 'secret' creating unraid/test on acd, I think I type:

 

rclone sync --dry-run /test secret:unraid

 

For the life of me I can't get this to work via SSH.  Do I have to type something at the front of rclone sync like:

 

docker exec -it Rclone rclone sync --dry-run  /test acd:unraid

 

sorry for the rookie question, but I'm not familiar with SSH but once someone explains how to run the rclone commands on unraid I think I'll be able to follow the rclone documentation for the rest.

 

Thanks in advance

Do you get any error message of some sort?

Share this post


Link to post

I think this does exactly what I want to do which is backup my files to amazon cloud drive, encrypt and allow me to mount the drive.

 

I've done the initial configuration (I think) and I'm trying to do a --dry-run but I can't work out how to do it.  If I want to do a dry run with the test folder /mnt/user/test (I put mnt/user/ as the data path in the docker) syncing to a 'unraid' folder on acd remote 'secret' creating unraid/test on acd, I think I type:

 

rclone sync --dry-run /test secret:unraid

 

For the life of me I can't get this to work via SSH.  Do I have to type something at the front of rclone sync like:

 

docker exec -it Rclone rclone sync --dry-run  /test acd:unraid

 

sorry for the rookie question, but I'm not familiar with SSH but once someone explains how to run the rclone commands on unraid I think I'll be able to follow the rclone documentation for the rest.

 

Thanks in advance

 

As i mentioned earlier, you can override the default sync command by specifying the variable: SYNC_COMMAND

 

So try:

SYNC_COMMAND="rclone sync --dry-run /test acd:/unraid"

 

I suppose you have mounted /test correctly and setup the Amazon Cloud Drive destination with the name "acd"

 

Show the log output after this has run.

Share this post


Link to post

I think this does exactly what I want to do which is backup my files to amazon cloud drive, encrypt and allow me to mount the drive.

 

I've done the initial configuration (I think) and I'm trying to do a --dry-run but I can't work out how to do it.  If I want to do a dry run with the test folder /mnt/user/test (I put mnt/user/ as the data path in the docker) syncing to a 'unraid' folder on acd remote 'secret' creating unraid/test on acd, I think I type:

 

rclone sync --dry-run /test secret:unraid

 

For the life of me I can't get this to work via SSH.  Do I have to type something at the front of rclone sync like:

 

docker exec -it Rclone rclone sync --dry-run  /test acd:unraid

 

sorry for the rookie question, but I'm not familiar with SSH but once someone explains how to run the rclone commands on unraid I think I'll be able to follow the rclone documentation for the rest.

 

Thanks in advance

 

As i mentioned earlier, you can override the default sync command by specifying the variable: SYNC_COMMAND

 

So try:

SYNC_COMMAND="rclone sync --dry-run /test acd:/unraid"

 

I suppose you have mounted /test correctly and setup the Amazon Cloud Drive destination with the name "acd"

 

Show the log output after this has run.

 

Thanks for the help.  Log attached:

 

[s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 10-adduser: executing...

GID/UID
-------------------------------------
User uid: 911
User gid: 911
-------------------------------------

[cont-init.d] 10-adduser: exited 0.
[cont-init.d] 40-config: executing...
crontab => 0 * * * * /app/rclone.sh
[cont-init.d] 40-config: exited 0.
[cont-init.d] done.
[services.d] starting services
[services.d] done.

 

What I've done so far:

1. installed docker with /data <-- /mnt/user and sync <-- /mnt/user/Backup

2. setup remote 'acd' and then remote 'secret' by choosing option 5 to encrpyt/decrpyt a remote and choosing 'acd'.  added encryption password and salt password

3. tried to sync /mnt/user/Max to unraid folder I've setup on acd as a test.  Tried:

 

SYNC_COMMAND="rclone sync /Max secret:unraid"

 

and

 

SYNC_COMMAND="rclone sync /data/Max secret:unraid"

 

no errors in putty, but nothing appears in in the unraid folder in acd

Share this post


Link to post

ahh just spotted missed forward slash on /unraid.  One more go

 

Update: no change.  Tried all of the below from the putty prompt:

 

SYNC_COMMAND="rclone sync /Max secret:/unraid"
SYNC_COMMAND="rclone sync /data/Max secret:/unraid"
SYNC_COMMAND="rclone sync /Max acd:/unraid"
SYNC_COMMAND="rclone sync /data/Max acd:/unraid"

 

Share this post


Link to post

ahh just spotted missed forward slash on /unraid.  One more go

The command is only getting executed every hour, you can change "Cron Schedule:" to this, then wait a minutte and see what the output of the command are.

Cron Schedule: * * * * *

Share this post


Link to post

ahh just spotted missed forward slash on /unraid.  One more go

The command is only getting executed every hour, you can change "Cron Schedule:" to this, then wait a minutte and see what the output of the command are.

Cron Schedule: * * * * *

 

getitng warnign messages now:

 

2016/09/27 13:50:00 Attempt 1/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Attempt 2/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Attempt 3/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Failed to sync: mkdir /mnt/user: permission denied

 

i've changed /data <-- /mnt/user to /data <-- /mnt/user/Media to see if that helps and I'm trying

 

SYNC_COMMAND="rclone sync /data/Music secret:/unraid"

 

Share this post


Link to post

ahh just spotted missed forward slash on /unraid.  One more go

The command is only getting executed every hour, you can change "Cron Schedule:" to this, then wait a minutte and see what the output of the command are.

Cron Schedule: * * * * *

 

getitng warnign messages now:

 

2016/09/27 13:50:00 Attempt 1/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Attempt 2/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Attempt 3/3 failed with 1 errors and: mkdir /mnt/user: permission denied
2016/09/27 13:50:00 Failed to sync: mkdir /mnt/user: permission denied

 

i've changed /data <-- /mnt/user to /data <-- /mnt/user/Media to see if that helps and I'm trying

 

SYNC_COMMAND="rclone sync /data/Music secret:/unraid"

 

Can you make a screenshot of the docker configuration page with "Advanced View" chosen.

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.