[Support] Rclone (beta)


Recommended Posts

Would still prefeer to either use the "build in command" (the one you sat up), or make you own file with commands. 9 commands on that little text file gets very cluttered :P

Nvm, you can add the path to your own script to the "Sync Command" variable, you just need to "chmod +x" on it.

Added a request to "chmod +x /config/Rclone.sh", so it gets easier for users to add their own file.

Link to comment

Can you make a screenshot of the docker configuration page with "Advanced View" chosen.

 

e9b3388263

 

hmm didn't seem to embed http://screenshot.co/#!/e9b3388263

 

In Data Path you specified "/mnt/user/"  That means the entire directory will be backed up. Are you sure about this?

 

In "Sync Destination" you need yo write the destination to the REMOTE specified when you made the configuration for Amazon Cloud Drive. E.g. "acd"

 

In Sync Command write: "rclone sync --dry-run /data acd:/unraid"

 

This will backup the entire contents of /mnt/user/ to a folder called "unraid" on Amazon Cloud Drive

Link to comment

I'm confused how to set this up with Amazon Cloud Drive.

 

I'm assuming I would want Amazon to be the sync destination but how would I add it? I mean, it's not mounted yet.

 

Everything you need appears to be in the OP -  the link to the rclone docs (which then provides a breakdown of how to configure each cloud service) and then the command to execute to write the config file.

 

I'll take another look. It was 1AM when I was looking at it.

Link to comment

Is the config wrong by default? It won't run because it cannot find /root/.rclone.conf but the docker image default this directory to /config

Is that just me?

 

Read op and in particular this section:

Creating the initial config file (.rclone.conf)

docker exec -it Rclone rclone --config="/config/.rclone.conf" config

 

I'd be pretty nice though if the container could automagically find out your prefered Cloud Service, username and password  ;)

Link to comment

Is the config wrong by default? It won't run because it cannot find /root/.rclone.conf but the docker image default this directory to /config

Is that just me?

 

Read op and in particular this section:

Creating the initial config file (.rclone.conf)

docker exec -it Rclone rclone --config="/config/.rclone.conf" config

 

I'd be pretty nice though if the container could automagically find out your prefered Cloud Service, username and password  ;)

 

I didn't really look at the command but that's what I copied and pasted. It still gave me the error. I think it was a fluke though. Worked when I reinstalled.

Link to comment

 

In Data Path you specified "/mnt/user/"  That means the entire directory will be backed up. Are you sure about this?

 

 

I thought I could specify sub-folders with the SYNC_COMMAND?  Or, do I have to create /data2 for /mnt/user/folder1, /data3 for /mnt/user/folder2 etc etc?

 

 

In "Sync Destination" you need yo write the destination to the REMOTE specified when you made the configuration for Amazon Cloud Drive. E.g. "acd"

 

 

Fixed - now gone with 'crypt'

 

 

In Sync Command write: "rclone sync --dry-run /data acd:/unraid"

 

I'm still getting no files added to Amazon.  I'm using a small test folder /mnt/user/Max <-- /data and for:

 

SYNC_COMMAND="rclone sync /data crypt:/unraid"

 

The logs keep saying files are being transferred, but when I log onto acd nothing is there?

logs.txt

Link to comment

apologies for monopolising this thread, but this is confusing the hell out of me.

 

I have the following remotes created:

 

Current remotes:

Name                 Type
====                 ====
acd                  amazon cloud drive
secret               crypt

 

After installing and setting up remotes,

 

- if I put 'acd' in the docker sync destination field, my files upload unencrypted to the root folder of Amazon

- if I put 'secret' in, the files get encrypted but saved in /mnt/user/appdata/Rclone/acd and not uploaded!!

 

How do I get encrypted files synced to Amazon?  Thanks

 

 

rclone.txt

Link to comment

apologies for monopolising this thread, but this is confusing the hell out of me.

 

I have the following remotes created:

 

Current remotes:

Name                 Type
====                 ====
acd                  amazon cloud drive
secret               crypt

 

After installing and setting up remotes,

 

- if I put 'acd' in the docker sync destination field, my files upload unencrypted to the root folder of Amazon

- if I put 'secret' in, the files get encrypted but saved in /mnt/user/appdata/Rclone/acd and not uploaded!!

 

How do I get encrypted files synced to Amazon?  Thanks

 

I'm having this issue as well but I don't know if the encrypted file is getting saved anywhere. When I copy unencrypted it takes a few seconds and completes fine and is there but if I try it encrypted it immediately completes successfully but nothing actually happens.

Link to comment

One other question. I am thinking about using this and testing it with Plex, encrypting my data and storing it remotely for Plex to stream. However I am not sure if this is possible because this is a docker image. Is there any way to mount the remote drive unencrypted for the Plex docker to see?

 

once I've got the encryption working, I'm going to have a crack at this:

 

http://rclone.org/commands/rclone_mount/

Link to comment

One other question. I am thinking about using this and testing it with Plex, encrypting my data and storing it remotely for Plex to stream. However I am not sure if this is possible because this is a docker image. Is there any way to mount the remote drive unencrypted for the Plex docker to see?

 

once I've got the encryption working, I'm going to have a crack at this:

 

http://rclone.org/commands/rclone_mount/

 

Yeah but what I'm trying to wrap my head around is will that mount in the docker be available to Unraid, and if so, will it see it decrypted? I'm sure the mount can be seen but it's the second part I'm concerned about.

Link to comment

How do I get through in the browser to authorize rclone with ACD?  stuck at:

If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...

 

I've swapped out the IP for the server's, and the docker is set to bridge mode and still nothing.  It's probably obvious I'm just not seeing it.

Link to comment

How do I get through in the browser to authorize rclone with ACD?  stuck at:

If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...

 

I've swapped out the IP for the server's, and the docker is set to bridge mode and still nothing.  It's probably obvious I'm just not seeing it.

 

Yeah, it is obvious you missed it. We all do from time to time. Chose the option where it says "Say N if you are working on a remote or headless machine".

Link to comment

How do I get through in the browser to authorize rclone with ACD?  stuck at:

If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...

 

I've swapped out the IP for the server's, and the docker is set to bridge mode and still nothing.  It's probably obvious I'm just not seeing it.

 

Yeah, it is obvious you missed it. Chose the option where it says "Say N if you are working on a remote or headless machine".

too trigger happy I guess.  Thanks!

Link to comment

How do I get through in the browser to authorize rclone with ACD?  stuck at:

If your browser doesn't open automatically go to the following link: http://127.0.0.1:53682/auth
Log in and authorize rclone for access
Waiting for code...

 

I've swapped out the IP for the server's, and the docker is set to bridge mode and still nothing.  It's probably obvious I'm just not seeing it.

 

Yeah, it is obvious you missed it. Chose the option where it says "Say N if you are working on a remote or headless machine".

too trigger happy I guess.  Thanks!

 

No sweat, we all do it. I tried the same way you did even though I knew it was headless, thinking I can just substitute the IP. You'll need to do the auth part on another non-headless machine with rclone though. Easy enough and there is a Windows version.

Link to comment

apologies for monopolising this thread, but this is confusing the hell out of me.

 

I have the following remotes created:

 

Current remotes:

Name                 Type
====                 ====
acd                  amazon cloud drive
secret               crypt

 

After installing and setting up remotes,

 

- if I put 'acd' in the docker sync destination field, my files upload unencrypted to the root folder of Amazon

- if I put 'secret' in, the files get encrypted but saved in /mnt/user/appdata/Rclone/acd and not uploaded!!

 

How do I get encrypted files synced to Amazon?  Thanks

 

You are not running with the --dry-run flag?

 

Can you post the config file (without your credentials of course).

 

For me encryption to ACD is working perfectly. I've both tested with filename encryption and without filename encryption.

Link to comment

apologies for monopolising this thread, but this is confusing the hell out of me.

 

I have the following remotes created:

 

Current remotes:

Name                 Type
====                 ====
acd                  amazon cloud drive
secret               crypt

 

After installing and setting up remotes,

 

- if I put 'acd' in the docker sync destination field, my files upload unencrypted to the root folder of Amazon

- if I put 'secret' in, the files get encrypted but saved in /mnt/user/appdata/Rclone/acd and not uploaded!!

 

How do I get encrypted files synced to Amazon?  Thanks

 

You are not running with the --dry-run flag?

 

Can you post the config file (without your credentials of course).

 

For me encryption to ACD is working perfectly. I've both tested with filename encryption and without filename encryption.

I took --dry-run out so and used a test directory for /data to see if it worked.

 

here's my files.

 

Config:

 

[acd]
type = amazon cloud drive
client_id = 
client_secret = 
token = {"DELETED"}

[secret]
type = crypt
remote = acd
filename_encryption = standard
password = autogenerated
password2 = autogenerated

 

within rclone:

 

Current remotes:

Name                 Type
====                 ====
acd                  amazon cloud drive
secret               crypt

 

docker settings:

 

http://screenshot.co/#!/82ca7f6eb1

 

screenshot of encrypted files going to appdata not amazon cloud drive:

 

http://screenshot.co/#!/5b19624ac0

 

Update: 

 

I'm going to leave it running for a bit longer than usual as it might be using the appdata folder to store files that are being encrypted and while they are being uploaded.  If that's the case I'll have to install in a diff folder as my appdata is set to cache only and won't have enough space.

logs.txt

Link to comment

I'm going to leave it running for a bit longer than usual as it might be using the appdata folder to store files that are being encrypted and while they are being uploaded.  If that's the case I'll have to install in a diff folder as my appdata is set to cache only and won't have enough space.

 

I don't think so. I tried backing up 600gb from a non-cache share and it went fine despite my cache drive having only 70gb free space. Something else is wrong.

 

Your configuration looks fine to me. Let me know what happens if you let it run for a bit longer.

 

EDIT: Can you try setting the config path to /mnt/cache/appdata/rclone/config/

 

Sometimes docker apps have problems with the "user" shares.

Link to comment

[quote author=thomast_88 link=topic=52033.msg502375#msg502375 date=1475071998

 

EDIT: Can you try setting the config path to /mnt/cache/appdata/rclone/config/

 

Sometimes docker apps have problems with the "user" shares.

 

I had this problem with another docker so I thought you had the solution, but it still didn't work  :(

 

I really hope someone cracks this, as this would be a great addition to my unRAID setup.

Link to comment

One other question. I am thinking about using this and testing it with Plex, encrypting my data and storing it remotely for Plex to stream. However I am not sure if this is possible because this is a docker image. Is there any way to mount the remote drive unencrypted for the Plex docker to see?

 

once I've got the encryption working, I'm going to have a crack at this:

 

http://rclone.org/commands/rclone_mount/

 

I just tested the mount function and it works for encrypted remotes. However I can only see the mounted content within the container, outside the container I cannot see it. What could that be?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.