Jump to content
Waseh

[Plugin] rclone

515 posts in this topic Last Reply

Recommended Posts

So I managed to get an encrypted folder.  Still playing around with it.

 

So everything I put in my shared folder is only stored on the cloud and not locally?  Just trying to understand the best way to utilize this with my existing shares.  Any best practices? 

Edited by RAINMAN

Share this post


Link to post

It seems that Amazon is throwing a fit about how long the encrypted file names are. Is there anything I can do to shorten the file names without having to change the filenames of the data I am backing up through Rclone? 

 

 

 

 

 

Edited by DazedAndConfused

Share this post


Link to post

So after playing with this for a bit I figured out how to sync it to amazon, if I run a command like this:

 

rclone sync --transfers=10 --bwlimit 5M '/mnt/user/Console/Atari.2600/' encrypted:'Console/Atari.2600'

 

It successfully writes the files to the encrypted drive but I don't see any of the files in the local mount.  If I put the files in the local mount they get uploaded and I can see them.  If I do a copy from /mnt/user/Console/Atari.2600/ to /mnt/disks/Console/Atari.2600/ it copies the files but I get a lot of file system errors.  I definitely thing rclone is best to sync I just dont understand why I cant see the files in my local mount.

Share this post


Link to post

Is this correct place to put the "-L" flag?

 

rclone mount --max-read-ahead 1024k --allow-other -L amazon: /mnt/disks/Amazon &

 

I only added the "-L" this works fine without it. I just want to make sure that adding the "-L" where I put it will allow rclone to follow symlinks properly. 

Share this post


Link to post

I updated the stable branch 2 hours ago so 1.36 should be live :)

Share this post


Link to post

I am looking (and I've read this thread and the docker thread) at installing rclone to backup all my important files to amazon drive, but one thing I can't seem to get my head around: What happens in the event of accidental deletions/corruption of local files, or being infected with ransomware? From my reading, rclone makes an exact copy of the local folder you point it at. If this is the case, wouldn't it also delete/rename any backup versions when next it runs?

 

Also, is there any sort of version control built in?

 

Thanks!

 

[EDIT] So I think I've answered my own questions. No versioning control and if local files get nixed, backups do as well?... If this is the case, what is the purpose of this exactly? I suppose having a cloud version of all your files for ease of access?... I feel I'm missing something here...

 

Perhaps I should reword this to: "What does running rclone do for you?" :)

Edited by DoeBoye

Share this post


Link to post
1 hour ago, DoeBoye said:

[EDIT] So I think I've answered my own questions. No versioning control and if local files get nixed, backups do as well?... If this is the case, what is the purpose of this exactly? I suppose having a cloud version of all your files for ease of access?... I feel I'm missing something here...

 

Perhaps I should reword this to: "What does running rclone do for you?" :)

 

 

If you use rclone sync it will overwrite what's on your cloud storage provider.  But there is an easy way to get around this.  Just use rclone copy   that way any "file" you delete is not snyced up to the cloud.    Copy would overwrite a file if it was changed though.  So no protection from ransom ware.

 

There is a backup command that is being discussed.

https://forum.rclone.org/t/whats-the-latest-on-the-backup-command-development/632/5

 

Share this post


Link to post

Most cloud providers have some kind of version control. If you get infected with ransomware and the files get overwritten just restore them from the deleted files on your cloud provider. There are ways to do what you need. You just need to plan it out.

Share this post


Link to post
5 minutes ago, hugenbdd said:

If you use rclone sync it will overwrite what's on your cloud storage provider.  But there is an easy way to get around this.  Just use rclone copy   that way any "file" you delete is not snyced up to the cloud.    Copy would overwrite a file if it was changed though.  So no protection from ransom ware.

 

Ooooh! That sounds better!.... Does that mean though that the entirety of the folder being watched would get uploaded every time, or does it still look at modified date etc?

 

also, maybe you would get ransomware protection, because wouldn't local files that were renamed by ransomware appear to be new files to rclone (if using the copy function)?

Share this post


Link to post
6 hours ago, DoeBoye said:

 

Ooooh! That sounds better!.... Does that mean though that the entirety of the folder being watched would get uploaded every time, or does it still look at modified date etc?

 

also, maybe you would get ransomware protection, because wouldn't local files that were renamed by ransomware appear to be new files to rclone (if using the copy function)?

I'm not sure how ransomeware works.  I would assume if it changes the file name, then yes, you are good.

 

No, it should not upload the same files in a directory that have already successfully been uploaded.  However, if they change, then yes, it's uploaded and overwritten.

Share this post


Link to post
20 hours ago, bobbintb said:

Most cloud providers have some kind of version control. If you get infected with ransomware and the files get overwritten just restore them from the deleted files on your cloud provider. There are ways to do what you need. You just need to plan it out.

Thanks for pointing that out! I hadn't really priced out other options than Amazon (because I already had Prime), but it looks like Crashplan and Backblaze both have unlimited packages priced around the same as Amazon's unlimited files offer, and they provide version control (unlike amazon).

 

The description of Rclone mentions Backblaze B2. Has anyone tried the Backblaze Personal Backup with Rclone?

Edited by DoeBoye
spelling

Share this post


Link to post
6 hours ago, tr0lll said:

Any plans for a gui?

 

Cheers

 

I know it has been discussed here before. I don't think anything has come of it yet. It's been a while since I've looked into it. I know there are GUI frontends out there but I don't think any of them are webUIs yet. There are talks of official webUIs though.

Edited by bobbintb

Share this post


Link to post

It's not something i intend to implement. :)
The (very) basic gui for the configuration and scripts is as far as i'm able (and willing ;) ) to take it. If anyone at any point wants to chip in regarding it, it would be much appreciated :)

Share this post


Link to post

Anyone else having issues with it just hanging at the end of a large upload to ACD? I've got --verbose set and I see this sort of thing -- I imagine it's been going on like this for hours, and the file itself doesn't actually show as uploaded in ACD.

 


2017/04/13 15:36:14 INFO : 
Transferred: 87.177 GBytes (848.749 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 29h55m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

2017/04/13 15:37:14 INFO : 
Transferred: 87.177 GBytes (848.276 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 29h56m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

2017/04/13 15:38:14 INFO : 
Transferred: 87.177 GBytes (847.805 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 29h57m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

2017/04/13 15:39:14 INFO : 
Transferred: 87.177 GBytes (847.333 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 29h58m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

2017/04/13 15:40:14 INFO : 
Transferred: 87.177 GBytes (846.861 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 29h59m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

2017/04/13 15:41:14 INFO : 
Transferred: 87.177 GBytes (846.392 kBytes/s)
Errors: 0
Checks: 51
Transferred: 4
Elapsed time: 30h0m1.2s
Transferring:
* ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s

 

Share this post


Link to post

This plugin spawned from my initial docker which didn't support sharing the rclone FUSE mount. With Rclone-mount it's now possible.

 

I'm not trying to hijack this thread, just want to let people know that I've created a Docker, which can expose the FUSE mount to the host and other docker containers, for those people who prefer dockerizing.

Edited by thomast_88

Share this post


Link to post

I'm having a basic setup issue. I was following the youtube setup guide made for this plugin https://www.youtube.com/watch?v=-b9Ow2iX2DQ

and everything appears to work fine until I run the command to check if the encrypted folder has been created on the Amazon drive. The encrypted folder in this case named secure shows up in rclone (if I type in rclone config it shows the following)

 

root@Tower:/# rclone config  

Current remotes:

 

Name                 Type

====                 ====

EncryptedACD         crypt

amazon               amazon cloud drive

secure               crypt

testfolder           crypt

 

(all of the crypt folders are me attempting to create a remote that actually includes a folder in the amazon drive)

 

However if I type in 

 

rclone lsd amazon:

 

I get the following

 

root@Tower:/# rclone lsd amazon:

          -1 2016-11-19 09:46:59        -1 Plex Cloud Sync

          -1 2016-11-12 02:04:12        -1 Pictures

          -1 2016-11-12 02:04:12        -1 Videos

          -1 2016-11-12 02:04:12        -1 Documents

          -1 2017-05-03 18:50:10        -1 new

 

I checked to see if rclone was able to create folders on amazon: by typing

 

rclone mkdir amazon:new

 

as you can see above this was successful.

 

Not sure what I'm doing wrong here. 

 

Edit: by manually creating the encrypted folder using rclone and then mounting it, I was able to confirm that everything is working as it should.

 

 

Edited by jude

Share this post


Link to post

Does anyone know if there is some temp location for files during a sync job between unraid and acd ?

 

- im gettin spammed with "no space left on device" when trying to do a sync.

 

Failed to copy: Write /root/secure/test/xyz.avi: no space left on device

 

files are a couple of gigs in size.

 

Setup is a vmware virtual machine on esxi 6.5 running off a passed through usb. Disc controller passed through to that vm for the storage.

Share this post


Link to post

just read the post and it might not be all doom and gloom

Quote

Update 2017-05-20

I have now heard back from Amazon. Rclone has been banned for having the encrypted secrets in the source code.

I've asked for new credentials so I can build an auth server and get rclone going again - I haven't heard back on that reply yet, but it is plausible we could get rclone running again with Amazon Drive in the not too distant future.

Thank you all for your patience

Nick

 

Share this post


Link to post

I used this plugin, followed the excellent video gridrunner made. It works great except plex cant see anything that is in the encrypted /Media/Disks/TVCloud/ Directory where Media is mapped to /mnt in the docker app. I also mapped it to /Media/Disks/MGDrive/TV/ and it sees the few things I have there. This is the unencrypted GDrive with a few media files for testing Plex Cloud. The other is the encypted folder.

 

I can see the encrypted folder in the network share and I can see and play the video files but they are not seen when going thru plex. Any ideas?

 

Thanks

 

Share this post


Link to post
32 minutes ago, Aric said:

I used this plugin, followed the excellent video gridrunner made. It works great except plex cant see anything that is in the encrypted /Media/Disks/TVCloud/ Directory where Media is mapped to /mnt in the docker app. I also mapped it to /Media/Disks/MGDrive/TV/ and it sees the few things I have there. This is the unencrypted GDrive with a few media files for testing Plex Cloud. The other is the encypted folder.

 

I can see the encrypted folder in the network share and I can see and play the video files but they are not seen when going thru plex. Any ideas?

 

Thanks

 

You'll have to post your settings. 

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.