RAINMAN Posted March 16, 2017 Share Posted March 16, 2017 (edited) So I managed to get an encrypted folder. Still playing around with it. So everything I put in my shared folder is only stored on the cloud and not locally? Just trying to understand the best way to utilize this with my existing shares. Any best practices? Edited March 17, 2017 by RAINMAN Quote Link to comment
DazedAndConfused Posted March 17, 2017 Share Posted March 17, 2017 (edited) It seems that Amazon is throwing a fit about how long the encrypted file names are. Is there anything I can do to shorten the file names without having to change the filenames of the data I am backing up through Rclone? Edited March 18, 2017 by DazedAndConfused Quote Link to comment
RAINMAN Posted March 18, 2017 Share Posted March 18, 2017 So after playing with this for a bit I figured out how to sync it to amazon, if I run a command like this: rclone sync --transfers=10 --bwlimit 5M '/mnt/user/Console/Atari.2600/' encrypted:'Console/Atari.2600' It successfully writes the files to the encrypted drive but I don't see any of the files in the local mount. If I put the files in the local mount they get uploaded and I can see them. If I do a copy from /mnt/user/Console/Atari.2600/ to /mnt/disks/Console/Atari.2600/ it copies the files but I get a lot of file system errors. I definitely thing rclone is best to sync I just dont understand why I cant see the files in my local mount. Quote Link to comment
DazedAndConfused Posted March 18, 2017 Share Posted March 18, 2017 Is this correct place to put the "-L" flag? rclone mount --max-read-ahead 1024k --allow-other -L amazon: /mnt/disks/Amazon & I only added the "-L" this works fine without it. I just want to make sure that adding the "-L" where I put it will allow rclone to follow symlinks properly. Quote Link to comment
Waseh Posted March 19, 2017 Author Share Posted March 19, 2017 I updated the stable branch 2 hours ago so 1.36 should be live Quote Link to comment
SpaceInvaderOne Posted March 20, 2017 Share Posted March 20, 2017 12 hours ago, Waseh said: I updated the stable branch 2 hours ago so 1.36 should be live @Waseh great thank you. Quote Link to comment
DoeBoye Posted March 22, 2017 Share Posted March 22, 2017 (edited) I am looking (and I've read this thread and the docker thread) at installing rclone to backup all my important files to amazon drive, but one thing I can't seem to get my head around: What happens in the event of accidental deletions/corruption of local files, or being infected with ransomware? From my reading, rclone makes an exact copy of the local folder you point it at. If this is the case, wouldn't it also delete/rename any backup versions when next it runs? Also, is there any sort of version control built in? Thanks! [EDIT] So I think I've answered my own questions. No versioning control and if local files get nixed, backups do as well?... If this is the case, what is the purpose of this exactly? I suppose having a cloud version of all your files for ease of access?... I feel I'm missing something here... Perhaps I should reword this to: "What does running rclone do for you?" Edited March 22, 2017 by DoeBoye Quote Link to comment
hugenbdd Posted March 22, 2017 Share Posted March 22, 2017 1 hour ago, DoeBoye said: [EDIT] So I think I've answered my own questions. No versioning control and if local files get nixed, backups do as well?... If this is the case, what is the purpose of this exactly? I suppose having a cloud version of all your files for ease of access?... I feel I'm missing something here... Perhaps I should reword this to: "What does running rclone do for you?" If you use rclone sync it will overwrite what's on your cloud storage provider. But there is an easy way to get around this. Just use rclone copy that way any "file" you delete is not snyced up to the cloud. Copy would overwrite a file if it was changed though. So no protection from ransom ware. There is a backup command that is being discussed. https://forum.rclone.org/t/whats-the-latest-on-the-backup-command-development/632/5 Quote Link to comment
bobbintb Posted March 22, 2017 Share Posted March 22, 2017 Most cloud providers have some kind of version control. If you get infected with ransomware and the files get overwritten just restore them from the deleted files on your cloud provider. There are ways to do what you need. You just need to plan it out. Quote Link to comment
DoeBoye Posted March 22, 2017 Share Posted March 22, 2017 5 minutes ago, hugenbdd said: If you use rclone sync it will overwrite what's on your cloud storage provider. But there is an easy way to get around this. Just use rclone copy that way any "file" you delete is not snyced up to the cloud. Copy would overwrite a file if it was changed though. So no protection from ransom ware. Ooooh! That sounds better!.... Does that mean though that the entirety of the folder being watched would get uploaded every time, or does it still look at modified date etc? also, maybe you would get ransomware protection, because wouldn't local files that were renamed by ransomware appear to be new files to rclone (if using the copy function)? Quote Link to comment
hugenbdd Posted March 23, 2017 Share Posted March 23, 2017 6 hours ago, DoeBoye said: Ooooh! That sounds better!.... Does that mean though that the entirety of the folder being watched would get uploaded every time, or does it still look at modified date etc? also, maybe you would get ransomware protection, because wouldn't local files that were renamed by ransomware appear to be new files to rclone (if using the copy function)? I'm not sure how ransomeware works. I would assume if it changes the file name, then yes, you are good. No, it should not upload the same files in a directory that have already successfully been uploaded. However, if they change, then yes, it's uploaded and overwritten. Quote Link to comment
DoeBoye Posted March 23, 2017 Share Posted March 23, 2017 (edited) 20 hours ago, bobbintb said: Most cloud providers have some kind of version control. If you get infected with ransomware and the files get overwritten just restore them from the deleted files on your cloud provider. There are ways to do what you need. You just need to plan it out. Thanks for pointing that out! I hadn't really priced out other options than Amazon (because I already had Prime), but it looks like Crashplan and Backblaze both have unlimited packages priced around the same as Amazon's unlimited files offer, and they provide version control (unlike amazon). The description of Rclone mentions Backblaze B2. Has anyone tried the Backblaze Personal Backup with Rclone? Edited March 23, 2017 by DoeBoye spelling Quote Link to comment
tr0lll Posted April 7, 2017 Share Posted April 7, 2017 Any plans for a gui? Cheers Quote Link to comment
bobbintb Posted April 7, 2017 Share Posted April 7, 2017 (edited) 6 hours ago, tr0lll said: Any plans for a gui? Cheers I know it has been discussed here before. I don't think anything has come of it yet. It's been a while since I've looked into it. I know there are GUI frontends out there but I don't think any of them are webUIs yet. There are talks of official webUIs though. Edited April 7, 2017 by bobbintb Quote Link to comment
Waseh Posted April 7, 2017 Author Share Posted April 7, 2017 It's not something i intend to implement. The (very) basic gui for the configuration and scripts is as far as i'm able (and willing ) to take it. If anyone at any point wants to chip in regarding it, it would be much appreciated Quote Link to comment
drumstyx Posted April 13, 2017 Share Posted April 13, 2017 Anyone else having issues with it just hanging at the end of a large upload to ACD? I've got --verbose set and I see this sort of thing -- I imagine it's been going on like this for hours, and the file itself doesn't actually show as uploaded in ACD. 2017/04/13 15:36:14 INFO : Transferred: 87.177 GBytes (848.749 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 29h55m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s 2017/04/13 15:37:14 INFO : Transferred: 87.177 GBytes (848.276 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 29h56m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s 2017/04/13 15:38:14 INFO : Transferred: 87.177 GBytes (847.805 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 29h57m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s 2017/04/13 15:39:14 INFO : Transferred: 87.177 GBytes (847.333 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 29h58m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s 2017/04/13 15:40:14 INFO : Transferred: 87.177 GBytes (846.861 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 29h59m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s 2017/04/13 15:41:14 INFO : Transferred: 87.177 GBytes (846.392 kBytes/s) Errors: 0 Checks: 51 Transferred: 4 Elapsed time: 30h0m1.2s Transferring: * ...f.Duty.Advanced.Warfare.MULTi6-PROPHET.tar: 100% done, 0 Bytes/s, ETA: 0s Quote Link to comment
thomast_88 Posted April 29, 2017 Share Posted April 29, 2017 (edited) This plugin spawned from my initial docker which didn't support sharing the rclone FUSE mount. With Rclone-mount it's now possible. I'm not trying to hijack this thread, just want to let people know that I've created a Docker, which can expose the FUSE mount to the host and other docker containers, for those people who prefer dockerizing. Edited April 29, 2017 by thomast_88 Quote Link to comment
jude Posted May 3, 2017 Share Posted May 3, 2017 (edited) I'm having a basic setup issue. I was following the youtube setup guide made for this plugin https://www.youtube.com/watch?v=-b9Ow2iX2DQ and everything appears to work fine until I run the command to check if the encrypted folder has been created on the Amazon drive. The encrypted folder in this case named secure shows up in rclone (if I type in rclone config it shows the following) root@Tower:/# rclone config Current remotes: Name Type ==== ==== EncryptedACD crypt amazon amazon cloud drive secure crypt testfolder crypt (all of the crypt folders are me attempting to create a remote that actually includes a folder in the amazon drive) However if I type in rclone lsd amazon: I get the following root@Tower:/# rclone lsd amazon: -1 2016-11-19 09:46:59 -1 Plex Cloud Sync -1 2016-11-12 02:04:12 -1 Pictures -1 2016-11-12 02:04:12 -1 Videos -1 2016-11-12 02:04:12 -1 Documents -1 2017-05-03 18:50:10 -1 new I checked to see if rclone was able to create folders on amazon: by typing rclone mkdir amazon:new as you can see above this was successful. Not sure what I'm doing wrong here. Edit: by manually creating the encrypted folder using rclone and then mounting it, I was able to confirm that everything is working as it should. Edited May 3, 2017 by jude Quote Link to comment
brklynmark Posted May 6, 2017 Share Posted May 6, 2017 @jude Thanks for edit! What kind of upload speeds you getting on the encrypted folders? Quote Link to comment
Maze Posted May 8, 2017 Share Posted May 8, 2017 Does anyone know if there is some temp location for files during a sync job between unraid and acd ? - im gettin spammed with "no space left on device" when trying to do a sync. Failed to copy: Write /root/secure/test/xyz.avi: no space left on device files are a couple of gigs in size. Setup is a vmware virtual machine on esxi 6.5 running off a passed through usb. Disc controller passed through to that vm for the storage. Quote Link to comment
SpaceInvaderOne Posted May 23, 2017 Share Posted May 23, 2017 Rclone has been banned from ACD see https://forum.rclone.org/t/rclone-has-been-banned-from-amazon-drive/2314/24 Quote Link to comment
DZMM Posted May 23, 2017 Share Posted May 23, 2017 10 hours ago, gridrunner said: Rclone has been banned from ACD see https://forum.rclone.org/t/rclone-has-been-banned-from-amazon-drive/2314/24 Bummer just as I was about to get my FTTP connection :-( It was bound to happen eventually as I can imagine how much usage Amazon have been seeing from mounted drives Quote Link to comment
DZMM Posted May 23, 2017 Share Posted May 23, 2017 just read the post and it might not be all doom and gloom Quote Update 2017-05-20 I have now heard back from Amazon. Rclone has been banned for having the encrypted secrets in the source code. I've asked for new credentials so I can build an auth server and get rclone going again - I haven't heard back on that reply yet, but it is plausible we could get rclone running again with Amazon Drive in the not too distant future. Thank you all for your patience Nick Quote Link to comment
Aric Posted May 31, 2017 Share Posted May 31, 2017 I used this plugin, followed the excellent video gridrunner made. It works great except plex cant see anything that is in the encrypted /Media/Disks/TVCloud/ Directory where Media is mapped to /mnt in the docker app. I also mapped it to /Media/Disks/MGDrive/TV/ and it sees the few things I have there. This is the unencrypted GDrive with a few media files for testing Plex Cloud. The other is the encypted folder. I can see the encrypted folder in the network share and I can see and play the video files but they are not seen when going thru plex. Any ideas? Thanks Quote Link to comment
bobbintb Posted May 31, 2017 Share Posted May 31, 2017 32 minutes ago, Aric said: I used this plugin, followed the excellent video gridrunner made. It works great except plex cant see anything that is in the encrypted /Media/Disks/TVCloud/ Directory where Media is mapped to /mnt in the docker app. I also mapped it to /Media/Disks/MGDrive/TV/ and it sees the few things I have there. This is the unencrypted GDrive with a few media files for testing Plex Cloud. The other is the encypted folder. I can see the encrypted folder in the network share and I can see and play the video files but they are not seen when going thru plex. Any ideas? Thanks You'll have to post your settings. Quote Link to comment
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.