thomast_88

Community Developer
  • Posts

    246
  • Joined

  • Last visited

Everything posted by thomast_88

  1. What I am trying to do is store encrypted data in Amazon Cloud Drive and remotely mount it (no local copy) for UnRAID to ba able to access. Is it even technically possible to mount ACD within this docker so Unraid can access it? I'm not sure of Docker's abilities in that regard. If not it only does half of what a need it to, which if fine, I'd just like to know one way or the other so I don't waste my time trying to make something work that just isn't possible. I installed fuse in the Docker container yesterday, and tried to use the rclone mount function, to mount the remotely encrypted ACD content. It worked fine inside the docker, but when I tried to mount the encrypted content inside a docker volume (to be able to see it outside docker / other dockers) the folder was empty outside the docker. I think this is a limitation in Docker itself. I had to run the docker with the --privileged flag to be able to use fuse mounts.
  2. I think for people who are comfortable with editing config files and installing stuff on their server, User Scripts Plugin gives you all the flexibility you need?
  3. Not yet supported. You can run many sync commands, but they will all be run at the same cron schedule. I can edit mine. Can you show the commands you are trying to execute together with the error messages? Not exactly sure what you mean with 1-2 transfers? Rclone is for creating an exact copy of a folder on your server to the cloud. It will keep doing that at the defined cron schedule, and keep the remote up to date whether you delete or add new files. Yes, unless you provide an already setup .rclone.config, there is no way to overcome this.
  4. once I've got the encryption working, I'm going to have a crack at this: http://rclone.org/commands/rclone_mount/ I just tested the mount function and it works for encrypted remotes. However I can only see the mounted content within the container, outside the container I cannot see it. What could that be? That's what I was concerned about. Try adding the mount drive as another path. Not sure if it will work though. I appreciate the work done with this but honestly, I'm thinking this kind of app would be better done as a plugin instead of a docker. Being CLI, it would be much easier for the user to configure via the plugin page and there wouldn't be any issues trying to get a share mounted in the docker visible to unraid. This docker is certainly not for advanced scenarios like mounting cloud FUSE volumes (at the moment at least...). If you want to use a homemade rclone script instead, i suggest you take a look at the User Scripts plugin. It will allow you to run the script at predefined intervals.
  5. Already supported. See this http://lime-technology.com/forum/index.php?topic=52033.msg501339#msg501339
  6. Ignore this for now. This is from a pull request @Bjonness406 just sent me. It will be changed, so that you won't be getting any warning in the future.
  7. once I've got the encryption working, I'm going to have a crack at this: http://rclone.org/commands/rclone_mount/ I just tested the mount function and it works for encrypted remotes. However I can only see the mounted content within the container, outside the container I cannot see it. What could that be?
  8. I don't think so. I tried backing up 600gb from a non-cache share and it went fine despite my cache drive having only 70gb free space. Something else is wrong. Your configuration looks fine to me. Let me know what happens if you let it run for a bit longer. EDIT: Can you try setting the config path to /mnt/cache/appdata/rclone/config/ Sometimes docker apps have problems with the "user" shares.
  9. You are not running with the --dry-run flag? Can you post the config file (without your credentials of course). For me encryption to ACD is working perfectly. I've both tested with filename encryption and without filename encryption.
  10. Read op and in particular this section: Creating the initial config file (.rclone.conf) docker exec -it Rclone rclone --config="/config/.rclone.conf" config I'd be pretty nice though if the container could automagically find out your prefered Cloud Service, username and password
  11. hmm didn't seem to embed http://screenshot.co/#!/e9b3388263 In Data Path you specified "/mnt/user/" That means the entire directory will be backed up. Are you sure about this? In "Sync Destination" you need yo write the destination to the REMOTE specified when you made the configuration for Amazon Cloud Drive. E.g. "acd" In Sync Command write: "rclone sync --dry-run /data acd:/unraid" This will backup the entire contents of /mnt/user/ to a folder called "unraid" on Amazon Cloud Drive
  12. The command is only getting executed every hour, you can change "Cron Schedule:" to this, then wait a minutte and see what the output of the command are. Cron Schedule: * * * * * getitng warnign messages now: 2016/09/27 13:50:00 Attempt 1/3 failed with 1 errors and: mkdir /mnt/user: permission denied 2016/09/27 13:50:00 Attempt 2/3 failed with 1 errors and: mkdir /mnt/user: permission denied 2016/09/27 13:50:00 Attempt 3/3 failed with 1 errors and: mkdir /mnt/user: permission denied 2016/09/27 13:50:00 Failed to sync: mkdir /mnt/user: permission denied i've changed /data <-- /mnt/user to /data <-- /mnt/user/Media to see if that helps and I'm trying SYNC_COMMAND="rclone sync /data/Music secret:/unraid" Can you make a screenshot of the docker configuration page with "Advanced View" chosen.
  13. As i mentioned earlier, you can override the default sync command by specifying the variable: SYNC_COMMAND So try: SYNC_COMMAND="rclone sync --dry-run /test acd:/unraid" I suppose you have mounted /test correctly and setup the Amazon Cloud Drive destination with the name "acd" Show the log output after this has run.
  14. Can you try doing this and see if it works (taken from the rclone documentation): If it is safe in your environment, you can set the RCLONE_CONFIG_PASS environment variable to contain your password, in which case it will be used for decrypting the configuration. If you are running rclone inside a script, you might want to disable password prompts. To do that, pass the parameter --ask-password=false to rclone. This will make rclone fail instead of asking for a password if RCLONE_CONFIG_PASS doesn’t contain a valid password. UPDATE: I just tested by passing RCLONE_CONFIG_PASS environment variable to the container, and it decrypts the config file fine I'll add this to the documentation
  15. We definitely need multiple folders, I just havn't come up with a smart way to implement it yet. Having a file like @Bjonness406 is suggesting would work, but it's not that user-friendly. I prefer a way through variables. I'll think of it and do some more work on it this weekend. Custom commands would be nice aswell - I see you are using 'copy' while I'm using 'sync' :-) If you come up with anything, let me know. You can follow the development of the container in the dev branch. Well, I don't think Rclone at all is super user friendly, since it doesn't have a GUI. What about having a file where you can add your own commands, but add some start examples? Like this: #Take away the "#" for the lines you want to use. # Change "remote" to what you did call your remote during setup of the config file (Onedrive, Amazon, Google, backup2 etc) #rclone copy /pictures remote:pictures # Copies the files from "/pictures" to "pictures" on the remote. #rclone copy /documents remote:documents #rclone copy /video remote:video #rclone sync /example remote:example #Sync's the content between both the destiations, need to have RW permission. Or you could have a Varible called "Remote" where you specify what you have as remote. If you do this, I can write guide on how to setup the config file from a windows pc at least (probably mac and linux too), then then add it correctly to the rclone commands. (I will make that for Onedrive since I use that, but will probably be the same for most of the other ones too.) Thanks for the suggestions. For now I made the multiple commands / folders really simple (but it works...). There is a new environemt variable you can pass called "SYNC_COMMAND" where you basically override the default sync command. So now you can do: SYNC_COMMAND="rclone sync /data remote:/data && rclone sync /data2 remote:/data2" You cannot use variables inside the custom sync command. I've not figured out how to expand them yet :-). Let me know how it works.
  16. Wouldn't that interfere with the Docker philosophy?
  17. We definitely need multiple folders, I just havn't come up with a smart way to implement it yet. Having a file like @Bjonness406 is suggesting would work, but it's not that user-friendly. I prefer a way through variables. I'll think of it and do some more work on it this weekend. Custom commands would be nice aswell - I see you are using 'copy' while I'm using 'sync' :-) If you come up with anything, let me know. You can follow the development of the container in the dev branch.
  18. I actually already implemented that - I just didn't update the template. Pass the variable CRON_SCHEDULE with a cron which suits you. E.g. 0 * * * * for every hour. Every minute is a bit overkill - I might change the default to 1 hour anyway. Update: I updated the template now, you should be able to see it if you tick "Advanced View" when setting up the container.
  19. Docker Containers Rclone - A command line program to sync files and directories to and from various cloud services. UnoEuro DNS - Keep your DNS records for your own domains updated with this UnoEuro DDNS script. UnoEuro provides a free DNS service for your private domains. GitLab - Tools for modern developers. GitLab unifies chat, issues, code review, CI and CD into a single UI. Socat - A relay for bidirectional data transfer between two independent data channels. Please post any questions/issues/requests relating to these docker in their respective topic's (click on the name and you will be redirected).
  20. Mounting encrypted volumes in this docker to the host or other dockers is not supported. This is not a missing feature in this docker container but a limitation in docker itself. For mounting encrypted volumes to the host or other docker containers please use the new docker: Rclone-mount
  21. I have created a CA template and released my beta version of the docker here: https://lime-technology.com/forum/index.php?topic=52033.0 Maybe this thread should be merged?
  22. Docker for Rclone - a command line program to sync files and directories to and from various cloud services. Application Name: Rclone Application Site: http://rclone.org/ Docker Hub: https://hub.docker.com/r/tynor88/rclone/ Github: https://github.com/tynor88/docker-rclone/tree/dev unRAID Template: https://github.com/tynor88/docker-templates Setup/Configuration: http://rclone.org/docs/ Supported Cloud Services: Google Drive Amazon S3 Openstack Swift / Rackspace cloud files / Memset Memstore Dropbox Google Cloud Storage Amazon Drive Microsoft One Drive Hubic Backblaze B2 Yandex Disk The local filesystem Creating the initial config file (.rclone.conf) docker exec -it Rclone rclone --config="/config/.rclone.conf" config Feel free to post any questions/issues/requests relating to this docker in this thread. <CAUTION>This is still a beta so don't expect everything to work. Currently I have it running with Amazon Cloud Drive + Dropbox sync, and it's working fine.</CAUTION>
  23. If you exec into your container with docker exec -it rclone /bin/bash can you actually see the /config/rclone.sh file? It seems from the error that it is not present at that location. I have POC version of my running which you can check out here: https://github.com/tynor88/docker-rclone/tree/dev Currently it is hardcoded for my needs (Only works with my config), but I will expand it and make it more dynamic so you can specify config through environment variables.
  24. I've started some initial work on a docker for this, which can run at some predefined schedules. Hopefully i can have something ready for testing next week. Try to map: /mnt/user/cache/appdata/rclone/config -> /config That works for me.
  25. How did you authorize the docker app? Did you install rclone on your pc, to get the key and then paste it in the docker afterwards? Asking mod to split this out of the thread, we are way OT I just ran it from my own PC, as the correct ports where not exported in the docker. Rclone comes with a built in webserver which can do the application authorization. Yes, we definitely need a new topic for this discussion