[Support] binhex - Rclone


Recommended Posts

Is the sync function instantaneous? The files that I have made changes to at the source destination doesn't reflect on the destination source till I restart the docker. I have tried setting the RCLONE_OPERATION to sync but no avail. 

 

Any idea?

 

Edit: Changing RCLONE_SLEEP_PERIOD to 0.01h (or 1 minute) does the trick but it seems like it's running the rclone start script every minute which isn't ideal, I suppose. Any chance I can make it instant without having to configure the sleep period?

 

Thanks a lot!

 

Edited by fishermanG
Link to comment
Is the sync function instantaneous? The files that I have made changes to at the source destination doesn't reflect on the destination source till I restart the docker. I have tried setting the RCLONE_OPERATION to sync but no avail. 
 
Any idea?
 
Edit: Changing RCLONE_SLEEP_PERIOD to 0.01h (or 1 minute) does the trick but it seems like it's running the rclone start script every minute which isn't ideal, I suppose. Any chance I can make it instant without having to configure the sleep period?
 
Thanks a lot!
 
No way to do this other than what you're doing with changing the sleep period

Sent from my CLT-L09 using Tapatalk

Link to comment
On 9/6/2021 at 9:52 PM, binhex said:

i get it, hmm interesting, so i think rclone can do this but i would need to do some additional coding, probably another env var something like RCLONE_DIRECTION which would have values of 'localtoremote', 'remotetolocal', or 'both', then have additional code to detect the env var and perform the required sync operation, it would take a bit of testing but should be possible i think, no promises but i might take a look in the next few days.

Is this option still coming? I work out of OneDrive on multiple PCs and would like to backup my OneDrive to Unraid. As I don't sync all files on th ePCs I'm working on to save space.

 

Cheers

 

Link to comment
1 hour ago, cappapp said:

Is this option still coming? I work out of OneDrive on multiple PCs and would like to backup my OneDrive to Unraid. As I don't sync all files on th ePCs I'm working on to save space.

 

Cheers

 

its sitting on my drive, not finished at present, i need more time to complete it, so it MIGHT be coming, time permitting.

Link to comment
19 minutes ago, cappapp said:

Have I configured that right?

RCLONE_MEDIA_SHARES is configured correctly, but RCLONE_REMOTE_NAME is not, this should be the name of the remote when you ran through the rclone wizard to create the rclone configuration file, if you cant remember what you set it to then open your rclone config file with a text editor and look at the value in square brackets, that is the remote name.

Link to comment
1 hour ago, binhex said:

RCLONE_MEDIA_SHARES is configured correctly, but RCLONE_REMOTE_NAME is not, this should be the name of the remote when you ran through the rclone wizard to create the rclone configuration file, if you cant remember what you set it to then open your rclone config file with a text editor and look at the value in square brackets, that is the remote name.

Oh, the name I set specifically for OneDrive. OK, that worked. I can see the files on OneDrive now :) Thanks.

 

So if I want to repeat the process for copying the same files to Google Drive too, does that mean I need another whole Docker?
Or can I put multiple entries in this field, for the other configs that are setup too?

 

Cheers

 

Link to comment
2 minutes ago, cappapp said:

does that mean I need another whole Docker?
Or can I put multiple entries in this field, for the other configs that are setup too?

you can put multiples in, simply comma separate the 'remote' names, e.g.:-

RCLONE_REMOTE_NAME=onedrive-business-encrypt,gsuite-encrypt

 

Link to comment
8 minutes ago, binhex said:

you can put multiples in, simply comma separate the 'remote' names, e.g.:-

RCLONE_REMOTE_NAME=onedrive-business-encrypt,gsuite-encrypt

 

Awesome, thanks. 

That'll make it interesting if you do update to copy the reverse way. :P

 

Goodluck :)

Edited by cappapp
Link to comment
28 minutes ago, cappapp said:

Awesome, thanks. 

That'll make it interesting if you do update to copy the reverse way. :P

 

Goodluck :)

lol, yeah thats very true, well i am not responsible for any loss of data, you give a man a shotgun, its up to him whether he shoots himself in the foot with it or not 🙂

Link to comment

Hopefully, someone can answer, as the sync is creating a subdirectory folder when syncing. 

 

container path media is set to: /mnt/user

RCLONE_MEDIA_SHARES is set to: /media/Documents

 

the sync is creating and syncing to root (my drive) --> media --> Documents 

 

How do i just make it sync to the root of the gdrive? I tried just stating "/media/" in the MEDIA_SHARES as well as '/Documents/', which errors out in the log

 

Link to comment
Hopefully, someone can answer, as the sync is creating a subdirectory folder when syncing. 
 
container path media is set to: /mnt/user
RCLONE_MEDIA_SHARES is set to: /media/Documents
 
the sync is creating and syncing to root (my drive) --> media --> Documents 
 
How do i just make it sync to the root of the gdrive? I tried just stating "/media/" in the MEDIA_SHARES as well as '/Documents/', which errors out in the log
 
See this issue, same problem:- https://github.com/binhex/arch-rclone/issues/1

Sent from my CLT-L09 using Tapatalk

Link to comment
23 hours ago, binhex said:

See this issue, same problem:- https://github.com/binhex/arch-rclone/issues/1

Sent from my CLT-L09 using Tapatalk
 

Thank you, i think i am okay with how this works.

 

In this thread, i saw a couple folks ask about mounting and not getting an answer. Is there a preferred method to mount? Will the mount section in the GUI work, or is there a preferred method for mounting such as a user script or another plugin to try?

Link to comment
Thank you, i think i am okay with how this works.
 
In this thread, i saw a couple folks ask about mounting and not getting an answer. Is there a preferred method to mount? Will the mount section in the GUI work, or is there a preferred method for mounting such as a user script or another plugin to try?
It depends what you are trying to achieve if you simply want to verify that the files have been uploaded then the UI should work perfectly fine (does for me). If on the other hand you want to mount to pass through to something like Plex then you will need to either drop to the console of the container or use the rclone plugin.

Sent from my CLT-L09 using Tapatalk


Link to comment

Hi there,

 

I'm currently running the rclone plugin (not docker container) to sync to my Google drive. I set it up last week (via SpaceInvaderOne's video) and it worked great for 2TB worth of data. I could see and access the cloud drives in Krusader (under disks) with no issues.

 

I restarted my server to install an Nvidia driver update and when the server came back up, the cloud drives would not mount. I tried to manually run the script to mount the cloud drives based on the help output that it specified (rclone mount remote:path /path/to/mountpoint [flags]). "encrypt" is just the encryption for Google Drive, both remotes are going to the same account, just different folders. Here is the script from user scripts that I ran:

 

#!/bin/bash
#----------------------------------------------------------------------------
# first section makes the folders for the mount in the /mnt/disks folder so docker containers can have access
# there are 4 entries below as in the video I had 4 remotes amazon, dropbox, google and secure
# you only need as many as what you need to mount for dockers or a network share

mkdir -p /mnt/disks/Google
mkdir -p /mnt/disks/encrypt

# This section mounts the various cloud storage into the folders that were created above.

rclone mount Google: /mnt/disks/Google --allow-non-empty --max-read-ahead 1024k --allow-other
rclone mount encrypt: /mnt/disks/encrypt --allow-non-empty --max-read-ahead 1024k --allow-other  

 

When trying to run the script right now (9/23 @ 12:16pm), the screen just sits there and nothing happens. Checking the logs, it just says

Script Starting Sep 20, 2021 17:39.18

Full logs for this script are available at /tmp/user.scripts/tmpScripts/Rclone Mount Script/log.txt

Script Finished Sep 20, 2021 17:39.18

Full logs for this script are available at /tmp/user.scripts/tmpScripts/Rclone Mount Script/log.txt

 

 

I say all of that just to give background on where I'm at. I would like to set up binhex's docker container running rclone so that I don't have to run the commands directly off the server. I will be doing some large file transfers that may take a few days with my internet upload speed, so would like to be able to shut off / restart the remote computer that I access the webGUI of the server with, and still be able to see the progress of the upload. I've read that I could use a "screen" command, but if there is a way to do it through this docker container it seems like that would be an easier and more straightforward option.

 

I would like it run in a sync situation, so that it adds missing files to the cloud storage, and deletes cloud files if they are not on the local server (not just blind copy everything each time it syncs).

 

I also would like the ability for Plex to be able to pull from the cloud repository should I need to.

 

So. Now that I have the remotes set up, it did work at one point, and i already have the Google Drive API keys and such set up, where do I start on getting the drives to mount and setting up the docker container instead of just the command line plugin? If I need to use the plugin for this, that's fine, just trying to figure out the best way to set everything up.

Edited by Arcaeus
Link to comment

I seem to be having an issue with trying to set this up with backblaze.  When inspecting the log files I get this:

 

cat local-to-remote-b2:kronos-backup-combined-media-backup.txt
2021/09/27 00:34:59 Failed to create file system for "b2:kronos-backup:/media/backup": you must use bucket "kronos-backup" with this application key

 

I've got my `RCLONE_REMOTE_NAME` set to `b2:kronos-backup` but I'm not sure that's the correct way of doing this.  Any thought?

Link to comment
11 minutes ago, kronoskoders said:

I've got my `RCLONE_REMOTE_NAME` set to `b2:kronos-backup` but I'm not sure that's the correct way of doing this.  Any thought?

this is not correct, i think your remote name will be 'b2' im assuming 'kronos-backup' is a root folder on backblaze, right?.

Link to comment
On 9/23/2021 at 5:21 PM, Arcaeus said:

so would like to be able to shut off / restart the remote computer that I access the webGUI of the server with, and still be able to see the progress of the upload.

this docker image includes the rclone web ui, so you can watch progress using that, i do just this on my smartphone.

 

On 9/23/2021 at 5:21 PM, Arcaeus said:

I would like it run in a sync situation, so that it adds missing files to the cloud storage, and deletes cloud files if they are not on the local server (not just blind copy everything each time it syncs).

yep you can also do this, just set the RCLONE_OPERATION to 'sync'

On 9/23/2021 at 5:21 PM, Arcaeus said:

where do I start on getting the drives to mount and setting up the docker container instead of just the command line plugin?

so this is not implemented at the moment, and tbh im not 100% sure its possible, as you are in essence mounting a drive in a container, then sharing that to the host via a bind mount, and then sharing that again back to another container, its a bit gnarly and its currently untested, so for mounting you might need to use the plugin, at least for the time being.

Link to comment
19 minutes ago, binhex said:

this is not correct, i think your remote name will be 'b2' im assuming 'kronos-backup' is a root folder on backblaze, right?.

Putting `b2` in there returns this error:

 

2021/09/27 01:21:07 Failed to create file system for "b2:/media/backup": you must use bucket "kronos-backup" with this application key

 

Link to comment

Ok ya.  I figured out the problem.  It looks like you're appending an additional `:` to the end of remote name.  This won't work for buckets in backblaze.  I altered your start.sh script to this and it started to work:

 

function set_run_sync_flags(){

	if [[ "${rclone_remote_name_item}" == *":"* ]]; then
		rclone_remote_name="${rclone_remote_name_item}${rclone_media_shares_item}"
	else
		rclone_remote_name="${rclone_remote_name_item}:${rclone_media_shares_item}"
	fi

	if [[ "${RCLONE_DIRECTION}" == 'localtoremote' || "${RCLONE_DIRECTION}" == 'both' ]]; then

		if [[ "${ENABLE_WEBUI}" == 'yes' ]]; then

			sync_direction="srcFs=${rclone_media_shares_item} dstFs=${rclone_remote_name}"

		else

			sync_direction="${rclone_media_shares_item} ${rclone_remote_name}"

		fi

		echo "[info] Running rclone ${RCLONE_OPERATION} for local media share '${rclone_media_shares_item}' to remote '${rclone_remote_name}'..."
		run_rclone
		echo "[info] rclone ${RCLONE_OPERATION} finished"

	fi

	if [[ "${RCLONE_DIRECTION}" == 'remotetolocal' || "${RCLONE_DIRECTION}" == 'both' ]]; then

		if [[ "${ENABLE_WEBUI}" == 'yes' ]]; then

			sync_direction="srcFs=${rclone_remote_name} dstFs=${rclone_media_shares_item}"

		else

			sync_direction="${rclone_remote_name} ${rclone_media_shares_item}"

		fi

		echo "[info] Running rclone ${RCLONE_OPERATION} from remote '${rclone_remote_name}' to local share '${rclone_media_shares_item}'..."
		run_rclone
		echo "[info] rclone ${RCLONE_OPERATION} finished"

	fi

}

 

This seems to be working for my scenario.  I'm not sure if this is within the realm of your support but it would be nice to have an option to set the 'bucket' in between the remote_name and the media shares

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.