[Plugin] rclone


Waseh

Recommended Posts

6 minutes ago, Waseh said:

Sounds like you didn't have the new version installed. 

Are you getting any errors when you reinstall?

thanks for the reply.

I have my plugins to update automatically

 

just rebooted server again and plugin isnt there. So iv installed the rclone-beta again but now i get this :

 

User@Tower:/# rclone config /usr/sbin/rclone: line 21: rcloneorig: command not found

 

 

---------------------------------------------

contents on /usr/sbin/rclone

 

 


#!/bin/bash
log=false
args=()
for i in "$@" ; do
    if [[ $i = "--log" ]] ; then
        log=true
        continue
    fi
    if [[ $i = "-l" ]] ; then
        log=true
        continue
    fi
        args+=($i)
done

config=/boot/config/plugins/rclone-beta/.rclone.conf
logfile=/boot/config/plugins/rclone-beta/logs/rclone-$(date "+%Y%m%d").log
if [ "$log" = true ] && [ ${#args[@]} -ge 1 ]; then
        rcloneorig --config $config "${args[@]}" >> $logfile 2>&1
else
        rcloneorig --config $config "$@";
fi;

Edited by binary
Link to comment
12 minutes ago, Waseh said:

Having your plugins update automatically won't do much good if rclone is running when the update is triggered. 

 

The install didn't succeed. Try rebooting again if you cannot reinstall and/or you don't get any errors when you install 

Thanks. Uninstalled...rebooted...installed and rebooted again.

all back to normal thanks

Link to comment
  • 2 weeks later...

Hi,

I use rclone to copy the ip cam footage that is FTP's to my unraid server to google drive every 5 minutes.

Been working for ages well.

I am slowly implementing QOS rules on my router.

As I assume anything being sent (uploaded) from unraid goes from the same IP address (all the dockers etc), the rclone traffic is bundled up with any streaming I do from this same IP address to remote clients.

Does anyone know what ports the rclone connects to google drive with ?  a source port would be good to use to differentiate traffic.

Thanks.

 

googled it - source port is randon, target is port 443 (so I cant use that)

Edited by vw-kombi
Link to comment
7 hours ago, vw-kombi said:

Hi,

I use rclone to copy the ip cam footage that is FTP's to my unraid server to google drive every 5 minutes.

Been working for ages well.

I am slowly implementing QOS rules on my router.

As I assume anything being sent (uploaded) from unraid goes from the same IP address (all the dockers etc), the rclone traffic is bundled up with any streaming I do from this same IP address to remote clients.

Does anyone know what ports the rclone connects to google drive with ?  a source port would be good to use to differentiate traffic.

Thanks.

 

googled it - source port is randon, target is port 443 (so I cant use that)

Ah looks like you're trying to do the same thing as me.  I QOSed the GDrive IP addresses but they change so often, I have even done the enter subnet with a /16 but even that host IP changes. 443 is a catch all which like you said is no good. I wonder if there is a docker instead of a plugin that lets you change the IP that rclone uses.  

Link to comment
1 minute ago, xhaloz said:

Is there a way to assign a different IP address to rclone?  I want to QOS my traffic for it.  This is different from the --bwlimit flag.  I tried --bind but got an error.

Find an rclone docker that works for you or fire up rclone in a VM. THOSE are your only options. Don't think plugins can be assigned their own IP

Link to comment

After following the video the unmount scripts are not working at all. I'm curious as to why and if this can be updated via some kind of annotation to the video considering he outright deletes the default ones.

I keep getting this error:
fusermount: failed to unmount /mnt/disks/OneDriveJ: Invalid argument
fusermount: failed to unmount /mnt/disks/ODJsecure: Invalid argument
Script Finished Fri, 09 Nov 2018 16:45:22 -0800

 

edit:

Also,

When I view the onedrive storage the file manager reflects the available storage in the top bar. For the encrypted container this doesn't change from the root storage. I'm assuming the encrypted container is not connecting to the cloud storage?

Solved this after a reboot, still entirely unclear as to why it was required.

 

Not helpful linking a video in the second comment of this if it is outdated. At least update it via annotations.

 

The onedrive connection itself worked perfectly and I thank you for this effort.

 

What exactly is the path to the full logs I do not see them in any folders I find called tmp?

Edited by zjosh86
Link to comment
1 hour ago, zjosh86 said:

After following the video the unmount scripts are not working at all. I'm curious as to why and if this can be updated via some kind of annotation to the video considering he outright deletes the default ones.

I keep getting this error:
fusermount: failed to unmount /mnt/disks/OneDriveJ: Invalid argument
fusermount: failed to unmount /mnt/disks/ODJsecure: Invalid argument
Script Finished Fri, 09 Nov 2018 16:45:22 -0800

 

Try this alternate unmount command (example below)

 

umount -l  /mnt/disks/google_mount

 

Link to comment
  • 2 weeks later...

Hi All,

 

I'm currently coping data from my Unraid server to secure remote (Google Drive) using the command below in my User Scripts:

 

rclone --transfers=32 --checkers=16 --drive-chunk-size=16384k --bwlimit 8M copy /mnt/user/foo/ secure:foo

 

Once the transfer is complete, I plan to delete the source data. With that in mind, what is the best way to verify that everything uploaded correctly, prior to deleting the local data from my Unraid server? I believe Google Drive supports MD5 via rclone, I'm just not sure if there's something I have to specify.

 

Any guidance would be greatly appreciated! 

Link to comment
2 minutes ago, newoski said:

How is that relevant? Data is data. I'd prefer not to have to source it and organize it again, thus the backup/question

 

🙂

If the data only exists in one place, you don't have a backup. And relying on a large corporation to keep your data safe for you, well, that's your call.

 

You could checksum the files, but if they get altered or corrupted after the fact, you will have no recourse to fix them. Perhaps using par2 sets with some level of redundancy, whatever you feel comfortable with.

 

As far as checking to be sure they were uploaded intact, the only way to be sure is to download them and run another checksum and compare it to the original. Checking them on the destination only gets you so far, you have to bring them back locally to use them, so until you know they can be retrieved intact, you can never be sure.

Link to comment
1 minute ago, jonathanm said:

If the data only exists in one place, you don't have a backup. And relying on a large corporation to keep your data safe for you, well, that's your call.

 

You could checksum the files, but if they get altered or corrupted after the fact, you will have no recourse to fix them. Perhaps using par2 sets with some level of redundancy, whatever you feel comfortable with.

 

As far as checking to be sure they were uploaded intact, the only way to be sure is to download them and run another checksum and compare it to the original. Checking them on the destination only gets you so far, you have to bring them back locally to use them, so until you know they can be retrieved intact, you can never be sure.

I have them on a local hard drive and tower. I'd like to switch to local machine and GDrive. I'm seeing lots of rclone commands to verify. You're saying none are designed to check integrity against source?

Link to comment

Forgive the newbie question:

 

Having a bit of trouble rclone setup and specifically authorizing dropbox.  I am working on a remote machine, select the option during the setup, login to the terminal on my remote machine but can not authorize dropbox. It does not direct my browser and the http://127.0.0.1 address just times out.

 

I'm sure this is simple, but would appreciate some assistance.

 

Thanks.

Link to comment
1 minute ago, sisren said:

Forgive the newbie question:

 

Having a bit of trouble rclone setup and specifically authorizing dropbox.  I am working on a remote machine, select the option during the setup, login to the terminal on my remote machine but can not authorize dropbox. It does not direct my browser and the http://127.0.0.1 address just times out.

 

I'm sure this is simple, but would appreciate some assistance.

 

Thanks.

 

What command do you run on your local box with rclone to authorize Dropbox?

 

Link to comment
4 hours ago, Dimtar said:

 

What command do you run on your local box with rclone to authorize Dropbox?

 

Rclone authorize “dropbox”

 

Edit: I've tried both connecting via the 127.x.x.x:53682/auth as well as pluging in my Unraid server's address with same port.

 

Edited by sisren
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.