Jump to content
Waseh

[Plugin] rclone

623 posts in this topic Last Reply

Recommended Posts

You should install it through the community applications plugin.

 

Also thanks for pointing out the link in the first post is outdated - I'll get that sorted

Share this post


Link to post

hey everyone. I've been having an issue with Rclone for a month now. It comes and goes but none the less is so annoying. With much help form this community I was able to get rclone to work. I have been sending files to it regularly and I use it with Plex but here is what is happening. With this in mind. Nothing has changed with the scripts, OS, or Rclone Ever 2 days I lose the mount and can no longer connect to Gsuite. The only way to get the access back is to fully restart the server. Running the unmount and mount again fail to fix it. Now it is possible my unmount script is bunk but still.. I lose the connection all the time. I thought maybe it was an api issue or maybe I was using to many connections to the gsuite drive but I am not uploading to it at the time and I only have maybe 5 or 6 connections to it from plex.. Can anyone shed any light on why this might be happening or what I can do to try to stop it. I think its a great service but this is driving me nuts and im borderline going back to using hard disks. they are just less hassle. Thanks in advance. .

Share this post


Link to post
On 6/24/2019 at 8:37 PM, statecowboy said:

Can someone please help me stop the currently running rclone process so i can update?  I've tried killing processes, but I'm not sure I'm doing it right and I dont want to screw anything up.  Here's what I get when attempting to update the plugin.

 

plugin: updating: rclone.plg

+==============================================================================
| Skipping package rclone-2018.08.25-bundle (already installed)
+==============================================================================

Downloading rclone
Downloading certs
Archive: /boot/config/plugins/rclone-beta/install/rclone-beta-latest.zip
creating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/rclone.1 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/README.txt 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/git-log.txt 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/README.html 
inflating: /boot/config/plugins/rclone-beta/install/rclone-v1.48.0-011-g276f8ccc-beta-linux-amd64/rclone 
Copy failed - is rclone running?
plugin: run failed: /bin/bash retval: 1

I was actually running into the same issue, and I resolved mine by uninstalling the plugin, then reinstalling from the Community Apps page. You could give it a shot and see if it works for you.

 

I just checked the logs on my only rclone script beforehand, to be sure it was not in the middle of running, then uninstalled after I confirmed it wasn't running.

Share this post


Link to post
6 hours ago, ShortBusHero said:

I was actually running into the same issue, and I resolved mine by uninstalling the plugin, then reinstalling from the Community Apps page. You could give it a shot and see if it works for you.

 

I just checked the logs on my only rclone script beforehand, to be sure it was not in the middle of running, then uninstalled after I confirmed it wasn't running.

Yeah, I shut down the script and waited a week at one point but it was still running.  Maybe I can wait longer.  I'd just like to kill it somehow and be done with it so I can upgrade.  Thanks for the input.

Share this post


Link to post
Posted (edited)
4 hours ago, statecowboy said:

Yeah, I shut down the script and waited a week at one point but it was still running.  Maybe I can wait longer.  I'd just like to kill it somehow and be done with it so I can upgrade.  Thanks for the input.

Ok.....straight up. I don't experience any of these issues. My rclone beta updates on UnRAID just fine. So I'll just outline what I do:

 

  1. Use the rclone-beta plugin (not docker or anything else)
  2. I use rclone mount commands in scripts with userscripts plugin. You click "run in background" with these scripts.
  3. When I want to stop rclone....all I ever do is run below command.....the path is wherever your rclone mount is mounted to
  4. You can verify rclone is killed by just searching through htop on terminal
  5. Then you can update the rclone-beta plugin....and it WILL update. I've never ever had an issue with this.

 

umount -l  /mnt/disks/google

 

I'm running the very latest rclone beta....I just did this and it worked just fine. Frankly, my rclone-beta plugin updates automatically for me along with the rest of my plugins. I have no idea why you all are having these issues.

Edited by Stupifier

Share this post


Link to post
13 hours ago, statecowboy said:

Yeah, I shut down the script and waited a week at one point but it was still running.  Maybe I can wait longer.  I'd just like to kill it somehow and be done with it so I can upgrade.  Thanks for the input.

Simpler solution. Try updating right after a reboot?

Share this post


Link to post
Simpler solution. Try updating right after a reboot?
Nobody likes/wants to reboot their unRAID...but yes, that would work too.

Share this post


Link to post

Hi.

 

I am using this plugin to backup some stuff to my gdrive.

 

Is there any way to get a notification on completed backup or if there was any error during backup? I would prefer the notification to be sendt to discord using webhook, but pushover or email is ok too.

 

I ended up with the script like this. Is there anything I should change for it?

 

rsync -ahv --delete --partial --ignore-existing source destination

Share this post


Link to post
Posted (edited)
2 hours ago, ProphetSe7en said:
Hi.
 
I am using this plugin to backup some stuff to my gdrive.
 
Is there any way to get a notification on completed backup or if there was any error during backup? I would prefer the notification to be sendt to discord using webhook, but pushover or email is ok too.
 
I ended up with the script like this. Is there anything I should change for it?
 
rsync -ahv --delete --partial --ignore-existing source destination

Mmmm... There is no option in rclone for anything like that.... But you can use an rclone flag to log output to a file.... And probably figure out a bash script to notify via webhook whatever you want when your rclone command finishes with a specific EXIT signal/code. And then just tail the last couple lines of the log file along with your webhook so you can quickly see if rclone sync succeeded or not.

 

Here's just an example of something I quickly pulled out from github. I modified it slightly to make it easier.
 source

 

#!/bin/bash

#######################
#INPUT VARIABLES
#######################
$source=
$destination=
$maxtransfer=
$rcloneflags=
$logfile=
#######################
#INPUT VARIABLES
#######################

rclone sync $source $destination\
--drive-server-side-across-configs\
--max-transfer $maxtransfer\
--fast-list --size-only -vP --stats 5s\
--tpslimit 4 --tpslimit-burst 20 --max-backlog 1000000\
--stats-file-name-length 0 $rcloneflags\
--log-file="$logfile"\
echo;echo FINISHED sync from $source to $destination;echo

#Sends a Notification to my phone via Join App for Android
wget -q "https://joinjoaomgcd.appspot.com/_ah/api/messaging/v1/sendPush?deviceNames=Phone&text=rclone_sync_complete&apikey=my_secret_key" > /dev/null

 

Edited by Stupifier

Share this post


Link to post

Thank you. Need to look into that and see what I can make of it.

 

I have made a script for testing discord messages. This one sends a message to discord. All it does is showing a bell, pinging my user and type the text "this is a test". Now I need to figure out how to integrate it to the script to get the correct message.

 

curl -X POST "webhookurl" \
            -H "Content-Type: application/json"  \
            -d '{"username":"borg", "content":":bell: Hey <@userid>  This is a test"}' 

There is also a script that uses borg + rclone. At the end it sends a email if any error or backup has finished without errors. It should be possible to change this to use discord, I just dont know how to yet.
https://pastebin.com/8WGmJgiQ

Share this post


Link to post

Ya.....You're on the right track.....Just go down the rabbit hole with that and experiment. I glanced at it but I'm not much of a coder. Post up if you figure out an easy way to fire a command if rclone success/fail.

Share this post


Link to post

Hi guys.

 

Having a nightmare with permissions and hopefully someone can help. I've naturally followed SpaceInvaders video which helped.

 

I have successfully connected to Google Drive and Mega.

 

However when mounting and sharing via SMB, is where I start having problems, here are the commands I'm trying (manually or through User Scripts):


 

mkdir -p /mnt/disks/mega

rclone mount --max-read-ahead 1024k --allow-other mega: /mnt/disks/mega &

 

So creates the directory, then mount it to /mnt/disks/mega. This completes fine.

 

SMB Config: 

 

[mega]
      path = /mnt/disks/mega
      comment =
      browseable = yes
      # Public
      public = yes
      writeable = yes
      vfs objects =

 

But when I browse, I cannot write files to this share at all. Checking permissions in SSH gives this:

 

drwxrwxrwx 1 root root 0 Jul 26 15:56 mega/
 

Anyone had similar issues? Tried searching without succsess. Thanks.

 

 

Share this post


Link to post
4 hours ago, bitmass said:

Hi guys.

 

Having a nightmare with permissions and hopefully someone can help. I've naturally followed SpaceInvaders video which helped.

 

I have successfully connected to Google Drive and Mega.

 

However when mounting and sharing via SMB, is where I start having problems, here are the commands I'm trying (manually or through User Scripts):


 


mkdir -p /mnt/disks/mega

rclone mount --max-read-ahead 1024k --allow-other mega: /mnt/disks/mega &

 

So creates the directory, then mount it to /mnt/disks/mega. This completes fine.

 

SMB Config: 

 


[mega]
      path = /mnt/disks/mega
      comment =
      browseable = yes
      # Public
      public = yes
      writeable = yes
      vfs objects =

 

But when I browse, I cannot write files to this share at all. Checking permissions in SSH gives this:

 

drwxrwxrwx 1 root root 0 Jul 26 15:56 mega/
 

Anyone had similar issues? Tried searching without succsess. Thanks.

 

 

The general wisdom is that rclone mounts are NOT reliable for writes and should only be used for reads. That is not to say that ALL writes to an rclone mount will fail....I'm just saying you will have an inconsistent write experience. Sometimes it will work but more often than not, it will not work.

 

If you need to put something onto cloud storage, use rclone copy/sync commands OR use Google Drive File stream aka GDFS (for Google Drive storage only of course).

Share this post


Link to post

Can someone answer me this question. Since using Rclone with my Plex/Unraid server. I have noticed an overwhelming amount of RAM usage even when only 1 stream from my Rclone mount is in use. I have 24GB of RAM. Only dockers are Plex, Filezilla (Which is off) and 2 VMS ( Also Off)  and usually I am at about 30-40% used but it seems when someone is streaming from the mount, It shoot up to high 90% even with just one stream. Is there any way to use my SSD as a cache for Rclone instead of my RAM? or would that not work.. Thanks. 

Share this post


Link to post

is it possible to use this to sync an entire share to google drive account, so any edit on files locally would automatically upload on the cloud?

Share this post


Link to post
Posted (edited)
On 9/10/2018 at 4:13 AM, vw-kombi said:

 

Yep - you are correct.  That is IP 5.153.250.7, which is in the block list for my firewall.......  not manually entered by me, it is part of skynet's firewall rules so it must be in some sort of list.  I will whitelist it and update shortly.

 

Currently having a similar issue as a user before had. Except I can connect fine and download the file through curl just fine. I have tried everything that was done for that user and I still get nothing. I would love for any insight into this.

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg ... done

+==============================================================================
| Skipping package rclone-2018.08.25-bundle (already installed)
+==============================================================================

Downloading rclone
Downloading certs
Download failed - No existing archive found - Try again later
plugin: run failed: /bin/bash retval: 1

Updating Support Links

 

 

--Update--

So ran the rclone through the command line and this is what I get.

 

plugin install https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg ... done

+==============================================================================
| Skipping package rclone-2018.08.25-bundle (already installed)
+==============================================================================

Downloading rclone
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 3 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 2 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 1 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
curl: (28) Resolving timed out after 5000 milliseconds
Downloading certs
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 3 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 2 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
Warning: Transient problem: timeout Will retry in 2 seconds. 1 retries left.
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0
curl: (28) Resolving timed out after 5000 milliseconds
Download failed - No existing archive found - Try again later
plugin: run failed: /bin/bash retval: 1

 

Edited by Visorak
Adding more information to get assistance.

Share this post


Link to post

Hi,

i´ve installed the rclone plugin to synchronize a folder sitting on my UNRAID server to a cloud storage.

I´m using rclone together with the user scripts plugin. The script consists of the following command:

"rclone sync /mnt/user/Data/Media/Bilder /mnt/disks/HiDrive/Media/Bilder/"

I´ve configured user scripts to run in every night at 2:00am.

In general, everything is working (mounting, synching), both source and destination folder are in synch.

However, the synch process takes almost takes 4 hours and saturates the internet connection, although nothing has changed in the source folder (so it seems to me that this is the time rclone uses just to check if there are any changes).

We´re talking about ~150GB in ~40.000 files (photo archive).

 

Any advice how this can be accelerated?

 

Regards,

Hocky

Share this post


Link to post
3 hours ago, hocky said:

Hi,

i´ve installed the rclone plugin to synchronize a folder sitting on my UNRAID server to a cloud storage.

I´m using rclone together with the user scripts plugin. The script consists of the following command:

"rclone sync /mnt/user/Data/Media/Bilder /mnt/disks/HiDrive/Media/Bilder/"

I´ve configured user scripts to run in every night at 2:00am.

In general, everything is working (mounting, synching), both source and destination folder are in synch.

However, the synch process takes almost takes 4 hours and saturates the internet connection, although nothing has changed in the source folder (so it seems to me that this is the time rclone uses just to check if there are any changes).

We´re talking about ~150GB in ~40.000 files (photo archive).

 

Any advice how this can be accelerated?

 

Regards,

Hocky

So you are doing two things wrong:

  1. You mounted an rclone remote and want to write to that mount. I'll just link you to my post dated just over a week ago for ya. Don't write to rclone mounts...instead write to the rclone remote.
  2. You are using rclone copy/sync as you should for writing into your cloud storage location....BUT.....the location you gave is where you mounted your rclone cloud storage. Instead, use your remote for the destination. For example in `rclone config` if you setup your rclone remote with the name "google_drive"....Then this would be a simple sync command:
     
    rclone sync /mnt/user/Data/Media/Bilder google_drive:Media/Bilder/ -vP
    -v for Verbose logs.....so you can see wtf is going on
    -P just to show live progress.....

 

 

If you don't like doing it this way...the alternative is to use GDFS (Google FileStream).....it is more reliable for writing directly to a local filesystem location where a cloud storage is mounted onto.

 

rclone documentation REALLY needs to make this point clear to people.....because everyone is making this mistake. Everyone just wants to mount their cloud storage, then start writing to it easily....It sort of works...but it is just NOT reliable and you'll just get a crap experience.

Share this post


Link to post
Posted (edited)

Can anyone help me figure out how the new Google Photos sync works?

 

All I want to do is keep a copy of my photos/videos in Google Photos in a folder on a unraid share.  I tried variations, but this is what I thought would work, but am getting errors:

 

rclone sync -v google-photos-username:/ /mnt/user/Multimedia/Photos/test/

 

I thought this would just sync everything, but I get the error:

 

2019/08/07 12:40:09 ERROR : upload: error reading source directory: directory not found

 

Can anyone see what i'm doing wrong?

 

Edit1: if I use the command:

 

rclone sync -v google-photos-username:/media/all/ /mnt/user/Multimedia/Photos/test/

 

It thinks for a few minutes and then I start seeing errors like: 

 

2019/08/07 13:00:04 ERROR : : error reading source directory: couldn't list files: Quota exceeded for quota metric 'photoslibrary.googleapis.com/all_requests' and limit 'ApiCallsPerProjectPerDay' of service 'photoslibrary.googleapis.com' for consumer 'project_number:REDACTED'. (429 RESOURCE_EXHAUSTED)

 

 

For reference: https://tip.rclone.org/googlephotos/

Edited by Coolsaber57

Share this post


Link to post

@Stupifier

Thanks a lot for your comment. I´ll have a look at writing directly to the cloud storage.

Share this post


Link to post

So i changed the custom script command from

rclone sync /mnt/user/Data/Media/Bilder /mnt/disks/HiDrive/Media/Bilder/
to

rclone sync /mnt/user/Data/Media/Bilder HiDrive:users/tkhidrive/Media/Bilder/ -v

 

A couple of test-runs look promising - i´m now down to 1m15s for a sync without changes which took 4h before. 🙂

Thanks a lot, @Stupifier, for your excellent support!
 

Share this post


Link to post
On 8/8/2019 at 3:16 AM, hocky said:

So i changed the custom script command from

rclone sync /mnt/user/Data/Media/Bilder /mnt/disks/HiDrive/Media/Bilder/
to

rclone sync /mnt/user/Data/Media/Bilder HiDrive:users/tkhidrive/Media/Bilder/ -v

 

A couple of test-runs look promising - i´m now down to 1m15s for a sync without changes which took 4h before. 🙂

Thanks a lot, @Stupifier, for your excellent support!
 

You're welcome and don't worry.....your problem is WAYYY too common. Almost every other post is about something similar to this. Everyone seems to want to use rclone that way and it just doesn't work like that.

Share this post


Link to post
Posted (edited)
On 8/7/2019 at 11:41 AM, Coolsaber57 said:

Can anyone help me figure out how the new Google Photos sync works?

 

Can anyone see what i'm doing wrong?

 

Edit1: if I use the command:

 


rclone sync -v google-photos-username:/media/all/ /mnt/user/Multimedia/Photos/test/

 

It thinks for a few minutes and then I start seeing errors like: 

 


2019/08/07 13:00:04 ERROR : : error reading source directory: couldn't list files: Quota exceeded for quota metric 'photoslibrary.googleapis.com/all_requests' and limit 'ApiCallsPerProjectPerDay' of service 'photoslibrary.googleapis.com' for consumer 'project_number:REDACTED'. (429 RESOURCE_EXHAUSTED)

 

 

For reference: https://tip.rclone.org/googlephotos/

 

I'm having this EXACT same problem and it is driving me crazy. Would love to get some help figuring out what the deal is.

Edited by sunbear

Share this post


Link to post

hi there... are there any known issues with the jottacloud remote? I can't seem to authenticate agains the jottacloud servers anymore with rclone...

 

rclone v1.48.0
- os/arch: linux/amd64
- go version: go1.12.6

 

2019/08/14 06:24:10 Failed to create file system for "ula-jottacloud:": couldn't get account info: failed to get endpoint url: error 401: org.springframework.security.authentication.AuthenticationCredentialsNotFoundException: Token mismatch! (Unauthorized)

 

I tried everything, including removing the remote from the config and recreating it... to no avail...

 

any help welcome!

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.