[Plugin] rclone


Waseh

Recommended Posts

Hi everyone,

 

Having some problems getting rclone-beta installed on 6.6.7

 

Currently receiving the below error when I attempt to install it via the Apps tab

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg ... done
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/archive/rclone-beta-2019.10.13-x86_64-1.txz ... done

+==============================================================================
| Installing new package /boot/config/plugins/rclone-beta/install/rclone-2019.10.13-bundle.txz
+==============================================================================

Verifying package rclone-2019.10.13-bundle.txz.
Installing package rclone-2019.10.13-bundle.txz:
PACKAGE DESCRIPTION:
Package rclone-2019.10.13-bundle.txz installed.
Downloading rclone
Downloading certs
Download failed - No existing archive found - Try again later
plugin: run failed: /bin/bash retval: 1

Updating Support Links

 

Running the following command in Terminal:

 

curl --connect-timeout 10 --retry 3 --retry-delay 2 --retry-max-time 30 -o /boot/config/plugins/rclone/install/rclone-current.zip https://downloads.rclone.org/rclone-current-linux-amd64.zip

 

 

I'm receiving the following:

 

  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0Warning: Failed to create the file
Warning: /boot/config/plugins/rclone/install/rclone-current.zip: No such file
Warning: or directory
  0 11.1M    0  4735    0     0    857      0  3:47:54  0:00:05  3:47:49  1060
curl: (23) Failed writing body (0 != 4735)

 

Network firewall disabled, restarted multiple times, and have deleted the rclone-beta folder on the flash drive.

 

Any recommendations on how to resolve this/get rclone installed successfully? 

Link to comment
  • 4 weeks later...

Hello guys. I seem to be facing a strange issue. I tried to search for similar thing in this thread but I failed so I hope I am not repeating the question. 

 

I configured my first ftp remote, set the concurrency limitation to 1 (I need to use strict limitation for one file at once in this case) and when I try to sync content from it to my local share, it always does one random file and it hangs. Like it was waiting fro something. When I run the command with -P switch, I see exactly this - one file looks to be done, rest in the pipeline is not moving, ETA growing, average speed decreasing and current transfer stays 0. Also, Transferred counter is not growing --> thus why I think that it is actually waiting for the previous transfer to be completed while it has been already. Any ideas? Your help would be very much appreciated. Please see the picture below:

 

image.png.b0bda53cb43c88de86dcda23be7ae410.png

 

This issue is not visible when using SMB - a.k.a. mounting the share and then using sync between two files without any remote.

Thank you.

Link to comment
Hello guys. I seem to be facing a strange issue. I tried to search for similar thing in this thread but I failed so I hope I am not repeating the question. 
 
I configured my first ftp remote, set the concurrency limitation to 1 (I need to use strict limitation for one file at once in this case) and when I try to sync content from it to my local share, it always does one random file and it hangs. Like it was waiting fro something. When I run the command with -P switch, I see exactly this - one file looks to be done, rest in the pipeline is not moving, ETA growing, average speed decreasing and current transfer stays 0. Also, Transferred counter is not growing --> thus why I think that it is actually waiting for the previous transfer to be completed while it has been already. Any ideas? Your help would be very much appreciated. Please see the picture below:
 
image.png.b0bda53cb43c88de86dcda23be7ae410.png
 
This issue is not visible when using SMB - a.k.a. mounting the share and then using sync between two files without any remote.
Thank you.
I'd ask in the official rclone forums. Think you'll get better help to your problem there.
Link to comment

I am backing up 820GB (150k files) nightly to B2 and keep exceeding the free 2500 Class C Transaction limit.  I bumped this limit big time on my initial upload and added the --fast-list flag to my command and it made a huge difference but for whatever reason I am going over my daily limit still.

 

My cron job runs nightly and the stats show 20-30% cpu usage for 60+ minutes to complete the sync.  I am updating very little each night but do want to make sure everything stays synced.  How can I reduce the Class C transactions, specifically list_file_names?

Link to comment
14 minutes ago, ur6969 said:

I am backing up 820GB (150k files) nightly to B2 and keep exceeding the free 2500 Class C Transaction limit.  I bumped this limit big time on my initial upload and added the --fast-list flag to my command and it made a huge difference but for whatever reason I am going over my daily limit still.

 

My cron job runs nightly and the stats show 20-30% cpu usage for 60+ minutes to complete the sync.  I am updating very little each night but do want to make sure everything stays synced.  How can I reduce the Class C transactions, specifically list_file_names?

This probably another question I'd ask in the official rclone forums....since it's more geared towards just general rclone operation issues rather than the unraid plugin itself.

 

But...I dunno, maybe look into the tps limit and tps burst flags.

Link to comment
20 hours ago, Stupifier said:
23 hours ago, Bartist said:
Hello guys. I seem to be facing a strange issue. I tried to search for similar thing in this thread but I failed so I hope I am not repeating the question. 
 
I configured my first ftp remote, set the concurrency limitation to 1 (I need to use strict limitation for one file at once in this case) and when I try to sync content from it to my local share, it always does one random file and it hangs. Like it was waiting fro something. When I run the command with -P switch, I see exactly this - one file looks to be done, rest in the pipeline is not moving, ETA growing, average speed decreasing and current transfer stays 0. Also, Transferred counter is not growing --> thus why I think that it is actually waiting for the previous transfer to be completed while it has been already. Any ideas? Your help would be very much appreciated. Please see the picture below:
 
image.png.b0bda53cb43c88de86dcda23be7ae410.png
 
This issue is not visible when using SMB - a.k.a. mounting the share and then using sync between two files without any remote.
Thank you.

I'd ask in the official rclone forums. Think you'll get better help to your problem there.

I sorted it out. I ditched the FTP remote approach. Instead I locally mounted SMB share and checking size-only - "rclone sync <smb mount> <local folder> --size-only --transfers 1 --checkers 1" Works like a charm.

Link to comment

is this plugin still working?

 

I am getting:

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg ... done
plugin: downloading: http://beta.rclone.org/v1.34-11-gbf243f3/rclone-v1.34-11-gbf243f3β-linux-amd64.zip ... failed (Invalid URL / Server error response)
plugin: wget: http://beta.rclone.org/v1.34-11-gbf243f3/rclone-v1.34-11-gbf243f3β-linux-amd64.zip download failure (Invalid URL / Server error response)

Link to comment
3 hours ago, bar1 said:

is this plugin still working?

 

I am getting:

plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/rclone.plg ... done
plugin: downloading: http://beta.rclone.org/v1.34-11-gbf243f3/rclone-v1.34-11-gbf243f3β-linux-amd64.zip ... failed (Invalid URL / Server error response)
plugin: wget: http://beta.rclone.org/v1.34-11-gbf243f3/rclone-v1.34-11-gbf243f3β-linux-amd64.zip download failure (Invalid URL / Server error response)

Try using the rclone plugin and not the rclone beta.

Link to comment

@bar1 You are not using the Community Applications addon are you?

The links to the plg's in the first post were outdated. They have been updated now but you really should consider using Community Applications instead of installing manually.

Edited by Waseh
Link to comment
On 1/25/2020 at 8:52 PM, Waseh said:

@bar1 You are not using the Community Applications addon are you?

The links to the plg's in the first post were outdated. They have been updated now but you really should consider using Community Applications instead of installing manually.

ok noted, i will install your plugin from the CA.

I already have rclone mount docker up and running....so hopefully it wont conflict

Link to comment

I have set up rclone to run once a day, and it seems to start sometime between 1 AM and 5 AM (My local time). I have a 5 MB/s speed limiter. 

 

What is weird is that if that rclone is already running backup, sometimes it starts a second backup and that really messes up my router and bandwith.

 

Is there a way to make sure that if rclone is running, it will not start again? 


 

Quote

First instance of backup

 

2020/02/01 12:04:40 INFO :
Transferred: 488.304G / 727.036 GBytes, 67%, 4.848 MBytes/s, ETA 14h23s
Errors: 2 (retrying may help)
Checks: 2157 / 2157, 100%
Transferred: 22 / 49, 45%
Elapsed time: 28h38m56.3s
Transferring:
* Filmer/Batman & Robin ….mkv: 38% /67.497G, 1.245M/s, 9h26m59s
* Filmer/Batman (1989)/B…umkv: 37% /66.613G, 1.329M/s, 8h56m56s
* Filmer/Dragged Across ….mkv: 40% /36.956G, 1.449M/s, 4h20m2s
* Filmer/Extremely Wicke….A.mkv: 46% /28.272G, 895.955k/s, 4h53m56s

 

Second instance of backup:

 


2020/02/01 12:05:04 INFO :
Transferred: 116.424G / 779.134 GBytes, 15%, 4.466 MBytes/s, ETA 1d18h12m39s
Errors: 1 (retrying may help)
Checks: 2165 / 2165, 100%
Transferred: 12 / 58, 21%
Elapsed time: 7h24m56.2s
Transferring:
* Filmer/Aquaman (2018)/….mkv: 19% /57.404G, 1.204M/s, 10h51m43s
* Filmer/Batman v Superm….mkv: 39% /76.825G, 1.595M/s, 8h17m2s
* Filmer/Big Hero 6 (201….mkv: 59% /48.279G, 1.695M/s, 3h16m40s
* Filmer/Blade Trinity (….mkv: 92% /29.054G, 158.344k/s, 4h0m31s

 

Edited by Halvliter
added question
Link to comment
8 minutes ago, Halvliter said:

I have set up rclone to run once a day, and it seems to start sometime between 1 AM and 5 AM (My local time). I have a 5 MB/s speed limiter. 

 

What is weird is that if that rclone is already running backup, sometimes it starts a second backup and that really messes up my router and bandwith.

 

Is there a way to make sure that if rclone is running, it will not start again? 

There's nothing weird about it. One of your backup has been running for more than 24 hours so it overlaps with the "once a day" schedule.

You have to write your script properly to detect this scenario.

  • Thanks 1
Link to comment
18 minutes ago, testdasi said:

There's nothing weird about it. One of your backup has been running for more than 24 hours so it overlaps with the "once a day" schedule.

You have to write your script properly to detect this scenario.

Thank you for the quick reply. I've tried to google it, but I'm not sure if I'm on the right track:

 

Quote

#!/bin/bash
if pidof -o %PPID -x “rclone-cron.sh”; then
exit 1
fi
rclone sync …
exit

 

Edited by Halvliter
Link to comment
38 minutes ago, testdasi said:

I prefer to use the status file method e.g.

  1. Check if a status file exists. If yes, exit script.
  2. If files does not exist then touch status file (to create it) and then do stuff and once all done delete status file

I find this method works really well with cross-script signalling.

Thanks again!

Too complex for me right now, but I will remember it when I have some extra time to learn how to do it. I've just used SpaceInvaderOnes guide for now.

Link to comment

I am trying to track down higher than normal API calls on B2 with Rclone.  I have been in the official Rclone forums and they told me what to log.  However, when I add a --log-file to the script I get a log but am unable to open it.  When I try to open it from my W10 pc I get the following:

 

You do not have permission to open this file.  See the owner of the file or an administrator to obtain permission.

 

The log file is saved to "/mnt/user/Folder/log.txt" where "Folder" is a normal folder I access files with from the pc.  It's there, I can see it, but it won't let me open.  I've searched but I've given up!

Link to comment
On 2/8/2020 at 11:07 PM, ur6969 said:

The log file is saved to "/mnt/user/Folder/log.txt" where "Folder" is a normal folder I access files with from the pc.  It's there, I can see it, but it won't let me open.  I've searched but I've given up!

rclone creates the log with weird permission so you have to use the console (either ssh in or through the CLI) to change permission and owners.

Link to comment
2 minutes ago, torch2k said:

Is there any way to mount with rclone but have the mounted files owned by nobody:users?

 

I'm not able to see anything over SMB/shares due to the ownership of mounted files being root.

Example command....specifically, pay attention to the uid, gid, and default permissions flags

rclone mount \
   --allow-other \
   --dir-cache-time 96h \
   --fast-list \
   --drive-chunk-size 512M \
   --buffer-size 256M \
   --vfs-read-chunk-size 128M \
   --vfs-read-chunk-size-limit off \
   --uid 99 --gid 100 \
   --default-permissions \
   --user-agent="Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.131 Safari/537.36" \
   rclone_remote: /mnt/disks/rclone_remote

 

Link to comment
On 2/10/2020 at 1:23 PM, testdasi said:

rclone creates the log with weird permission so you have to use the console (either ssh in or through the CLI) to change permission and owners.

I have looked and even searched again given the context of what you are saying but cannot find any information.  Can you give me any more details or point me in the right direction?  Thanks

Link to comment

Sorry all if this is a dumb question.  I'm not sure if I'm searching for the right things.  Here goes:

The Rclone app configuration has the config script, Rclone custom script, mount script and unmount script.

 

My question is when/where can I invoke these scripts?  I don't see that my backups are running, and I'd like to mount the storage and I don't see it mounted as I wrote it in the script.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.