[Plugin] rclone


Waseh

Recommended Posts

question when i do my plex scan of any media in my library my array goes undefined. some other users stated it could be my rclone causing the problem. Previous Post i tried to find the answer but still no luck. 

 

here is my rclone cache script: rclone mount "google cache": /mnt/user/Media/GoogleDrive --allow-other --dir-cache-time 1000h --log-level INFO --timeout=1h --umask 002 --user-a gent shedman214 --cache-dir=/mnt/user/Media/GoogleDrive_cache --vfs-cache-mode writes --vfs-cache-max-size 1G --vfs-cache-max-age 20s & 

syslog.txt

Link to comment

@animeking

This could be related. I'm not sure you'll be able to get this to work if your problem is indeed related to that issue until a fix is deployed.

Did you check how much memory rclone is using before the array becomes unresponsive? You could run htop from ssh and look for "rcloneorig".

If rclone is filling your memory then the question is probably better suited for the rclone forum.

Edited by Waseh
Link to comment
On 12/30/2019 at 1:45 AM, rahmor said:

Hi everyone,

 

Having some problems getting rclone-beta installed on 6.6.7

 

Currently receiving the below error when I attempt to install it via the Apps tab


plugin: installing: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/plugin/rclone.plg ... done
plugin: downloading: https://raw.githubusercontent.com/Waseh/rclone-unraid/beta/archive/rclone-beta-2019.10.13-x86_64-1.txz ... done

+==============================================================================
| Installing new package /boot/config/plugins/rclone-beta/install/rclone-2019.10.13-bundle.txz
+==============================================================================

Verifying package rclone-2019.10.13-bundle.txz.
Installing package rclone-2019.10.13-bundle.txz:
PACKAGE DESCRIPTION:
Package rclone-2019.10.13-bundle.txz installed.
Downloading rclone
Downloading certs
Download failed - No existing archive found - Try again later
plugin: run failed: /bin/bash retval: 1

Updating Support Links

 

Running the following command in Terminal:

 


curl --connect-timeout 10 --retry 3 --retry-delay 2 --retry-max-time 30 -o /boot/config/plugins/rclone/install/rclone-current.zip https://downloads.rclone.org/rclone-current-linux-amd64.zip

 

 

I'm receiving the following:

 


  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:--  0:00:05 --:--:--     0Warning: Failed to create the file
Warning: /boot/config/plugins/rclone/install/rclone-current.zip: No such file
Warning: or directory
  0 11.1M    0  4735    0     0    857      0  3:47:54  0:00:05  3:47:49  1060
curl: (23) Failed writing body (0 != 4735)

 

Network firewall disabled, restarted multiple times, and have deleted the rclone-beta folder on the flash drive.

 

Any recommendations on how to resolve this/get rclone installed successfully? 

Did you ever figure this out, i am in the same situation

Link to comment
On 9/22/2020 at 9:55 AM, Lilarcor said:

For the latest version, will it re-download rclone after each rebooting ? I experienced the issue when using the version in 2019.

 

I pushed a new version yesterday that will only re-download rclone at boot if a newer version is available :) 

Link to comment

Hey folks,

Can someone please help me out. I had issues with using this in the past but I'm committed to making it work now

I am using google drive

 

I am not interested in mounting the drive, but just to run a daily script. I want something that will be as close to a true backup as possible to preotect from a possible crypto or accidental deletion. 

What would be the best script for this? I want to be able to easily restore if a crypto comes along and I don't catch it say for a while, I need to just be able to pull down the files from before crypto hit.... I already have the user scripts plugin installed. Thank you so much!

 

With encryption of course. Thanks again

Edited by maxse
Link to comment

I am trying to add more remotes to my config but after trying to add about a total of 34 drives, i seem to have hit a character limit in the edit config. 
When i add remotes via command line process (painfully slow when you have 34 drives) it will pass this character limit. so i know that it isn't a file limitation but a limit with the unraid web interface with this edit config.

Where is the config file located so i can add more remotes and bypass this web interface?
also, can the character limit be increased?

image.thumb.png.7675c6a9d8db4ecf7a900b7cc80effae.png
 

Link to comment
6 hours ago, ServerNewbi said:

I am trying to add more remotes to my config but after trying to add about a total of 34 drives, i seem to have hit a character limit in the edit config. 
When i add remotes via command line process (painfully slow when you have 34 drives) it will pass this character limit. so i know that it isn't a file limitation but a limit with the unraid web interface with this edit config.

Where is the config file located so i can add more remotes and bypass this web interface?
also, can the character limit be increased?

image.thumb.png.7675c6a9d8db4ecf7a900b7cc80effae.png
 

In command line terminal, use "rclone config show file" to find the location of your rclone.conf file and edit it manually.

  • Thanks 1
Link to comment
13 minutes ago, maxse said:

anyone? There is so much to learn and read and I'm not sure what script to use for the purposes I mentioned above...

 

rclone is just a mount, download, upload tool......it does not contain any special backup functions like you are asking for. You are looking for one of these:

  1. A script/tool which performs a backup AND moves that backup to Google Drive (using rclone)
  2. Any backup script/tool...then you manually move/copy the backup to Google Drive (using rclone)...people here can gladly supply the rclone command to copy/move that backup directory onto Google Drive....that is dead simple though.

In any event.....this is a support thread for rclone help....not so much backup script help. I'd ask for help somewhere else. I think that is why you are not receiving responses here.....

Edited by Stupifier
Link to comment

I understand what you're saying but all I cold find on here are how to mount, play files from the mount etc... I was looking for help if you folks had any suggestions like to limit file transfer speeds for google drive to not get banned, I remember reading something about it here a while ago but can't find it anymore... There is also a switch --backupdir or something like that, that could basically place the modified files into a directory with that date instead of overwriting them. Was hoping someone here had the script for something like this already that they could help. 

Thanks

Link to comment
6 minutes ago, maxse said:

I understand what you're saying but all I cold find on here are how to mount, play files from the mount etc... I was looking for help if you folks had any suggestions like to limit file transfer speeds for google drive to not get banned, I remember reading something about it here a while ago but can't find it anymore... There is also a switch --backupdir or something like that, that could basically place the modified files into a directory with that date instead of overwriting them. Was hoping someone here had the script for something like this already that they could help. 

Thanks

I don't personally know too much about the --backup-dir flag in rclone.....but this is how you limit file transfer speeds with the "--bwlimit 2M" flag. That will limit you to 2 MB/sec transfer speed. More flags here:
https://rclone.org/flags/

Not sure if you are limiting yourself to just the unraid forum in your search........but I would broaden your search out to basically anything linux based. There should be tons of various scripts out there for you to find. I personally use something called "rdiff-backup"....it doesn't backup to Cloud natively....but that is no big deal....you can manually move/copy the backup to Cloud when its done.

Edited by Stupifier
Link to comment
On 10/7/2020 at 8:53 PM, maxse said:

Can someone please help me out. I had issues with using this in the past but I'm committed to making it work now

I am using google drive

 

I am not interested in mounting the drive, but just to run a daily script. I want something that will be as close to a true backup as possible to preotect from a possible crypto or accidental deletion. 

What would be the best script for this? I want to be able to easily restore if a crypto comes along and I don't catch it say for a while, I need to just be able to pull down the files from before crypto hit.... I already have the user scripts plugin installed. Thank you so much!

 

With encryption of course. Thanks again

 

Many websites claim that Google Drive is safe against Ransomware as an deleted file can be restored through the Web GUI. And if its became encrypted, you can restore the original file as Google saves 100 versions of each file. But I'm not sure. What happens if the attacker replaces the file against an 1kB file and re-uploads this file 101x times? The Google FAQ uses only the word "may" and not "can recover files".

 

This means if your server has been attacked, it has an authentificated connection to your Google drive and by that the attacker is able to overwrite all the data 101x times and the files are gone (feel free to contradict me).

 

As it is not possible to mount only a specific subfolder, the only ransomware safe solution would be to share the data with a second Google Drive account which automatically moves the data out of the shared folder (this could be done by a password protected VM or an RPI in a different network area). Depending of the size of the files this could be done with a free (upload) and a paid account (the one that moves).

 

But maybe its even better to use the Google Drive only as Disaster Backup against Thiefs, Earthquake, Fire, etc. and do not rely on Ransomware-Safety.

 

Against Ransomware you should guard your Unraid server itself. Suggestion:

- Use only one client to Log into Unraid through your Root account and this Client should be on a different OS than your other Clients and in addition it could be without internet access

- User Shares should be backuped inside the server on a different Share which is not accessible through users (rsync)

- Client backups should be picked up and not uploaded. As an example: I add a share on my Windows PC for the folder "users" and this share has only read rights and is password protected. Through Unassigned Devices this share is mounted on my Unraid server and rsync picks up the files. By that Unraid has only read access to the client and the client has no access at all (except to the user share, which is on a different disk)

 

 

But finally the only ransomware-safe protection will be an offline backup.

 

 

Edited by mgutt
Link to comment

Folks, I almost got it! I am using the script that @mgutt posted and it is working great!

rclone copy /mnt/user/abc secure:abc --create-empty-src-dirs --ignore-checksum --bwlimit 8M --checkers 2 --transfers 1 -vv --stats 10s

 

My only issue that I am having, (and I have been reading all day and can't figure it out) is how to have rclone create a s subfolder in that directory with the date and time stamp attached to the folder name, and place the file that was either modified or deleted from the source into that folder. I understand I have to use --backup-dir somewhere but I don't know how to get it to add the date and time stamp to it... This would be helpful if ransomware encrypts files and it starts to sync before I have the chance to catch it, or in the case of accidental deletion...

 

I checked the restore from the trash with google drive directly, however, with an encrypted drive it's hard to figure out which files I would need to restore... Would be best to create this kind of folder, so in the case of diaster I can just mount and drag and drop and restore the original unmodified files

Link to comment

Thanks bud. I did read it. Of course it's possible that ransomeware will take control of gdrive and overwrite files 100 times. But with the -backup-dir command, The original files would still keep nesting in the backup directory, unless this ransomware runs a delete command for the drive? I'm also not planning on mounting it so hopefully the chances of that happening are much less. 

 

I actually saw that exact post and added it to your script rclone copy /mnt/user/backups secure:backups --backup-dir remote:backups/`date -I` --create-empty-src-dirs --ignore-checksum --bwlimit 8M --checkers 2 --transfers 1 -vv --stats 10s

 

That's what I wrote exactly and I get an error and it won't run at all

 

Also, out of curiosity, since google drive does record previous versions, since it's an encrypted remote, how would I know which files to restore to their previous versions, since it's all encrypted when browsing directly from the drive...

 

Thank you so much!

Link to comment
12 minutes ago, maxse said:

The original files would still keep nesting in the backup directory, unless this ransomware runs a delete command for the drive?

Correct. Now you should ask yourself. If you are an attacker and you want to earn some Bitcoins, will you leave some dirs in the victims cloud untouched or not, especially if it contains "backup" in its name? ^^

 

This backup-dir is an usual subdir in the google drive. There is nothing special about it. Google Drive does not know that rclone created it for special backup purposes.

 

Edited by mgutt
Link to comment

right I understand. I thought you meant to use the the fact that google keeps multiple versions of a file without using --back-dir. I see now it's important because that folder would be easy to navigate to especially in the even of accidental deletion or change in file etc...

 

I am really hoping that unraid comes up with snapshots soon. I am strongly looking at running xpenology because of the snapshots, at least trying it out in a VM, but I digress.

 

Could you pls help me out with the script though. I can't get that to work :(

 

BTW this is the error that I get

 

Failed to make fs for --backup-dir "remote:backups/2020-10-09": didn't find section in config file

Edited by maxse
Link to comment

BTW, hypothetically, has anyone ever been able to get this to work with possibly radarr or the like? So that no files are stored locally? I'm thinking that radarr would put the finished file in a dummy folder, than that file is moved to the cloud. Then plex could be pointed to the seperate mounted cloud drive... 

But then what would happen to radarr, wouldn't it then think that the file is "missing" and try to fetch it again? I understand you wouldn't want radarr to be pointing to the mounted cloud drive because you shouldn't use the mount to write files, and only read.... Anyone have a set up like this that works?

Link to comment
5 minutes ago, maxse said:

BTW, hypothetically, has anyone ever been able to get this to work with possibly radarr or the like? So that no files are stored locally? I'm thinking that radarr would put the finished file in a dummy folder, than that file is moved to the cloud. Then plex could be pointed to the seperate mounted cloud drive... 

But then what would happen to radarr, wouldn't it then think that the file is "missing" and try to fetch it again? I understand you wouldn't want radarr to be pointing to the mounted cloud drive because you shouldn't use the mount to write files, and only read.... Anyone have a set up like this that works?

You're a bit late to the game. People have been doing this for the better part of 3 years.....countless projects on GitHub and other places.

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.