[Plugin] rclone


Waseh

Recommended Posts

Hey everyone, I am looking to setup something similar to what I have seen used for Plex servers etc, however I only want to offload data to my google drive. That said, I done even need access to the data on my server once its uploaded nor do I want it encrypted.  Does any one have a guide or can point me in the right direction? 

 

IE.  Share on my server call "gdrive". Whenever I drop a file in said share, it uploads it to my google drive with no encryption then deletes it from the "gdrive" share. 

Link to comment
1 hour ago, Archemedees said:

Hey everyone, I am looking to setup something similar to what I have seen used for Plex servers etc, however I only want to offload data to my google drive. That said, I done even need access to the data on my server once its uploaded nor do I want it encrypted.  Does any one have a guide or can point me in the right direction? 

 

IE.  Share on my server call "gdrive". Whenever I drop a file in said share, it uploads it to my google drive with no encryption then deletes it from the "gdrive" share. 

Research a github project called cloudplow. It does exactly what you want. I will NOT walk you through setup for it. Review the readme carefully.

 

You MAY have to use a docker version of cloudplow if Unraid has issues installing it the standard way.

Link to comment

Hi, I am currently facing some issue with the rclone's mounted Google Drive which is shared on Windows OS as SMB Share (Mapped Network Drive).

 

Issue: If I try to copy or do any action on this drive, it gives me a permission error “You need permission to perform this action”.

 

Screenshot: https://i.imgur.com/sAnLaeu.png

 

Please if you could take a look at this thread as I have in-detail mentioned the issue with all the logs/screenshots/commands. Thanks.

 

Thread: https://forums.unraid.net/topic/100567-rclone-permission-and-writefilehandlewrite-issue/

 

Thanks!

 

P.S.: If you want, I can post the whole content on this thread too just in case if you cannot provide support over there in General Support Section. Please do mention!

 

Edited by learningunraid
Link to comment
  • 2 weeks later...

Hey All

 

Having an issue with rclone, using: 

 

Unraid Nvidia 6.8.3

Rclone 1.53.3

 

Have a simple user script which is set to rclone sync -v /mnt/user/Media/ /mnt/disks/offload

 

This syncs my media to a synlogy backup box, which is mounted with unassigned devices to the offload folder

 

Error is pretty obvious as Failed to copy write, no space left on device.

 

However the device has 29TB free

 

Any thoughts would be much appreciated. 

 

This has been working great for a few years since this month.

Edited by purplechris
Link to comment
  • 2 weeks later...
  • 2 weeks later...

Hey,

 

I've been using rclone for a while on Google Drive (Encrypted), and have now decided to take it all locally -- ~50TB in total.

 

What's the right way to think about copying all of this locally? I know the 750GB/day limit will make this take forever, so thinking about the correct way to set up rclone to copy data from a remote to local.

 

Presumably the plan would need to handle internet downtime / happen automatically, so I guess that I'd want the reverse of rclone/mergerfs.

 

 

So the setup seems like it'd be:

 

1. stop uploading data to google drive

2. long-running rclone ? script to copy to a local directory

 

My config looks like

 

[gdrive]
type = drive
client_id = x
client_secret = x
scope = drive
root_folder_id =
service_account_file =
token = x

[gcrypt]
type = crypt
remote = gdrive:/encrypt
filename_encryption = standard
directory_name_encryption = true
password = x
password2 = x

 

and my unraid script is https://pastebin.com/uKWECVSx

Link to comment
11 minutes ago, willm said:

Hey,

 

I've been using rclone for a while on Google Drive (Encrypted), and have now decided to take it all locally -- ~50TB in total.

 

What's the right way to think about copying all of this locally? I know the 750GB/day limit will make this take forever, so thinking about the correct way to set up rclone to copy data from a remote to local.

 

Presumably the plan would need to handle internet downtime / happen automatically, so I guess that I'd want the reverse of rclone/mergerfs.

 

 

So the setup seems like it'd be:

 

1. stop uploading data to google drive

2. long-running rclone ? script to copy to a local directory

 

My config looks like

 

[gdrive]
type = drive
client_id = x
client_secret = x
scope = drive
root_folder_id =
service_account_file =
token = x

[gcrypt]
type = crypt
remote = gdrive:/encrypt
filename_encryption = standard
directory_name_encryption = true
password = x
password2 = x

 

and my unraid script is https://pastebin.com/uKWECVSx

  1. 750 GB/Day is for UPLOADS to Google. It's about 10 TB/Day for DOWNLOADS from Google.
  2. You can workaround these limits with the use of Google Service Accounts (no I won't explain how you set that up)
  3. There are scripts on github such as sasync to help you download to local. These scripts are capable of performing service account cycling so you can download/upload a lot per day. Google to find this in github.
Link to comment

Thanks for the suggestion Stupifier. I didn't realize the download limit was 10TB -- I had been thinking about the service account approach, but actually I think with the 10TB limit, it's not needed. I could fully saturate my gigabit down connection and not hit 10TB a day.

 

I've started a typical rclone sync from my decrypted mount to a new directory. It seems to be running fine ATM, weeks to go :)

Link to comment
18 minutes ago, willm said:

Thanks for the suggestion Stupifier. I didn't realize the download limit was 10TB -- I had been thinking about the service account approach, but actually I think with the 10TB limit, it's not needed. I could fully saturate my gigabit down connection and not hit 10TB a day.

 

I've started a typical rclone sync from my decrypted mount to a new directory. It seems to be running fine ATM, weeks to go :)

50TB of total data. I'd do your transfer in chunks...I'd also do it in a tmux/screen terminal session to avoid disconnects

Link to comment

Thank you so much for this. It is running great syncing to an offiste minio unraid server. I just need some help please with --backup-dir

 

I want to set this up as a true backup so if I get crypto and the script runs every night after crypto, I want to still get my files back..

 

Right now I have it set up as --backup-dir remotename:deleted So all of the files that have changed go there. I will periodically purge this directory.

 

But I want something more specific, I would like to be able to create a folder in the "deleted" backup directory that has the date the file was changed as part of its name, and then put that file in there. So every time a file is changed a subdirectory with the date of the change is created within the "deleted" back up directory.

This would make it easier to sort through and figure out the date of the file that I should revert to.

 

I understand that this is possible, and I've read the docs over at rclone.org, but I'm a newbie and it looks like there are some weird symbols involved and I just don't know how to add it to my script.

Can you guys pls help me out with this?

 

Thanks so much!

Link to comment

Hi...  this thread is 32 pages long.. so not sure if this has been discussed, I am sure it has but I couldn't find it....

 

I watched SpaceInvaders rClone tutorial - 

 

 

In it he tells us to boot into GUI mode, but my server is completely headless and I am not able to put a monitor and keyboard on it with out a ton of effort...

 

I was hoping there is a way to install rclone without GUI mode.... he dose mention briefly that there is but dose not actually show how to do it.

 

TL;DR How do I install and setup the plugin with out GUI Mode turned on?

 

Link to comment
9 hours ago, questionbot said:

Hi...  this thread is 32 pages long.. so not sure if this has been discussed, I am sure it has but I couldn't find it....

 

I watched SpaceInvaders rClone tutorial - 

 

 

In it he tells us to boot into GUI mode, but my server is completely headless and I am not able to put a monitor and keyboard on it with out a ton of effort...

 

I was hoping there is a way to install rclone without GUI mode.... he dose mention briefly that there is but dose not actually show how to do it.

 

TL;DR How do I install and setup the plugin with out GUI Mode turned on?

 

Type "rclone config" in a terminal session and follow along with all the prompts. When you reach a prompt which asks if you are Headless, say YES. And just follow the prompt. It'll ask you to go to a link, auth your account, and paste some code back in the terminal. That's all.

 

And for FWIW, this is the exact same procedure for any headless device. Unraid is not special. So you can Google any setup tutorial for help on this. I'm not exactly sure why SpaceInvaderOne had all the GUI mode discussion stuff in his video...but his video there is very old.

Edited by Stupifier
Link to comment
1 hour ago, JNCK said:


I’m trying to copy files from my Unraid server to Google Drive using a VPN without having to route all the traffic of my Unraid box through a VPN.


Verzonden vanaf mijn iPhone met Tapatalk

No idea how to do that but seriously.....upload the files to Google Drive without the VPN. Google truly does NOT care. This isn't like you are seeding torrents, you don't need a VPN in this instance.

Link to comment

Hi, all,

 

In another thread i posted about huge amounts of data being used by the Unraid box (its static ip address as shown in BandiwdthD and ntop shows 150GB-250GB received per night!)

 

I have the 2020.09.29 version of this rclone plugin from Waseh's Repository.

 

Got a gdrive mounted and am able to use Plex and Emby with the rclone-mounted gdrive.

 

I disabled this plugin (rclone), rebooted unraid, and the huge nightly amounts of data being downloaded stopped.

 

So the culprit is rclone plugin. Or is it? Could it be some bug? Could it be the Plex or Emby plugin doing something stupid? (The Plex and Emby database syncs were over weeks or couple months ago.)

 

Is there any way I can find out what is happening? What and why is downoading 200GB or more of data every night?

 

I'm disabling the Plex docker image and see if it happens tonight. If yes, I'll next disable Emby. But in the mean time, I'm hoping for some pointer from you guys.

 

Thank you!

Link to comment
  • 2 weeks later...

When trying to follow the spaceinvader one tutorial, I get this error when I run rclone config via ssh:

 

Failed to load config file "/boot/config/plugins/rclone/.rclone.conf": could not parse line: rclone config

 

The config file only contains the words rclone config, I'm not sure if that's normal? I removed and re-added the plugin. I'm not sure what else to try?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.