Hey,
I've been using rclone for a while on Google Drive (Encrypted), and have now decided to take it all locally -- ~50TB in total.
What's the right way to think about copying all of this locally? I know the 750GB/day limit will make this take forever, so thinking about the correct way to set up rclone to copy data from a remote to local.
Presumably the plan would need to handle internet downtime / happen automatically, so I guess that I'd want the reverse of rclone/mergerfs.
So the setup seems like it'd be:
1. stop uploading data to google drive
2. long-running rclone ? script to copy to a local directory
My config looks like
[gdrive]
type = drive
client_id = x
client_secret = x
scope = drive
root_folder_id =
service_account_file =
token = x
[gcrypt]
type = crypt
remote = gdrive:/encrypt
filename_encryption = standard
directory_name_encryption = true
password = x
password2 = x
and my unraid script is https://pastebin.com/uKWECVSx