willm

Members
  • Posts

    2
  • Joined

  • Last visited

Everything posted by willm

  1. Thanks for the suggestion Stupifier. I didn't realize the download limit was 10TB -- I had been thinking about the service account approach, but actually I think with the 10TB limit, it's not needed. I could fully saturate my gigabit down connection and not hit 10TB a day. I've started a typical rclone sync from my decrypted mount to a new directory. It seems to be running fine ATM, weeks to go
  2. Hey, I've been using rclone for a while on Google Drive (Encrypted), and have now decided to take it all locally -- ~50TB in total. What's the right way to think about copying all of this locally? I know the 750GB/day limit will make this take forever, so thinking about the correct way to set up rclone to copy data from a remote to local. Presumably the plan would need to handle internet downtime / happen automatically, so I guess that I'd want the reverse of rclone/mergerfs. So the setup seems like it'd be: 1. stop uploading data to google drive 2. long-running rclone ? script to copy to a local directory My config looks like [gdrive] type = drive client_id = x client_secret = x scope = drive root_folder_id = service_account_file = token = x [gcrypt] type = crypt remote = gdrive:/encrypt filename_encryption = standard directory_name_encryption = true password = x password2 = x and my unraid script is https://pastebin.com/uKWECVSx