My Experience With Cloud Backup So Far


statecowboy

Recommended Posts

I thought I would share my experience trying to set up a cloud backup solution for my array.  It has not been as easy as I had hoped for.  I initially had planned on using crashplan for small business.  I'm now with G Suite (Google Drive).  With both of these options there are some issues that people need be aware of.

 

Crashplan Pro (AKA Crashplan for Small Business) - Cost is $10/month for unlimited backup capacity.  Setup is relatively straight forward using the Crashplan Pro docker (I used @Djoss's docker and it worked great).  However, I was experiencing very slow uploads, and after discussing with their support they informed me that users should experience 1-5 Mb/sec upload speed.  This was unacceptable to me, as I have a lot of data and the initial upload would have taken 90 days according to the crashplan app.

 

G Suite (Google Drive) - Signing up for a G Suite business account requires a domain.  I already had one so that was not an issue.  Getting everything set up is relatively straight forward (you must confirm you own the domain by adding records through your registrar).  G Suite also gives you access to business email (with your domain - [email protected]) which is a nice perk.  Storage is supposedly limited to 1TB per user up until you have 5 users, at which point it becomes unlimited (in other words to get unlimited in theory you would need 5 X $10 = $50/month worth of services).  However, as a lot of folks know, this has not been enforced, and people with just one account have been getting unlimited storage.  I found this to be the case in my situation as well.  My storage limit shows as Unlimited.  I am using rclone to do the actual sync'ing and it's been working beautifully.  I used Spaceinvader ( @gridrunner) 's video for the setup (love his videos):

 

All that said - I was getting very fast upload speeds (basically saturating my upload bandwidth on a gigabit connection).  When I woke up this morning however, I had received an error from rclone telling me "Error 403 - User Rate Limit Exceeded".  I contacted google support and they informed me that they enforce a 750GB limit per day on uploads.  So, it looks like I will need to run my sync scripts a few times during the first couple of weeks to get everything properly uploaded after google resets my upload limit each day.  Of note, they also told me that during the trial period they limit you to 750 GB total upload.  So if you are still in trial, your upload limit may not reset.  In my case, I asked that I forego the trial and pay now.

 

I'm sure this information is probably common knowledge, but I figured I would share in case it helps someone trying to figure out a good cloud backup solution.  These things seem to evolve quite quickly so this is my experience with it as of this date.

 

Edited by statecowboy
  • Like 1
  • Upvote 1
Link to comment

It is kind of funny, we go years with no backups and no worries, but then once we finally decide we need a backup, we feel more at risk than ever during the initial upload :) 

 

What I like about CrashPlan is that it is a true backup solution, with unlimited versions and restoration of deleted files. I don't know about rsync and Google specifically, but in a typical syncing scenario if you delete a file locally it will be deleted from the remote system at the next sync. Or if your local files are encrypted by malware, the sync tool will dutifully overwrite your remote files with the encrypted local ones, rendering the remote copy useless too. 

 

To help deal with the long initial upload to Crashplan you can create backup sets. Put all your family photos and personal documents in the highest priority set and they will be backed up first, then put your Movies and TV Shows in a lower priority set and let them get backed up over time.

 

As a Comcast customer, my biggest concern is not the upload speed, but the amount of data to be uploaded. Their 1TB/month transfer cap counts uploads too, and the fees for exceeding that are excessive.

Link to comment
19 minutes ago, ljm42 said:

What I like about CrashPlan is that it is a true backup solution, with unlimited versions and restoration of deleted files. I don't know about rsync and Google specifically, but in a typical syncing scenario if you delete a file locally it will be deleted from the remote system at the next sync. Or if your local files are encrypted by malware, the sync tool will dutifully overwrite your remote files with the encrypted local ones, rendering the remote copy useless too. 

That is correct with rclone and google drive as well, you're right.  It's just something I need to remind myself to be mindful of.  You can use the copy command as well, which avoids this.

Link to comment
8 hours ago, gridrunner said:

I would really recommend Duplicati for backups. Its cross platform and we have a container from linuxserver for it on unRAID. All backups are encrypted.

https://www.duplicati.com/

Duplicati is great, but I prefer for my backup to be an actual replica of my data on the cloud rather than a back up comprised of a number of compressed files.  With rclone simply cloning the contents of your array along with google drive's sharing capabilities it make sharing family photos, videos, etc. very easy.  

 

Edit - I think perhaps setting duplicati to "no encryption" would do what I've described above?  I may give that a shot.

 

 

Edited by statecowboy
Link to comment
  • 1 year later...
On 2/24/2018 at 12:47 AM, SpaceInvaderOne said:

I would really recommend Duplicati for backups. Its cross platform and we have a container from linuxserver for it on unRAID. All backups are encrypted.

https://www.duplicati.com/

Hey Ed saw your duplicati video i have installed it on all three unraid servers and currently uploading one servers shares encrypted with 4GB files to g-suite right now, thank you again I will have to buy you another beer soon! i love the encryption part. I had to call my ISP to get an upgrade as my main server is 80TB with about 69TB of data and at that 750GB/day limit will take about 1 year to upload...i have that one going to upload last...

so created multiple backup schedules but not using scheduling for now as I want to make sure its fully done before i schedule it for the much smaller incremental backups it will do after the initial backup
 

 

 

What I heard is if you talk to google you can ask them to uncap you upload limit if you explain the reason. i will be calling them tomorrow since my domain email isnt working yet anyways (been over two days since i created the mx records on aws)

Link to comment
On 2/24/2018 at 8:59 AM, statecowboy said:

Duplicati is great, but I prefer for my backup to be an actual replica of my data on the cloud rather than a back up comprised of a number of compressed files.  With rclone simply cloning the contents of your array along with google drive's sharing capabilities it make sharing family photos, videos, etc. very easy.  

 

Edit - I think perhaps setting duplicati to "no encryption" would do what I've described above?  I may give that a shot.

 

 

I guess that depends on how private you want your backups to be. Encryption hides files names and the compression is important if you have limited storage options

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.