The state of various backup solutions


Recommended Posts

I am growing a bit depressed to see the state of some backup apps that I tried so far, unbaked, unreliable, down right scary. I don't mean to be a debbie downer on the hard work that is being put into it, but man I will be scared to trust my data with any of these solutions.   So far I tried and failed:

 

CrashPlan - I have a 4TB backup that I can't restore with the Unraid app. I am stuck

CloudBerry - Horrible speed 2 - 4 Megs upload - I stoped testing due to lack of performance - over scp/ssh/sftp

Duplicacy -  Good speed / performance but unreliable restore.  Various errors, annoying to even troubleshoot - over scp/ssh/sftp

Duplicati - Good speed / performance but can't restore large 2TB backups - locked database - down right depressed and not willing to troubleshoot further - over scp/ssh/sftp

 

I used ArQ with my Mac mini server and is rock solid, but it won't run on Unraid so I am stuck. Unless I mount the network shares and do it like that.

 

Any suggestion on a rock solid reliable app that gives me performance and reliability?   Why do you guys use?  I need something that will work over scp/ssh/sftp

 

Many thanks gents

Link to comment
  • 2 weeks later...

Hi, I've been holding off on replying to this topic because I've been in the same boat and have been trying to figure out a viable long term solution, and recently came across one that is working well for me so far.

 

Once Crashplan was discontinued, I had given up on a native UNRAID cloud backup solution a few months back and went with BackBlaze. I was using a Windows 10 machine to sync my unRAID shares with software called "SyncFolders" and a fleet of USB3 portable drives outside of UNRAID. Needless to say, it was a bit much, but it worked.

 

I recently came across both Rclone (file syncing ) and Duplicati (true block based backups). I tried both, and I ended up using Duplicati and a cloud based solution with encryption, specifically GSuite (Google Drive For Business). They currently offer unlimited storage for $12/Month, and they do not enforce the 5 user minimum. However, you do have to own your own domain to use this solution. I already had one, so no big deal.

 

The only thing with Google Drive, is that you can only upload 750GB/Day. I hit exactly 725.4GB before Duplicati started throwing server side errors. I have since throttled my uploads to 8 MB/s to keep it under this ceiling. (Math works out to 691.2 GB/Day [8MB * 86,400 Sec / 1000MB = 691.2GB/Day]. 9 MB/s puts it over, and the parameter has to be a whole number). This should keep Duplicati happy and support uninterrupted backups during my initial upload set. This would never be a problem once all files are initially backed up, but it is an interesting facet of the this solution's workflow.

 

Other than that, I have had no issues. Restoring works from my testing with various files types and sizes. I'll be testing a full 8TB restore here soon once the initial backup set is completed. Hopefully, I won't run into the issues you did, but I remain hopeful. I am interested to hear what specifically your issues were with Duplicati and restores.

 

It looks like you have not tried Rclone yet, so it may be worth a shot. Here are some great tutorials by @SpaceInvaderOne.

 

 

Great tutorial if you just want a straight forward encrypted cloud sync via Rclone:

 

 

 

Here is another way of performing block based backups (true backups) not syncing with Duplicati. You can use the same cloud services, or even UNRAID to UNRAID. This is what I am currently using.

 

 

 

 

Hopefully this helps you and/or others who will be going down this path. I do intend on using Duplicati to backup UNRAID to UNRAID as well. Let me know if you have any questions. Good Luck!

 

Link to comment

No problemo.

It seems to work well enough for me, at least in the sense I've confidence I can restore whatever I have in cold storage.

It's definitely not a real-time solution, but then I don't treat my unRAID server as a device that gets daily changes and needs the same for backups.  Mine is more of a warm storage "vault"; a gigantic WORM device.

Things that are touched daily are on a completely different server, with a completely different backup/restore model.

Link to comment

So, Duplicati is terrible. I couldn't even get my initial 7TB backup set to complete due to all sorts of database/block issues. Basically, if you interrupt the initial set in any way (like graceful pausing or stopping after current file is finished ***BROKEN***, the entire thing fails. I tried 7 times to get it done, and each time some kind of fatal exception would occur. I uploaded 2TB at one point, only to have it fail once I stopped it (gracefully) per the documentation, in the hopes that I could resume it later. After reading many posts in their own forums, many others have had issues getting very large initial backup sets completed. I was hopeful, but now just depressed ha ha.

 

Now I'm on to the paid Duplicacy. So far, not really impressed, but I'm trying the same 7TB backup set now. It does appear to be able to start/stop the initial set via a native partial backup snapshot function, so I am hopeful this product will succeed where the other free one failed. However, I am trying to restore a 30GB backup I did last night, and it is stuck on "Starting" for like 30 minutes, so.....not looking great.

 

I'll report back my findings. At this point, my only other option if this fails is to try RClone, or go back to my Windows 10/Backblaze solution.

 

@johnwhicker, I didn't want to believe that ALL of these options are nonviable, but you appear to be correct. Which completely SUCKS!

Edited by falconexe
Link to comment

I use Duplicacy for local backups (network share and USB.) I recently restored two ~60GB VMs as a test, one from each source. Worked fine although you can't navigate away from the restore screen or it'll stop. Other than that and the hassle of the initial setup it's worked fairly well.

 

Add the password as a container variable or automated backups after a container update will fail pending manual log in.554097545_ScreenShot2020-09-01at8_41_59PM.thumb.png.9eeadbc1dd44ededabd58d99ec455fba.png 

  • Thanks 1
Link to comment
On 9/1/2020 at 6:46 PM, CS01-HS said:

I use Duplicacy for local backups (network share and USB.) I recently restored two ~60GB VMs as a test, one from each source. Worked fine although you can't navigate away from the restore screen or it'll stop. Other than that and the hassle of the initial setup it's worked fairly well.

 

Add the password as a container variable or automated backups after a container update will fail pending manual log in.554097545_ScreenShot2020-09-01at8_41_59PM.thumb.png.9eeadbc1dd44ededabd58d99ec455fba.png 

Thanks for this tip. I've set it up like this now.

  • Like 1
Link to comment
  • 3 months later...
On 9/1/2020 at 11:50 AM, falconexe said:

So, Duplicati is terrible. I couldn't even get my initial 7TB backup set to complete due to all sorts of database/block issues. Basically, if you interrupt the initial set in any way (like graceful pausing or stopping after current file is finished ***BROKEN***, the entire thing fails. I tried 7 times to get it done, and each time some kind of fatal exception would occur. I uploaded 2TB at one point, only to have it fail once I stopped it (gracefully) per the documentation, in the hopes that I could resume it later. After reading many posts in their own forums, many others have had issues getting very large initial backup sets completed. I was hopeful, but now just depressed ha ha.

 

Now I'm on to the paid Duplicacy. So far, not really impressed, but I'm trying the same 7TB backup set now. It does appear to be able to start/stop the initial set via a native partial backup snapshot function, so I am hopeful this product will succeed where the other free one failed. However, I am trying to restore a 30GB backup I did last night, and it is stuck on "Starting" for like 30 minutes, so.....not looking great.

 

I'll report back my findings. At this point, my only other option if this fails is to try RClone, or go back to my Windows 10/Backblaze solution.

 

@johnwhicker, I didn't want to believe that ALL of these options are nonviable, but you appear to be correct. Which completely SUCKS!

any success with Duplicacy backing up to google drive for your 7TB worth of data?

Link to comment
  • 4 months later...
Posted (edited)

I'd love to hear an update with this.  I went with Cloudberry and its horrible.  Its taken me over a month to backup 4TB of data and its STILL not done.  It constantly errors out, runs at a snails pace..  I paid for the Personal license, maybe thats why.  I'd love to hear about Cloudberry alternatives that just WORK.

 

**UPDATE

I found these instructions for rclone that seem to be pretty close.. but not great:

https://www.reddit.com/r/unRAID/comments/g71391/howto_run_official_rclone_docker_and_backup_to/

I cant' seem to figure out where in my bucket its uploading to. Also can't seem to figure out the encryption but i haven't really looked into it much yet.  I'd also like the ability to name more than just one directory i want to back up..

 

Edited by tvd1
Link to comment
  • 3 weeks later...
On 12/17/2020 at 8:26 AM, falconexe said:

Ha ha. Actually, YES. I was able to upload 16TB so far and have had no issues. A bit complex to get all the settings just right, but once it is locked in, it’s fire and forget.

Are you still happy with Duplicacy? I'm right now in the situation that I've set up rclone with gdrive (plex) and am now looking for a solution to backup my shares to Gdrive as well. Currently, I can't decide whether to just copy the shares and upload them to GDrive every week with a script (1 TB of Data) or use a solution like Duplicacy.

Link to comment
1 minute ago, Symon said:

Are you still happy with Duplicacy? I'm right now in the situation that I've set up rclone with gdrive (plex) and am now looking for a solution to backup my shares to Gdrive as well. Currently, I can't decide whether to just copy the shares and upload them to GDrive every week with a script (1 TB of Data) or use a solution like Duplicacy.

 

I've had no issues with Duplicacy. I did test some extensive restores and all worked out and checksummed correctly. However, for such a small amount of data, I would just RClone the differentials. That way you have direct file access (even though they are encrypted).

 

Another thought is that I REALLY love TRESORIT. It's pricey but it is THE ABSOLUTE BEST box type sync with high quality encryption out there. It blows DropBox out of the Water. If you are under 2TB, I'd go TRESORIT and use a Windows Machine with a program called SyncFoldersPro. This is how I backup my NFO files and metadata for Kodi, etc.. I use SyncFoldersPro to simultaneously backup (with advanced filters) to my onsite backup server and Tresorit online backup. Works like a charm!

 

  • Like 1
Link to comment
  • 3 months later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.