The state of various backup solutions


DivideBy0

Recommended Posts

I am growing a bit depressed to see the state of some backup apps that I tried so far, unbaked, unreliable, down right scary. I don't mean to be a debbie downer on the hard work that is being put into it, but man I will be scared to trust my data with any of these solutions.   So far I tried and failed:

 

CrashPlan - I have a 4TB backup that I can't restore with the Unraid app. I am stuck

CloudBerry - Horrible speed 2 - 4 Megs upload - I stoped testing due to lack of performance - over scp/ssh/sftp

Duplicacy -  Good speed / performance but unreliable restore.  Various errors, annoying to even troubleshoot - over scp/ssh/sftp

Duplicati - Good speed / performance but can't restore large 2TB backups - locked database - down right depressed and not willing to troubleshoot further - over scp/ssh/sftp

 

I used ArQ with my Mac mini server and is rock solid, but it won't run on Unraid so I am stuck. Unless I mount the network shares and do it like that.

 

Any suggestion on a rock solid reliable app that gives me performance and reliability?   Why do you guys use?  I need something that will work over scp/ssh/sftp

 

Many thanks gents

  • Like 1
Link to comment
  • 2 weeks later...

Hi, I've been holding off on replying to this topic because I've been in the same boat and have been trying to figure out a viable long term solution, and recently came across one that is working well for me so far.

 

Once Crashplan was discontinued, I had given up on a native UNRAID cloud backup solution a few months back and went with BackBlaze. I was using a Windows 10 machine to sync my unRAID shares with software called "SyncFolders" and a fleet of USB3 portable drives outside of UNRAID. Needless to say, it was a bit much, but it worked.

 

I recently came across both Rclone (file syncing ) and Duplicati (true block based backups). I tried both, and I ended up using Duplicati and a cloud based solution with encryption, specifically GSuite (Google Drive For Business). They currently offer unlimited storage for $12/Month, and they do not enforce the 5 user minimum. However, you do have to own your own domain to use this solution. I already had one, so no big deal.

 

The only thing with Google Drive, is that you can only upload 750GB/Day. I hit exactly 725.4GB before Duplicati started throwing server side errors. I have since throttled my uploads to 8 MB/s to keep it under this ceiling. (Math works out to 691.2 GB/Day [8MB * 86,400 Sec / 1000MB = 691.2GB/Day]. 9 MB/s puts it over, and the parameter has to be a whole number). This should keep Duplicati happy and support uninterrupted backups during my initial upload set. This would never be a problem once all files are initially backed up, but it is an interesting facet of the this solution's workflow.

 

Other than that, I have had no issues. Restoring works from my testing with various files types and sizes. I'll be testing a full 8TB restore here soon once the initial backup set is completed. Hopefully, I won't run into the issues you did, but I remain hopeful. I am interested to hear what specifically your issues were with Duplicati and restores.

 

It looks like you have not tried Rclone yet, so it may be worth a shot. Here are some great tutorials by @SpaceInvaderOne.

 

 

Great tutorial if you just want a straight forward encrypted cloud sync via Rclone:

 

 

 

Here is another way of performing block based backups (true backups) not syncing with Duplicati. You can use the same cloud services, or even UNRAID to UNRAID. This is what I am currently using.

 

 

 

 

Hopefully this helps you and/or others who will be going down this path. I do intend on using Duplicati to backup UNRAID to UNRAID as well. Let me know if you have any questions. Good Luck!

 

Link to comment

No problemo.

It seems to work well enough for me, at least in the sense I've confidence I can restore whatever I have in cold storage.

It's definitely not a real-time solution, but then I don't treat my unRAID server as a device that gets daily changes and needs the same for backups.  Mine is more of a warm storage "vault"; a gigantic WORM device.

Things that are touched daily are on a completely different server, with a completely different backup/restore model.

Link to comment

So, Duplicati is terrible. I couldn't even get my initial 7TB backup set to complete due to all sorts of database/block issues. Basically, if you interrupt the initial set in any way (like graceful pausing or stopping after current file is finished ***BROKEN***, the entire thing fails. I tried 7 times to get it done, and each time some kind of fatal exception would occur. I uploaded 2TB at one point, only to have it fail once I stopped it (gracefully) per the documentation, in the hopes that I could resume it later. After reading many posts in their own forums, many others have had issues getting very large initial backup sets completed. I was hopeful, but now just depressed ha ha.

 

Now I'm on to the paid Duplicacy. So far, not really impressed, but I'm trying the same 7TB backup set now. It does appear to be able to start/stop the initial set via a native partial backup snapshot function, so I am hopeful this product will succeed where the other free one failed. However, I am trying to restore a 30GB backup I did last night, and it is stuck on "Starting" for like 30 minutes, so.....not looking great.

 

I'll report back my findings. At this point, my only other option if this fails is to try RClone, or go back to my Windows 10/Backblaze solution.

 

@johnwhicker, I didn't want to believe that ALL of these options are nonviable, but you appear to be correct. Which completely SUCKS!

Edited by falconexe
Link to comment

I use Duplicacy for local backups (network share and USB.) I recently restored two ~60GB VMs as a test, one from each source. Worked fine although you can't navigate away from the restore screen or it'll stop. Other than that and the hassle of the initial setup it's worked fairly well.

 

Add the password as a container variable or automated backups after a container update will fail pending manual log in.554097545_ScreenShot2020-09-01at8_41_59PM.thumb.png.9eeadbc1dd44ededabd58d99ec455fba.png 

  • Thanks 1
  • Upvote 1
Link to comment
On 9/1/2020 at 6:46 PM, CS01-HS said:

I use Duplicacy for local backups (network share and USB.) I recently restored two ~60GB VMs as a test, one from each source. Worked fine although you can't navigate away from the restore screen or it'll stop. Other than that and the hassle of the initial setup it's worked fairly well.

 

Add the password as a container variable or automated backups after a container update will fail pending manual log in.554097545_ScreenShot2020-09-01at8_41_59PM.thumb.png.9eeadbc1dd44ededabd58d99ec455fba.png 

Thanks for this tip. I've set it up like this now.

  • Like 1
Link to comment
  • 3 months later...
On 9/1/2020 at 11:50 AM, falconexe said:

So, Duplicati is terrible. I couldn't even get my initial 7TB backup set to complete due to all sorts of database/block issues. Basically, if you interrupt the initial set in any way (like graceful pausing or stopping after current file is finished ***BROKEN***, the entire thing fails. I tried 7 times to get it done, and each time some kind of fatal exception would occur. I uploaded 2TB at one point, only to have it fail once I stopped it (gracefully) per the documentation, in the hopes that I could resume it later. After reading many posts in their own forums, many others have had issues getting very large initial backup sets completed. I was hopeful, but now just depressed ha ha.

 

Now I'm on to the paid Duplicacy. So far, not really impressed, but I'm trying the same 7TB backup set now. It does appear to be able to start/stop the initial set via a native partial backup snapshot function, so I am hopeful this product will succeed where the other free one failed. However, I am trying to restore a 30GB backup I did last night, and it is stuck on "Starting" for like 30 minutes, so.....not looking great.

 

I'll report back my findings. At this point, my only other option if this fails is to try RClone, or go back to my Windows 10/Backblaze solution.

 

@johnwhicker, I didn't want to believe that ALL of these options are nonviable, but you appear to be correct. Which completely SUCKS!

any success with Duplicacy backing up to google drive for your 7TB worth of data?

Link to comment
  • 4 months later...

I'd love to hear an update with this.  I went with Cloudberry and its horrible.  Its taken me over a month to backup 4TB of data and its STILL not done.  It constantly errors out, runs at a snails pace..  I paid for the Personal license, maybe thats why.  I'd love to hear about Cloudberry alternatives that just WORK.

 

**UPDATE

I found these instructions for rclone that seem to be pretty close.. but not great:

https://www.reddit.com/r/unRAID/comments/g71391/howto_run_official_rclone_docker_and_backup_to/

I cant' seem to figure out where in my bucket its uploading to. Also can't seem to figure out the encryption but i haven't really looked into it much yet.  I'd also like the ability to name more than just one directory i want to back up..

 

Edited by tvd1
Link to comment
  • 3 weeks later...
On 12/17/2020 at 8:26 AM, falconexe said:

Ha ha. Actually, YES. I was able to upload 16TB so far and have had no issues. A bit complex to get all the settings just right, but once it is locked in, it’s fire and forget.

Are you still happy with Duplicacy? I'm right now in the situation that I've set up rclone with gdrive (plex) and am now looking for a solution to backup my shares to Gdrive as well. Currently, I can't decide whether to just copy the shares and upload them to GDrive every week with a script (1 TB of Data) or use a solution like Duplicacy.

Link to comment
1 minute ago, Symon said:

Are you still happy with Duplicacy? I'm right now in the situation that I've set up rclone with gdrive (plex) and am now looking for a solution to backup my shares to Gdrive as well. Currently, I can't decide whether to just copy the shares and upload them to GDrive every week with a script (1 TB of Data) or use a solution like Duplicacy.

 

I've had no issues with Duplicacy. I did test some extensive restores and all worked out and checksummed correctly. However, for such a small amount of data, I would just RClone the differentials. That way you have direct file access (even though they are encrypted).

 

Another thought is that I REALLY love TRESORIT. It's pricey but it is THE ABSOLUTE BEST box type sync with high quality encryption out there. It blows DropBox out of the Water. If you are under 2TB, I'd go TRESORIT and use a Windows Machine with a program called SyncFoldersPro. This is how I backup my NFO files and metadata for Kodi, etc.. I use SyncFoldersPro to simultaneously backup (with advanced filters) to my onsite backup server and Tresorit online backup. Works like a charm!

 

  • Like 1
Link to comment
  • 3 months later...
  • 4 weeks later...
  • 3 weeks later...

I will throw my hat in the ring here.

I have always had a backup strategy in my household to protect the important stuff.  I

I was always a big fan of Acronis and ran it from my windows machine to file-based backups.

 

I have been searching for a decent file backup utility within Unraid Docker.

 

I stumbled onto Duplicacy and decided to give it a try.

 

I was able to backup about 1 TB from my Unraid Server to a Synology Box on my local network.

I was averaging about 20MB/s but I suspect that Synology was the bottleneck.

 

I tested the restore functionality - no issues with that.

 

I added scheduled jobs within Duplicacy and they are running without issue.

 

I am ready to take it to the next level:

1. Purchase personal license

2. Sign up for Backblaze Account

3.  Setup secondary jobs for each backup and point to Backblaze for true 3-2-1 protection.

 

Has anyone had experience with Duplicacy and Cloud backups?  

I'm not sure how long the cloud backups will take, do they have to be contiguous?

I have the CA AppData backup process where Docker service is shutdown.

Not sure what would happen to a backup that is running, would a long running job complete?

 

 

Edited by a12vman
Link to comment
21 hours ago, a12vman said:

I stumbled onto Duplicacy and decided to give it a try.

 

I was able to backup about 1 TB from my Unraid Server to a Synology Box on my local network.

I was averaging about 20MB/s but I suspect that Synology was the bottleneck.

 

Over SMB? Should be faster. I get about double that with a theoretically slower setup (odroid-HC2 over WebDAV.)

 

21 hours ago, a12vman said:

3.  Setup secondary jobs for each backup and point to Backblaze for true 3-2-1 protection.

 

You probably want to set this up as a copy job (initialize backblaze as copy-compatible and consider bit-identical.)

 

Check out the forum: https://forum.duplicacy.com

 

One piece of advice is keep a copy of duplicacy's appdata outside of CA AppData backups and duplicacy backups -

In case of catastrophic failure to restore you'll need a working duplicacy and you don't want anything preventing that, e.g. encryption keys.

(Seems like this would apply generally to backup software.)

Link to comment

Ok the Duplicacy data you are referring to, is it located at \Cache\Appdata\Duplicacy\Cache\Localhost\?

 

Are you suggesting that in event of an Appdata corruption that my Duplicacy functionality would work if I:

 

1. Re-install duplicacy Docker.

2.  Restore Backup DB to Appdata\Duplicacy\Cache\Localhost

 

 

Link to comment
  • 2 weeks later...
On 10/30/2021 at 7:22 AM, a12vman said:

Ok the Duplicacy data you are referring to, is it located at \Cache\Appdata\Duplicacy\Cache\Localhost\?

 

Are you suggesting that in event of an Appdata corruption that my Duplicacy functionality would work if I:

 

1. Re-install duplicacy Docker.

2.  Restore Backup DB to Appdata\Duplicacy\Cache\Localhost

 

 

 

Sorry for the delay, just saw this.

 

Actually I meant Duplicacy's full appdata directory.

On unRAID that's typically: /mnt/cache/appdata/Duplicacy

 

I run the Backup/Restore Appdata plugin weekly which backs up all my containers' appdata directories (and my flash drive) to the array, so for simple corruption I'd just restore from that.

 

I'm talking about catastrophic failure, your server's struck by lightning or stolen, etc.

 

I believe everything necessary to recreate a container is either on flash or in appdata. So I take those two backups, created by the backup plugin, and save them elsewhere – an offline flash drive, remote server, etc.

Link to comment

Ok thanks.  I am also using CA Backup/Restore but it stopped working and I haven't figured out why.  

 

I wish there was a way to backup dockers without stopping the Docker Engine.

 

/usr/bin/tar: Error is not recoverable: exiting now
Starting binhex-delugevpn
Starting binhex-krusader
Starting binhex-plexpass
Starting binhex-sabnzbd
Starting binhex-sonarr
Starting duplicacy
Starting firefox
Backup/Restore Complete. tar Return Value:
Backup / Restore Completed

Link to comment
  • 8 months later...

What are people using these days? I tried Rclone and it's not finding all my files in a dir, and thus only backing up a fraction of what's there. Can't figure out why. Doesn't seem to be permissions related either. Fiddled with it for 18h+ and didn't figure it out.

 

Off-the-shelf NAS solutions and the likes of TrueNAS have replication and synchronization facilities built into them. I feel like this is the missing piece of the puzzle for me with Unraid. It's a big piece, but it's largely there.

  • Upvote 2
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.