CrashPlan Home Ending


Smackover

Recommended Posts

Bumping an older thread here.

Is there a good set of solutions to replace Crash Plan as yet?

In our particular case, we would have two UnRAID servers in different locations (my home and my parents'), with individual PCs and smartphones at each location backing up to the respective UnRAID servers. What we would ideally like to do is have the two UnRAID servers back up to each other (essentially providing both on-site and off-site backup), with each user able to access only their own data. File versioning is a definite plus, as is allowing the users to access their data over the internet.

Has the community arrived at a solution or set of solutions that accomplishes this fairly simply?

Thanks in advance,

Ari

Link to comment
  • 3 weeks later...

Wow. Setting up Minio and Duplicati was incredibly easy.

I have set up Minio using the unRAID docker available from "topdockercat". The only setting I needed to change was to point it to a folder on unRAID to store the Minio "buckets".

 

2018-04-16_16-03-20.thumb.gif.7dc207e1bd4ccfed1726a43683a86d87.gif

 

(in this case I made a folder called "dup", and did not share it in any way)

 

2018-04-16_16-05-20.thumb.gif.5a40a79fa160b2b27d3e50a6f24e8258.gif

 

Then just made a generic bucket in Minio:

 

2018-04-16_16-07-17.thumb.gif.5d95e88fcaa5262db2b224a653bbbe33.gif

 

I got the Minio Keys from the docker log in unRAID.

 

Then I installed a Duplicati client on one of my local client computers and was able to make a backup by pointing it to the unRAID box on the LAN. The first backup compressed 60GB of files into 30GB of duplicati files... This took almost an hour for the first backup.

 

I set up a dynamic IP and set my router to update it. Then forwarded a random port (obtained at http://cubicspot.blogspot.com/2016/04/need-random-tcp-port-number-for-your.html ) using NAT forwarding on my router to the unRAID box's LAN IP and Minio's default 9000 port.

Then I tested the backup outside of my LAN. Updating the backup only took a couple minutes (using a cellular internet connection). Pretty good!

 

Impressed how easy this was. Thanks for all the assistance.

 

Now I suspect I must be missing something? Do I need to do anything to increase security? It seems the Minio will require the key and access key to even get in. The duplicati backups are also encrypted.

 

Ari

  • Like 1
Link to comment
1 hour ago, adoucette said:

Wow. Setting up Minio and Duplicati was incredibly easy.

I have set up Minio using the unRAID docker available from "topdockercat". The only setting I needed to change was to point it to a folder on unRAID to store the Minio "buckets".

 

2018-04-16_16-03-20.thumb.gif.7dc207e1bd4ccfed1726a43683a86d87.gif

 

(in this case I made a folder called "dup", and did not share it in any way)

 

2018-04-16_16-05-20.thumb.gif.5a40a79fa160b2b27d3e50a6f24e8258.gif

 

Then just made a generic bucket in Minio:

 

2018-04-16_16-07-17.thumb.gif.5d95e88fcaa5262db2b224a653bbbe33.gif

 

I got the Minio Keys from the docker log in unRAID.

 

Then I installed a Duplicati client on one of my local client computers and was able to make a backup by pointing it to the unRAID box on the LAN. The first backup compressed 60GB of files into 30GB of duplicati files... This took almost an hour for the first backup.

 

I set up a dynamic IP and set my router to update it. Then forwarded a random port (obtained at http://cubicspot.blogspot.com/2016/04/need-random-tcp-port-number-for-your.html ) using NAT forwarding on my router to the unRAID box's LAN IP and Minio's default 9000 port.

Then I tested the backup outside of my LAN. Updating the backup only took a couple minutes (using a cellular internet connection). Pretty good!

 

Impressed how easy this was. Thanks for all the assistance.

 

Now I suspect I must be missing something? Do I need to do anything to increase security? It seems the Minio will require the key and access key to even get in. The duplicati backups are also encrypted.

 

Ari

 

What do you use Minio for? Isn't Duplicati enough?

Link to comment
30 minutes ago, izarkhin said:

 

What do you use Minio for? Isn't Duplicati enough?

I believe it's because Duplicati doesn't support backing up to SMB shares. So you install Minio on unRAID to present a share as an Amazon S3 compatible service that Duplicati can target against.

Link to comment
On 2018-04-16 at 6:24 PM, adoucette said:

 

Also to allow remote backup.

Agreed. I just started using Duplicati myself to test it out. You can backup to shares on unRAID without using Minio. You just need to use the local path option, and then manually set the path to the UNC path.

 

So Minio is only required if you want to be able to backup remotely. However, I could imagine that using Minio might make things a little easier. That’ll be my next test.

Link to comment

So I've been playing with Duplicati to backup my everyday machine to my unRAID server (using local directory to go direct to an SMB share). Working as expected. My next step there will be to convert the local install to a service so it'll run without anyone logged in. After that will be adding duplicati to my unRAID server, and backing that up to the cloud, probably following @gridrunner's guide. So I'd be curious if anyone has any experience with Duplicati backing up it's own dblock/dindex/etc files.

 

That said, has anyone had any experience with Duplicacy? The tiny bit of reviews that I'm seeing look decent. The licensing model looks strange (the CLI is free to use for personal use, but you need to pay to get the GUI and/or commercial use), but it's not particularly expensive. I'd also be curious if people have gotten it up and running on unRAID.

Link to comment
  • 3 weeks later...

I'm trying out Nextcloud and have pointed a Duplicati backup to it's WebDAV storage, but am getting an error in Duplicati "The remote server returned an error: (413) Request Entity Too Large."

I know that I set the following in the letsencrypt ngnix site-conf:

client_max_body_size 512M;
proxy_buffering off;
proxy_request_buffering off;

And I set the following in the nextcloudngnix site-conf:

# Path to the root of your installation
root /config/www;
# set max upload size
client_max_body_size 512M;
fastcgi_buffers 64 4K;
proxy_buffering off;
proxy_request_buffering off;

And have restarted the unRAID server after this, and the dockers, but still get that error in Duplicati.

I can upload large files to the nextcloud instance using the web interface (just tested with a 1.2GB .iso file). Why am I getting this error with Duplicati?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.