Jump to content
linuxserver.io

[Support] Linuxserver.io - Duplicati

247 posts in this topic Last Reply

Recommended Posts

I just completed my first backup of 1.7 TB to a USB3 drive. I'm running 500KB blocks, 250MB remote volumes and no encryption. The rest of my settings are default. It took 2d18h to complete. I've used solutions that were slower, but this still concerns me unless someone here can verify that this is expected.

 

I did a test restore after the backup completed. The first thing I noticed was how slow it was to browse through the backups. It took 30-45 seconds to expand each directory that I drilled down through. I chose a 7GB file to test with, which took about 20 minutes to restore. From what I observed with the information the UI provides, the actual restoration of the file took about three minutes while the rest of the time was spent verifying before and after.

 

Is all this kind of behavior typical with Duplicati, or is there something I could be doing wrong?

 

 

Share this post


Link to post

No sure if this is expected, but I see a very similar slowness for verification.  I've been trying to backup ~650GB to B2 storage.  Periodically I get a message that a timeout has happened and this causes the backup to stop, it then spends 24-30 hours "verifying backup" before it starts actually uploading data again.

 

I'm a bit concerned that this will continue even after I have successfully completed my backup.  If so, I'll have to go find something else to use.

 

I've check my bandwidth and CPU and cannot see why it takes so long to verify.

 

david

Share this post


Link to post
On 26/08/2017 at 9:28 AM, Gog said:

 

Yeah, I figured with proftpd restricting access to the backup directory it wasn't too bad but it still eaves me a bit twitchy.  I'm trying to switch to sFTP now but I'm fighting with the public key authentication method.  I'll post info if I figure it out.

 

It's working now on sFTP.

 

Credit goes to SlrG for the sFTP encryption key setup: 

 

Then simply copy sftp_rsa_key to the sftp and duplicati client.  

For filezilla, load sftp_rsa_key: filezilla, edit->settings->sftp->add keyfile -> target sftp_rsa_key

For Duplicati, configure the destination as a sFTP site, using port 2222 and in the advanced properties, set ssh-keyfile to the path of sftp_rsa_key

Share this post


Link to post
On 8/26/2017 at 10:03 AM, lovingHDTV said:

I asked and got an answer from the Duplicati forums and verified that email works just fine in the docker:

 

https://forum.duplicati.com/t/how-to-set-up-email-notification/233/12

 

hopefully this helps someone,

david

 

Thanks! That thread helped me find the issue I was having :-)

 

(certificate mis-match on SSL connection) I just couldn't find a way to see the output without that.

Share this post


Link to post
On 8/30/2017 at 11:37 AM, Phastor said:

I just completed my first backup of 1.7 TB to a USB3 drive. I'm running 500KB blocks, 250MB remote volumes and no encryption. The rest of my settings are default. It took 2d18h to complete. I've used solutions that were slower, but this still concerns me unless someone here can verify that this is expected.

 

I did a test restore after the backup completed. The first thing I noticed was how slow it was to browse through the backups. It took 30-45 seconds to expand each directory that I drilled down through. I chose a 7GB file to test with, which took about 20 minutes to restore. From what I observed with the information the UI provides, the actual restoration of the file took about three minutes while the rest of the time was spent verifying before and after.

 

Is all this kind of behavior typical with Duplicati, or is there something I could be doing wrong?

 

 

 

On 8/30/2017 at 1:17 PM, lovingHDTV said:

No sure if this is expected, but I see a very similar slowness for verification.  I've been trying to backup ~650GB to B2 storage.  Periodically I get a message that a timeout has happened and this causes the backup to stop, it then spends 24-30 hours "verifying backup" before it starts actually uploading data again.

 

I'm a bit concerned that this will continue even after I have successfully completed my backup.  If so, I'll have to go find something else to use.

 

I've check my bandwidth and CPU and cannot see why it takes so long to verify.

 

david

 

I believe it is just stupidly slooooow. I was considering duplicati, but I'm now leaning towards tossing down 30 bucks for cloudberry.

 

 

 

Share this post


Link to post

Duplicati works perfectly for me, even backing up to an external hard drive via USB. Yes, it's slow, but usb 2 is 40mbps so I wouldn't expect miracles

Sent from my Nexus 5X using Tapatalk

Share this post


Link to post
4 hours ago, allanp81 said:

Duplicati works perfectly for me, even backing up to an external hard drive via USB. Yes, it's slow, but usb 2 is 40mbps so I wouldn't expect miracles

Sent from my Nexus 5X using Tapatalk
 

 

I'm actually using USB3.

 

The actual transfer of the remote volumes to the drive are pretty quick. Each 250 MB volume only takes a couple seconds to be moved to the drive. It's the creation of the volumes before they are flushed to the drive that takes forever. Each volume is generated at about 10 MB/s.

Edited by Phastor

Share this post


Link to post

Just a comparison update: paid 30 bucks for cloudberry. It encrypts and compresses using multi-threading at 10-70MB/s depending on file type/size on my server. Unencrypted runs 75-100MB/s. Duplicati unencrypted only ever ran 5-6MB/s, and 4-5MB/s with encryption/compression.

 

And that was the issue for me, encryption and compression operating so slowly. If I was only doing cloud upload only and on a 10-20mbps line, then 5MB/s is more than the upload speed. But my local backups (10gbe) and remote (200mb/s) compel me use something that crunches the backup faster and more efficiently. Especially when the initial backup of 2+TB difference is 20 hours vs 105... and with weekly backups of 2-400GB.

 

Otherwise, I would have stuck with duplicati. It's a cool (free) product. Just not fast enough for me!

Edited by 1812

Share this post


Link to post
10 hours ago, 1812 said:

Just a comparison update: paid 30 bucks for cloudberry. It encrypts and compresses using multi-threading at 10-70MB/s depending on file type/size on my server. Unencrypted runs 75-100MB/s. Duplicati unencrypted only ever ran 5-6MB/s, and 4-5MB/s with encryption/compression.

 

And that was the issue for me, encryption and compression operating so slowly. If I was only doing cloud upload only and on a 10-20mbps line, then 5MB/s is more than the upload speed. But my local backups (10gbe) and remote (200mb/s) compel me use something that crunches the backup faster and more efficiently. Especially when the initial backup of 2+TB difference is 20 hours vs 105... and with weekly backups of 2-400GB.

 

Otherwise, I would have stuck with duplicati. It's a cool (free) product. Just not fast enough for me!

 

I might have to take a look at Cloudberry then. I have another drive I can test it with without losing what I've done so far with Duplicati.

 

Would be great if it can address the speed issue, but another thing I would love if it has a restore option to ignore files that already exist. Duplicati only offers the option to overwrite or create duplicate. Does Cloudberry do this?

Share this post


Link to post

My upload speed is 2Mbit and my backup size is about 170GB. So my Duplicati is still going from a week ago. It's had quite a few interruptions which have resulted in quite long verification's (multiple hours a time) before continuing with the upload.

 

I'm getting CloudBerry ready to go once Duplicati is done. I do note that the Linux edition has quite a limited choice of backup locations compared to the others, which is pretty disappointing.

Share this post


Link to post
4 hours ago, scytherbladez said:

My upload speed is 2Mbit and my backup size is about 170GB. So my Duplicati is still going from a week ago. It's had quite a few interruptions which have resulted in quite long verification's (multiple hours a time) before continuing with the upload.

 

I'm getting CloudBerry ready to go once Duplicati is done. I do note that the Linux edition has quite a limited choice of backup locations compared to the others, which is pretty disappointing.

 

Then perhaps boot up a vm and install the freeware on that to give it a go? Since i'm backing up to my own remote hardware (local & remote) it wasn't a consideration for me.

 

4 hours ago, Phastor said:

I would love if it has a restore option to ignore files that already exist. Duplicati only offers the option to overwrite or create duplicate. Does Cloudberry do this?

 

I've only tested individual file restoration after deletion. But it appears from the menu it will only replace missing files unless you specify overwrite existing on restore.

 

59ad28fb80df6_Screenshotfrom2017-09-0406-18-21.png.4895eb249a49214cb48a11892ccdee7c.png

 

 

 

The two things I have noticed (in local backups) that may be important to considering cloudberry:1. encryption/compression- keeps the file name intact vs obfuscating it.  2. The versioning cloudberry does is whole file versions and not referencing. So it doesn't keep just the changes, but the whole file again. This means more space. But it also means that if the original file is corrupted, you don't lose it, just that version.

 

 

 

I'll stop hijacking this thread now...

 

 

Edited by 1812

Share this post


Link to post
Quote

The two things I have noticed (in local backups) that may be important to considering cloudberry:1. encryption/compression- keeps the file name intact vs obfuscating it.  2. The versioning cloudberry does is whole file versions and not referencing. So it doesn't keep just the changes, but the whole file again. This means more space. But it also means that if the original file is corrupted, you don't lose it, just that version.

 

I wouldn't take a large issue with full file versions being taken when changed since I mostly have audio and video. The only changes made to those are renames and relocation, which I would imagine and hope it wouldn't take full versions for that.

Share this post


Link to post

So I have done a few backups now, but "source" is always bigger than "backup"

Anyone knows the reason for this?

The backups seems to not missing any files.

Share this post


Link to post
1 hour ago, allanp81 said:

Compression? De-duping?

I accually did not think about the compression :P

Share this post


Link to post

I have 440GB of data I need to backup to another NAS, file sizes range from kilobytes to gigabytes. Is duplicati the right solution for me or will it struggle with that volume of data?

Share this post


Link to post
5 minutes ago, Spies said:

I have 440GB of data I need to backup to another NAS, file sizes range from kilobytes to gigabytes. Is duplicati the right solution for me or will it struggle with that volume of data?

 

My best times were 5MB/s. So if you can live with the longer backup times vs some other programs, you'll be fine.

Share this post


Link to post

I have one drive with an existing set of backups on it. I want to make a second so I can rotate them to have an off-site cold backup.

 

I really don't want to take another three days to generate another backup of my 1.7TB from scratch on another drive, so I had the idea copying the backups from one drive to another (a few hours of work rather than days) and exporting/importing it's configuration and changing the name and destination. I know you could move backups to another location by doing this, but what about making an exact copy? Could two exact instances of the same backups exist under different names?

 

To test if this would work, I made a small test backup set. I ran it once and copied the resulting backups to another location. I then exported the configuration for that backup and imported it into another new set under a different name. I changed the destination to the location that I made the copy of the backup to and tried running it. It failed the first time, referencing files on the remote location that didn't match. After running a repair on it, it ran successfully on the second attempt.

 

So now that I know this works, before I go an do this with my live backup, I just wanted to make sure that I won't run into any wonky issues later on?

Edited by Phastor

Share this post


Link to post

Has anyone setup Duplicati to connect to a Minio docker (s3 compatible backend) via the letsencrypt docker? 

 

I've attempted to set this up, hitting mino via https in a browser works fine, no errors, but when I try and point Duplicati at it i get the following error:

Failed to connect: The request signature we calculated does not match the signature you provided. Check your key and signing method.

 

Pointing Duplicati directly at the mino docker via non ssl works fine.

 

Wasn't sure which docker container thread i should be asking this in but any help would be appreciated.

Share this post


Link to post

@Tango, I'm new to unRAID and Docker (and Duplicati - thanks to CrashPlan).  I have my box set up and a few 'standard' containers going but searching for "Minio" in the Community Applications "Apps" returns nothing.  How did you get Minio installed as a docker in the first place?

 

I found another post from July (I think) with a link to a supposed Minio template on github, but the link is dead now.  :-(

Share this post


Link to post
21 hours ago, JonMikelV said:

@Tango, I'm new to unRAID and Docker (and Duplicati - thanks to CrashPlan).  I have my box set up and a few 'standard' containers going but searching for "Minio" in the Community Applications "Apps" returns nothing.  How did you get Minio installed as a docker in the first place?

 

I found another post from July (I think) with a link to a supposed Minio template on github, but the link is dead now.  :-(

I had problems with that too.

What you need to do is go to the unraid servers "settings"-->" Community Applications"-->" General Settings" and set " Enable additional search results from dockerHub?" to yes

 

Then you go to "Apps" and search for "minio"

When it does not find anything, you click on the whale " Get More Results From DockerHub" then you will get a lot of results.

The one Im using is from "topdockercat"

 

:)

 

I would also like to know how to put the minio server behind the letsencrypt docker

Edited by isvein

Share this post


Link to post

isvein, thanks for the "Enable additional search results from dockerHub" tip!

 

Regarding your Duplicati issues it may be related to your cloud provider.  I recall a topic on the Duplicati forums about getting lots of errors specifically from box.com.  I believe the "solution" was to switch from their API to WebDAV.

https://forum.duplicati.com/t/benchmarking-different-storage-providers/601/3

 

Share this post


Link to post

Trying to browse through folders under the Restore tab is painful. Every time you try to drill down a folder it takes nearly two minutes to think before it actually does it. Doesn't matter how large or how many files are in the folder. It does this with every single folder--and it seems to get longer with every additional snapshot taken.

 

Is this normal?

Share this post


Link to post

To be honest, I haven't had to use the browse feature in restores much so I don't know if it's normal or not.  I did need something I used the Search button to filter down to just the files I cared about and that didn't seem to take me too long.

 

I'd suggest you try asking over at their forms (https://forum.duplicati.com/) where I know the main developers are pretty active.

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.