[Support] Linuxserver.io - Duplicati


332 posts in this topic Last Reply

Recommended Posts

hey all,

 

does anyone have experience backing up to B2 on a gigabit connection?  Seems like it maxes out around 50Mb/s (bit, not byte, to be clear).  also tested with cloudberry and backblaze’s speedtest, which seem similar.

 

i’ve seen recommendations to increase the number of threads in the backup software - is that possible with duplicati?

Edited by acosmichippo
Link to post
  • Replies 331
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

Hi Guys. I have made a video about setting up and configuring Duplicati on unRAID for cloud and network backups.  

Application Name: Duplicati Application Site: https://www.duplicati.com/ Docker Hub: https://hub.docker.com/r/linuxserver/duplicati/ Github: https://github.com/linuxserver/docker-duplic

Same problem here. Temporary solution that I came up with is to edit the docker settings and change repository to an older version: "linuxserver/duplicati:v2.0.4.23-2.0.4.23_beta_2019-07-14-ls27"

Posted Images

13 minutes ago, jonathanm said:

How is duplicati getting access to /mnt/user? Normally that wouldn't be mapped.

Thanks partner.  No it wasn't mapped.  I had to play with the path for source in settings and that fix it.  I switched to /mnt/ and it works now.   I never understand the file permission stuff.

 

Does the "privileged" switch gives access to everything?  How does it work?

Link to post
  • 3 weeks later...

I decided to give Duplicati a try as my backup solution.  I have been using UnRaid for the last 10 years but I stepped up my game in the last year by setting up several dockers along with a few VM's on my Quad Xeon with 64GB Ram.

 

I installed the Docker and scheduled a 220GB backup from Unraid to a remote SMB Share(Synology Box) for 3am.  When I woke up this morning it looks as if the backup was complete(~220GB of duplicati files in my Synology Share).  I attampted to access the Duplicati WebUI but it won't come up - stuck on loading. 

 

I stopped and re-started the Docker - same result.  I stopped the array and re-booted the server - same result I can't access the WebUI.

 

No errors in the Docker Log and no errors in Syslog. Where do I go next? 

Edited by a12vman
Link to post
  • 2 weeks later...

I get a warning from within duplicati GUI when I configure a backup to run-script-after. After clicking on the warning the log reports success and no details on what went wrong. The script itself works, but I suppose it never reaches "exit 0"? Any help is appreciated.

MESSAGE="$DUPLICATI__OPERATIONNAME $DUPLICATI__PARSED_RESULT"
TITLE="$DUPLICATI__backup_name"

APP_TOKEN="hidden"
USER_TOKEN="hidden"

curl 'https://api.pushover.net/1/messages.json' -X POST -d "token=$APP_TOKEN&user=$USER_TOKEN&message=\"$MESSAGE\"&title=\"$TITLE\""

exit 0

 

Link to post

OK nevermind got it working without any warnings or errors using the pushover api example code for unix. In case anyone wants to get notifications when backups occur here is my script set to "run-script-after" in advanced options of the backup. Just add your token and user.

 

TITLE="Duplicati ($DUPLICATI__backup_name)"
MESSAGE="$DUPLICATI__OPERATIONNAME $DUPLICATI__PARSED_RESULT"

curl -s \
  --form-string "token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" \
  --form-string "user=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" \
  --form-string "message=$MESSAGE" \
  --form-string "title=$TITLE" \
  https://api.pushover.net/1/messages.json

exit 0

 

 

Edited by bigbangus
spelling
Link to post
  • 3 weeks later...

A public service announcement regarding duplicati.

 

Duplicati is still in beta ( which this container runs)

 

I recently lost my cache drive which means the I lost all my docker containers and duplicati config. No sweat, I had a backup of the config.

 

After restoring the config, I tried to restore a backup of the ~20GB file containing my docker backups. Since the database of files was lost with the duplicati docker, duplicati has to rebuild the database.

 

Whatever code performs this task is broken. A 40GB restore from a 600GB backup, is tracking on taking about 20 days.

I bring down all the files from the cloud, and looks like the restore will take about 9 days, even when running from local disk.

 

This seems to be a well known problem with this software. Google duplicati recreating database for similar stories.

 

If you are relying on duplicati for your primary backup, I recommend you google the above msg, and test your back. (delete your docker container / duplicati config, and try a restore as if you had a catastrophic failuer)

 

Sent from my SM-N960W using Tapatalk

 

 

 

 

Link to post
On 9/29/2020 at 7:17 PM, bobo89 said:

A public service announcement regarding duplicati.

 

Duplicati is still in beta ( which this container runs)

 

I recently lost my cache drive which means the I lost all my docker containers and duplicati config. No sweat, I had a backup of the config.

 

After restoring the config, I tried to restore a backup of the ~20GB file containing my docker backups. Since the database of files was lost with the duplicati docker, duplicati has to rebuild the database.

 

Whatever code performs this task is broken. A 40GB restore from a 600GB backup, is tracking on taking about 20 days.

I bring down all the files from the cloud, and looks like the restore will take about 9 days, even when running from local disk.

 

This seems to be a well known problem with this software. Google duplicati recreating database for similar stories.

 

If you are relying on duplicati for your primary backup, I recommend you google the above msg, and test your back. (delete your docker container / duplicati config, and try a restore as if you had a catastrophic failuer)

 

Sent from my SM-N960W using Tapatalk

 

 

 

 

Thanks for the PSA. I had several error messages myself when there were DB issues, and ended up deleting everything and starting fresh, but never had to simulate a catastrophic failure. I only backup small datasets in comparison to your use case, but still this is worrysome, as the stable version of Duplicati is not even recommended anymore... so there are not lots of alternative in this Duplicati world. Do you know of alternatives of tools that are:
1. free

2. can deduplicate

3. can store to cloud as encrypted, splitted files

4. not beta, and that can actually restore without taking weeks

5. not command line only, i.e. needs a free GUI (Duplicacy is close alternative, but with CLI version being only version that's free, this is not intuitive to use and especially restore via command line) 

 

 

 

 

 

Link to post
Thanks for the PSA. I had several error messages myself when there were DB issues, and ended up deleting everything and starting fresh, but never had to simulate a catastrophic failure. I only backup small datasets in comparison to your use case, but still this is worrysome, as the stable version of Duplicati is not even recommended anymore... so there are not lots of alternative in this Duplicati world. Do you know of alternatives of tools that are:
1. free
2. can deduplicate
3. can store to cloud as encrypted, splitted files
4. not beta, and that can actually restore without taking weeks
5. not command line only, i.e. needs a free GUI (Duplicacy is close alternative, but with CLI version being only version that's free, this is not intuitive to use and especially restore via command line) 
 
 
 
 
 
Some recommendations I have seen were to use Borg, and to use just use rclone to copy over the files(or the rclone uoload/mount/unmount script) used on furms . I haven't explored thay yet fully, but will report back how it works.

Duplicati is out for me.

Sent from my SM-N960W using Tapatalk

Link to post
  • 3 weeks later...

Hi,

I want to mount an external hard disk but when I turn it off and put it away, duplicati docker goes off
How do I set it up so I can swap external hard drives and still have the path available because I want to use one name for 7 hard drives for all 7 days of the week.
While I'm at it, can I tell duplicati that a backup should start when a particular disk is attached?

 

grafik.png.817650d421ee20b93da2cea22c2b7428.png

Link to post
On 10/1/2020 at 10:32 AM, xxxliqu1dxxx said:

Thanks for the PSA. I had several error messages myself when there were DB issues, and ended up deleting everything and starting fresh, but never had to simulate a catastrophic failure. I only backup small datasets in comparison to your use case, but still this is worrysome, as the stable version of Duplicati is not even recommended anymore... so there are not lots of alternative in this Duplicati world. Do you know of alternatives of tools that are:
1. free

2. can deduplicate

3. can store to cloud as encrypted, splitted files

4. not beta, and that can actually restore without taking weeks

5. not command line only, i.e. needs a free GUI (Duplicacy is close alternative, but with CLI version being only version that's free, this is not intuitive to use and especially restore via command line) 

 

 

 

 

 

Nothing else exists that matches all those requirements. Borg does have a GUI, Vorta, but you still need another program to get your backups to the cloud, like rclone.

You can check out this thread for more details if you decide to go the borg + rclone way, but you will have to modify scripts to use.

Link to post
  • 2 weeks later...

Question. 

 

What's the best way to backup appdata using duplicati? 

 

Currently backing the appdata folder directly with all the dockers running. Surprises me that it compressed from 20gb to 10gb. 

 

Other option to use the appdatabackup plugin that saves into tar.gz and then have duplicati upload that, but my backup size will be huge then, cause I keep last 4 copies in the folder. 

 

With duplicati I only one the last 2 going to a Mega.nz account 

Link to post
  • 4 weeks later...

I'm struggling to decide where to set Duplicati TMP folder on the server.  I use appdata backup app that backs up the appdata folder. Duplicati stores the databases in it's appdata folder so that needs to be backed-up.

 

The tmp folder for duplicati doesn't need backing up, so inside its appdata folder is the wrong place for it, and its making my appdata backup tar files huge!

 

Would it be ok to set it to the server cache drive e.g. /mnt/cache/  ?

 

I thought about using the server's RAM "/tmp/" but didn't want it to run out of RAM during a backup.

 

Link to post
  • 4 weeks later...
Why is duplicati creating and storing these files in the directory I am trying to backup?  I left it running for like 2 weeks while I was vacationing in Mexico and when I got back the drive was full.  WTF?  
DUPLICATI.thumb.jpg.f1ec9d88074c2508accaae318294c10d.jpg
Those are the encrypted chunks that constitute your backup. Are you sure you setup the right destination for your backup?

Sent from my SM-N960W using Tapatalk

Link to post
20 hours ago, bobo89 said:

Those are the encrypted chunks that constitute your backup. Are you sure you setup the right destination for your backup?

Sent from my SM-N960W using Tapatalk
 

Yes I am sure, the destination was right, to cloud.  Perhaps it creates a local copy before it uploads it and the upload speed couldn't keep up with the local copy? I was backing up about 5TB and the drive is 8TB. It was full in 2 weeks while I was out traveling :(

Edited by johnwhicker
Link to post

I am getting the following error over and over. I have tried restoring the container by re-downloading it but it does not seem to fix the issue.

I am not sure how long this has been going on for but it stops the container properly launching. Any ideas?

 

 

A serious error occurred in Duplicati: System.UnauthorizedAccessException: Access to the path "/tmp/HttpServer" is denied.
at System.IO.Directory.CreateDirectoriesInternal (System.String path) [0x0005e] in <254335e8c4aa42e3923a8ba0d5ce8650>:0
at System.IO.Directory.CreateDirectory (System.String path) [0x0008f] in <254335e8c4aa42e3923a8ba0d5ce8650>:0
at HttpServer.HttpServer.Init () [0x0010a] in <bed89f1655ee48029f6d6812f54c58ad>:0
at HttpServer.HttpServer.Start (System.Net.IPAddress address, System.Int32 port) [0x00026] in <bed89f1655ee48029f6d6812f54c58ad>:0
at Duplicati.Server.WebServer.Server..ctor (System.Collections.Generic.IDictionary`2[TKey,TValue] options) [0x00215] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
at Duplicati.Server.Program.StartWebServer (System.Collections.Generic.Dictionary`2[TKey,TValue] commandlineOptions) [0x00000] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
at Duplicati.Server.Program.RealMain (System.String[] _args) [0x00227] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
A serious error occurred in Duplicati: System.UnauthorizedAccessException: Access to the path "/tmp/HttpServer" is denied.
 

Link to post
12 minutes ago, doma_2345 said:

I am getting the following error over and over. I have tried restoring the container by re-downloading it but it does not seem to fix the issue.

I am not sure how long this has been going on for but it stops the container properly launching. Any ideas?

 

 

A serious error occurred in Duplicati: System.UnauthorizedAccessException: Access to the path "/tmp/HttpServer" is denied.
at System.IO.Directory.CreateDirectoriesInternal (System.String path) [0x0005e] in <254335e8c4aa42e3923a8ba0d5ce8650>:0
at System.IO.Directory.CreateDirectory (System.String path) [0x0008f] in <254335e8c4aa42e3923a8ba0d5ce8650>:0
at HttpServer.HttpServer.Init () [0x0010a] in <bed89f1655ee48029f6d6812f54c58ad>:0
at HttpServer.HttpServer.Start (System.Net.IPAddress address, System.Int32 port) [0x00026] in <bed89f1655ee48029f6d6812f54c58ad>:0
at Duplicati.Server.WebServer.Server..ctor (System.Collections.Generic.IDictionary`2[TKey,TValue] options) [0x00215] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
at Duplicati.Server.Program.StartWebServer (System.Collections.Generic.Dictionary`2[TKey,TValue] commandlineOptions) [0x00000] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
at Duplicati.Server.Program.RealMain (System.String[] _args) [0x00227] in <c5f097a49c0a4f1fb0f93cf3f5f218b1>:0
A serious error occurred in Duplicati: System.UnauthorizedAccessException: Access to the path "/tmp/HttpServer" is denied.
 

I have managed to fix this with the following that i found on the duplicati forum from 2018

 

48E14933-2DF5-4097-96C9-9C7991624BDF.jpeg

Link to post
  • 2 weeks later...

Has something changed in the last update? My local backups run to a Network NAS (it is very old granted) which can no longer be accessed in duplicati. unRAID can see the NAS and it is mounted and can be accessed:

 

image.png.94666520c43d5c1b963022ae501f10c8.png

 

It is configured and the docker has not been edited since i created it (a few years ago):

 

image.thumb.png.e749f0d698a427ef585be306914073f9.png

 

However the destination is not accessible any more in duplicati:

 

image.thumb.png.3c65a119f440426134ba86095cfc6b77.png

 

Any ideas guys?

 

EDIT - Figured it out. It seems the mounted share is now in /remotes and not /disks. I changed the docker setting and it's now working again.

Edited by showstopper
Link to post

after resolving the missing httpserver folder error, the latest update broke it again and i had to update my template to remove that fix, that gained me access to the GUI however I am now getting the following error.

 

I backed up one of my backup tasks and deleted it, i then tried restoring that back up and the file import failed, i then added the back up task manually but that also did not resolve the issue. Any ideas?

 

image.png.99d1873624b132a1091632cda57f5786.png

 

Link to post
On 9/30/2020 at 9:17 AM, bobo89 said:

A public service announcement regarding duplicati.

 

Duplicati is still in beta ( which this container runs)

 

I recently lost my cache drive which means the I lost all my docker containers and duplicati config. No sweat, I had a backup of the config.

 

After restoring the config, I tried to restore a backup of the ~20GB file containing my docker backups. Since the database of files was lost with the duplicati docker, duplicati has to rebuild the database.

 

Whatever code performs this task is broken. A 40GB restore from a 600GB backup, is tracking on taking about 20 days.

I bring down all the files from the cloud, and looks like the restore will take about 9 days, even when running from local disk.

 

This seems to be a well known problem with this software. Google duplicati recreating database for similar stories.

 

If you are relying on duplicati for your primary backup, I recommend you google the above msg, and test your back. (delete your docker container / duplicati config, and try a restore as if you had a catastrophic failuer)

 

Sent from my SM-N960W using Tapatalk

 

 

 

 

 

 

Is this still a current issue? I've been looking for backup solution for my server and was thinking that Duplicati was it, however after reading posts on this forum, I am having my doubts...

Link to post
On 1/19/2021 at 6:07 AM, Munce31 said:

 

 

Is this still a current issue? I've been looking for backup solution for my server and was thinking that Duplicati was it, however after reading posts on this forum, I am having my doubts...

Haven't tried it, but I'm of the opinion you should really test your backup anyways. Try duplicati, and pretend you had a catastrophic failure. How quickly can you restore using it ? 

Link to post
On 1/14/2021 at 5:26 PM, rampage said:

This doesn't seem to work for me, I getting HTTP ERROR 400 while open the webui? Do I have to change anything with the default template?

I'm getting the same HTTP 400 error.  Anyone know what's going on?

 

Edit:  Doesn't work in chrome.

 

Found solution from user fnwc:

When in the web UI, for Chrome, right click "inspect" go to the "Application" and click the "Clear Site Data" button then reload.

Edited by jmmille
Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.