[Plugin] CA Appdata Backup / Restore v2.5


KluthR

Recommended Posts

2 hours ago, Jclendineng said:

On a brand new install? OK, that's 1 way to look at it. Lets go with that then :) I'll call it my fault until its updated.  Appreciate the help!

A brand new install of what? If it’s the latest stable branch (6.11.5) then it does work for most people and you should post more info to get to the bottom of your issue.

 

If it’s the newly released RC (6.12.0-RC1) then you can expect that by the time 6.12 reaches stable any issues will be corrected.

Link to comment

I had this running flawlessly for quite a while. I installed SmokePing a few weeks ago and for some reason it refuses to start back up after the backup process automatically, but always starts up once I start it from the GUI. I excluded it for now in hopes to get a successful backup and clean out the old ones. any ideas? 

Link to comment

Hi,

since a few days I get after each backup always the following messages in the logs and in addition the notification:

Quote

Backup of appData complete - Errors occurred

 

In the Syslog i can only see this:

 

image.thumb.png.e54682a44ecf50ce1422642087f32f9c.png

 

And here are my Settings:

 

image.thumb.png.e93986ecb654af8e89b019253f92e1f4.png

 

Perhaps in the next update can be built in that despite an error the old backups are deleted.

maybe someone can help me.
Thanks

Link to comment
On 3/11/2023 at 11:29 AM, KluthR said:

No, thats just right. Both setting can lead to broken backups. First one if backups are not verified and second if errors occur which will be discarded silently.

 

the BackupMainV2 name will be gone with the new version.

Is this right? As worded, it sounds like setting "Verify Backups?" to "Yes" can result in broken backups. But isn't the point of verification to ensure that they aren't broken?

Link to comment

Oh, no, no. Setting it „No“ could cause ot (because not being verified). If the main tar process say its all ok, the chances are very low that it is broken. Just a extra layer of security.  The wording is wrong and already fixed for the upcoming update.

  • Thanks 1
Link to comment
18 minutes ago, MikaelTarquin said:

Is this right? As worded, it sounds like setting "Verify Backups?" to "Yes" can result in broken backups. But isn't the point of verification to ensure that they aren't broken?

The problem is that many people have dockers running that they don’t want to stop for the time needed to preform the backup. This often results in a failed verification because the backup doesn’t match the original data since it changed during the process. The test doesn’t know what changed. Only that the two data sets don’t match. If the verification fails then the backup isn’t saved. End result is no backup.

 

Personally, I stop my dockers before running the backup and never have any issues. For people who don’t want to do this they can at least get a backup by disabling verification even if there is a chance that the backup may not be valid.

  • Like 2
Link to comment
3 hours ago, wgstarks said:

For people who don’t want to do this they can at least get a backup by disabling verification even if there is a chance that the backup may not be valid.

 

I honestly can't understand why people do this.  Which is worse?  Not having a backup or having a backup file which is invalid?

 

Link to comment
4 hours ago, ConnerVT said:

 

I honestly can't understand why people do this.  Which is worse?  Not having a backup or having a backup file which is invalid?

 

Sometimes the partially valid backup contains all that is really needed to recover, the parts that change rapidly may not be "valuable", in the sense that restoring an older copy of some files in the archive may not be a breaking issue, vs not having any data at all.

 

The ability to keep a backup that technically isn't complete, but is complete enough, is better than nothing. Chances are, even if the backup doesn't verify, it's still usable enough for disaster recovery, especially if you keep multiple dates of backups.

 

Some (many?) containers have the option to keep internal database backups, and since those files are pretty much guaranteed to be stable even when the container is still running and changing the active database, the "invalid" backup still contains a valid backup made by the app itself that can be used.

 

Making backups of running containers is complicated, it's not a black and white issue like your statement seems to imply.

Link to comment
7 minutes ago, JonathanM said:

The ability to keep a backup that technically isn't complete, but is complete enough, is better than nothing. Chances are, even if the backup doesn't verify, it's still usable enough for disaster recovery, especially if you keep multiple dates of backups.

 

Is the unknown "usable enough" something that one would wish to inject into the middle of a disaster recovery?  That stressful time when one is attempting to identify just what went wrong?  You are an experienced troubleshooter.  Many have skills which are the more basic "replace with a known good part".  Other which you help here have little idea what they are doing.  And it is not unreasonable that what they are troubleshooting is not even related to the missing data of the "complete enough" backup dataset, which may then compound the issue further.

 

I feel comfortable with my previous comment.

Link to comment
10 hours ago, ConnerVT said:

Is the unknown "usable enough" something that one would wish to inject into the middle of a disaster recovery?

Preferably not, but it's at least something to work with vs. complete loss. "Sorry, we tried everything we could" is preferable to "You have no backup archives at all, so you are hosed"

Link to comment

Hello, can anyone help me, please? When the server is always doing the CA auto backup, it has some errors. Looking at the log, I just found this in the log:

[22.03.2023 03:08:47] tar verify failed!

and

[22.03.2023 03:13:13] A error occurred somewhere. Not deleting old backup sets of appdata

 

I'm using the plugin from Robin Kluth version 2023.01.28. I'm attaching the full backuo log here. It's always like this.

backup.log

Link to comment

Update:

I finished the main part: backup :)

All is working well (at least it seems so).

 

Currently working on last things for the restore:

grafik.thumb.png.db0691ac5611580735596028353fc6fd.png

 

 

Beta very soon.

 

Please also note, that backups created via previous plugin versions are NOT supported!

 

The first things beta testers should do: test this in any other browser than Firefox. I just tested everything in FF for now...

Edited by KluthR
  • Like 1
Link to comment
On 3/20/2023 at 9:54 PM, ConnerVT said:

Is the unknown "usable enough" something that one would wish to inject into the middle of a disaster recovery? 

That depends on the circumstance. 
 

I know that my containers would support recovery from a hot backup, so it’s nice having the option to do that instead of being forced to stop services. I’d rather have the option than have it decided for me. 
 

In a more ideal world, we could do backup from file system snapshot, so we could get both hot backups and verification, but that doesn’t seem to be on the unRAID roadmap for a while. 

Link to comment

  

I keep getting an error on my appdata backup (errors below). Tar files are being made, I see 80GB files in the backup directory. When i go to restore my appdata it only allows me to select the non errored backup from 2022-08-01. Any folder names that contain 'error' can't be used. I assume if i rename the folder it will take but im curious if someone could advise what the error is or how to investigate this? I dont need the backups at the moment, just doing some housekeeping on the server. 

 

[20.03.2023 03:00:01] Backup of appData starting. This may take awhile
[20.03.2023 03:00:01] AdGuard-Home set to not be stopped by ca backup's advanced settings. Skipping
[20.03.2023 03:00:01] Stopping binhex-krusader...  done! (took 0 seconds)
[20.03.2023 03:00:01] Not stopping binhex-minidlna: Not started! [ / Created]
[20.03.2023 03:00:01] Stopping binhex-prowlarr...  done! (took 0 seconds)
[20.03.2023 03:00:01] Not stopping changedetection.io: Not started! [ / Created]
[20.03.2023 03:00:01] Stopping EmbyServer...  done! (took 4 seconds)
[20.03.2023 03:00:05] Stopping MariaDB-Official...  done! (took 1 seconds)
[20.03.2023 03:00:06] Stopping nextcloud...  done! (took 4 seconds)
[20.03.2023 03:00:10] Stopping Nginx-Proxy-Manager-Official...  done! (took 4 seconds)
[20.03.2023 03:00:14] Stopping overseerr...  done! (took 4 seconds)
[20.03.2023 03:00:18] Not stopping Portainer-CE: Not started! [ / Created]
[20.03.2023 03:00:18] Stopping qbittorrent...  done! (took 8 seconds)
[20.03.2023 03:00:26] Stopping radarr...  done! (took 4 seconds)
[20.03.2023 03:00:30] Stopping Sonarr...  done! (took 4 seconds)
[20.03.2023 03:00:34] Not stopping syncthing: Not started! [ / Created]
[20.03.2023 03:00:34] unifi-controller set to not be stopped by ca backup's advanced settings. Skipping
[20.03.2023 03:00:34] Stopping unpackerr...  done! (took 4 seconds)
[20.03.2023 03:00:38] Stopping Unraid-API...  done! (took 1 seconds)
[20.03.2023 03:00:39] Not stopping wikijs: Not started! [ / Created]
[20.03.2023 03:00:39] Backing up libvirt.img to /mnt/user/system/libvert/
[20.03.2023 03:00:39] Using Command: /usr/bin/rsync  -avXHq --delete  --log-file="/var/lib/docker/unraid/ca.backup2.datastore/appdata_backup.log" "/mnt/user/system/libvert/libvirt.img" "/mnt/user/system/libvert/" > /dev/null 2>&1
2023/03/20 03:00:39 [3469] building file list
2023/03/20 03:00:39 [3469] sent 68 bytes  received 12 bytes  160.00 bytes/sec
2023/03/20 03:00:39 [3469] total size is 1,073,741,824  speedup is 13,421,772.80
[20.03.2023 03:00:39] Backing Up appData from /mnt/user/appdata/ to /mnt/user0/backups/appdata backup/[email protected]
[20.03.2023 03:00:39] Separate archives disabled! Saving into one file.
[20.03.2023 03:00:39] Backing Up
[20.03.2023 03:17:35] Verifying Backup 
./unifi-controller/data/db/diagnostic.data/metrics.2023-03-19T00-01-15Z-00000: Mod time differs
./unifi-controller/data/db/diagnostic.data/metrics.2023-03-19T00-01-15Z-00000: Size differs
./unifi-controller/data/db/diagnostic.data/metrics.interim: Mod time differs
./unifi-controller/data/db/diagnostic.data/metrics.interim: Size differs
./unifi-controller/data/db/WiredTiger.turtle: Mod time differs
./unifi-controller/data/db/WiredTiger.turtle: Contents differ
./unifi-controller/data/db/journal/WiredTigerLog.0000000468: Mod time differs
./unifi-controller/data/db/journal/WiredTigerLog.0000000468: Contents differ
./unifi-controller/data/db/WiredTiger.wt: Mod time differs
./unifi-controller/data/db/WiredTiger.wt: Contents differ
./unifi-controller/data/db/sizeStorer.wt: Mod time differs
./unifi-controller/data/db/sizeStorer.wt: Contents differ
./unifi-controller/data/db/collection-101--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-101--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-215--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-215--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-179--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-179--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-323-3780217339629934391.wt: Mod time differs
./unifi-controller/data/db/collection-323-3780217339629934391.wt: Contents differ
./unifi-controller/data/db/collection-206--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-206--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-186--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-186--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-190--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-190--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-36--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/collection-36--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-102--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-102--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-103--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-103--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-104--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-104--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-105--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-105--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-106--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-106--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-107--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-107--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-158--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-158--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-165--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-165--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-180--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-180--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-187--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-187--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-188--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-188--632181887443358492.wt: Size differs
./unifi-controller/data/db/index-189--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-189--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-191--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-191--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-192--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-192--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-193--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-193--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-207--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-207--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-208--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-208--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-209--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-209--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-216--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-216--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-217--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-217--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-218--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-218--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-41--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-41--632181887443358492.wt: Contents differ
./unifi-controller/data/db/index-42--632181887443358492.wt: Mod time differs
./unifi-controller/data/db/index-42--632181887443358492.wt: Contents differ
./unifi-controller/data/db/collection-0--8116669766152780928.wt: Mod time differs
./unifi-controller/data/db/collection-0--8116669766152780928.wt: Contents differ
./unifi-controller/data/db/index-1--8116669766152780928.wt: Mod time differs
./unifi-controller/data/db/index-1--8116669766152780928.wt: Contents differ
./unifi-controller/data/db/index-2--8116669766152780928.wt: Mod time differs
./unifi-controller/data/db/index-2--8116669766152780928.wt: Contents differ
./unifi-controller/data/db/index-1-622574346060731590.wt: Mod time differs
./unifi-controller/data/db/index-1-622574346060731590.wt: Contents differ
./unifi-controller/data/db/index-15-622574346060731590.wt: Mod time differs
./unifi-controller/data/db/index-15-622574346060731590.wt: Contents differ
./unifi-controller/data/db/index-16-622574346060731590.wt: Mod time differs
./unifi-controller/data/db/index-16-622574346060731590.wt: Size differs
[20.03.2023 03:36:50] tar verify failed!
[20.03.2023 03:36:50] done
[20.03.2023 03:36:50] Starting qbittorrent... (try #1)  done!
[20.03.2023 03:36:50] Waiting 2 seconds before carrying on
[20.03.2023 03:36:52] Starting Unraid-API... (try #1)  done!
[20.03.2023 03:36:53] Waiting 2 seconds before carrying on
[20.03.2023 03:36:55] Starting Nginx-Proxy-Manager-Official... (try #1)  done!
[20.03.2023 03:36:55] Waiting 2 seconds before carrying on
[20.03.2023 03:36:57] Starting unpackerr... (try #1)  done!
[20.03.2023 03:36:59] Starting binhex-prowlarr... (try #1)  done!
[20.03.2023 03:36:59] Waiting 2 seconds before carrying on
[20.03.2023 03:37:01] Starting radarr... (try #1)  done!
[20.03.2023 03:37:02] Waiting 1 seconds before carrying on
[20.03.2023 03:37:03] Starting Sonarr... (try #1)  done!
[20.03.2023 03:37:03] Waiting 1 seconds before carrying on
[20.03.2023 03:37:04] Starting overseerr... (try #1)  done!
[20.03.2023 03:37:05] Waiting 2 seconds before carrying on
[20.03.2023 03:37:07] Starting EmbyServer... (try #1)  done!
[20.03.2023 03:37:10] Starting MariaDB-Official... (try #1)  done!
[20.03.2023 03:37:10] Waiting 2 seconds before carrying on
[20.03.2023 03:37:12] Starting nextcloud... (try #1)  done!
[20.03.2023 03:37:12] Waiting 2 seconds before carrying on
[20.03.2023 03:37:14] Starting binhex-krusader... (try #1)  done!
[20.03.2023 03:37:15] Waiting 2 seconds before carrying on
[20.03.2023 03:37:17] A error occurred somewhere. Not deleting old backup sets of appdata
[20.03.2023 03:37:17] Backup / Restore Completed
Link to comment

Running Unraid 6.11.5, CA Appdata Backup 2023.01.28

so this seems related to other posts I've seen here, but I don't have any exclusions for stopping containers. They're all set to stop, backup, then restart, but I've gotten errors the past two days about `Error while stopping container! Code: Container already started`, on a single different container each time, and then of course a verification difference on the affected container. Logs of past two nights attached.

 

I can stop the containers manually without issue, so not sure why CA Appdata Backup is listing it as "already started"

backup.log backup.log

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.