rsync Incremental Backup


Recommended Posts

On 1/18/2022 at 11:51 PM, schwarzzrawhcs said:

I did that by restricting the maximal SMB protocol to SMB1. But now i cant connect it as a remote share in Unraid.

Try to mount manually with a fixed version. Example:

mkdir /mnt/remotes/syno
mount -t cifs -o "username=MyUserName,password=myPassword,vers=1.0" //DISKSTATION/Sharename /mnt/remotes/syno
 

 

 

Link to comment
On 12/30/2021 at 8:27 PM, mgutt said:

 

Ok, I solved it as follows:

 

  # stop docker container
  host_path="$src_path"
  if [[ $( uname -r ) == *"Unraid"* ]]; then
    host_path=${src_path/\/mnt\/$( echo "$src_path" | grep -oP "(?<=/mnt/).*(?=/)")//mnt/.*}
  fi
  for container_id in $(echo "$docker_mounts" | grep -oP ".*?(?=:$host_path)" | uniq); do
    container_name=$(docker container inspect -f '{{ .Name }}' "$container_id")
    container_name=${container_name//\/}
    echo "Stop container $container_name as it uses $host_path"
    #docker stop "$container_id"
  done

 

Now I need to expand this to support SSH sources as well.

 

 

i want to try this because i get failed backups if i backup my appdata folder. Is it possible to start the stopped container after backup finished?

Link to comment
11 minutes ago, MX-Hero said:

 

i want to try this because i get failed backups if i backup my appdata folder. Is it possible to start the stopped container after backup finished?

It will autostart in the next version. Will be released in a few days.

Link to comment
18 minutes ago, MX-Hero said:

 

i want to try this because i get failed backups if i backup my appdata folder. Is it possible to start the stopped container after backup finished?

The way I handle my appdata folder along with plex is back it up to a backup share with appdata backup folders once a week. I do this with other folders that are not really live backup friendly.....that way the script backups a nice clean tar file.

Edited by bclinton
Link to comment
  • 2 weeks later...

I tried your script, but getting this error.

 

Could not obtain last backup! (/mnt/user/appdata)
Error: ()!

 

my paths

# backup source to destination
backup_jobs=(
  # source                          # destination
  "/mnt/user/appdata"               "/mnt/user/backup/unraid/appdata"
)

 

 

I changed the destination to /mnt/disk/backup/ and this worked.

Now it also worked with the user share path....

Edited by Archonw
Link to comment
  • 2 weeks later...

Thanks for the latest version, but in my understanding the existing "backup chain" should still be used with the new version - is that correct?

 

So far I used version 0.6 and updated one backup job with the new script for a test and now it looks like the backup chain will be re-created (like a new initial backup) and ate a lot of additional space.

If I change every job to the new script my backup disk will be full before all the jobs end.

 

Do I something wrong? Please let me know.

 

Many thanks!

Edited by toasti
Link to comment

Hello,

maybe someone else could need this.

I want to stop and restart all my docker container while the script is running. I used this commands:

 

# #####################################
# Settings
# #####################################
docker container stop $(docker container list -qa)

 

 

done
docker container start $(docker container list -qa)

 

 

Now i have also persistent data from my database container.

 

  • Like 1
Link to comment
On 1/2/2022 at 5:36 PM, mgutt said:

The old backups will be preserved, but on the next backup hardlinking (space saving) does not work. If you even want that, you need to move the files of your most recent backup in the upper dir as follows:

mv /mnt/diskX/backup/daten/20220102_044001/daten/* /mnt/diskX/backup/daten/20220102_044001

 

@toasti

Look above. You need to move the last backup one dir up, so the new version finds the last backup.

Link to comment
10 hours ago, mgutt said:

@toasti

Look above. You need to move the last backup one dir up, so the new version finds the last backup.

OK, thanks but is it not possible to change the script instead?

I don't understand why it is not 100% compatible to the old version - was that not possible?

Link to comment
1 hour ago, toasti said:

OK, thanks but is it not possible to change the script instead?

Maybe users were confused regarding the additional subdir with the same name as the target dir.

 

I'll check if the script could rename the last backup automatically to enhance backwards compatibility.

Link to comment

Yes, that's the case, but what was nothing which bothered me. I just accepted it.

It is more confusing and not ideal that we have to tweak the backups when upgrading to the new version.

 

I will look now for the "fix", hope it will work.

 

But what is about the old backup states in the chain? Will the cleanup work, I mean, the old backups before the last one are not moved to the new place.

 

Edit:

I moved the latest backup as suggested and then wanted to have a look if the script do the cleanup in the correct way, but the log is not complete.

I backup 2 folders in this job, I can only see the details of the second folder backup, for example which backup states were deleted and which ones were preserved.

Edited by toasti
Link to comment
  • 2 weeks later...

Hello all,

thank you for the script @mgutt!
 

I have a question because during the backup again and again a folder error is produced. it is a docker in appdata.

the following error message appears:

Error: >f+++++++++ speedtest/www/vendor/symfony/http-kernel/Exception/PreconditionFailedHttpException.php >f+++++++++ speedtest/www/vendor/symfony/process/Exception/ProcessFailedException.php rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1330) [sender 3.2.3] (23)!

 

Is it possible to exclude certain folders from the backup? I could do without this docker in the backup.

Link to comment

Hello,

Been using the script for a while now and am finally getting around to setting up the excudes. What would be the best way to go about deleting the directories I want to exclude from all of the previous backups?  Would an rm -rf with a wildcard for the date directory level do it?

Edited by epilot5280
Link to comment
16 hours ago, epilot5280 said:

Would an rm -rf with a wildcard for the date directory level do it?

Yes, if the subdir is on the same level in each backup. But if you want to delete every dir with the name /temp on different levels, this won't work. In this case I would use find mentioned here:

https://stackoverflow.com/a/13032747/318765

 

Instead of -f use -i to test the command (-i means "interactive", where you need to confirm each deletion).

  • Thanks 1
Link to comment
On 3/1/2022 at 11:31 AM, LyDjane said:

Hello all,

thank you for the script @mgutt!
 

I have a question because during the backup again and again a folder error is produced. it is a docker in appdata.

the following error message appears:

Error: >f+++++++++ speedtest/www/vendor/symfony/http-kernel/Exception/PreconditionFailedHttpException.php >f+++++++++ speedtest/www/vendor/symfony/process/Exception/ProcessFailedException.php rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1330) [sender 3.2.3] (23)!

 

Is it possible to exclude certain folders from the backup? I could do without this docker in the backup.

nobody can help here? :(

Link to comment
47 minutes ago, LyDjane said:

nobody can help here? :(

The way I backup the appdata folder is using the backup/restore appdata plugin and store it in a backup share along with the plex backup and various other backup files. I then backup the backup share instead of the appdata folder with the rsync script. 

Edited by bclinton
Link to comment
On 3/3/2022 at 1:38 PM, bclinton said:

The way I backup the appdata folder is using the backup/restore appdata plugin and store it in a backup share along with the plex backup and various other backup files. I then backup the backup share instead of the appdata folder with the rsync script. 

that's what I do as well.
I would also like to have the full and incremental backup running.

Link to comment
On 3/3/2022 at 1:38 PM, bclinton said:

I then backup the backup share instead of the appdata folder with the rsync script. 

The appdata Backup plugin creates tar files which differ on every backup. So you don't benefit from hardlinks?!

Link to comment
28 minutes ago, mgutt said:

The appdata Backup plugin creates tar files which differ on every backup. So you don't benefit from hardlinks?!

Yep....but mine are only about 5 gigs and I have room to spare on my backup drive. 

 

Link to comment
23 hours ago, bclinton said:

Yep....but mine are only about 5 gigs and I have room to spare on my backup drive. 

with photprism and other dockers my appdata is round about 400gb.

 

So it would be nice, if the rsyn can exclude folders. :)

Link to comment
On 3/6/2022 at 6:54 PM, mgutt said:

Copy this line:

--exclude="Cache/" 

 

And as an example modify it as follows:

--exclude="dir/this-subdir-will-be-ignored/"

 

 

like this?

  "/mnt/user/appdata --exclude=/mnt/user/appdata/plex"

 

and if i want to exclude multiple folders?

Thanks!!!

 

Edited by LyDjane
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.