rsync Incremental Backup


105 posts in this topic Last Reply

Recommended Posts

Wow, many thanks for the complete solution!

I set it up but I am not sure if that really works on my server, because tcpdump is not recognized as command.

I presume I have to install this first, right?

Link to post
  • Replies 104
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

Popular Posts

The following script creates incremental backups of a share to a target path by using rsync. As an example the default setting is creating backups of the shares "Music" and "Photos" to the path "/mnt/

Good point. I will upgrade the script so it supports multiple and complete paths (not only a share name).

Ah ok. You passed multiple source paths and by that it uses one log for all source paths. This bug will be solved in the next version. I already fixed it, but I'm still testing the next release

Posted Images

Another question... how did you solve it that your Backup user can access your Windows profile and things like desktop, documents, etc?

Did you add the user to the admin group? Add NTFS read permissions for the profile folder?

All not ideal...

Link to post

Add a new local user through the settings with a strong password and name it "backup". After that right mouse click on "documents" > share with... Select the new user from the drop-down. It should be "read only". Repeat this step for all dirs you want to share. Do not share the root username dir. It contains the hidden "appdata" dir and a massive amount of other files which aren't needed.

 

After that you can mount this "smb server" through the Unassigend Disks Plugin.

Link to post

That is clear ;-) I did the same, but the problem is that other users have no NTFS permissions for your profile folder.

I always use advanced sharing, maybe the easier sharing option also sets NTFS permissions. Not sure. Never used it the last years.

 

EDIT: Yes, tested it: NTFS permissions are automatically set with the easier sharing option.

I will create a group, add the Unraid-Backup-User to the group and give the permissions to the group.

Then I am flexible if I want to change anything for the user accounts...

It's always best practice to use groups.

 

By the way, I created a share for the complete C$ because I want also to backup other folders in my profile, also some program folders in AppData etc. In the script I just added the path behind and all is fine...

Edited by toasti
Link to post
18 minutes ago, toasti said:

By the way, I created a share for the complete C$ because I want also to backup other folders in my profile, also some program folders in AppData etc. In the script I just added the path behind and all is fine...

Ok, but it takes much more time and the logs will contain several permission errors because of hidden redirects. But you will find them in the backup log ;)

Link to post

No problem, I will check this as soon as I did add the first backups within AppData.

 

How this works now is really brilliant.

 

I just thinking about how to do a second copy which I save in the office and refresh every week with a new backup.

Probably the easiest one is to create 2 more scripts that will copy the data to the second and third external drive.

Same config, just another target path and manually started.

 

 

Edited by toasti
Link to post

Actually the latest versions of UD relinquished spin control to unRAID so disks in UD spindown according to the default unRAID spindown timer.

 

Maybe you just have to wait longer or maybe you're encountering the same problem I did - where the USB drive returns an error when spindown's called so even though it's spun down unRAID thinks it's active.

 

I have a fix but I'm not confident enough in it to really recommend it:

 

 

Link to post

I found a bug. My PC is mounted through SMB. The backup runs every 5 hours. Now I found out that my sleeping PC returned a succesful backup, although rsync copied only the root dir of the SMB mount and returned no other errors:

 

Create backup of /mnt/remotes/DESKTOP-I0HHMD9_Users
Backup path has been set to /mnt/user/Backup/marc/remotes/DESKTOP-I0HHMD9_Users
Create incremental backup 20210321_000001 by using last backup 20210320_200001
sending incremental file list
DESKTOP-I0HHMD9_Users/

Number of files: 1 (dir: 1)
Number of created files: 0
Number of deleted files: 0
Number of regular files transferred: 0
Total file size: 0 bytes
Total transferred file size: 0 bytes
Literal data: 0 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 75
Total bytes received: 27

sent 75 bytes  received 27 bytes  204.00 bytes/sec
total size is 0  speedup is 0.00

 

 

The backup dir is empty:

1017178472_2021-03-2211_11_48.thumb.png.ee6073d86dd089efe275a9b38a6047b9.png

 

The last backup exists:

1533757857_2021-03-2211_12_27.thumb.png.cc506c0048b254ec70fd209d2b505544.png

 

This bug is not dangerous, because after powering on my PC, the new backup is created as usual:

611151545_2021-03-2211_14_01.thumb.png.e77de9735116bafe2e42d798c2fbd4b5.png

 

But its not incremental anymore as it was compared with the last empty backup.

 

I don't know why the SMB mount returns an empty result. Shouldn't it return "host is offline"?! I need to check that further while my PC is sleeping.

 

Maybe its the best to mark backups as failed if they contain only one empty dir.

 

 

Link to post

Just to be sure, I need a BTRFS formatted external hard disk that it works correctly (hard links, inodes, etc) - right?

Yesterday I had the idea to map my external disk (connected to my W10 machine, NTFS formatted and Bitlocker encrypted) to Unraid and use your script to make a backup copy which I want to store in my office.

Link to post
On 1/3/2021 at 2:40 AM, mgutt said:

Ah ok. You passed multiple source paths and by that it uses one log for all source paths. This bug will be solved in the next version. I already fixed it, but I'm still testing the next release ;)

Hello,

 

I use today the script in v0.6 and i still have the problem with the log files when using multiple source paths :

Same result as in v0.3:

- the first log file is full with the logs of all source path backups

- only the last log file is OK with only the log of the last source path backup.

 

Fred.

Link to post
1 hour ago, hf00 said:

i still have the problem with the log files when using multiple source paths :

Yes, this bug is still present. I tried to find out how to stop forwarding the script output to a file, but wasn't successful. I think I will try to forward the output to an own function. Maybe I can replace only the filename so "exec" isn't called multiple times.

Link to post
  • 2 weeks later...

Hi all.....I have noticed that I am unable to unmount the USB disk even after the backup script finishes. I run it in background mode and it apparently affects the unmounting of the drive. I click unmount on the UD entry and it pauses a split second and stays mounted. Looking at the drive settings below it shows the backup script. Is there a command I can run to stop the script manually?

 

 

Capture.JPG

Link to post
  • 2 weeks later...

Hi,

i tried the script an it works wonderful.

One thing i would like to see added, is the possibility to exclude Folders.

For example ".Recycle.Bin". Or other specified folders like home videos in user shares.

Link to post
Posted (edited)
On 3/22/2021 at 11:19 AM, mgutt said:

I found a bug. My PC is mounted through SMB. The backup runs every 5 hours. Now I found out that my sleeping PC returned a succesful backup, although rsync copied only the root dir of the SMB mount and returned no other errors:

 





Create backup of /mnt/remotes/DESKTOP-I0HHMD9_Users
Backup path has been set to /mnt/user/Backup/marc/remotes/DESKTOP-I0HHMD9_Users
Create incremental backup 20210321_000001 by using last backup 20210320_200001
sending incremental file list
DESKTOP-I0HHMD9_Users/

Number of files: 1 (dir: 1)
Number of created files: 0
Number of deleted files: 0
Number of regular files transferred: 0
Total file size: 0 bytes
Total transferred file size: 0 bytes
Literal data: 0 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 75
Total bytes received: 27

sent 75 bytes  received 27 bytes  204.00 bytes/sec
total size is 0  speedup is 0.00

 

 

 

 

where to get that summary ? I doubt the backup went through ... on a very large Folder ( ~ 22TB)

The log what i got looks like this

 

reate backup of /mnt/remotes/192.168.1.3_Pics
Backup path has been set to /mnt/HDD/Pics/remotes/192.168.1.3_Pics
Create full backup 20210508_062306
sending incremental file list
192.168.1.3_Pics/
192.168.1.3_Pics/Treesize.xml
192.168.1.3_Pics/Untitled.dfd
192.168.1.3_Pics/duplicates.xml
192.168.1.3_Pics/-- new entries !/
192.168.1.3_Pics/-- new entries !/2021-07.05 001.jpg
192.168.1.3_Pics/-- new entries !/2021-07.05 002.jpg

....

~ roughly 1TB copied over then suddenly stopped

script stopped somewhere without error entry ?

**

Update script makes Browser Tab Crash

after thats script is closed ...

where to dig out the error ?

 

 

 

 

 

 

Edited by Dtrain
Link to post
58 minutes ago, Dtrain said:

Update script makes Browser Tab Crash

after thats script is closed ...

where to dig out the error ?

 

Do not execute the script through the browser. Only in the background oder through Cron. The user scripts plugin is not memory friendly enough to display a huge amount of logs in the popup.

Link to post
7 hours ago, johnwhicker said:

allows ssh / scp / sftp?

I'm working on that. Please do not modify my script. Someone tried that and because of the last cleanup through "rm -r" all his data gone lost. I want to change everything to rsync. After that SSH / rsync daemon will work, too.

Link to post
5 hours ago, mgutt said:

I'm working on that. Please do not modify my script. Someone tried that and because of the last cleanup through "rm -r" all his data gone lost. I want to change everything to rsync. After that SSH / rsync daemon will work, too.

Thanks much Sir

Link to post
Posted (edited)
On 5/15/2021 at 10:13 AM, mgutt said:

I'm working on that. Please do not modify my script. Someone tried that and because of the last cleanup through "rm -r" all his data gone lost. I want to change everything to rsync. After that SSH / rsync daemon will work, too.

 

Do you mean, you want to change everything to rclone?

Is it possible to make Backups encrypted to my Cloud Storage?

 

 

Thank you very much for your great Work!!!

Edited by schoppehermann
Link to post
2 hours ago, schoppehermann said:

Do you mean, you want to change everything to rclone?

Rclone? I wrote rsync ;)

 

2 hours ago, schoppehermann said:

Is it possible to make Backups encrypted to my Cloud Storage?

Not a the moment. 

Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.