rsync Incremental Backup


Recommended Posts

You could try a different method to mount the target. The best option would be to enable the rsync daemon.

 

But it would be nice if you could test to raise "rename_timeout" to for example "1000". Maybe we find a proper timeout for your setup.

Edited by mgutt
Link to comment

Perhaps it's something completely different. I issue that script thru User Scripts. Is there a difference between calling a script thru User Scripts and from command line?

 

***EDIT*** script didn't work from command line as well. Forget about it. I don't want to butt to much into this thread. My Linux knowledge is somewhere around 0,01%. So it's ok if I can't get it to work. I live with that since nearly three decades.

 

Edited by hawihoney
Link to comment

Hi,

I tried to backup one share (inicial backup). It did not finish. I run the script but it was not running in the background, so I guess due to this it stopped. Maybe I touched something when I went back to the scripts page. Will try tomorrow again.
When it ends does the pop up inform that the backup finished?
Also, I noticed when it was running I could not change any share permission(read/write) of the other shares. Is this normal?
Rgds

Sent from my NX569J using Tapatalk

Link to comment
1 hour ago, luca2 said:

so I guess due to this it stopped

No need to guess. Open the logs, they will update live.

 

1 hour ago, luca2 said:

When it ends does the pop up inform that the backup finished?

By default only on fails. If you want all notifications set notification to "1".

 

1 hour ago, luca2 said:

I could not change any share permission(read/write) of the other shares. Is this normal?

Does not make sense. Rsync copies only files and does not use any Unraid or SMB processes. Maybe your system was overloaded?!

Link to comment

I will collect some errors with code 23, so we can decide later which are only soft errors. This one was critical as the complete folder had no permission:

Script Starting Nov 10, 2020 11:29.06

Full logs for this script are available at /tmp/user.scripts/tmpScripts/backup__w10_desktop/log.txt

Create backup of /mnt/disks/DESKTOP-TOG_Downloads
Backup path has been set to /mnt/user/Backup/disks/DESKTOP-TOG_Downloads
Create full backup 20201110_112906
sending incremental file list
rsync: readdir("/mnt/disks/DESKTOP-TOG_Downloads"): Permission denied (13)
DESKTOP-TOG_Downloads/

Number of files: 1 (dir: 1)
Number of created files: 1 (dir: 1)
Number of deleted files: 0
Number of regular files transferred: 0
Total file size: 0 bytes
Total transferred file size: 0 bytes
Literal data: 0 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 93
Total bytes received: 20

sent 93 bytes received 20 bytes 226.00 bytes/sec
total size is 0 speedup is 0.00
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1189) [sender=3.1.3]
Script Finished Nov 10, 2020 11:29.15

And this one is only a soft error as it skipped only two files because of permission problems while the rest has been successfully transfered:

Create backup of /mnt/disks/DESKTOP-TOG_Desktop
Backup path has been set to /mnt/user/Backup/disks/DESKTOP-TOG_Desktop
Create full backup 20201110_114046
sending incremental file list
DESKTOP-TOG_Desktop/
DESKTOP-TOG_Desktop/Fortnite.url
...
rsync: send_files failed to open "/mnt/disks/DESKTOP-TOG_Desktop/Youtube/MAGIX Video deluxe Premium.lnk": Permission denied (13)
DESKTOP-TOG_Desktop/Youtube/MKVToolNix GUI.lnk
...
DESKTOP-TOG_Desktop/Youtube/Videos.lnk
rsync: send_files failed to open "/mnt/disks/DESKTOP-TOG_Desktop/Youtube/ffmpegyag.lnk": Permission denied (13)

Number of files: 12 (reg: 10, dir: 2)
Number of created files: 12 (reg: 10, dir: 2)
Number of deleted files: 0
Number of regular files transferred: 10
Total file size: 10,031 bytes
Total transferred file size: 10,031 bytes
Literal data: 7,569 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 8,385
Total bytes received: 222

sent 8,385 bytes received 222 bytes 17,214.00 bytes/sec
total size is 10,031 speedup is 1.17
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1189) [sender=3.1.3]
Preserve failed backup: .20201110_114046

And again a soft error (3 of 23 failed):

Create backup of /mnt/disks/DESKTOP-TOG_Documents
Backup path has been set to /mnt/user/Backup/disks/DESKTOP-TOG_Documents
Create full backup 20201110_114046
sending incremental file list
rsync: readdir("/mnt/disks/DESKTOP-TOG_Documents/Eigene Bilder"): Permission denied (13)
rsync: readdir("/mnt/disks/DESKTOP-TOG_Documents/Eigene Musik"): Permission denied (13)
rsync: readdir("/mnt/disks/DESKTOP-TOG_Documents/Eigene Videos"): Permission denied (13)
DESKTOP-TOG_Documents/
DESKTOP-TOG_Documents/desktop.ini
DESKTOP-TOG_Documents/Blackmagic Design/
...
DESKTOP-TOG_Documents/Movie Studio 17.0 Platinum Projekte/

Number of files: 43 (reg: 23, dir: 20)
Number of created files: 43 (reg: 23, dir: 20)
Number of deleted files: 0
Number of regular files transferred: 23
Total file size: 8,724,190,138 bytes
Total transferred file size: 8,724,190,138 bytes
Literal data: 8,724,190,138 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 8,726,323,061
Total bytes received: 553

sent 8,726,323,061 bytes received 553 bytes 108,401,535.58 bytes/sec
total size is 8,724,190,138 speedup is 1.00
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1189) [sender=3.1.3]
Preserve failed backup: .20201110_114046

Soft error, because of timeout (client was shutdown):

Create backup of /mnt/disks/DESKTOP-TOG_Videos
Backup path has been set to /mnt/user/Backup/disks/DESKTOP-TOG_Videos
Create incremental backup 20201110_124033 by using last backup 20201110_123139
sending incremental file list
rsync: link_stat "/mnt/disks/DESKTOP-TOG_Videos" failed: Host is down (112)

Number of files: 0
Number of created files: 0
Number of deleted files: 0
Number of regular files transferred: 0
Total file size: 0 bytes
Total transferred file size: 0 bytes
Literal data: 0 bytes
Matched data: 0 bytes
File list size: 0
File list generation time: 20.369 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 20
Total bytes received: 19

sent 20 bytes received 19 bytes 1.90 bytes/sec
total size is 0 speedup is 0.00
rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1189) [sender=3.1.3]
Preserve failed backup: .20201110_124033
Preserve daily backup: 20201110_123139
Keep multiple backups per day: 20201110_114208
Script Finished Nov 10, 2020 12:40.53

Full logs for this script are available at /tmp/user.scripts/tmpScripts/backup__w10_desktop/log.txt

 

Edited by mgutt
Link to comment
On 11/6/2020 at 11:55 PM, mgutt said:

No need to guess. Open the logs, they will update live.

 

By default only on fails. If you want all notifications set notification to "1".

 

Does not make sense. Rsync copies only files and does not use any Unraid or SMB processes. Maybe your system was overloaded?!

I could spend some time testing this script. I tried to backup 383GB and still got some trouble.

 

Regarding the interrupted backup process I observed it is consuming excessive ram. I tried with a windows 10 x64 laptop with 4GB ram only. The script stops after couple of minutes. Now I tried with a linux vm running 6 cores and 9gb ram + 9gb swap space. I was monitoring ram consumption and it starts eating up ram until it consumes all physical ram and also some swap space. Suddenly it comes to a point where it finds a balance consuming only 15% ram and 19% swap space. Anyway after some time running (340gb backup of 380gb) my vm freezes for a a moment and the script ends without copying the 380gb. I looked at the log file in the new folder created by this script and there is no single error reported. Maybe there is a different log we could look at?

 

Also, I can confirm I cannot edit any share while the script is running.

 

Let me know what logs you need to look at it.

 

Link to comment
3 hours ago, luca2 said:

Regarding the interrupted backup process I observed it is consuming excessive ram. I tried with a windows 10 x64 laptop with 4GB ram only. The script stops after couple of minutes. Now I tried with a linux vm running 6 cores and 9gb ram + 9gb swap space. I was monitoring ram consumption and it starts eating up ram until it consumes all physical ram and also some swap space.

You executed my bash script in Windows? How? ^^

 

And why do you use a Linux VM to execute the script?!

 

And how did you check RAM usage (to be sure its regarding rsync)?

 

I would use this command:

ps aux --sort -rss | head -20 | cut -c 1-120

Which returns:

USER       PID %CPU %MEM    VSZ   RSS TTY      STAT START   TIME COMMAND
nobody   18752  0.3  1.3 2351916 211688 ?      Sl   15:12   0:12 /usr/lib/plexmediaserver/Plex Media Server
root     19284  5.1  0.8 148784 140808 ?       S    16:07   0:45 rsync -av --stats /mnt/user/appdata /mnt/user/backup/Sh
root      3264  1.6  0.4 916696 71784 ?        Ssl  Oct30 399:22 /usr/local/sbin/shfs /mnt/user -disks 127 2048000000 -o
root      3418  0.1  0.4 1048860 71084 ?       Sl   Oct30  32:58 /usr/bin/dockerd -p /var/run/dockerd.pid --log-opt max-
nobody   18770  0.2  0.3 1708836 59548 ?       SNl  15:12   0:10 Plex Plug-in [com.plexapp.system] /usr/lib/plexmediaser
root      3343  0.0  0.3  80072 49920 ?        S    Oct30   0:15 /usr/sbin/winbindd -D
nobody   19393  0.0  0.2 888964 43176 ?        Sl   15:13   0:03 Plex Plug-in [com.plexapp.agents.thetvdb] /usr/lib/plex
nobody   19306  0.0  0.2 886884 40912 ?        Sl   15:12   0:03 Plex Plug-in [com.plexapp.agents.themoviedb] /usr/lib/p
root      3434  0.1  0.2 694900 36548 ?        Ssl  Oct30  25:06 containerd --config /var/run/docker/containerd/containe
root      5479  0.0  0.2 152380 35316 ?        Sl   16:21   0:00 docker stats --no-stream --format={{.ID}};{{.CPUPerc}};
nobody   19301  0.0  0.2 878456 34612 ?        Sl   15:12   0:02 Plex Plug-in [tv.plex.agents.movie] /usr/lib/plexmedias
nobody   19250  0.0  0.2 879588 33864 ?        Sl   15:12   0:02 Plex Plug-in [tv.plex.agents.music] /usr/lib/plexmedias
nobody   19084  0.0  0.2 878840 32820 ?        Sl   15:12   0:02 Plex Plug-in [org.musicbrainz.agents.music] /usr/lib/pl
root     19049  0.0  0.1 104624 26512 ?        SL   16:06   0:00 /usr/bin/php /usr/local/emhttp/plugins/user.scripts/sta
root      3853  0.0  0.1 1269128 20596 ?       Sl   Oct30   0:07 /usr/sbin/libvirtd -d -l -f /etc/libvirt/libvirtd.conf 
nobody   18868  0.0  0.1 367924 18048 ?        Sl   15:12   0:00 /usr/lib/plexmediaserver/Plex Tuner Service /usr/lib/pl
root      3326  0.0  0.0  51740 15300 ?        Ss   Oct30   0:02 /usr/sbin/smbd -D
root     19286  7.6  0.0 146496 13828 ?        S    16:07   1:07 rsync -av --stats /mnt/user/appdata /mnt/user/backup/Sh
root     19285  4.0  0.0  83872 13012 ?        S    16:07   0:36 rsync -av --stats /mnt/user/appdata /mnt/user/backup/Sh

 

As you can see one rsync command has a CPU load of 5.1% and 0.8% RAM usage (of 16GB).

 

If you see a higher RAM usage in the dashboard it could be something related to unraid itself as it runs in RAM. To check all those paths, execute this:

df -h | grep -E "tmpfs|Filesystem" && echo "Size    Path" && du -hsx --exclude=/{proc,sys,dev} /* | grep -v '^0'

It returns for me (while rsync runs):

Filesystem        Size  Used Avail Use% Mounted on
tmpfs              32M  504K   32M   2% /run
devtmpfs          7.7G     0  7.7G   0% /dev
tmpfs             7.7G     0  7.7G   0% /dev/shm
tmpfs             128M  616K  128M   1% /var/log
tmpfs             1.0M     0  1.0M   0% /mnt/disks
Size    Path
11M     /bin
258M    /boot
12M     /etc
6.6M    /lib
22M     /lib64
16K     /root
504K    /run
20M     /sbin
46M     /tmp
584M    /usr
4.5M    /var

As you can see there is no really high RAM usage on my system, but maybe you find something unusual on yours. As an example "/tmp" is used for the scripts logs.

 

Regarding my experience, Unraid has a really bad shared access performance. You could try to bypass this by using a direct disk path as your backup path. This means instead of "/mnt/user/Backup" use "/mnt/disk3/Backup".

Edited by mgutt
Link to comment
On 11/15/2020 at 3:51 PM, mgutt said:

ou executed my bash script in Windows? How? ^^

 

And why do you use a Linux VM to execute the script?!

In windows/linux I executed the script from the browser connection to Unraid, using the Andrew Zawadzki´s "scripts plugin".

 

On 11/15/2020 at 3:51 PM, mgutt said:

And how did you check RAM usage (to be sure its regarding rsync)?

I checked the ram usage with window´s task manager and linux´s equivalent.

 

Now I did run a backup again, running the script as I did before, but now I did select the option "run in background" and added 2 more cores to unraid. I checked the cpu and ram usage with the commands you provided. As you posted the cpu usage is around 12%-18% and ram usage is 0.2%. The backup did run without any problems. Great!

 

I gues if I start the script using Andrew Zawadzki´s "scripts plugin" but do not select "run in background" it eats up all my ram. Not Unraid´s ram, but the ram of the pc I use to connect via browser to Unraid´s GUI.

 

This is what I get when the script finishes:

Number of files: 57,852 (reg: 52,321, dir: 5,531)
Number of created files: 57,852 (reg: 52,321, dir: 5,531)
Number of deleted files: 0
Number of regular files transferred: 52,321
Total file size: 169,921,241,556 bytes
Total transferred file size: 169,921,241,556 bytes
Literal data: 169,921,241,556 bytes
Matched data: 0 bytes
File list size: 2,883,435
File list generation time: 0.001 seconds
File list transfer time: 0.000 seconds
Total bytes sent: 169,966,133,523
Total bytes received: 1,038,847

sent 169,966,133,523 bytes received 1,038,847 bytes 72,188,223.56 bytes/sec
total size is 169,921,241,556 speedup is 1.00
Try #1 to make backup visible
Preserve daily backup: 20201117_114137
Script Finished Nov 17, 2020 12:20.52

Full logs for this script are available at /tmp/user.scripts/tmpScripts/backuptoextHDD_data/log.txt

I don´t know if the output is what I should get. I selected "notification=1" in the script. But you can see I get in the log "Try #1 to make backup visible"...what does it mean?

 

Anyway, good news everything went smooth and with no errors. Thx for your work.

 

 

Link to comment
3 hours ago, luca2 said:

gues if I start the script using Andrew Zawadzki´s "scripts plugin" but do not select "run in background" it eats up all my ram. Not Unraid´s ram, but the ram of the pc I use to connect via browser to Unraid´s GUI

OK, now I understand what you did. Yes, this is related to the Scripts Plugin. The browser is eating your RAM while displaying the (huge) log output. This is something which you should report Andrew. I think he needs to limit the output to a specific amount of lines to avoid this.

 

3 hours ago, luca2 said:

don´t know if the output is what I should get. I selected "notification=1" in the script. But you can see I get in the log "Try #1 to make backup visible"...what does it mean?

Everything fine. In the new version this text will be more clear. Explanation: A backup is created in a hidden folder like ".hidden" and is renamed only of the backup was successful. By that recent backups ignore failed backups and the user is able to directly see which backups are trustworthy and which not.

 

In this example the last two backup folders contain failed backups:

IMG_20201117_164645.thumb.jpg.e93656ca609fbff2d8df7052ec044d7e.jpg

 

Edited by mgutt
Link to comment
  • 1 month later...

Hello @mgutt,

 

Thanks for this script, i use it succesfully to backup multiple shares to a Unassigned Device.

 

Now I don't understand how the clean up process initialize $year, $month, $day to the current date for correct backup deleting.

last_year=$year
last_month=$month
last_day=$day
year=${backup:0:4}
month=${backup:4:2}
day=${backup:6:2}
if [ "$last_day" = "$day" ] && [ "$last_month" = "$month" ] && [ "$last_year" = "$year" ]; then
    echo "Keep multiple backups per day: $backup"
    continue
...

And excuse me for my poor script knowledge if my question is obvious. 😞

 

Merci et bonne année.

Fred.

 

Link to comment
On 10/18/2020 at 3:14 PM, mgutt said:

ls -tA "${backup_path}/" | while read backup;

@hf00

As you can see, "ls" scans the backup path for all created backups. "t" sorts them by time and "A" let it return hidden folders as well.

 

Then "while" starts looping through all the found backup folders. And the part you quoted, extracts the day, month and year out of the backup folder name. "0:4" means for example the first four chars of the folder name which is the year.

Link to comment

OK, I understand.

 

One last question about the log files. My backup has 9 source paths and i noticed that :

- the log file from the first source path backup is completed by the logs from the nine backups

- the log files from the second source path is completed by the logs from the second to the last backup

...

- only the log file from the last backup is OK with only his data! 

 

it's not blocking but the logs became very long et not practical to read.

Is it possible to limit each log file to his backup data?

 

Thank's for the help.

Fred

 

 

 

 

 

 

Link to comment

I'm note sure what you mean with "old logs" : the logs of my prevous backups in november have the same probleme, and are not affected by today's backup.

 

It look's like every generated log stay open while the script is running. Then Every "echo" instruction is writing in all open log files at this time, in my case seven log files at the end.

 

It's perhaps a problem on my Unraid. If not how can we "close" a log file before generating a new one ?

 

Fred.

Link to comment
6 hours ago, hf00 said:

It look's like every generated log stay open while the script is running

Ah ok. You passed multiple source paths and by that it uses one log for all source paths. This bug will be solved in the next version. I already fixed it, but I'm still testing the next release ;)

  • Like 1
Link to comment

Yes, i use multiple source but it don't result in creating only one log.

 

There is one log files in each destination folder but as I explained the content is incremented by each loop :

23 hours ago, hf00 said:

My backup has 9 source paths and i noticed that :

- the log file from the first source path backup is completed by the logs from the nine backups

- the log files from the second source path is completed by the logs from the second to the last backup

...

- only the log file from the last backup is OK with only his data! 

I'm probably not very cleay but perhaps can you reproduced that. 😞

 

Link to comment

Hi Mgutt,

 

Really loving your script and think it might become really useful for my usecase, being:  backing up via rsync through ssh on port 8888 to an external server.

I'm getting close but I do have some issues. The first being that the log file isn't uploaded to the server, but placed in the folder of the backup-file (in my case /var/backups/backup_script.sh)

 

Using this script, rsync is starting as supposed but hasn't been able to finish yet due to a random broken pipe error... Any help would be greatly appreciated!

 

 

#!/bin/bash
# #####################################
# Script:      rsync Incremental Backup v0.3
# Description: Creates incremental backups and deletes outdated versions
# Author:      Marc Gutt
# 
# Changelog:
# 0.3
# - rsync returns summary
# - typo in notification corrected
# - skip some rsync errors (defaults are "0" = skip on success and "24" = skip if some files vanish from the source while transfer)
# - add timeout for backup renaming https://forums.unraid.net/topic/97958-rsync-incremental-backup/?tab=comments#comment-910188
# 0.2
# - use full path for source and destination
# - backup multiple paths
# - unraid notification on success is now optional
# 0.1
# - initial release
# 
# ######### Settings ##################
source_paths=(
    "/media/data"
)
backup_path="username@hostname:~/backup"
days=14 # preserve backups of the last X days
months=12 # preserve backups of the first day of the last X month
years=3 # preserve backups of the first january of the last X years
fails=3 # preserve the recent X failed backups
notification=0
skip_errors=(0 24) # https://linux.die.net/man/1/rsync#:~:text=Exit%20Values
rename_timeout=100
# #####################################
# 
# ######### Script ####################
# make script race condition safe
if [[ -d "/tmp/${0///}" ]] || ! mkdir "/tmp/${0///}"; then exit 1; fi; trap 'rmdir "/tmp/${0///}"' EXIT;
# check user settings
backup_root_path=$([[ "${backup_path: -1}" == "/" ]] && echo "${backup_path%?}" || echo "$backup_path")
# loop through all source paths
for source_path in "${source_paths[@]}"; do
    echo "Create backup of $source_path"
    backup_path="$backup_root_path"
    # check user settings
    source_path=$([[ "${source_path: -1}" == "/" ]] && echo "${source_path%?}" || echo "$source_path")
    # shorten the backup path
    if [[ $source_path == "/mnt/user"* ]]; then
        backup_path+="/Shares${source_path#'/mnt/user'}"
        echo "Backup path has been set to $backup_path"
    elif [[ $source_path == "/mnt"* ]]; then
        backup_path+="${source_path#'/mnt'}"
        echo "Backup path has been set to $backup_path"
    fi
    # new backup timestamp
    new_backup="$(date +%Y%m%d_%H%M%S)"
    # create directory tree as rsync is not able to do that (https://askubuntu.com/a/561239/227119)
    mkdir -p "${backup_path}/.${new_backup}"
    # create log file
    exec &> >(tee "${backup_path}/.${new_backup}/backup.log")
    # obtain most recent backup
    last_backup=$(ls -t "${backup_path}" | head -n1)
    # create incremental backup
    if [[ -n "${last_backup}" ]]; then
        echo "Create incremental backup ${new_backup} by using last backup ${last_backup}"
        rsync -av --stats --delete -e "ssh -p 8888 -i /home/username/.ssh/id_rsa" --link-dest="${backup_path}/${last_backup}" "${source_path}" "${backup_path}/.${new_backup}"
    else
        echo "Create full backup ${new_backup}"
        # create very first backup
        rsync -av --stats -e "ssh -p 8888 -i /home/username/.ssh/id_rsa" "${source_path}" "${backup_path}/.${new_backup}"
    fi
    rsync_status=$?
    job_name="$(dirname "$0")"
    job_name="$(basename "$job_name")"
    if [[ "${skip_errors[@]}" =~ "${rsync_status}" ]]; then
        if [ "$notification" = "1" ]; then
            /usr/local/emhttp/webGui/scripts/notify -i normal -s "Backup done." -d "Job $job_name:${backup_path}/${new_backup} successfully finished."
        fi
        # make backup visible
        rename_try=1
        while true; do
            sleep 1
            echo "Try #${rename_try} to make backup visible"
            mv "${backup_path}/.${new_backup}" "${backup_path}/${new_backup}"
            mv_status=$?
            if [[ $mv_status -eq 0 ]]; then
                break
            fi
            rename_try=$(($rename_try+1))
            if [[ $rename_try -ge rename_timeout ]]; then
                /usr/local/emhttp/webGui/scripts/notify -i alert -s "Backup failed!" -d "Job $job_name:${backup_path}/${new_backup} failed because rename timeout has been reached!"
                break;
            fi
        done
    else
        /usr/local/emhttp/webGui/scripts/notify -i alert -s "Backup failed!" -d "Job $job_name:${backup_path}/${new_backup} failed (error ${rsync_status})!"
    fi
    # clean up
    ls -tA "${backup_path}/" | while read backup; do
        if [ "${backup:0:1}" = "." ]; then
            if [ "$fails" -gt "0" ]; then
                echo "Preserve failed backup: $backup"
                fails=$(($fails-1))
                continue
            fi
            echo "Delete failed backup: $backup"
            rm -r "${backup_path}/${backup}"
           continue
        fi
        last_year=$year
        last_month=$month
        last_day=$day
        year=${backup:0:4}
        month=${backup:4:2}
        day=${backup:6:2}
        if [ "$last_day" = "$day" ] && [ "$last_month" = "$month" ] && [ "$last_year" = "$year" ]; then
            echo "Keep multiple backups per day: $backup"
            continue
        fi
        # preserve yearly backups
        if [ "$month" = "01" ] && [ "$day" = "01" ] && [ "$years" -gt "0" ]; then
            echo "Preserve yearly backup: $backup"
            years=$(($years-1))
            continue
        fi
        # preserve monthly backups
        if [ "$day" = "01" ] && [ "$months" -gt "0" ]; then
            echo "Preserve monthly backup: $backup"
            months=$(($months-1))
            continue
        fi
        # preserve daily backups
        if [ "$days" -gt "0" ]; then
            echo "Preserve daily backup: $backup"
            days=$(($days-1))
            continue
        fi
        echo "Delete $backup"
        rm -r "${backup_path}/${backup}"
    done
done




 

Link to comment
  • 4 weeks later...

Greetings! Thanks for the hard work on this script. I started to use cloudberry because it was pretty easy to use but would really like to get to know rsync and learn something new. I installed the script and edited my usb drive path and changed the path to my source files. I have about 3T I am putting on an 8TB drive. It is copying the files over as I type this but I got this error at the start of the script. 

rsync: [generator] chown "/mnt/disks/offsite1/Shares/data/media/.20210214_182358/media" failed: Operation not permitted (1)
media/
media/Family Videos/
media/Family Videos/1999 - backstreet boys 1.avi
rsync: [generator] chown "/mnt/disks/offsite1/Shares/data/media/.20210214_182358/media/Family Videos/Camera Videos" failed: Operation not permitted (1)

 

later as it was backing up I started receiving the error for every file. They seem to have a . proceeding the path

 

rsync: [receiver] chown "/mnt/disks/offsite1/Shares/data/media/.20210214_182358/media/Family Videos/Camera Videos/.Back Yard20160207-202448-1454898288.mp4.KLlKik" failed: Operation not permitted (1)
media/Family Videos/Camera Videos/Back Yard20160207-202748-1454898468.mp4

 

Wasn't sure what exactly this is telling me. The data came from my old synology backup drive. I already deleted all of the @eadir folders that were everywhere and giving me rsync errors earlier on a movie folder backup until I removed the -og from my script. I assume I should be using different parameters but wasn't exactly sure which ones. 

 

Sorry if this is something I should already know :)

 

Update - I change the rsync parameter to this -rvlptD and the errors are gone now and it appears to be backing up fine. 

Edited by bclinton
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.