Shell script to perform backups using rsync


Recommended Posts

Update (10/02/13): just a note to say email functionality is dependent on your system already having this configured, such as installing simplefeatures email notify plugin.

Update (14/12/12): thanks to trurl and dalben for their assistance in testing the script.  Thanks to them the script is probably in a decent enough form to be used as-is by others.  If someone does use this script and find issues or can help add to it, please post your thoughts.  Latest version of the script attached to this post.

 

I've been using rsync for some time to perform localised backups and although it's been working well I thought I'd have a stab at making it more robust as recently I discovered my external USB drive (where I backup to) had become detached and since I didn't have any error logging it went unnoticed for a few days.  I set out to create a script with a small amount of error checking so that if the target location doesn't exist, it'll log this to the syslog and send me an email of the failure.  Likewise it'll do the same if rsync comes back with a non-successful return code.

 

Before I could proceed, I needed to consolidate an earlier shell script which contained a large number of separate rsync jobs into one.  So I needed to figure out the include / exclude rsync parameters - which by the way I now hate with a passion.  Once I'd figured this out, I was able to create a file which contained all the directories I wanted backed up and allowed me to work on a more generic backup script.

 

My unRAID config is fairly simple...  I have a bunch of top level user shares (Pictures, Dropbox, Music, Apps etc) that I want to backup and in addition to this I have a "Services" user share which contains a number of sub-directories where my apps are such as Sab, Sickbeard, Couchpotato.  So everything I need can be referenced from /mnt/user

 

I started off by creating a file of directories that I want backed up under the /mnt/user directory;

 

rsync_include_list:

Services/
Services/airvideo/***
Services/couchpotatoserver/***
Services/denyhosts/***
Services/dropbox/***
Services/iTunes/***
Services/lms/***
Services/MediaMonkey/***
Services/playlists/***
- Services/sabnzbd/Downloads/incomplete
Services/sabnzbd/***
Services/sickbeard/***
Apps/***
Dropbox/***
Music/***
Music\ Videos/***
Music2/***
Pictures/***
Stuff/***

 

NB: you can create comments in the include file when starting a line with #.  Helpful if you want to test with only a subset of your total directories or to put in some reminders to self.  I'm not going to explain the format as it pretty much speaks for itself.  The "***" tells rsync to backup the directory, any files it finds in that directory as well as any sub-directories & files.  Basically, everything.

 

Two things to point out:

 

1. You need to specify parent dirs if you only intend rsync to backup a sub-set of directories.  So for example, in my include list, directories like Apps, Pictures are easy as I want everything.  But for Services, I have a number of directories I don't want, so for that area I needed to specify just the ones I needed.  But had I not explicitly included the parent dir (the line "Services/") then rsync wouldn't touch any of the Services sub-directories.

 

2. On the subject of directories you may not want.  rsync lets you precede each directory with "+" or "-".  So even though this is an include list, you can also exclude directories.  In the above example, I specifically tell rsync not to backup the incomplete directory under my sabnzbd directory.

 

The other piece of the puzzle lies in the rsync command within the script below.  There's an "--exclude=*" parameter which excludes everything else as otherwise rsync will happily process every sub-directory in /mnt/user, regardless of whether it's in your include list or not.

 

I've tried to keep the script reasonably generic so that it could be reusable without too many changes other than changing the configurable options.

 

 

Regards,

overbyrn

rsync_backup_to_external.sh.txt

Link to comment

Thanks.

 

I have been using rsync to backup critical files on eSATA mounted ntfs-3g which I rotate offsite monthly. Your script will help make this better. Look forward to trying it and any further development.

 

One problem I have run into trying to use ntfs-3g for this is incompatible characters in filenames. I have some music files with accented or other "foreign" characters in them that were put on unRAID from Windows and they work just fine with Windows but rsync won't copy them. I have done some research on this and it looks like there may be some options in rsync or possibly mount that might address this but I have not actually gotten around to trying anything out yet. If anyone has any experience with this please respond.

Link to comment

trurl, I was interested to see if I could replicate your issue but it seems to be working ok for me.  As a test I created a subdir on my cache disk [/mnt/cache/test].  From a windows PC, I browsed to this directory, right clicked, created a new file using ALT+146 on the keypad to create character "Æ".  So now I have a file called "Æ.txt"

 

From command line I performed an rsync from this location to my target which is an NTFS formatted USB external disk, mounted via SNAP.

 

rsync -rav --stats --log-file=/mnt/cache/Services/logs/log2.log /mnt/cache/test /mnt/disk/WDExt/test

 

Checked the target and it definitely copied the file across. The log file says the same:

2012/12/09 08:09:49 [15953] building file list

2012/12/09 08:09:49 [15953] cd+++++++++ test/

2012/12/09 08:09:49 [15953] >f+++++++++ test/Æ.txt

2012/12/09 08:09:49 [15953] Number of files: 2

2012/12/09 08:09:49 [15953] Number of files transferred: 1

2012/12/09 08:09:49 [15953] Total file size: 0 bytes

2012/12/09 08:09:49 [15953] Total transferred file size: 0 bytes

2012/12/09 08:09:49 [15953] Literal data: 0 bytes

2012/12/09 08:09:49 [15953] Matched data: 0 bytes

2012/12/09 08:09:49 [15953] File list size: 58

2012/12/09 08:09:49 [15953] File list generation time: 0.001 seconds

2012/12/09 08:09:49 [15953] File list transfer time: 0.000 seconds

2012/12/09 08:09:49 [15953] Total bytes sent: 114

2012/12/09 08:09:49 [15953] Total bytes received: 35

2012/12/09 08:09:49 [15953] sent 114 bytes  received 35 bytes  298.00 bytes/sec

2012/12/09 08:09:49 [15953] total size is 0  speedup is 0.00

 

For what it's worth, I'm using ntfs-3g-2010.3.6-i486-1.txz which I think originally came from Unmenu.

 

Link to comment

For what it's worth, I'm using ntfs-3g-2010.3.6-i486-1.txz which I think originally came from Unmenu.

That is the ntfs-3g I have always used as well.

 

I just tried this with some of my existing files that had been giving me problems and it worked for me too.

 

It's been a while since I last tried this and I have made a lot of changes but none that I thought should have fixed the problem I used to have. I upgraded unRAID from 4.7 to 5.0rc8a and installed a lot of plugins but it's not obvious that has anything to do with it.

 

The only other change that seems like it might be relevant is I changed the translation in PuTTY to utf-8 from its default of ISO-8859-1. I say this because translating these was exactly what I was coming up with in my research of this problem, but relative to ntfs-3g and rsync, not relative to PuTTY. I thought translation setting in PuTTY would only affect the way things got displayed, not the way they worked.

 

I don't know but it looks like my problem might be solved. I will try a full offsite backup using your script and see what happens.

 

Just started your script and it appears to be working beautifully. My offsite backup will be about 500GB so won't know until later if my problem files get transferred or not.

 

Thanks again.

 

 

 

 

Link to comment

My offsite backup using your script completed. It looks like it successfully did all 830GB in two of my user folders, which is what I intended for it to do.

 

For some reason it sent me the "rsync exited with a non-zero return code" email and I don't see any log file.

 

Here is the "user variables" section in my script. This is the only section I made any changes to.

 

######################################################################
# DEFINE USER VARIABLES
######################################################################
# set global date var
logdate=$(date +%d-%m-%y)

# define the source location. must end in slash to work correctly
source=/mnt/user/

# define target backup location
target=/mnt/user/Backup/offsite

# define rsync include file name / location
rsyncincludes=/boot/scripts/rsync_include_list

# define log dir
logdir=/mnt/user/Backup/logs

# define log name as composite of static text and logdate variable
logname=rsync_offsite_log_$logdate.tgz

 

rsync_include_list is

 

Mary/***
Rick/***

 

I'm not up to speed yet on bash scripting. /mnt/user/Backup is a share where I put backup images. This is where I did my ntfs-3g mount and also where I pointed my logdir. It is on a separate disk from the user shares in my rsync_include_list. None of these shares are cached.

Let me know if you would like me to try anything else.

 

Thanks

 

 

Link to comment

trurl,

 

Thanks for giving it a try and letting me know how you got on.  Not immediately sure why rsync exited with a non-zero exit code.  I must admit I was fairly lazy and simply made the assumption if rsync didn't exit with return code 0, then "something" went wrong.  I don't really care what the something is other than to get an email saying it failed so I know to follow it up.  We need to know what the return code was before being able to troubleshoot it further.  See below...

 

Currently, the script only creates a log if rsync completes successfully (eg. return code = 0).  That's something to consider as I suppose there's value in having the rsync log even if it fails.

 

I've made some modifcations to the script.  Try the copy at this link:

http://dl.dropbox.com/u/572553/UnRAID/rsync_backup_to_external.sh

 

I've simplified the temp log creation as I was trying to be too clever for my own good and it was leaving files behind in /tmp.

 

I've added three config options:

1) verbose - when set to true, it'll log to syslog when run as cron or alternatively you get the same log messageson the screen if running from command line.  Changed the error logging to include the rsync return code so we'll see what is happening there.

2) dryrun - when set to true, it'll do everything but perform file transfers.  Useful for testing.

3) email - turns on / off the emailing if there's an error

 

IMPORTANT CHANGE:

I added --delete rsync parameter.  I had wrongly left this out as I used to have it enabled my end.  Be aware, this means any files at the target location no longer at the source location will be DELETED.  You may want to review this in case it doesn't suit your needs.

 

Subtle change to the tar command used to create the final log file.  Means the file extension (.tgz, bz2) on the logname will dictate what compression is used when creating the tar.

 

In the script on the link above, dryrun is enabled, as is verbose logging & email.  You just need to re-enter the config data for your setup.

 

I suggest you run the script from command line as that'll help understand what's happening.  Give it a go and let me know how you get on.

 

 

Regards,

overbyrn

Link to comment

I ran your new script with dry-run="" verbose="true" email="true"

I go the error email again with the return code this time:

 

Error: rsync exited with non-zero return code (23), please check backup status.

 

The log file did not appear in the place I had specified. I found it instead in /tmp

 

Looking through the logfile I found several files were not copied. It is the same problem I had before.

 

2012/12/10 21:12:43 [5846] rsync: recv_generator: failed to stat "/mnt/user/Backup/offsite/Rick/Music/Lossy/Steve Hackett/Momentum/12 ~ Bourée.mp3": Invalid or incomplete multibyte or wide character (84)

 

When I try to ls this file in PuTTY it comes out as 12 ~ Bour?e.mp3

 

Apparently some of the filenames with special characters are copying OK and others are not. When I thought my problem had gone away it was just because I had not chosen the right example to test.

 

Don't know how these got named in the first place. I will see if I can rename them to something that works and try again.

 

Link to comment

rsync exit codes:

http://bluebones.net/2007/06/rsync-exit-codes/  23 Partial transfer due to error

Sounds right given the problem you're seeing with unicode utf-8 characters. 

 

This seems less about rsync and more about how unicode characters are being handled in general. 

 

I did a couple of tests:

1. created test file "12 ~ Bourée.mp3" via Windows PC on unaid array disk.  set putty to utf-8. from command line, listed the file:

root@Eddie:/mnt/user/Apps/TestUTF8# ls

12\ ~\ Bourée.mp3

2. rsync above file to external ntfs-3g mounted drive using same putty command line:

root@Eddie:/mnt/user/Apps/TestUTF8# rsync -ravi --delete --stats /mnt/user/Apps/TestUTF8 /mnt/disk/WDExt/utf8test

sending incremental file list

.d...pog... TestUTF8/

.f...pog... TestUTF8/12 ~ Bourée.mp3

 

What version of unRAID are you using?

What is your unRAID locale set to?  (issue "locale" from command line)

What is the method/command you're using to mount your Offsite disk to /mnt/user/Backups/Offsite?

 

 

Update: trurl, grab a fresh copy of the script on the dropbox location.  I noticed I had an error where dryrun would always run even if set to blank.  I've also specifically set "LANG=en_US.utf8" right before the rsync takes place.  Perhaps that'll help.

 

 

Link to comment

What version of unRAID are you using?

5.0rc8a

What is your unRAID locale set to?  (issue "locale" from command line)

That command gives a bunch of lines but they are all set to "en_US.UTF-8"

What is the method/command you're using to mount your Offsite disk to /mnt/user/Backups/Offsite?

It has been mounted for several days so I don't have that command line handy anymore but I took it from this wiki page:

Mounting an external USB drive having an existing NTFS file system in READ/WRITE mode to transport files from/to unRaid server

so it must have been something like:

mkdir /mnt/user/Backup/offsite
mount -t ntfs-3g -o umask=111,dmask=000  /dev/sdc1 /mnt/user/Backup/offsite

 

Thanks for the new version. The previous one did indeed only do dry-run. I don't entirely understand the code but it looks like the log only gets packaged if rsync is successful. In any case I had to go find it in /tmp instead of $logdir/$logname.

 

Anyway...

 

After staying up way too late beating my head against the wall I have determined that there are only a few files affected by this problem (thanks to the log).  There are plenty of other files (my wife has some "world" music in her iTunes) with "foreign" characters that are not affected by this problem. It seems the main thing the problem files all have in common is that they were all downloads and so were originally created on some other system. I have been trying to rename them from Windows but that doesn't seem to work very well. I even tried to retag one of the files in MediaMonkey and that didn't work either. Even though these filenames are displayed "correctly" (whatever that means in this context) in Windows, Windows does not seem to be able to tell unRAID what file it is talking about when it asks unRAID (or SAMBA) to manipulate the file.

 

I am going to try renaming these few files from the unRAID (linux) command line in PuTTY, then I will try the new version of the script.

 

Thanks

 

Link to comment

OK. I have been playing with this some more today. Ran it a few times while trying to fix the filenames. The first thing I tried was fixing the filenames in my source files. Then I noticed that it seemed to have created some of the bad filenames on my destination so I deleted all of those. The last run it seemed to complete without error. I think at least some of the errors may have been due to trying to do the compare on the NTFS volume rather than the copy from the array.

 

I tweaked the script a little with some copy/paste so I could get an email and a log package on both success and fail. I am running it again to test. I also set my logdir to my Dropbox (thanks for the plugin) so I could access it after I got the email.

 

It is appending the log rather than creating a new one and that was a little confusing at first as I was trying to fix some things I had already fixed because they were coming up in a search of the log. Also, the log is still hanging around in /tmp after completion and it can get large. I deleted it from /tmp and the old logs in my logdir. Might need some more work to get it to quit appending and clean up the old log in /tmp.

rsync_backup_to_external.sh.txt

Link to comment

Thanks for this.  My rsync backup scripts are so basic that they are a disaster waiting to happen.

 

My requirements are a little different and before I go tying myself in nots I wanted to run a couple of things past you.

 

My destination is on another networked device.  I'm assuming the following target is fine:

 

[email protected]::data

 

To gain access to that network device, rsync needs a password.  It currently starts like this:

 

rsync --password-file=/mnt/cache/apps/rsync/.rsyncpwd

 

I'm assuming I can change your script to the following without any issues:

 

  else
    rsync --password-file=/mnt/cache/apps/rsync/.rsyncpwd -avih --delete --stats --include-from=$rsyncincludes --exclude=* --log-file=/tmp/$rsynclog $source $target 
  fi

 

Not sure if you feel like coding the location and name of a password file for rsync, but that would be cool.

 

*******************

 

Thought I would try it before I post.  I am getting errors

 

[->] Subject: ### rsync_backup_to_external.sh" ###
[->]
[->] Error: target backup location ([email protected]::data) does not exist!
[->] .
[<-] 250 2.0.0 OK 1355316866 o5sm15777877paz.32
[->] QUIT
[<-] 221 2.0.0 closing connection o5sm15777877paz.32

 

The destination is there as my old code works fine:

 

rsync --password-file=/mnt/cache/apps/rsync/.rsyncpwd -vrHltD --delete --stats /mnt/user/data/ [email protected]::data 

and the logs show it's working.

 

I tried replacing $target with [email protected]::data in your script but I got the same errors.

 

Any ideas ?  I've modified both rsync commands in your script to start run with the password file:

 

rsync --password-file=/mnt/cache/apps/rsync/.rsyncpwd -avih --dry-run --delete --stats --include-from=$rsyncincludes --exclude=* --log-file=/tmp/$rsynclog $source $target 

Link to comment

OK. I have been playing with this some more today. Ran it a few times while trying to fix the filenames. The first thing I tried was fixing the filenames in my source files. Then I noticed that it seemed to have created some of the bad filenames on my destination so I deleted all of those. The last run it seemed to complete without error. I think at least some of the errors may have been due to trying to do the compare on the NTFS volume rather than the copy from the array.

 

I tweaked the script a little with some copy/paste so I could get an email and a log package on both success and fail. I am running it again to test. I also set my logdir to my Dropbox (thanks for the plugin) so I could access it after I got the email.

 

It is appending the log rather than creating a new one and that was a little confusing at first as I was trying to fix some things I had already fixed because they were coming up in a search of the log. Also, the log is still hanging around in /tmp after completion and it can get large. I deleted it from /tmp and the old logs in my logdir. Might need some more work to get it to quit appending and clean up the old log in /tmp.

 

trurl, I've had another hack at the script and made some more changes (for the better I hope);

 

Temp rsync log and final log file overhaul.  It wasn't great was it  ;) 

 

I wanted a way to run the script multiple times during the same day and not have a ton of logs to show for it.  So what I've done is have rsync create its log to /tmp using a combination of the script name + script PID.  That gives me a unique log per script invocation.

 

It then creates the final log as a compressed file (.tar.gz) to the logdir and logname of your choosing.  This is pretty much the same as before, just that I've hardcoded the fact it's going to be a .tar.gz file. (makes the next step easier)

 

For each subsequent time the script is run, it'll add that rsync log to the final compressed log archive file.  The assumption is that the log name will always be a composite of static text + date.  So with luck you should only end up with one .tar.gz log file in any given day, which contains one or more rsync logs.

 

I've tested it here and it seems to work ok.  So far every time I run the script, the temp rsync logs that are created in /tmp are being deleted once the script completes.  I'm really hoping I've squashed that issue where files in /tmp were getting left behind.

 

Sounds like you've got some real funnies where unicode files are concerned.  Not sure I can say much more on that side of things I'm afraid.

 

Previous dropbox link contains latest script amendments.

 

 

Regards,

overbyrn

Link to comment

My destination is on another networked device.  I'm assuming the following target is fine:

[email protected]::data

dalben, I've rarely used rsync across remote systems so I'm a bit rusty on what to do.  How does your example above connect to the remote rsyncd process? 

 

I seem to recall when I last set it up, it was between two Unraid systems, where one of these was running the rsync daemon.  Mostly setup like WeeboTech outlines in post 3 of this thread: http://lime-technology.com/forum/index.php?topic=3159

 

But the syntax of that rsync command looks nothing like yours so I'm assuming perhaps you're doing it over ssh? 

 

Happy to help if you can bring me up to speed on how your setup is configured.  For what it's worth I did some thinking and realised my script is going to fail at the first point where it tests for target directory presence.  No good when the location is remote!  But that got me thinking in fact I don't need the test in place at all anymore.  Not since I rejigged the script as regardless of the reason it fails, a log will be produced and errors reported if configured.

 

I setup an rsync daemon on a test server and using the script have been able to remotely backup to this location.  Again, this is basically configured the way WeeboTech outlined.  So for the moment in my script for target I have

target=rsync://tower/mnt/disk1/testbackup

 

Here's the copy of the script I've used : https://dl.dropbox.com/u/572553/UnRAID/v3.sh

 

 

Regards,

overbyrn

Link to comment

I ran the new version twice this afternoon and can report that log packaging and cleanup is working as intended. Both times rsync completed without errors so I haven't really tested what happens when it fails but at least it looks like I managed to get all of my filenames cleaned up.

 

After the first run I didn't get an email so I looked at the script again and noticed that there was no email for success (I just love getting notifications from my unRAID) so I put in an email on success before the second run and that worked too.

 

One thing that didn't work was Dropbox. After each run I waited for the log to show up in my Dropbox but it was stuck on the unRAID side. I think Dropbox had probably crashed due to OOM or something. A lot of OOM going around these days as people try to squeeze more apps into their server. I tried to restart Dropbox from the webGUI but that resulted in SAMBA crashing as well and I had to reboot. I didn't bother to capture the syslog. I don't know if the size of the rsync log (about 16MB on each run) had anything to do with this or not since I am not really clear on all this Linux memory management, low memory, swap, etc. I do have a swap file and 4GB RAM. I will investigate further if it continues to be a problem.

Link to comment

My destination is on another networked device.  I'm assuming the following target is fine:

[email protected]::data

dalben, I've rarely used rsync across remote systems so I'm a bit rusty on what to do.  How does your example above connect to the remote rsyncd process? 

 

Thanks.  that worked after I added the password file location into the command.

 

I want to make one change which is to turn the include variable and file into the exclude.  I find in my case that a) the list is a lost smaller and b) if I create a new root directory I'll never remember to add it to the backup include list.  I'll hack away at that this after noon.

 

Thanks again for the script.

Link to comment

Thanks.  that worked after I added the password file location into the command.

 

I want to make one change which is to turn the include variable and file into the exclude.  I find in my case that a) the list is a lost smaller and b) if I create a new root directory I'll never remember to add it to the backup include list.  I'll hack away at that this after noon.

Glad to hear its working for you now.  I don't think it's likely to make a one-size-fits-all rsync script, so please feel free to modify mine for how you need things.  I really only ever made this thread to semi-document my own rsync experience in the hope it might help others. 

 

As for swapping include with exclude, don't forget you can also include and exclude from the file you specifiy either in the the include-from= or exclude-from= parameters.  Precede each line with plus or minus and this'll  tell rsync what you want included/excluded.  For instance, in my include from file, I don't want a subdir of sabnzbd so my "rsync_include_list" looks like this;

 

Services/
Services/airvideo/***
Services/couchpotatoserver/***
Services/denyhosts/***
Services/dropbox/***
Services/iTunes/***
Services/lms/***
Services/MediaMonkey/***
Services/playlists/***
- Services/sabnzbd/Downloads/incomplete
Services/sabnzbd/***
Services/sickbeard/***
Apps/***
Dropbox/***
Music/***
Music\ Videos/***
Music2/***
Pictures/***
Stuff/***

 

The lines without + or - could just as easily have + at the start, but it's kind of implied in that it's the include-from parameter which is reading this file and not exclude-from.

 

Regards,

overbyrn

Link to comment

As for swapping include with exclude, don't forget you can also include and exclude from the file you specifiy either in the the include-from= or exclude-from= parameters.  Precede each line with plus or minus and this'll  tell rsync what you want included/excluded.  For instance, in my include from file, I don't want a subdir of sabnzbd so my "rsync_include_list" looks like this;

 

Services/
Services/airvideo/***
Services/couchpotatoserver/***
Services/denyhosts/***
Services/dropbox/***
Services/iTunes/***
Services/lms/***
Services/MediaMonkey/***
Services/playlists/***
- Services/sabnzbd/Downloads/incomplete
Services/sabnzbd/***
Services/sickbeard/***
Apps/***
Dropbox/***
Music/***
Music\ Videos/***
Music2/***
Pictures/***
Stuff/***

 

The lines without + or - could just as easily have + at the start, but it's kind of implied in that it's the include-from parameter which is reading this file and not exclude-from.

 

Regards,

overbyrn

 

Ah, that I didn't know.  That will make things cleaner.  Thanks.

Link to comment
  • 1 month later...
dalben, I've rarely used rsync across remote systems so I'm a bit rusty on what to do.  How does your example above connect to the remote rsyncd process? 

 

I seem to recall when I last set it up, it was between two Unraid systems, where one of these was running the rsync daemon.  Mostly setup like WeeboTech outlines in post 3 of this thread: http://lime-technology.com/forum/index.php?topic=3159

 

But the syntax of that rsync command looks nothing like yours so I'm assuming perhaps you're doing it over ssh? 

 

Happy to help if you can bring me up to speed on how your setup is configured.  For what it's worth I did some thinking and realised my script is going to fail at the first point where it tests for target directory presence.  No good when the location is remote!  But that got me thinking in fact I don't need the test in place at all anymore.  Not since I rejigged the script as regardless of the reason it fails, a log will be produced and errors reported if configured.

 

I setup an rsync daemon on a test server and using the script have been able to remotely backup to this location.  Again, this is basically configured the way WeeboTech outlined.  So for the moment in my script for target I have

Code: [select]target=rsync://tower/mnt/disk1/testbackup

Here's the copy of the script I've used : https://dl.dropbox.com/u/572553/UnRAID/v3.sh

 

 

Regards,

overbyrn

 

 

overbyrn,

 

 

I would like to see the script you referred to in the above message. Would you mind making it available again through your dropbox?

 

 

thanks,

 

 

ken

Link to comment
The emails are sent to the same address the system notification emails are sent. This can be setup in the simplefeatures email notify plugin. There is also an unMenu package but I don't use that method anymore since I upgraded from 4.7

 

Thanks trurl.  I did not realize I needed to install the simplefeatures email notify plugin.

Link to comment

Rsync does not seem to like something about some mac application files. I have an unraid share for the installers for all the mac applications I use. When I try to rsync the share I get this:

 

 

root@Vault:~# rsync -avP rsync://Tower/mnt/user/Files/Software-Mac/ /mnt/user/Files/Software-Mac | grep fail

rsync: failed to set times on "/mnt/user/Files/Software-Mac/Coda/Coda 2.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr_CA.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/Coda/Coda 2.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/pt.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/FX Photo Studio PRO.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr_CA.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/FX Photo Studio PRO.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/Google Drive.app/Contents/Resources/lib/python2.6/site.py": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/MDRP.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/MDRP.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr_CA.lproj": Too many levels of symbolic links (40)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/MacDVD Ripper Pro/MDRP.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/MacDVD Ripper Pro/MDRP.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr_CA.lproj": Too many levels of symbolic links (40)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/TwistedWave/TwistedWave.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr_CA.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/TwistedWave/TwistedWave.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/Videobox/Videobox.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr_CA.lproj": No such file or directory (2)

rsync: failed to set times on "/mnt/user/Files/Software-Mac/Videobox/Videobox.app/Contents/Frameworks/Sparkle.framework/Versions/A/Resources/fr.lproj/fr.lproj": No such file or directory (2)

rsync error: some files/attrs were not transferred (see previous errors) (code 23) at main.c(1508) [generator=3.0.7]

 

 

Has anyone else seen this?

Link to comment
  • 1 year later...

I have build my second unRAID server just for backup purpose and was trying to find the best backup approach - I think this is the one I want  ;D

 

I want to backup only a couple of user shares from tower1 -->> tower2 via NFS. Both servers are attached to the same network switch.

 

I have modified the initial script as well as the rsync_include_list but I might need another file that overborne mentioned. So here's my question:

 

overbyrn, the script you referred to is not available anymore. Would you mind making it available again?

 

Thanks a lot in advance.

Link to comment
  • 7 years later...

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.