Jump to content
nerdbot

Local and Cloud Backup advice

9 posts in this topic Last Reply

Recommended Posts

Hi all,

 

I've been evaluating Unraid for about a week now and, for my needs, I'm quite impressed.  I'll most likely be picking up a license soon.  I've already installed the Community Applications, Nerd Tools, CA Backup/Restore, and Unassigned Devices plugins and configured them.

 

About a decade back, I setup a simple home backup using rsnapshot on a Linux server and rsync installed on the Windows and Mac computers in my house.  I liked having daily, weekly, and monthly incremental backups, but I was missing the offsite portion of any good backup system.  Cloud storage was still relatively new/expensive for consumers, and I wasn't aware of any easy way to encrypt/decrypt what I stored in the cloud.  Fast forward to now and I see there are so many options, so I'm revisiting the idea, especially now with Unraid.

 

I plan to use rsnapshot again on my Unraid server, keeping daily, weekly, and monthly snapshots on the array.  I'd only be using this setup for really important files (family pictures, important documents, etc), so I don't expect this storage to grow too quickly in size (I'm looking at about 100GB right now).  With rsnapshot and local storage, these snapshots are space efficient by using hard links for files that haven't changed between backups, but it's not clear to me how I'd achieve the same thing in cloud storage.

 

From the research I've done so far, I should be able to just use something like Rclone, Duplicati, or Borg and upload the latest weekly backup (assuming the top level directory name remains unchanged).  Rclone/Duplicati/Borg will determine which files have changed and only upload those files to the cloud storage service like Backblaze or CrashPlan, and then Backblaze/Crashplan will handle versioning the file for me.  So  I won't explicitly have daily, weekly and monthly rsnapshot directories in cloud storage, but instead I'll be able to see different versions of files as they change over time.  Does that sound about right?  If so, will I be able to do this with any cloud storage service, or is there a particular feature I need to look for when choosing a company?  

 

Or maybe a better question is, is there a better way to achieve my goal? (I realize it's possible rsnapshot is a really old way of doing things)

Edited by nerdbot

Share this post


Link to post

Hey, sorry nobody has replied to you.  I've become very interested in the topic of the "best backup" solution for unRAID.

 

Duplicati on the surface seems pretty awesome.  However, a lot of users including myself had nothing but errors.  It still needs a lot of heavy development.  

 

I wrote a guide on combining RCLONE+BORG.  Borg does a really great job at creating your repos while utilzing compression, deduplication, encryption, pruning etc.  And you would then use RCLONE to push that BORG repo up to your offsite storage.  The guide is here https://www.reddit.com/r/unRAID/comments/9md2hh/tutorial_rclone_borg_for_your_awesome_backup_needs/

 

Lately though, I stumbled across Duplicacy.  This piece of software is pretty awesome as it's faster than BORG in some benchmarks AND it uploads to the cloud or locally without the need of using RCLONE.  @walle wrote a pretty cool guide here. https://forums.unraid.net/topic/73796-solved-install-duplicacy-install-binary/?tab=comments#comment-687737&searchlight=1

 

Let me know if you have any specific questions.

 

Share this post


Link to post

Hi xhaloz, thanks for the response.  I went back to working on my rsnapshot script to backup the devices in the house to the Unraid server, then got sidetracked with some other issues with Unraid as well as just busy in my regular day-to-day, so I haven't reached the off-site portion of my backup plan yet.  I'll definitely look into the links you provided.  Re: Duplicacy, I would just need the CLI license, which is $20/year for the license?

Share this post


Link to post
22 hours ago, nerdbot said:

Hi xhaloz, thanks for the response.  I went back to working on my rsnapshot script to backup the devices in the house to the Unraid server, then got sidetracked with some other issues with Unraid as well as just busy in my regular day-to-day, so I haven't reached the off-site portion of my backup plan yet.  I'll definitely look into the links you provided.  Re: Duplicacy, I would just need the CLI license, which is $20/year for the license?

 

Quote

Free for personal use or commercial trial

Source: https://github.com/gilbertchen/duplicacy#license

 

Just download the binary and you are good to go. My post about my installation doesn't include how to work with Duplicacy, but there are guides like this one that gives an idea how to work with it.

Share this post


Link to post
On 1/22/2019 at 7:00 AM, xhaloz said:

Hey, sorry nobody has replied to you.  I've become very interested in the topic of the "best backup" solution for unRAID.

 

Duplicati on the surface seems pretty awesome.  However, a lot of users including myself had nothing but errors.  It still needs a lot of heavy development.  

 

I wrote a guide on combining RCLONE+BORG.  Borg does a really great job at creating your repos while utilzing compression, deduplication, encryption, pruning etc.  And you would then use RCLONE to push that BORG repo up to your offsite storage.  The guide is here https://www.reddit.com/r/unRAID/comments/9md2hh/tutorial_rclone_borg_for_your_awesome_backup_needs/

 

Lately though, I stumbled across Duplicacy.  This piece of software is pretty awesome as it's faster than BORG in some benchmarks AND it uploads to the cloud or locally without the need of using RCLONE.  @walle wrote a pretty cool guide here. https://forums.unraid.net/topic/73796-solved-install-duplicacy-install-binary/?tab=comments#comment-687737&searchlight=1

 

Let me know if you have any specific questions.

 

Hi

 

I was looking into the borg+rclone setup and it looks like this is what I need.

 

Is it possible to change the script to get discord notifications and not email notifications when backups has error or is done?

Share this post


Link to post
On 1/22/2019 at 1:00 AM, xhaloz said:

Hey, sorry nobody has replied to you.  I've become very interested in the topic of the "best backup" solution for unRAID.

 

Duplicati on the surface seems pretty awesome.  However, a lot of users including myself had nothing but errors.  It still needs a lot of heavy development.  

 

I wrote a guide on combining RCLONE+BORG.  Borg does a really great job at creating your repos while utilzing compression, deduplication, encryption, pruning etc.  And you would then use RCLONE to push that BORG repo up to your offsite storage.  The guide is here https://www.reddit.com/r/unRAID/comments/9md2hh/tutorial_rclone_borg_for_your_awesome_backup_needs/

 

Lately though, I stumbled across Duplicacy.  This piece of software is pretty awesome as it's faster than BORG in some benchmarks AND it uploads to the cloud or locally without the need of using RCLONE.  @walle wrote a pretty cool guide here. https://forums.unraid.net/topic/73796-solved-install-duplicacy-install-binary/?tab=comments#comment-687737&searchlight=1

 

Let me know if you have any specific questions.

 

 

do you happen to have the Borg tutorial stored anywhere? The reddit link is deleted

Share this post


Link to post

Hi... I think you've have had some good feed back there but I'll just add my experience based on coming from Windows.

 

First duplicati needs work... It's no where near as good as windows backup up especially bare metal restores.

 

I looked at cloudberry and that had promise but seems to have a 5tb limit ?

 

I ended up writing my own rsync backup and restore script but I'm still not happy as that is a straight copy rather than one I can restore from a specific time.

 

There is one advantage though not been on windows and that is (in theory) it's a lot harder to get hit by malware so it does have its plus points.

 

I'm in the process of building a new server and I'm pretty sure I'll be back to this topic again hence the interest in this thread and the outcome.

 

Terran

Share this post


Link to post
On 7/12/2019 at 2:56 AM, ProphetSe7en said:

Hi

 

I was looking into the borg+rclone setup and it looks like this is what I need.

 

Is it possible to change the script to get discord notifications and not email notifications when backups has error or is done?

Super late reply but yes you can get discord notifications via slack. You would need to check the discord docs for that but its super easy.  The command line to fire off the notification

############
WEBH_URL="https://discordapp.com/api/webhooks/<MYDISCORDWEBHOOKNUMBER>/<MYOTHERDISCORDWEBHOOKNUMBER>/slack"
APP_NAME="unRAID Server"
TITLE="$1"
MESSAGE="$2"

############
TITLE=$(echo -e "$TITLE")
MESSAGE=$(echo -e "$MESSAGE")
curl -X POST --header 'Content-Type: application/json' \
-d "{\"username\": \"$APP_NAME\", \"text\": \"*$TITLE* \n $MESSAGE\"}" $WEBH_URL 2>&1

 

 

Share this post


Link to post
On 11/10/2019 at 6:03 PM, ffhelllskjdje said:

 

do you happen to have the Borg tutorial stored anywhere? The reddit link is deleted

Yeah I can provide my borg script here.  If you need help on it let me know.  But borg makes a local backup and rsync clones it off site.  This gives you 3 copies of your data and 2 of them local.   Also the script will not re-run if rsync hasn't finished its last operation (slow internet) or if parity sync is running.  The key factor in not having everything being constantly checked by Borg is the files-cache=mtime,size. I was noticing everytime I ran Borg it would index files that haven't changed.  This command fixed that which has to do with unRAID's constant changing inode values.  The borg docs are very good (https://borgbackup.readthedocs.io/en/stable/) 
 

Let me know if you get stuck. Obviously this script wont work until you setup your repository.

 

#!/bin/sh
LOGFILE="/boot/logs/TDS-Log.txt"
LOGFILE2="/boot/logs/Borg-RClone-Log.txt"

# Close if rclone/borg running
if pgrep "borg" || pgrep "rclone" > /dev/null 
then
    echo "$(date "+%m-%d-%Y %T") : Backup already running, exiting" 2>&1 | tee -a $LOGFILE
    exit
    exit
fi

# Close if parity sync running
#PARITYCHK=$(/root/mdcmd status | egrep STARTED)
#if [[ $PARITYCHK == *"STARTED"* ]]; then
#    echo "Parity check running, exiting"
#    exit
#    exit
#fi


#This is the location your Borg program will store the backup data to
export BORG_REPO='/mnt/disks/Backups/Borg/'

#This is the location you want Rclone to send the BORG_REPO to 
export CLOUDDEST='GDrive:/Backups/borg/TDS-Repo-V2/'


#Setting this, so you won't be asked for your repository passphrase:
export BORG_PASSPHRASE='<MYENCRYPTIONKEYPASSWORD>'

#or this to ask an external program to supply the passphrase: (I leave this blank)
#export BORG_PASSCOMMAND=''

#I store the cache on the cache instead of tmp so Borg has persistent records after a reboot.
export BORG_CACHE_DIR='/mnt/user/appdata/borg/cache/'
export BORG_BASE_DIR='/mnt/user/appdata/borg/'

#Backup the most important directories into an archive (I keep a list of excluded directories in the excluded.txt file)
SECONDS=0


echo "$(date "+%m-%d-%Y %T") : Borg backup has started" 2>&1 | tee -a $LOGFILE
borg create                         \
    --verbose                       \
    --info                          \
    --list                          \
    --filter AMEx                   \
    --files-cache=mtime,size        \
    --stats                         \
    --show-rc                       \
    --compression lz4               \
    --exclude-caches                \
    --exclude-from /mnt/disks/Backups/Borg/Excluded.txt \ 
    \
    $BORG_REPO::'{hostname}-{now}'  \
    \
    /mnt/user/Archive              \
    /mnt/disks/Backups/unRAID-Auto-Backup \
    /mnt/user/Backups              \
    /mnt/user/Nextcloud            \
    /mnt/user/system/              \
    
    >> $LOGFILE2 2>&1


backup_exit=$?
# Use the `prune` subcommand to maintain 7 daily, 4 weekly and 6 monthly
# archives of THIS machine. The '{hostname}-' prefix is very important to
# limit prune's operation to this machine's archives and not apply to
# other machines' archives also:
#echo "$(date "+%m-%d-%Y %T") : Borg pruning has started" 2>&1 | tee -a $LOGFILE
borg prune                          \
    --list                          \
    --prefix '{hostname}-'          \
    --show-rc                       \
    --keep-daily    7               \
    --keep-weekly   4               \
    --keep-monthly  6               \
    >> $LOGFILE2 2>&1

prune_exit=$?
#echo "$(date "+%m-%d-%Y %T") : Borg pruning has completed" 2>&1 | tee -a $LOGFILE

# use highest exit code as global exit code
global_exit=$(( backup_exit > prune_exit ? backup_exit : prune_exit ))

# Execute if no errors
if [ ${global_exit} -eq 0 ];
then
    borgstart=$SECONDS
    echo "$(date "+%m-%d-%Y %T") : Borg backup completed in  $(($borgstart/ 3600))h:$(($borgstart% 3600/60))m:$(($borgstart% 60))s" | tee -a >> $LOGFILE 2>&1

#Reset timer
    SECONDS=0
    echo "$(date "+%m-%d-%Y %T") : Rclone Borg sync has started" >> $LOGFILE
    rclone sync $BORG_REPO $CLOUDDEST -P --stats 1s -v 2>&1 | tee -a $LOGFILE2
    rclonestart=$SECONDS   
    echo "$(date "+%m-%d-%Y %T") : Rclone Borg sync completed in  $(($rclonestart/ 3600))h:$(($rclonestart% 3600/60))m:$(($rclonestart% 60))s" 2>&1 | tee -a $LOGFILE
# All other errors
else
    echo "$(date "+%m-%d-%Y %T") : Borg has errors code:" $global_exit 2>&1 | tee -a $LOGFILE
fi
exit ${global_exit}

 

Edited by xhaloz

Share this post


Link to post

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.