Synology Custom Remote Backup Solution

Created a personalized rsync incremental remote backup solution.

  • Tested with DSM 4.1 (and a little with 4.0)
  • Uses default rsync
  • Does not require any additional packages (e.g. ipkg)
  • Utilizes SSH transport and ssl keys for secure transport
Still To Do
  • Age-off script
  • Combine Full and Incremental backup scripts
  • Create age-off script
Some details
The below scripts utilize rsync’s “–link-dest” option, so that each incremental update takes up very little space compared to the first.  It does require a full backup initially, which is why I currently have 2 scripts.  I believe they can easily be combined, but this is the initial solution.
Hard links are pretty nifty.  Google them and how the rsync “–link-dest” option works.
I do not consider myself an advanced linux user, so there are probably a number of best practices with this solution that were not simply ignored, but totally unknown because of my ignorance.

This system should work in both directions between 2 Synology boxes.  I’ve only implemented in a single direction thus far, but reversing should be pretty simple.

Full backup Script
  • Needs to be run the first time to do the initial full backup
  • If able, recommend doing this locally across a LAN and then doing incremental backups over the WAN
#!/bin/sh

#
# Updated:
#

#
# Changes
#
#
# Future Ideas
# - If there's an rsync error, automatic retry
#

RSYNC=/usr/syno/bin/rsync

#
# Need the directory where the script runs
#
shDir="$( cd "$( dirname "$0" )" && pwd )"

#
# Config files of interest
#
confRemHost="rem_host.sh"
confSrcDirs="src_dirs.conf"
confLastFile="last_backup.conf"

#
# Read in remote host info
#
. ${shDir}/${confRemHost}

#
# Misc Variables
#
vDate=`date +%Y-%m-%dT%H%M`
dirLogBackup="/volume1/<your directory path>/backup_logs"
dirBckupName="Initial_${vDate}"
vRsyncOpts="--archive --partial"
vLogLvl="--verbose"
dirLogs="$shDir"
vLogF=$dirLogs/rsync_${vDate}.log
vErr="no"
vMaxRetries=5

if [ -f "${shDir}/is.test" ]; then
dirLogBackup=${dirLogBackup}/test
rDir_base=${rDir_base}/test
fi

if [ ! -d "$dirLogBackup" ]; then
mkdir -p $dirLogBackup
chmod 777 $dirLogBackup
fi

exec > $vLogF 2>&1

if [ ! -f "${shDir}/${confSrcDirs}" ]; then
echo ---
echo --- $shDir/$confSrcDirs not found. Exiting...
echo ---
exit 1
fi

#
# Loop through each directory to backup
# (There may be a better way to do this, but this works)
#
for myD in `cat $shDir/$confSrcDirs`
do
echo ---
echo "--- Starting directory backup: $myD"
echo ---

if [ -d "$myD" ]; then
$RSYNC $vRsyncOpts $vLogLvl -e "ssh -i ${rUserKey}" \
$myD $rUser@$rHost:$rDir_base/$dirBckupName

if [ "${?}" -ne "0" ]; then
vErr="yes"
echo "ERR ($?) : $myD" >> $dirLogs/rsync_${vDate}.err

# Put a test here for exit code 23, Partial Transfer Error
# And then retry?
# a while loop could work
fi
else
echo
echo "--- WARN: Directory $myD does not exist"
echo
fi

echo ---
echo --- Completed directory backup: $myD
echo ---
echo
done

#
# Some cleanup / completion stuff
#
if [ $vErr = "no" ]; then
echo $dirBckupName > $shDir/$confLastFile #track last backup dir
else
chmod 733 $shDir/*.err
mv $shDir/rsync_*.err $dirLogBackup #save off err file
fi

#
# Want to move log file to new location
#
chmod 733 $shDir/*.log
mv $shDir/rsync_*.log $dirLogBackup
Incremental Backup Scripts
  • Added the “–temp-dir” & “–link-dest” options for rsync when compared to the Full backup script
#!/bin/sh

#
# Updated:
#

#
# Changes
#
# - Added ulimit change to 30720
#

#
# Future Ideas
# - If there's an rsync error, automatic retry
#

RSYNC=/usr/syno/bin/rsync

#
# Need the directory where the script runs
#
shDir="$( cd "$( dirname "$0" )" && pwd )"

#
# Config files of interest
#
confRemHost="rem_host.sh"
confSrcDirs="src_dirs.conf"
confLastFile="last_backup.conf"

#
# Read in remote host info
#
. ${shDir}/${confRemHost}

#
# Misc Variables
#
vDate=`date +%Y-%m-%dT%H%M`
dirLogBackup="/volume1/<your directory path>/backup_logs"
dirBckupName="Incremental_${vDate}"
vRsyncOpts="--archive --partial --delete"
vLogLvl="--verbose"
dirLogs="$shDir"
vLogF=$dirLogs/rsync_${vDate}.log
vErr="no"
vMaxRetries=5
dirTemp="/volume1/meister_backup/backup_temp"

if [ -f "${shDir}/is.test" ]; then
dirLogBackup=${dirLogBackup}/test
rDir_base=${rDir_base}/test
fi

if [ ! -d "$dirLogBackup" ]; then
mkdir -p $dirLogBackup
chmod 777 $dirLogBackup
fi

exec > $vLogF 2>&1

if [ ! -f ${shDir}/${confLastFile} ]; then
echo ---
echo --- $confLastFile not found. Exiting...
echo ---
exit 42
fi

if [ ! -f "${shDir}/${confSrcDirs}" ]; then
echo ---
echo --- $shDir/$confSrcDirs not found. Exiting...
echo ---
exit 1
fi

vLastDir=`cat $shDir/$confLastFile`

#
# Loop through each directory to backup
# (There may be a better way to do this, but this works)
#
for myD in `cat $shDir/$confSrcDirs`
do
echo ---
echo "--- Starting directory backup: $myD"
echo ---

if [ -d "$myD" ]; then
$RSYNC $vRsyncOpts $vLogLvl -e "ssh -i ${rUserKey}" \
--temp-dir=$dirTemp \
--link-dest=$rDir_base/$vLastDir \
$myD $rUser@$rHost:$rDir_base/$dirBckupName

if [ "${?}" -ne "0" ]; then
vErr="yes"
echo "ERR ($?) : $myD" >> $dirLogs/rsync_${vDate}.err

# Put a test here for exit code 23, Partial Transfer Error
# And then retry?
# a while loop could work
fi
else
echo
echo "--- WARN: Directory $myD does not exist"
echo
fi

echo ---
echo --- Completed directory backup: $myD
echo ---
echo
done

#
# Some cleanup / completion stuff
#
if [ $vErr = "no" ]; then
echo $dirBckupName > $shDir/$confLastFile #track last backup dir
else
chmod 733 $shDir/*.err
mv $shDir/rsync_*.err $dirLogBackup #save off err file
fi

#
# Want to move log file to new location
#
chmod 733 $shDir/*.log
mv $shDir/rsync_*.log $dirLogBackup

Additional Required Files

  • src_dirs.conf
    • Located in the same directory as both scripts
    • Used by both scripts
    • Simple text file
    • Separate directory on each line
      • These are the directories that you want backed up
      • Sync’d individually via the “for” loop in each script
  • last_backup.conf
    • Located in the same directory as both scripts
    • Used by both scripts
    • Simple text file
    • Should only have 1 line
      • Name of the last directory where your last backup was placed
      • Only updated if rsync completes with zero / no errors
    • Used primarily by the Incremental backup script to do the incremental backup with rsync that’s implemented by using the “–link-dest” option
  • rem_host.sh
##
#
# Variables Defining the remote host / destination server
#
##

rHost=<ip/hostname>
rUser=<remote user, needs to exist on remote box, should not be root/admin>
rDir_base=/volume1/<base backup directory> # target base directory
rUserKey="/<directory tree to private key>/.ssh/<private-key>"

 

Additional Optional Files

  • There is a little test / debug code in each script.
  • File:  is.test
    • If exists, then backup scripts put backups and logs into separate “test” directories relative to the normal directories
    • If used, need to make sure both “test” locations exist first or else an error will happen
    • Made it easier for me to develop and test the scripts in one directory and then just do a “cp” to a “live” directory.
    • All I had to do in the dev directory was “touch is.test”

FYI

  • Private key authentication
    • Used openssl to create private key.  Nothing special in the creation of the key, so a quick google on the process should work fine
    • When setting up, I had some problems getting things to work for the user I created to do the rsync’s
      • Its totally possible that in my troubleshooting and Synology / DSM ignorance I did (or did not) do something I was (or was not) supposed to do
    • The key thing to remember though, is to make sure that the backup user’s home directory is where you place the “.ssh/backup-key” file
      • In my case, for some reason the user’s home in “/etc/passwd” was not pointing to where I was expecting.  So I changed it and restarted SSH
        • Should be able to use “synoservice” to restart SSH
    • Once I figured this out, everything else with configuring this went smoothly.  Luckily I’d had some experience in this area at work.  So that may have helped.
  • This entire solution works for me and my system setup.  I imagine there are a number of configurations where it will not work “out of the box”.
    • This may have problems if trying to rsync directories that are located on external devices or mounted from remote hosts or …. other stuff.

2 thoughts on “Synology Custom Remote Backup Solution

  1. So, I have two Synology units that I’m running a Synology backup job to (one is backing up to the other). I changed maybe 2 gigs of stuff, and the backup took 22 hours. It really lagged on the “metadata” part, which was the bulk of the time.

    My question is…..would this script essentially do the same thing but in minutes rather than hours? Including “metadata” (permissions, etc)?

  2. I’ve actually never done the built in Synology backup solutions, so I cannot compare do a comparison.

    What I can say is that, if you have a good connection between your 2 Diskstations and you only had 2GB of new data, I cannot see my rsync solution taking long. Unfortunately I never tested the speed of this solution, but I’d imagine 2GB taking less than 2 hours (and I feel I’m being quite conservative with that estimate).

    Since I’ve never used Synology’s built-in backup solutions, I’m not sure what is meant by “metadata”. Its entirely possible that Synology’s solution is sending additional data to provide more advanced capabilities.

    My rsync solution is pretty straight forward. It just does incremental backups, no extra capabilities and in the case that you actually had to take advantage of any backups, my solution is not meant to be “user friendly”…

    Probably better to say its not “consumer friendly” since I tried to make it simple to do a restore… Manually. But did not design any automatic or GUI or more “friendly” method.

Leave a Reply

Your email address will not be published. Required fields are marked *