Backup script via rsync
There was a need somehow and somewhere to backup. Moreover, so that the processors are not loaded and the place is not occupied, and the backups are rotated and conveniently delivered. I always used fsbackup before , but I wanted to refuse archiving. To solve the problem, rsync and the mechanism of hard links (the so-called hardlinks) of the file system were used.
Architecture: there is a free-standing server with a large screw - the script works on it. There are many different servers with ssh access on which a user public key has been added to ~ / .ssh / authorized_keys under which the backup script works.
Logic of work:at a certain time, the ssh script synchronizes the contents of the folder on the remote server with the domain.com/latest folder, and then copies it to the folder with today's date, creating hard links to files, then deletes folders whose creation date is older than 7 days. Because only the contents of the directory are synchronized, you need to dump the database by the crown on the client machine before rsync picks up the files.
Pros:
- uses less space than differential backups and no more space than incremental backups
- the processor loads less, because does not use archivers (it is possible to perform compression on the fly when transmitting over the network)
- it has a fairly detailed log format, alerts via email about errors
- it is resistant to hacking or complete destruction of the client machine - the attacker will not harm backups in any way
Question:
- because the script was originally published in the abstract , and it was not possible to hear an authoritative opinion regarding the effectiveness of this approach - I would be glad if you shared your thoughts ...
Architecture: there is a free-standing server with a large screw - the script works on it. There are many different servers with ssh access on which a user public key has been added to ~ / .ssh / authorized_keys under which the backup script works.
Logic of work:at a certain time, the ssh script synchronizes the contents of the folder on the remote server with the domain.com/latest folder, and then copies it to the folder with today's date, creating hard links to files, then deletes folders whose creation date is older than 7 days. Because only the contents of the directory are synchronized, you need to dump the database by the crown on the client machine before rsync picks up the files.
Pros:
- uses less space than differential backups and no more space than incremental backups
- the processor loads less, because does not use archivers (it is possible to perform compression on the fly when transmitting over the network)
- it has a fairly detailed log format, alerts via email about errors
- it is resistant to hacking or complete destruction of the client machine - the attacker will not harm backups in any way
Question:
- because the script was originally published in the abstract , and it was not possible to hear an authoritative opinion regarding the effectiveness of this approach - I would be glad if you shared your thoughts ...
#!/bin/sh
# simple rsync backup script written by farmal.in 2011-01-21
#
# latest backup is always in $SDIR/domains/$domain/latest folder
# all backups which are older than 7 days would be deleted
# backup.ini file can't contain comments, empty lines and spaces in domain names
#
# example of a GOOD backup.ini:
# mydomain.com user@mydomain.com:/path/to/public_html
#
SDIR="/usr/local/backup"
SKEY="$SDIR/.ssh/id_rsa"
SLOG="$SDIR/backup.log"
PID_FILE="$SDIR/backup.pid"
ADMIN_EMAIL="email@domain.com"
if [ -e $PID_FILE ]; then
echo "this task is already running or previous run was completed with errors on `hostname`" | mail -s "Some mess with backups on `hostname`..." $ADMIN_EMAIL
exit
fi
touch $PID_FILE
# redirecting all output to logfile
exec >> $SLOG 2>&1
# parsing backup.ini file into $domain and $from variables
cat backup.ini | while read domain from ; do
destination="$SDIR/domains/$domain"
# downloading a fresh copy in 'latest' directory
echo -e "`date` *** $domain backup started">>$SLOG
# start counting rsync worktime
start=$(date +%s)
rsync --archive --one-file-system --delete -e "ssh -i $SKEY" "$from" "$destination/latest" || (echo -e "Error when rsyncing $domain. \n\n For more information see $SLOG:\n\n `tail $SLOG`" | mail -s "rsync error" $ADMIN_EMAIL & continue)
finish=$(date +%s)
echo -e "`date` *** RSYNC worked for $((finish - start)) seconds">>$SLOG
# cloning the fresh copy by hardlinking
cp --archive --link "$destination/latest" "$destination/`date +%F`"
# deleting all previous copies which are older than 7 days by creation date, but not 'latest'
find "$destination" -maxdepth 1 -ctime +7 -type d -path "$destination/????-??-??" -exec rm -r -f {} \;
echo "`date` *** The size of $domain/latest is now `du -sh $destination/latest | awk '{print $1}'` ">>$SLOG
echo -e "`date` *** $domain backup ended">>$SLOG
echo -e "`date` *** Total allocated `du -sh $destination | awk '{print $1}'`">>$SLOG
echo -e "------------------------------------------------------------------">>$SLOG
done
rm $PID_FILE