Backup script for hosted Linux

It takes not much time to initially setup a Linux machine with a hosting provider, but it takes a lot of time to configure it… Backup of my precious databases and configuration is pretty essential. I learned that the hard way…

So I created a simple script for my servers which simply puts all precious information into a backup file and deletes also all existing backup files older than 14 days. This backup file is then downloaded every day to a NAS in my home network so I have a copy at home where I delete those backups only every month and also can put another copy to a cloud based storage.

Yep, as a Boomer I am probably still more cautious than I should be – or just too nerdy 🙂

#!/bin/sh

# date for the backup filename
backupDate=$(date +%Y%m%d%H%M)

# file name for the backup
backupFileName=$(hostname -f)-$backupDate.tgz

# where to put the backup (could be big)
backupFileLocation="/home/youruser/BACKUPS/"
backupFile=$backupFileLocation$backupFileName

# which directories to back up
backupDIRs="etc var/vmail var/log var/backups home/WWW var/lib/rspamd/dkim var/lib/fail2ban"

# let's change to a tmp directory
cd /tmp

# what release were we using
lsb_release -a > /var/backups/lsb_release 2>&1

# which packages were installed
dpkg --get-selections > /var/backups/packagesInstalled

# backup the postgres database
sudo -u postgres pg_dumpall -f /var/lib/postgresql/13/main/pgBackupDump; mv /var/lib/postgresql/13/main/pgBackupDump /var/backups/

# backup the mysql/mariadb database
mysqldump -A > /var/backups/mysqlDump

# backup the script itself
cp /root/backupServer.sh /var/backups/

# create the actual backup - the actual magic
tar cfz $backupFile  -C / $backupDIRs 

# delete all backups older than 14 days
find $backupFileLocation/*.tgz -mtime +14 -exec rm {} \;

It’s really simple. Usually I also encrypt the file with GPG right away so it’ secured before it gets uploaded to any cloud storage.

There is room for improvement, but so far it already saved my ass several times 🙂