[color=#ff0000][size=6]If your computer/server fails, you’ll want remote backups. Here’s how to do it.[/color][/size]
I just wanted to share with the community a script I wrote to automatically backup suiteCRM everyday. This script creates a compressed backup every morning at 4AM (but you can tweak it to run as infrequently or frequently as you would like). Backups are kept both on the local machine running suiteCRM and on two remote drive - namely Google Drive and NextCloud (but you can choose whichever remote drives you have access to). Backups are kept for the past 7 days. Backups older than 7 days are deleted, except for the backup created Monday morning (but you can change this to whatever day you’d like). Monday backups are kept indefinitely.
I’m running suiteCRM on a Debian 9 (GNU/Linux), but this should work the same on macOS/BSD and could be tweaked to run on Windows, too.
You’ll need to download an open source software for making backups called “rclone”. As per, https://rclone.org/install/, for (GNU/Linux)/macOS/BSD systems, from your terminal type:
curl https://rclone.org/install.sh | sudo bash
(You may need to install “Go” package, if it isn’t already installed on your system) On Debian-based systems type:
sudo apt install golang
Save the following bash script code on the computer/server that is running suiteCRM. I’ve named this file backup.sh, and placed the script in /home/username/ directory, but you can name it what you like, and place it where you like.
/#!/bin/bash
# Run backup command to archive suitecrm application files and maridb dump.
tar -czf /home/username/backup/suitecrm/daily/$(date -d "today" +"%Y%m%d%H%M").tar.gz /opt/suitecrm/
mysqldump suitecrmDATABASE -u suitecrmUSER --password='CHANGEthisTOyourUSERpassword' | gzip -c | cat > /home/username/backup/suitecrm/daily/$(date -d "today" +"%Y%m%d%H%M").sql.gz
# Move all backups created on Monday (signified by '= 1' before the left square bracket) to the 'weekly' directory.
find /home/username/backup/suitecrm/daily/ -type f -exec sh -c '[ "$(date +%u -d @"$(stat -c %Y "$1")")" = 1 ] && rclone copy "$1" /home/username/backup/suitecrm/weekly/' -- {} \;
# First check to make sure all backups are present locally (8 x 2 files), before removing daily remote backups older than one week. Then remove the local copies.
SC=$(ls -l /home/username/backup/suitecrm/daily/ | wc -l)
if [ $SC -gt 16 ]
then
find /home/username/backup/suitecrm/daily/ -type f -mtime +7 -exec sh -c 'rclone delete gdrive:suitecrm/daily/$(basename "$1")' - {} \;
find /home/username/backup/suitecrm/daily/ -type f -mtime +7 -exec sh -c 'rclone delete nextcloud:suitecrm/daily/$(basename "$1")' - {} \;
find /home/username/backup/suitecrm/daily/ -type f -mtime +7 -exec sh -c 'rm "$1"' -- {} \;
fi
# Move local backups to two remote drives.
rclone copy /home/username/backup/suitecrm/weekly/ gdrive:suitecrm/weekly/
rclone copy /home/username/backup/suitecrm/daily/ gdrive:suitecrm/daily/
rclone copy /home/username/backup/suitecrm/weekly/ nextcloud:suitecrm/weekly/
rclone copy /home/username/backup/suitecrm/daily/ nextcloud:suitecrm/daily/
Make sure that you update the code above with the correct information for your database user, database name, database password, suiteCRM application files location, remote drive locations, directory names, etc. Also make sure that any directories that you reference have been created on both your local machine and your remote drives.
If you haven’t already, you’ll need to configure your remote drives with rclone:
rclone config
When adding a “new remote” with rclone, you’ll choose option 25 -> “Webdav” for nextCloud and option 12 -> “Google Drive” accordingly. (Or whatever option you need).
Lastly, we’ll use “cron” to automatically run this script in the background every morning. From a terminal type:
crontab -e
At the bottom of the file type:
0 4 * * * /bin/bash /home/username/backup.sh
Then save and exit your text editor.