Backup Script for Logs Using Rclone
Overview
This Bash script automates the backup of log files from a backend application to an Amazon S3-compatible storage using rclone
. The script performs the following tasks:
- Moves logs older than 48 hours from the backend API logs directory to an S3 bucket.
- Copies all logs from the Multi PG logs directory to an S3 bucket.
- Logs all actions performed into a log file.
Prerequisites
Before running this script, ensure that you have:
- A Linux server with Bash support.
rclone
installed and configured.- Access to an S3-compatible storage.
Installation and Setup
Step 1: Install Rclone
If rclone
is not installed, use the following commands to install it:
curl https://rclone.org/install.sh | sudo bash
Step 2: Configure Rclone
Run the following command to configure rclone
with your storage provider:
rclone config
Follow the prompts to set up your remote storage. In this script, we assume the remote storage is named rcloneprodapi1
.
Step 3: Create Necessary Directories
Ensure the following directories exist and have the correct permissions:
sudo mkdir -p /var/log/rclone
sudo chmod 755 /var/log/rclone
Step 4: Save the Script
Save the following script as /usr/local/bin/log_backup.sh
:
#!/bin/bash
# Get the current date-time in YYYY-MM-DD_HH-MM-SS format
DATA_TIME=$(date +"%Y-%m-%d_%H-%M-%S")
# Get the public IP of the server
PUBLIC_IP=$(curl -s ifconfig.me)
########################################### BACKEND API ################################################################
# Define the log directory
LOG_DIR="/var/www/backend/storage/logs"
# Define the S3 bucket path
S3_BUCKET="rcloneprodapi1:<bucket>/logs/paytring/backend"
# Find and move logs older than 48 hours to the S3 bucket
find "$LOG_DIR" -type f -mtime +1 -exec rclone move {} "$S3_BUCKET/$DATA_TIME/$PUBLIC_IP/" --log-file=/var/log/rclone/logs.log \;
echo "Moved API logs to S3 on $DATA_TIME" >> /var/log/rclone/logs.log
############################################# MULTI PG ########################################################################
# Define the log directory
APP2_LOG_DIR="/var/www/backend/storage/logs/app2"
# Define the S3 bucket path
APP2_S3_BUCKET="rcloneprodapi1:<bucket>/logs/paytring/backend"
# For app2 Logs Backup
find "$APP2_LOG_DIR" -type f -exec rclone copy {} "$APP2_S3_BUCKET/$DATA_TIME/$PUBLIC_IP/app2" --log-file=/var/log/rclone/logs.log \;
# Log the operation
echo "Moved MULTI PG logs to S3 on $DATA_TIME" >> /var/log/rclone/logs.log
Step 5: Make the Script Executable
Run the following command to make the script executable:
sudo chmod +x /usr/local/bin/log_backup.sh
Step 6: Schedule the Script Using Cron
To automate the script execution, add a cron job:
sudo crontab -e
Add the following line to run the script every day at midnight:
0 0 * * * /usr/local/bin/log_backup.sh
Verification
To verify the script execution, check the log file:
tail -f /var/log/rclone/logs.log
Conclusion
This script helps in automating the backup of backend logs to S3 using rclone
. It ensures that logs are moved or copied securely and maintains a log file for tracking operations.