I have been running a couple of instances of Teslamate - one locally on a server at home and another in Azure in a Ubuntu VM (see 👉 this blog post for details). I have been backing up the data to a NAS and then an offsite backup for the local instance. For the Azure instance, I have been running various backups during the day and backing up the data to OneDrive. This allows me to have a data backup in case the VM crashes or the data gets corrupted.
Work and AI have kept me busy, and I have not had a chance to write about it until now. I have a ton of storage on OneDrive, which seemed the most logical place to store it. Multiple options are available to back up data to OneDrive. The one that has been working consistently and well for me is rclone
.
What is Rclone?
Rclone Rclone is an open-source command-line tool for managing files across various cloud storage services. It supports over 70 providers, including Google Drive, Amazon S3, Dropbox, and Microsoft OneDrive. Rclone allows users to perform operations like copying, moving, syncing, and deleting files and more advanced tasks such as encryption, compression, and chunking. Additionally, it can mount cloud storage as a local drive, facilitating seamless file access and management.
Known as the “Swiss army knife of cloud storage,” Rclone is highly versatile and can be integrated into scripts for automating complex file operations. Its ability to act as a bridge between different cloud providers makes it an invaluable tool for efficiently transferring files and managing cloud storage environments.
Step 1: Setting up Rclone
The base assumption is that you have a working instance of Teslamate running in a docker container on a Linux machine. Rclone is also available for Windows and Mac, and you can follow the [instructions] (
https://rclone.org/install/
) to install it on those platforms. Let us start by outlining how to install rclone
.
First, let’s install rclone
on your system. You can download and install it using the following commands:
curl https://rclone.org/install.sh | sudo bash
Step 1.1 - Configuring Rclone with OneDrive ☁️
After installing rclone
, we need to configure it to use OneDrive. Rclone has detailed steps
on how to configure OneDrive. I show some of them here on how to run the following command to start the configuration process:
rclone config
You’ll be prompted with several questions about setting up a new remote. Here is an example configuration for OneDrive:
No remotes found - make a new one
n) New remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
n/r/c/s/q> n
name> onedrive
Type of storage to configure.
Choose a number from below, or type in your own value.
...
24 / Microsoft OneDrive
\ "onedrive"
...
Storage> 24
Microsoft App Client Id
Leave blank normally.
client_id>
Microsoft App Client Secret
Leave blank normally.
client_secret>
Edit advanced config? (y/n)
y) Yes
n) No
y/n> n
Use auto config?
* Say Y if not sure
* Say N if you are working on a remote or headless machine
y) Yes
n) No
y/n> y
For the last prompt (auto config), if you are on a headless machine (e.g., you had SSH’d into a server to set this up), you can say n, and it will give you a URL to open in a browser to authenticate - as these require OAuth2. On the host (from which you are connected), you can authenticate and get the token, which you can paste into the terminal window. If the browser does not open, you can open your local browser and navigate to http://127.0.0.1:53682/auth
.
If you are on SSH, you might need to create an SSH tunnel over port 53682 to your local machine using the following command:
ssh -L localhost:53682:localhost:53682 username@remote_server
For more details on how to do this, refer to the rclone remote setup documentation.
It is important to remember the name you give to the remote, as you will need it in the backup script later. In our example, we called it onedrive
.
Also, note that the rclone
configuration is stored in the .config
directory in the user’s home directory. The configuration file is named rclone.conf
and contains the details of the remotes you have configured. You can view the configuration file by running:
cat ~/.config/rclone/rclone.conf
To test the configuration, you can run the following command:
rclone ls onedrive:
This should list the files and directories in your OneDrive account. If you see the list of files, the configuration is successful. If you encounter any errors, review the configuration and ensure the authentication details are correct. For example, the image below shows the successful configuration of the OneDrive remote:
Step 1.2 - Mounting Remote Directory with Rclone 📁
Now that we have configured rclone
with OneDrive, we can mount the remote directory to a local directory in the system. This allows us to access the remote files as if they were local files. We cannot use a regular mount command, as the rclone mount command is not a real drive but is a FUSE-based mount. To mount the remote directory, use the following command.
rclone mount --config=/home/amit/.config/rclone/rclone.conf --vfs-cache-mode full --allow-non-empty --allow-other --daemon --dir-perms 0777 car_backup:car_data2 /home/amit/onedrive
Let’s break down the command:
- In the above command, replace
/home/amit/.config/rclone/rclone.conf
with the path to your rclone configuration file. - Replace
car_backup:car_data2
with the remote name (configured in Sep 3) and the directory you want to mount. - The local directory
/home/amit/onedrive
is where the remote directory will be mounted (you might need to create a new folder for this). You can choose any local directory for this purpose. - In the above command, the
--daemon
flag runs the mount in the background, and the--dir-perms 0777
flag sets the directory permissions. - The
--allow-non-empty
flag allows mounting over a non-empty directory. - And finally, the
--allow-other
flag allows other users to access the mounted directory.
Once the command is executed, you should see the remote directory mounted to the local directory. You can access the files in the remote directory as local files. For example, you can list the files in the mounted directory using the ls
command, as shown in the image below:
Step 1.3 - Persistent Mounting with Rclone 💾
It is important to note that the rclone mount
command is not persistent across reboots. So, if your VM reboots for some reason, you will lose this. To make the mount persistent, you can add the command to the system’s startup scripts or use a tool like systemd
to manage the mount as a service. We will create a systemd
service file to manage the mount for this.
To do this, create a new service file using the following command:
sudo nano /etc/systemd/system/rclonemount.service
Add the following content to the file:
[Unit]
Description=rclonemount
AssertPathIsDirectory=/home/amit/onedrive
After=network-online.target
Wants=network-online.target
[Service]
Type=simple
User=amit
Group=amit
ExecStartPre=/bin/sleep 10
ExecStartPre=/bin/mkdir -p /home/amit/onedrive
ExecStartPre=/bin/chmod 777 /home/amit/onedrive
ExecStart=/usr/bin/rclone mount \
--config=/home/amit/.config/rclone/rclone.conf \
--vfs-cache-mode full \
--allow-other \
--dir-perms 0777 \
car_backup:/car_data2 /home/amit/onedrive
ExecStop=/bin/fusermount -uz /home/amit/onedrive
Restart=always
RestartSec=10
Environment=HOME=/home/amit
Environment=USER=amit
[Install]
WantedBy=default.target
In the above service file, the ExecStartPre`` commands create the directory and set the permissions before mounting the remote directory. The
ExecStartcommand mounts the remote directory to the local directory. The
ExecStopcommand unmounts the directory when the service is stopped. The
Restartand
RestartSec` settings ensure the service is restarted in case of failure.
The image below shows the contents of the rclonemount.service
file in my system:
After creating the service file, reload the systemd daemon and start the service using the following commands:
sudo systemctl daemon-reload
To test the service, start it using the following command:
sudo systemctl start rclonemount
And check the status of the service to ensure it is running:
sudo systemctl status rclonemount
If everything is working correctly, you should see the status of the service as active (running)
, as shown in the image below:
Note: Press Q to quit.
And finally, if the service is running without any errors, enable it to start at boot:
sudo systemctl enable rclonemount
Of course, the service can be tested by rebooting the system and checking if the remote directory is mounted automatically.
Yay! Congratulations! You have successfully configured rclone
to mount the remote directory as a service. The remote directory will now be mounted automatically on system startup. 👍
Step 2 - Basic Backup Script 🛠️
Now that we have rclone
set up and mounted, we can create a basic backup script to back up the Teslamate data to OneDrive. The script will perform the following steps:
- Back up the Teslamate database.
- Compress the backed-up file.
- Move the compressed file to the remote backup directory using
rclone
.
The will be a simple bash script that can be run manually or scheduled using a cron job. Here’s an example of the basic backup script. Create a new file named backup.sh
and add the following content to it:
#!/bin/bash
# Basic backup script
SOURCEDIR=/home/amit/teslamate # Source directory where the data is located
BACKUPDIR=/home/amit/onedrive # Remote directory mounted via rclone
FILENAME=teslamate-$(date +%Y-%m-%d-%H%M%S).bak.gz # Date-time stamp filename
# Perform the database dump and compress it
docker compose exec -T database pg_dump -U teslamate teslamate | gzip > $FILENAME
# Move the compressed file to the remote backup directory
rclone move $FILENAME $BACKUPDIR
You will need to replace the SOURCEDIR
and BACKUPDIR
with the actual directories on your system. The FILENAME
variable creates a unique filename for the backup file using the current date and time. The script uses pg_dump
to back up the Teslamate database and rclone move
to move the compressed file to the remote directory.
This script assumes the following:
- You have Teslamate running in a docker container, and the database is named
teslamate
. - You have
rclone
configured with OneDrive and mounted the remote directory to a local directory. - You have the necessary permissions to access the source and destination directories.
Make the script executable:
|
|
Run the script manually to test it:
|
|
You should see the backup file in the remote directory if the script runs successfully. You can check the remote directory using the ls command or by navigating to it.
Step 3 - Adding Logging 🗒️
To make the script more informative, let’s add logging. This will help us keep track of the backup process and identify any issues. We will use the same script as above as our starting point and add logging to it. We will log messages to a log file and display them on the console. Here’s the updated script with logging:
#!/bin/bash
# Backup script with logging
SOURCEDIR=/home/amit/teslamate # Source directory where the data is located
BACKUPDIR=/home/amit/onedrive # Remote directory mounted via rclone
FILENAME=teslamate-$(date +%Y-%m-%d-%H%M%S).bak.gz # Date-time stamp filename
LOGFILE=/home/amit/backup.log
# Function to log messages
log() {
echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" | tee -a $LOGFILE
}
# Start of the script
log "Starting backup script."
# Perform the database dump and compress it
docker compose exec -T database pg_dump -U teslamate teslamate | gzip > $FILENAME
if [ $? -eq 0 ]; then
log "Database dump and compression successful: $FILENAME"
else
log "Database dump and compression failed"
exit 1
fi
# Move the compressed file to the remote backup directory
rclone move $FILENAME $BACKUPDIR
if [ $? -eq 0 ]; then
log "Successfully moved backup to $BACKUPDIR"
else
log "Failed to move backup to $BACKUPDIR"
exit 1
fi
log "Backup script completed successfully."
As in the earlier step, run the script manually to test it and check the log file for messages.
Step 4 - Adding Email Notifications 📧
Finally, we want to be notified about the backup status. We can add email notifications to the script. We’ll use ssmtp
for this purpose. This simple mail transfer agent can send email notifications from the command line. We’ll update the script to send an email notification after the backup is completed.
To send email notifications, you need to have an SMTP server configured. You can use your email provider’s SMTP server for this purpose. Our first step is to set up ssmtp
for email notifications
Step 4.1 - Installing and Configuring ssmtp
Install ssmtp
on your system:
|
|
Edit the /etc/ssmtp/ssmtp.conf
file to add your SMTP server details:
|
|
Add the following lines, replacing the placeholders with your actual details:
|
|
Edit or create the /etc/ssmtp/revaliases
file:
|
|
Add the following line, replacing with your actual details:
|
|
To test the email configuration, we can create a simple script called test_email.sh that sends a test email:
#!/bin/bash
# Test email script
FROM_NAME="Teslamate Server"
FROM_EMAIL="the-from-email-address"
EMAIL="email-address-where-you-want-to-send-the-email"
SUBJECT="Test Email"
BODY="This is a test email to verify the SMTP configuration."
echo -e "To: $EMAIL\nFrom: \"$FROM_NAME\" <$FROM_EMAIL>\nSubject: $SUBJECT\n\n$BODY" | ssmtp $EMAIL
You should receive a test email at the specified address when you run the script. If you encounter any issues, review the configuration and ensure the SMTP server details are correct.
Step 4.2 - Timeout for Email Notifications
We can add a timeout to the email command to prevent the script from hanging indefinitely if the email-sending process takes too long. We’ll use the timeout
command to set the duration of the email sending. First, we must check if the timeout
command is available on your system. You can check this by running the following command:
timeout --version
If the timeout
command is not available, you can install it using the following command:
sudo apt-get install coreutils
Step 4.3 - Updating the Backup Script with Email Notifications
Now that we have ssmtp
set up for email notifications, we can update the backup script to send an email notification after the backup is completed. We’ll add a function to send email notifications and call it at the end of the script. Here’s the updated script that includes email notifications:
#!/bin/bash
# Backup script with logging and email notifications
SOURCEDIR=/home/amit/teslamate # Source directory where the data is located
BACKUPDIR=/home/amit/onedrive # Remote directory mounted via rclone
FILENAME=teslamate-$(date +%Y-%m-%d-%H%M%S).bak.gz # Date-time stamp filename
LOGFILE=/home/amit/backup.log
EMAIL="email-address-you-want-to-send-the-email-to" # Replace with your email address
FROM_NAME="Teslamate Server"
FROM_EMAIL="[email protected]" # Replace with the authorized sender email address
TIMEOUT_DURATION=10 # Set timeout duration (in seconds) for sending email
RCLONE_CONFIG="/home/amit/.config/rclone/rclone.conf" # Path to the rclone config file
# Initialize a variable to capture the entire log
FULL_LOG=""
# Function to log messages
log() {
echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" | tee -a $LOGFILE
FULL_LOG+="$1\n"
}
# Function to send email notification
send_email() {
local STATUS="$1"
local SUBJECT="Backup ${STATUS} - $(date '+%Y-%m-%d %H:%M:%S')"
local BODY="The backup script ${STATUS} on $(date).\n\nLog details:\n$FULL_LOG"
echo -e "To: $EMAIL\nFrom: \"$FROM_NAME\" <$FROM_EMAIL>\nSubject: $SUBJECT\n\n$BODY" | timeout $TIMEOUT_DURATION ssmtp $EMAIL
EMAIL_EXIT_CODE=$?
log "Email command exit code: $EMAIL_EXIT_CODE"
if [ $EMAIL_EXIT_CODE -eq 124 ]; then
log "Email sending timed out after $TIMEOUT_DURATION seconds"
elif [ $EMAIL_EXIT_CODE -ne 0 ]; then
log "Failed to send email"
else
log "Email sent successfully"
fi
}
# Start of the script
log "Starting backup script."
# Change to the source directory
cd $SOURCEDIR
if [ $? -eq 0 ]; then
log "Changed directory to $SOURCEDIR"
else
log "Failed to change directory to $SOURCEDIR"
send_email "Failed"
exit 1
fi
# Perform the database dump and compress it
docker compose exec -T database pg_dump -U teslamate teslamate | gzip > $FILENAME
if [ $? -eq 0 ]; then
log "Database dump and compression successful: $FILENAME"
else
log "Database dump and compression failed"
send_email "Failed"
exit 1
fi
# Move the compressed file to the remote backup directory
rclone --config $RCLONE_CONFIG move $FILENAME $BACKUPDIR
if [ $? -eq 0 ]; then
log "Successfully moved backup to $BACKUPDIR"
else
log "Failed to move backup to $BACKUPDIR"
send_email "Failed"
exit 1
fi
log "Backup script completed successfully."
# Send email notification
send_email "Completed Successfully"
If you run the script, you should receive an email notification after completing the backup. The email will contain the log details of the backup process. You can customize the email content and format as needed. The image below shows an example email notification:
Step 4.4 - Scheduling the Backup Script
You can schedule the script to run regularly using a cron
job to automate the backup process. You can create a cron
job to run the script daily, weekly, or at any interval you prefer. Here’s an example of a cron job to run the script daily:
0 0 * * * /home/amit/backup.sh
This cron job runs the backup.sh
script every day at midnight. You can customize the schedule based on your requirements. To add the cron job, open the crontab file using the following command:
crontab -e
Add the cron job to the file and save it. The cron job will now run the backup script at the specified interval.
I have the script running at 5 AM, 1 PM, and 9 PM daily. In addition, I log the output of the cron job to a file (backup_cron.log
) so I can review it later if there is an issue. Here are the steps to set this up:
0 5 * * * /home/amit/backup.sh >> /home/amit/backup_cron.log 2>&1
0 13 * * * /home/amit/backup.sh >> /home/amit/backup_cron.log 2>&1
0 21 * * * /home/amit/backup.sh >> /home/amit/backup_cron.log 2>&1
Be sure to replace /home/amit/backup.sh
with the actual path to your backup script. The >> /home/amit/backup_cron.log 2>&1
part of the cron job redirects the script output to the specified log file. This allows you to capture the script output and any errors during execution.
Finally, you can check the cron job status using the following command:
crontab -l
This will list the cron jobs that are currently scheduled. You can also check the cron job logs to verify that the script is running as expected.
Here is a snippet of the backup_cron.log
file from my system:
2024-07-05 13:00:01 - Starting backup script.
2024-07-05 13:00:01 - Changed directory to /home/amit/teslamate
time="2024-07-05T13:00:01-07:00" level=warning msg="/home/amit/teslamate/docker-compose.yml: `version` is obsolete"
2024-07-05 13:00:30 - Database dump successful: teslamate-2024-07-05-130001.bak
2024-07-05 13:00:55 - Successfully compressed teslamate-2024-07-05-130001.bak.gz
2024-07-05 13:00:57 - Successfully moved backup to /home/amit/onedrive
2024-07-05 13:00:57 - Backup script completed successfully.
2024-07-05 13:00:59 - Email command exit code: 0
2024-07-05 13:00:59 - Email sent successfully
Step 4.5 - Timezone Configuration
Note in most cloud-based VMs, the timezone is set to UTC; if you specify the time in your local timezone, you might want to set the timezone to your local timezone. You can do this by running the following command:
sudo timedatectl set-timezone your-timezone
Replace your-timezone
with the appropriate timezone (e.g., America/Los_Angeles
).
Conclusion
And that’s it! You now have a backup script that backs up your Teslamate data to OneDrive and sends email notifications about the backup status. The script runs automatically at the scheduled intervals, ensuring your data is backed up regularly. You can customize the script further based on your requirements and preferences. 🫰