We have been working with a client who has a website based on SocialEngine with fairly heavy user traffic and lots of data. As a popular website and data changing on daily basis, we have to take data backup every day just in case something happens and we need to restore it. Storing the backup on a local server needs more space which is a costlier approach in terms of hardware. So we were exploring the other options to store the data securely, efficiently and in relatively cheaper.

We researched and found that Amazon S3 option is the best for our requirements. The costing for S3 is much cheaper than storing in the SSD hard drive and won’t require any maintenance. We didn’t find any third party plugin who can handle our requirements to take the backup of our website. There were few backup plugins but failed to take backup properly.

Amazon S3 saves the data as secure object storage. It lets you preserve, retrieve and restore every version of every object in an Amazon S3 bucket. So, you can easily recover if something is accidentally deleted by users or applications failure. With Amazon S3 you only pay for the storage you actually use. There is no minimum fee and no setup cost. So we decided to use this option to store data for the website.

The best thing about using s3 is that AWS provides S3 utility which can take care of storing the data in the relevant bucket. You just need to run the command in the shell script and data is stored in S3 bucket without any hassle. So we are sharing the script on Github so that you can directly use it for your SocialEngine website.

Steps to take back up of website based on SocialEngine

We are going to go through the steps in details to take the backup of the database. If you would like then you can skip the next steps and directly download the script for your website though we would like you to read the full article.

In which You will be asked for the two keys (Access key and Secret key are your identifiers for Amazon S3.) – copy and paste them from your confirmation email or from your Amazon account page.

They are case sensitive and must be entered accurately or you’ll keep getting errors about invalid signatures or similar.

You can optionally enter a GPG encryption key that will be used for encrypting your files before sending them to Amazon. Using GPG encryption will protect your data against reading by Amazon staff or anyone who may get access to your them while they’re stored at Amazon S3.

Other advanced settings can be changed (if needed) by editing the config file manually. Some of the settings contain the default values for s3cmd to use.

Step 2. MySql Database Backup Script S3:

The shell script used to back up your database and upload it to S3.

The idea is to create the following script and run it with the appropriate environment variables, and what it does is actually pretty simple, and then uses mysqldump to dump the database to a temporary file, and It uploads the file to S3 by using the AWS CLI Tools for S3.

Step 3. Let’s run the script now:

# chmod +x dbbackup.sh# Run the script to make sure it's all good# bash dbbackup.sh

Backup script Output:-

Step 4. Schedule IT With CRONTAB:

Assuming the backup script is stored in /opt/scripts directory we need to add a crontab task to run it automatically on weekly basis:

So we have to edit the crontab file #vim /etc/crontab #Add the following lines: #Run the database backup script on every week at 12.00
0 0 * * 0 bash /opt/scripts/mysqlbackupS3.sh to >/dev/null 2>&1

Conclusion

In this article, we have explained how to automate the backup process MySQL directly on AWS S3 Bucket. Just schedule and configure the script on you MySql server and you are done. This service runs the process on regular basis according to your pre-configured schedule. Feel free to ask any query, glad to hear from you.

Sitemap

About Us

We are a small agile company delivering world class products and software development services for startups and small businesses. We specialize in developing online communities,e-commerce websites, custom websites, mobile apps and performance. Over the years, we have served over 350+ customers through our products and services. Read More