web scraping, screen scraping, data parsing and other related things

For simple web scraping jobs I often prefer a php + mysql bundle putting the project right to the web and working online. But as you work online a problem appears: how to backup your work results? Before, I used to periodically copy all the files via FTP and export the database via phpMyAdmin manually. But this method was quite time-consuming and didn’t allow me to do any version control.

Now I use a nice web service called MySql Backup Online to backup all my small projects. You may ask how this service differs from other similar services for website backup? There are two main points:

It can backup MySql databases via phpMyAdmin. This is important if you don’t have a direct access to your MySql databases via TCP/IP or SSH (or especially if you don’t know how to do that).

It has a free plan. This is very helpful if your project is not developed enough for you to pay for its backups. It’s interesting that among other similar services MySqlBackupOnline is the only one offering a free plan (at least for now).

Despite its name, MySql Backup Online can backup not only MySql databases but all files on the web server using FTP. As soon as you tune it to your website it will backup your databases and files once a day (but you can still initiate backup manually) and keep the history of all your changes, allowing you to restore it to any point in the past.

Now here is a screenshot of the website dashboard on MySql Backup Online for your foretaste: