Setup for automatic Magento database backup in Git

Setup for automatic Magento database backup in Git

Well, any version control system is great, at least in compare with none at all. But what’s happening with your database? Today I’ll show you how to create automated solution for backup and insertion of MySQL database into Git repository.

Explanation and Prerequisites

Explanation

What I did was clone of existing Magento Git repository on development server, and then created cron task that dumps its current development database into local repository, commits changes, and then pushes them back to remote repository.
Prerequisites

First of all, I’ll explain this using our development server that runs on Linux. If you wish to follow this article, you’ll need Git, MySQL,SSH,Remote git repository (ours is at beanstalk) and crontab access on it.

Setup

First, you need to setup your server to gain access to Git, so if you’re not sure how, just follow this link.

Notes:

– Since our repository works with SSH keys, I explained how to do this in that way. If your works with password authentication, your cron commands will differ from this.

– I’ve setup SSH keys without password, for simplicity

Depending on your repository, Git URL might differ, but it should look like this:

git@you.beanstalkapp.com:/your_repository.git

Next, you need to create clone your remote repository, like this:

cd path/to/your/local/repository/
//first, navigate to folder in which you wish to clone repository
git clone git@you.beanstalkapp.com:/your_repository.git
//cloning itself

Next thing you need to do is to create and add your database dump to the repository. You’ll do that with this command:

cd path/to/your/local/repository/your_repository/
//first, navigate to folder with repository clone if you haven't already
mysqldump -u [mysql_username] -p[mysql_password] --skip-extended-insert [mysql_database] > [name_of_the_dump].sql
//initial database dump
git add [name_of_the_dump].sql
//adding the dump to the repository
git push origin master
//pushing initial database dump to remote repository, on master branch

And next thing is the creation of cron task that will dump the database, and that is the final step of this HOW-TO post.

crontab -e
//start the editor that will allow you to edit cron list for your user on the server
//and then enter the following cron:
*/30 * * * * cd path/to/your/local/repository/your_repository/ && mysqldump -u [mysql_username] -p[mysql_password] --skip-extended-insert [mysql_database] > [name_of_the_dump].sql && git pull origin master && git commit -am 'Updating DB backup' && git push origin master > crontab-[name_of_the_dum]p.log 2>&1
//note that this above cron must be a "single line", without line break characters and that it contains more that 1 command to execute

A small explanation of cron itself is in order so, to break it apart:
*/30 * * * * – cron execution time setup, you can read more about it here.
cd path/to/your/local/repository/your_repository/ – relative path from your SSH login directory (usually user’s “home”) to your local Git repository on development server
mysqldump -u [mysql_username] -p[mysql_password] –skip-extended-insert [mysql_database] > [name_of_the_dump].sql – dump the database with –skip-extended-insert parameter, that will increase readability on each commit / push.
git pull origin master – pull changes on remote Git repository, just in case there won’t be any conflicts
git commit -am ‘Updating DB backup’ – make a commit with new database dump
git push origin master crontab-[name_of_the_dum]p.log 2>&1 – push changes to remote Git repository, and create a log file

And that’s all about the setup.

Conclusion

All of this shouldn’t take more than 10-15 minutes, when you get the hang of it. And in return, you have automated file / database backup of your Magento project on remote development server. I think it’s worth the time, and you?

You made it all the way down here so you must have enjoyed this post! You may also like:

Declarative Schema feature in Magento 2 Josip Kovacevic
Josip Kovacevic, | 6

Declarative Schema feature in Magento 2

Magento 2 logging Matej Maricic
, | 11

Magento 2 logging

How to generate SSH keys for Git authorization Hrvoje Ivancic
, | 21

How to generate SSH keys for Git authorization

7 comments

  1. @Jared – its used to see difference clearly, nothing more. I’m aware it takes a bit more time than usually, but its not intended to sync local and remote database, but rather as database backup,with different versions.

    So to conclude, it does same thing as regular mysqldump set as cron task, except this way it allows you to see the diff, and is stored locally with rest of your git project, automatically.

    Hope you’ve got it!

    Mladen

  2. Since you are using the “–skip-extended-insert” for the mysqldump so that each row has it’s own insert line (better for git), do you then extract and run only the changes when syncing changes from the remote repository to your local/staging/production servers?

    –skip-extended-insert makes the restore very slow. I guess I’m wondering if you can use it to only get the changed rows and then change them to update statements perhaps to avoid the slow restore process every time there is a small database change?

  3. Thanks for the recent posts on using Git, its completely new to me and its helping ease me into it.

    I would love a post detailing your workflow with Git and Magento. What do you include in your .gitignore? And how do you keep an up-to-date version of the production environment (including the database, extensions etc.) locally for development?

  4. that could explain it yes , never understood the “each developer has its stuff locally” part in my life and always have replicated environments on server and used to do automations on different level

  5. Hi,

    nice idea. I’m using something similar to create database dumps:

    Im creating two files for each tables. One containing the structure only (-no-data) and one containing the data. With some extra parameters (–no-create-db –no-create-info –skip-extended-insert –order-by-primary) you’ll get files that are easily diffable. That enables you to see what actually changed between two dumps and will save a lot of space because git won’t create copies for those files that didn’t change.

  6. Since our projects are on Git, why wouldn’t each of developers on the project have a up-to-date version of database locally (pulled with Git), together with the fact that there’s an additional copy of everything on remote server and that it’s automated don’t leave much space for “it’s unnecessary” scenario.

    That would be my point. 😀

    Cheers!

  7. cool git experiments, but can you explain the point of this procedure or is it “just because it’s possible” thing?

    if you have set up your server to access with certificate you could use scp to do the same

    scp SourceFile user@host:directory/TargetFile

    or rsync if you like to copy only changed parts of the file

    rsync -ave ssh source.server:/SourceFile /destination/dir

    video response is overkill 😀

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <blockquote cite=""> <code> <del datetime=""> <em> <s> <strike> <strong>. You may use following syntax for source code: <pre><code>$current = "Inchoo";</code></pre>.

Tell us about your project

Drop us a line. We'd love to know more about your project.