I've been working away at a medium-size WordPress site. So far I've just been hosting the site on my local machine and showing it to internal consultants over our local network. Things are working well, and now it's time to show it off to the client. I've been using git all along, so pushing it to the dev server was a breeze. I duplicated the local DB and pushed it to the dev server manually, which was fairly easy except that I had to manually change a few URL entries.

Now my question is: what's the best way to keep two instances of WordPress synced? I still have more work to do locally and the DB is going to need to get pushed up again. How do other people manage this in an automated way?

One thought I had was to write a git hook to pull the MySQL data down when I push from my local machine and have a hook on the other end to import the data when the dev server pulls. However, if I do this I'll have to worry about changes to the wp_options table.

I guess this is probably my best option, then. I was hoping to avoid doing something like this, but it's what I'm reading everywhere else as well. The only thing I would change about this is step 3, where I would use mysqlimport so that I could tie all the steps into git hook and have it happen automatically on a pull. Thank you for your help!
–
Gavin AndereggMar 16 '11 at 23:49

As mentioned in the Codex, simply performing a search-and-replace on the database dump can lead to problems with serialized data. Much better to use a script, such as the one by interconnect/it, or a plugin based on it such as "Better Search Replace".
–
Yosi MorJan 15 at 19:50

Ultimately keeping the database in sync is the issue most struggle with. I simply don't really worry about it. I'll import some posts/CPT's if we have a new content type but that's about it the rest I do manually.