Database Administrators Stack Exchange is a question and answer site for database professionals who wish to improve their database skills and learn from others in the community. It's 100% free, no registration required.

This question appears to be off-topic. The users who voted to close gave this specific reason:

"Shopping list question - questions about which tool, library, product or resource you should use are off-topic here because they quickly become obsolete and often are just about the preferences of the answerer. If you have an issue with or a question about a specific tool, please revise your question to conform to that scope." – swasheck, Paul White, dezso, Mikael Eriksson, RLF

I am not PostgreSQL expert, still learning myself. I removed my answer. However, there is another question about using mysqldump with postgresql posted by someone else : dba.stackexchange.com/q/9792/877. Hope it helps !!!
–
RolandoMySQLDBAJan 3 '12 at 20:14

1

What's wrong with tools written in Java? And it might help if you described what exactly was "not working" with all of them.
–
a_horse_with_no_nameJan 3 '12 at 21:44

3 Answers
3

Migrating between two very different DBMS requires a lot more than just migrating the data. But migration of the data is usually the easiest part.

The way that I have tried, which is free and I can confirm that it works:

create a mysql schema only dump

Adjust the SQL using a text editor and a lot of search and replace

run the converted SQL in Postgres

Create a plain text dump from MySQL (e.g. CSV, or some other delimited format)

import the data using PostgreSQL's COPY command

Importing the data might actually be difficult if you relied on MySQL's behaviour to accept illegal data (like the 31st of February)

My guess is, that this is going to be quicker than searching for a tool, evaluating a bunch of them and then trying to understand the one that you chose. But it depends on what kind of "big" you are referring to. If big is several hundreds of tables this might not be feasible. If big only referred to the number of rows, then this is probably the quickest way to do it.

There are some tools out there that can dump a database schema in a DBMS independent (XML) format, like Liquibase, SchemaSpy or WbSchemaReport. With Liquibase probably being the easiest to use. The others will require some manual work writing/extending the XSLT to transform the generated XML.

If you are using triggers and stored procedures in MySQL, I don't believe there will be any automated tool that can translate them without requiring a major manual fixing afterwards - and then the generated procedures would probably not use any advanced features of the target DBMS.

I exported the schema and the data seperatly. Make sure that you use INSERTs instead of COPY. If there is no fance stuff going on, the data should require (almost) no clean up. Make sure everything is on UTF-8. With 2 seperate files, the schema gets more managable and you don't have a 1GB file to edit.

When loading the schema, you get pretty detailed information where an error is, so debugging should be pretty simple.

I am throwing this out there, having never tried it, but Tungesten Replicator may do what you want. It's primarily a replication tool, but I think it may have a bootstrapping process that may help you at least with the data piece.