Transfer data from PostgreSQL to MySQL - mysql

Hello is there any method to transfer the table layout and data from a Postgres database to MySQL automatic?
I have to migrate the scheme anda data to MYSQL

The easiest would probably be to export the database (schema & data) as SQL using Postgres' pg_dump utility, then import the resulting SQL file into MySql.
It's possible that there will be some incompatibilities in the intermediate SQL, but you can almost assuredly take care of these with a find/replace in your favorite text editor.

It is possible to do using the "Data Transfer" feature of Navicat Premium. It will not preserve foreign keys though, but the data transfers correctly with the two databases incompatibility issues resolved.

Related

How to import a MySQLdump into a PostgreSQL database?

I have a MySQLdump generated by PHPMyAdmin, and I need to import it into a Postgresql database, but I dont know if it's even possible. I've seen people recommending pgloader but seens a little confusing on how to do it. Also I'm on windows if its relevant at all.
I only need the tables, so I'm not concerned about the data in the old or in the new database.
It's not that big too, only 84 tables. But big enough for me to write it.
Thank you!
You can use pgloader.
You need to install it, and then run a simple lisp script (script.lisp) with the following 3 lines:
/* content of the script.lisp */
LOAD DATABASE
FROM mysql://dbuser#localhost/dbname
INTO postgresql://dbuser#localhost/dbname;
/*run this in the terminal*/
pgloader script.lisp
And after that, your PostgreSQL DB will have all of the information that you had in your MySQL DB.
There are a lot of resources to do this. None of them are simple.
pgloader which you mentioned is among the many tools listed on this page: https://wiki.postgresql.org/wiki/Converting_from_other_Databases_to_PostgreSQL#MySQL
MySQLdump is supposed to have an option --compatible=postgres but don't rely on that. It makes some changes to its output, but not enough to be fully compatible with PostgreSQL syntax.
Another option is to dump tables to CSV files with mysqldump --tables instead of dumping to SQL format. Then you can bulk-load the CSV files one by one with the COPY statement in PostgreSQL.
If your MySQL database contains views or stored routines (procedures, functions, triggers, or events), then in general those can't be converted by any tool. The PostgreSQL language for stored routines is too different from MySQL. You must just start over and code routines in PostgreSQL that do equivalent logic, but coded in a way more idiomatic for PostgreSQL.

How to combine 2 databases (db files) into one database file?

I have 2 db files la.db and lb.db. I want a single db file, like final.db which would combine both la.db and lb.db databases. I am using sql lite.
Can anybody help me?
yes, you can use some native client in Ubuntu or similar for other Os, export them in a sql dump and reimport in a new one, OR you can simply write a small routine / program in c# that uses sqlite implicit, and select + insert and manage exception...
it depends on what data type you have into the sqlite.

Combine several mssql database to one mysql with php

We are handling a data aggregation project by having several microsoft sql server databases combining to one mysql database. all mssql database have the same schema.
The requirements are :
each mssql database can be imported to mysql independently
before being able to import each record to mysql we need to validates each records with a specific createrias via php.
each imported mssql database can be rollbacked. It means even it already imported to mysql, all the mssql database can be removed from the mysql.
we would still like to know where does each record imported to the mysql come from what mssql database.
All import process will be done with PHP .
we have difficulty in many aspects. we don't know what is the best approach to solve our problem.
your help will be highly appreciated.
ps: each mssql database has around 60 tables and each table can have a few hundred thousands .
Don't use PHP as a database administration utility. Any time you build a quick PHP script to transfer records directly from one database to another, you're going to cause yourself a world of hurt when that script becomes required for production operation.
You have a number of problems that you need solved:
You have multiple MSSQL databases with similar if not identical tables.
You have a single MySQL database that you want to merge the data into.
The imported data must be altered in a specific way before being merged.
You want to prevent all duplicate records in your import.
You want to know what database each record originally came from.
The solution?
Analyze the source MSSQL databases and create a merge strategy for them.
Create a database structure on the MySQL database that fits the merge strategy in #1, including all the new key constraints (like unique and foreign keys) required for the consolidation.
At this point you have two options left:
Dump the data from each of the source databases into raw data using your RDBMS administration utility of choice. Alter that data to fit your merge strategy and constraints. Document this, and then merge all of the data into your new database structure.
Use a tool like opendbcopy to map columns from one database to another and run a mass import.
Hope this helps.

SQL Server to MySQL data transfer

I am trying to transfer bulk data on a constant and continuous based from a SQL Server database to a MYSQL database. I wanted to use SQL Server's SSMS's replication but this apparently is only for SQL Server to Oracle or IBM DB2 connection. Currently we are using SSIS to transform data and push it to a temporary location at the MYSQL database where it is copied over. I would like the fastest way to transfer data and am complication several methods.
I have a new way I plan on transforming the data which I am sure will solve most time issues but I want to make sure we do not run into time problems in the future. I have set up a linked server that uses a MYSQL ODBC driver to talk between SQL Server and MYSQL. This seems VERY slow. I have some code that also uses Microsoft's ODBC driver but is used so little that I cannot gauge the performance. Does anyone know of lightening fast ways to communicate between these two databases? I have been researching MYSQL's data providers that seem to communicate with a OleDB layer. Im not too sure what to believe and which way to steer towards, any ideas?
I used the jdbc-odbc bridge in Java to do just this in the past, but performance through ODBC is not great. I would suggest looking at something like http://jtds.sourceforge.net/ which is a pure Java driver that you can drop into a simple Groovy script like the following:
import groovy.sql.Sql
sql = Sql.newInstance( 'jdbc:jtds:sqlserver://serverName/dbName-CLASS;domain=domainName',
'username', 'password', 'net.sourceforge.jtds.jdbc.Driver' )
sql.eachRow( 'select * from tableName' ) {
println "$it.id -- ${it.firstName} --"
// probably write to mysql connection here or write to file, compress, transfer, load
}
The following performance numbers give you a feel for how it might perform:
http://jtds.sourceforge.net/benchTest.html
You may find some performance advantages to dumping data to a mysql dumpfile format and using mysql loaddata instead of writing row by row. MySQL has some significant performance improvements for large data sets if you load infile's and doing things like atomic table swaps.
We use something like this to quickly load large datafiles into mysql from one system to another e.g. This is the fastest mechanism to load data into mysql. But real time row by row might be a simple loop to do in groovy + some table to keep track of what row had been moved.
mysql> select * from table into outfile 'tablename.dat';
shell> myisamchk --keys-used=0 -rq '/data/mysql/schema_name/tablename'
mysql> load data infile 'tablename.dat' into table tablename;
shell> myisamchk -rq /data/mysql/schema_name/tablename
mysql> flush tables;
mysql> exit;
shell> rm tablename.dat
The best way I have found to transfer SQL data (if you have the space) is a SQL dump in one language and then to use a converting software tool (or perl script, both are prevalent) to convert the SQL dump from MSSQL to MySQL. See my answer to this question about what converter you may be interested in :) .
We've used the ado.net driver for mysql in ssis with quite a bit of success. Basically, install the driver on the machine with integration services installed, restart bids, and it should show up in the driver list when you create an ado.net connection manager.
As for replication, what exactly are you trying to accomplish?
If you are monitoring changes, treat it as a type 1 slowly changing dimension (data warehouse terminology, but same principal applies). Insert new records, update changed records.
If you are only interested in new records and have no plans to update previously loaded data, try an incremental load strategy. Insert records where source.id > max(destination.id).
After you've tested the package, schedule a job in the sql server agent to run the package every x minutes.
Cou can also try the following.
http://kofler.info/english/mssql2mysql/
I tried this a longer time before and it worked for me. But I woudn't recommend it to you.
What is the real problem, what you try to do?
DonĀ“t you get a MSSQL DB Connection, for example from Linux?

Importing records from PostgreSQL to MySQL

Was wondering if anyone had any insight or recommended tools for exporting the records from a PostgreSQL database and importing them into a MySQL database. I believe the table structure is 100% identical.
Thoughts? Thanks!
The command
pg_dump --data-only --column-inserts <database_name>
will generate SQL-standard-compliant INSERT statements with all column names listed and one VALUES clause per INSERT. This is the most portable way of moving data from PostgreSQL to any other SQL database.
Check out SquirrelSQL, it can pump data from one database brand into another via the DBCopy plugin. When the table structures are really identical it works quite well.
There is a ruby app called Taps that will do it. I've used it before with great success:
http://adam.heroku.com/past/2009/2/11/taps_for_easy_database_transfers/