migrate postgreSQL data to mysql [duplicate] - mysql

This question already has answers here:
Migrate database from Postgres to MySQL [closed]
(3 answers)
Closed 8 years ago.
I'm looking to grab a few bits of data from musicbrainz db to use in a mysql based app.
I don't need the entire database, and have been looking at 'migrating' postgreSQL to mysql, which it seems lots of people have difficulty with.
Wouldn't it be simplest to just dump the postgreSQL data into a comma-delimited text file, and then import that into mysql?
I'm just getting started with this, and don't even have postgreSQL installed yet, but trying to look ahead at how I'm going to do it.

You can use COPY (in the psql client) to dump a single table.
Or you can use pg_dump with the -d parameter. This will cause pg_dump to dump INSERT statements, which you can just execute against your MySQL server. You will obviously need to port the schema first - and assuming the datatypes that are used exist in MySQL.

Perhaps you want to dump you Database to a SQL script.

Related

Copy all data from one database to the other MySQL - MariaDB [duplicate]

This question already has answers here:
Mysql Copy Database from server to server in single command
(3 answers)
Closed 1 year ago.
So I would like to make a test playground for my website where I don't alter with the original production data, and therefore I want to make a copy of all my data, and put it into another database. But how can I do this the right way?
I have a database with a lot of tables called testreporting4, and I want to make a copy of all the data/structure into the database called testreportingdebug
From testreporting4 to testreportingdebug
My database size is around 3.1GB at the moment (don't know if that changes anything)
I think the easiest way is the export the database and then import it. More information regarding exporting/importing MariaDB: https://www.digitalocean.com/community/tutorials/how-to-import-and-export-databases-in-mysql-or-mariadb

Bind MySQL, Cassandra through Postgres

I have a doubt with this.
I'm using postgres to connect two different databases. I'm already can write in both, I can write in MySQL using the mysql_fdw and also I can write values on Cassandra using the Multicorn fdw, but now the thing is that I want to write on both databases from Postgres.
When I when make \du in Postgres I get that the two extensions are loaded (mysql_fdw and multicorn for cassandra), also when I make select * from information_schema._pg_foreign_servers; I got the name of both, I already have created the database and table name on MySQL and the keyspace and columnfamily on Cassandra.
My doubt now is how can I connect both to replicate the data?

MySql for Excel: How to prevent changing MySql data when using MySql for Excel

I am bit of a novice to MySql and have a relatively basic question regarding using MySql for Excel:
If I alter the data that I imported from mysql excel within the spreadsheets, will it change the original data within the tables in MySql?
I am currently planning on dumping large data sets from my job's MySql database into Excel to run some analyses but want to make sure it doesn't affect the data within MySql.
Thanks!

Move data from PostgreSQL to MySQL database [duplicate]

This question already has answers here:
Transfer data from PostgreSQL to MySQL
(2 answers)
Closed 9 years ago.
I have a PostgreSQL database with 50 tables which I need to export then import it in MySQL database. I need to maintain same table fields, data inside tables.
I am using pgAdmin III, tried using the Backup (Plain) feature but couldn't then import data and tables in mysql
I have also tried searching the web on how to do so but failed to locate any useful info so any help is highly appreciated.
Thanks
Take a look at http://www.mysql.com/products/workbench/
It has a Migration Wizard.

Import data to mysql from oracle [duplicate]

This question already has answers here:
Migrate from Oracle to MySQL
(7 answers)
Closed 8 years ago.
I have 2 databases, the first one is a Mysql db and it is used for a website. The second one is an Oracle db and it has data that I want to show on the website and that data must be fresh, I mean, I need to execute a process to migrate data to Mysql from oracle db every 30 minutes.
Because I am talking about 60000 rows to migrate every 30 minutes, I think that optimal way to do it is something like (thinking about performance):
insert into mysql_db.table.field1, mysql_db.table.field2, mysql_db.table.field3 select oracle_db.table.field1, oracle_db.table.field2, oracle_db.table.field3 from oracle_db.table
The Oracle db is on Windows and Mysql is on Linux (Ubuntu).
Is that possible? how? else, suggest me a different way please.
Having ODBC driver for the MySQL database, you could try Data Export tool (with command line support) in dbForge Studio for Oracle.
You could also use something like GoldenGate to capture changes in oracle and apply them to the mysql database , although if this is the one use case you have, the cost of the product may not be justifiable, in which case probably just a simple perl/php/python/etc script would do the trick.
I've used HSODBC links between mysql 4 and oracle 9i and found the performance less than fantastic, hopefully things have improved, if that is the case, this may be a viable solution, however you would still need to do it as a job ( inside or out of oracle ), as I'm not aware of anyway to make calls from mysql to oracle.
Oracle support the connection between Oracle databases through DBLink. But I don't believe that exists some tool for connect a Mysql and Oracle that let you execute directly your proposed query.
I can suggest that you write an script (by example in Python or Groovy) and schedule it on CRON Linux (if this is your environment). Since the size of data you will need implement Batch Updates (it's specific to the language in which you will implement the script)