Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have dumped a PostgreSQL database and I would need it to a MySQL as we migrate our project from Heroku to Amazon EC2, where is MySQL.
It's quite a long time when I was doing something similar like raw dumps & SQL queries, so I would like to ask you for help how to make this as smoothly as possible.
I dumped a PostgreSQL database. When I open the file, there are some PostgreSQL commands, table structures and data that belongs to those tables. On the other hand, I have on EC2 a MySQL database with created tables (the same tables as are in the PostgreSQL dump - but all MySQL tables are empty). My goal is to populate the MySQL tables with data from PostgreSQL tables.
How to do that?
As of 6 days ago Amazon started offering a hosted PostgreSQL service as part of their Relational Database Service. I would seriously considering importing straight back into PostgreSQL and skipping MySQL altogether. Changing to a different DBMS is always more pain than it's worth.
You can import your dump by running it using psql, like this: psql -f <backup_filename> <target_db>. See Amazon's docs for more
If you're really set on migrating to MySQL you can try a simple tool like openDBcopy or something more comprehensive (and complicated) like Talend Studio.
If you have text file created by pg_dump then in this file you can see data from source database as COPY command or as series of INSERT commands. Both should be easy to import to any other database. Maybe you will have to change string representation of dates. COPY format is simply CSV and for this you will have to write import program, for example Jython: Python language that can work with JDBC drivers. With Jython you can easily parse CSV data and import it into destination database using JDBC PreparedStatement with proper INSERT.
If you have connection to both databases at once you can "pump" destination database using various tools. As Jython fan I simply use JDBC with code like:
insert_stmt.setObject(i, rs_in.getObject(i))
where insert_stmt isprepareStatement() with INSERT... and rs_in is record set of executeQuery()
Related
We have a fairly large MySQL database with more than a million rows of data with every possible data type.
It is a part of a custom MVC application built more than 5 years ago. We have to migrate it now.
There are a large number of queries and insert statements which we want to replace with JSON based web services so that it can be used with every kind of app/device etc.
A large number of PHP functions have been fused with display logic making it tricky. Also there are also a few MySQL functions in the bundle.
Please share tips/suggestions/tools that would be useful for this migration.
There are some tools which I can suggest you to convert your Mysql database to Pgsql one here is some of them:
pgloader
PostgreSQL Data Wizard
pgloader
dataPro
And based on my experience, I suggest you pgloader to load data from MySQL, SQLite, MS SQL Server, dBase files, CSV files and fixed-width data files, and more. Released under The PostgreSQL Licence..
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I've seen a lot of questions here about how to export a sql database to excel, but I have a ton of information from an excel file that I would like to export to sql database. The file is extensive and it could take me years to transfer the info manually... years! Is there a faster easier way to do this?
You don't actually need to write code to do this.
Save the Excel file to CSV format and then see one of the many related questions:
How to import CSV file to MySQL table
How I can Import data from CSV to MySQL?
Import CSV file directly into MySQL
The friendliest way is to use a graphical tool like HeidiSQL to match up the excel columns to the columns in your database.
You can use apache poi library to read the excel file line by line and jdbc to create connection to mysql database and execute insert statements.
In my project I used jruby with nurettin-jruby-poi fork of the jruby-poi gem to read the excel file and activerecord to insert the data into mysql.
Write a .NET Program (Console is simpler)
Access the Excel Worksheet/Workbook
Copy the Attributes and create appropriate Tables for the sheets in your Excel
Now copy the contents to SQL Tables appropriately
Since it is automated it will be faster
As Colin Pickard said, you can use CSV format that can be directly imported.
If you're using PHP, you can use the PHPExcel library here which can do the job !
Btw, you should tell us if you're using a certain language..
You can use a GUI Data Import tool, it supports direct import from *.XLS and *.XLSX (Excel 2007) files.
Use Talend Open Studio to visually design and then automate any kind of data integration job.
http://www.talend.com/index.php
My favorite tool for converting excel sheets into mysql database...
Full disclosure, I am the author.
http://excel2mysql.net
creating a CSV is better.
if we use CSV method we can export any character sets(unicode) also
ones we create a CSV file then, just use LOAD method available in mySql.
load method is used to import data to mysql directly(using sql query).
LOAD local permit you to import data from any location in your local machine
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I've just got a lovely Access database, so the first thing I want to do is to move it over to a normal database management system (sqlexpress), but the only solution I've found sounds like craziness.
Isn't there an "export database to .sql" button somewhere? I have around 50 tables and this export might run more than once so it would be great if I didn't have to export all the tables manually. Generating a .sql file (with tables creation and inserts) would also be great since it would allow me to keep that under version control.
I guess if it's not possible to do something simple like this I'd appreciate any pointers to do something similar.
Is there a reason you don't want to use Management Studio and specify Microsoft Access as the data source for your Import Data operation? (Database->Tasks->Import, Microsoft Access as data source, mdb file as parameter). Or is there a reason it must be done from within Microsoft Access?
There is a tool from the SQL Server group - SQL Server Migration Assistant for Access (SSMA Access) There have been comments stating it's a better tool than the Upsizing Wizard included in Access.
A quick-and-dirty way to upsize Jet/ACE tables to any ODBC-accessible database engine:
create an ODBC DSN for your database.
in Access, select a table, and choose EXPORT from the file menu. Choose ODBC as the type and then select your DSN.
This will export the table and its data with data types that your ODBC driver indicates are most compatible with Jet/ACE's data types. It won't necessarily guess right, and that's why you likely wouldn't do this with SQL Server (for which there are tools that do better translating). But with non-SQL Server databases, this can be an excellent starting place.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
The university I work at uses Oracle for the database system. We currently have programs we run at night to download what we need into some local Access tables for our testing needs. Access is getting to small for this now and we need something bigger. Also, the nightly jobs require constant maintance to keep working (because of network issues, table changes, bad code :) ) and I would like to eliminate them to free us up for more important things.
I am most familiar with MySQL so I setup a test MySQL server. What is the best way to automate copying the needed tables from Oracle to MySQL?
Edit: I accepted the answer. I don't like the answer but it seems to be correct based on further research and the lack of other answers provided. Thanks to all for pondering my question and answering it.
I don't think there is really anything that is going to do this. If you could setup a local Oracle database, then most likely you could as oracle has various means of keeping two databases "in sync", provided they are both Oracle.
If you must use mysql, then likely you are going to just have to write something to sync the data, this is of course always going to run in the same problems you currently have with the access "database".
You could setup something with HSODBC and triggers, but
I've found HSODBC to be very memory hungry
This is only going to add more load to your DB, which you say is already heavily loaded during the day.
If the main thing you are doing is wanting a local Test copy of your oracle database, you would be best to setup syncing with a local version of oracle, as far as I can tell from the licenses, oracle is free for development copies ( I have seen some posts to the contrary, but if you find that is the case, you could always use something like Oracle XE)
Could you just copy the Oracle tables and then set them up as linked tables in MS Access? This way the front-end stays the same plus you keep everything in Oracle (less moving parts than exporting and importing).
As Kellyn said, there are lots of free tools. One of them is SQLWorkbench http://www.sql-workbench.net/, which works with any JDBC database, so MySQL and Oracle should work.
It can create tables in Oracle if needed, or just only copy over the (updated) data.
There are many tool available to migrate data from oracle to mysql if your database is not very complicated.
You can use open source tools like Kettle pentaho ETL tool or paid enterprise tools like DB convert: https://dbconvert.com/oracle/mysql/
Lastly you can write a script or program that migrates the data.
Please find links related to your question:
https://dba.stackexchange.com/questions/150343/how-to-sync-a-mysql-db-with-a-oracle-db
Migrate from Oracle to MySQL
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I've been banging my head against SQL Server 2005 trying to get a lot of data out. I've been given a database with nearly 300 tables in it and I need to turn this into a MySQL database. My first call was to use bcp but unfortunately it doesn't produce valid CSV - strings aren't encapsulated, so you can't deal with any row that has a string with a comma in it (or whatever you use as a delimiter) and I would still have to hand write all of the create table statements, as obviously CSV doesn't tell you anything about the data types.
What would be better is if there was some tool that could connect to both SQL Server and MySQL, then do a copy. You lose views, stored procedures, trigger, etc, but it isn't hard to copy a table that only uses base types from one DB to another... is it?
Does anybody know of such a tool? I don't mind how many assumptions it makes or what simplifications occur, as long as it supports integer, float, datetime and string. I have to do a lot of pruning, normalising, etc. anyway so I don't care about keeping keys, relationships or anything like that, but I need the initial set of data in fast!
The best way that I have found is the MySQL Migration Toolkit provided by MySQL. I have used it successfully for some large migration projects.
Using MSSQL Management Studio i've transitioned tables with the MySQL OLE DB. Right click on your database and go to "Tasks->Export Data" from there you can specify a MsSQL OLE DB source, the MySQL OLE DB source and create the column mappings between the two data sources.
You'll most likely want to setup the database and tables in advance on the MySQL destination (the export will want to create the tables automatically, but this often results in failure). You can quickly create the tables in MySQL using the "Tasks->Generate Scripts" by right clicking on the database. Once your creation scripts are generated you'll need to step through and search/replace for keywords and types that exist in MSSQL to MYSQL.
Of course you could also backup the database like normal and find a utility which will restore the MSSQL backup on MYSQL. I'm not sure if one exists however.
SQL Server 2005 "Standard", "Developer" and "Enterprise" editions have SSIS, which replaced DTS from SQL server 2000. SSIS has a built-in connection to its own DB, and you can find a connection that someone else has written for MySQL. Here is one example. Once you have your connections, you should be able to create an SSIS package that moves data between the two.
I ddin't have to move data from SQLServer to MySQL, but I imagine that once the MySQL connection is installed, it works the same as moving data between two SQLServer DBs, which is pretty straight forward.
Rolling your own PHP solution will certainly work though I'm not sure if there is a good way to automatically duplicate the schema from one DB to the other (maybe this was your question).
If you are just copying data, and/or you need custom code anyway to convert between modified schemas between the two DB's, I would recommend using PHP 5.2+ and the PDO libraries. You'll be able to connect using PDO ODBC (and use MSSQL drivers). I had a lot of problems getting large text fields and multi-byte characters from MSSQL into PHP using other libraries.
Another tool to try would be the SQLMaestro suite. It is a little tricky nailing down the precise tool, but they have a variety of tools, both free and for purchase that handle a wide variety of tasks for multiple database platforms. I'd suggest trying the Data Wizard tool first for MySQL, since I believe that will have the proper "import" tool you need.