Been tasked with moving a code first database from MSSQL to MySQL. After a few hours of kung fu, I was able to get the asp.net core project to properly deploy all migrations to mysql. Now I need to migrate the data inside of the existing mssql tables. I saw posts mentioning MySQL Migration Toolkit but that appears to be old. Also attempted to do so with MySQL Workbench and DBLoad's Data Loader but haven't had any luck.
Table structure is pretty simple with incremental integer keys + the usual crap with asp.net core identify framework (GUID). Just need to keep that consistent during the migration. What is the best way to migrate the data now that the table structure is setup in MySQL? Any recommendations would be greatly appreciated!
Update: Some more details...
I attempted a direct migration of the database from MSSQL to MySQL using MySQL WorkBench and DBLoader. But failed on the ASP.net Identity tables big time plus other issues. .net core api took a huge dump in multiple places so that idea is out.
From that point, I migrated the api controller over to mysql and then had to fix a myriad of issues related to mssql fks being too long and a few other issues.
So at this point, the controller works on mysql. I just need to dump all of the data into MySQL and keep the FKs consistent.
I have had a few thoughts with it such as CSV export>import and/or trying a few other things. Any recommendations?
You can use Data Export wizard built-in with SqlServer Management Studio (task -> Export Data)
or use SSIS package to migrate data.
Tried MySQL Workbench, DBLoad from DBLoad.com etc to migrate the data directly and they all failed.
So ended up finding a "solution"...
First off, I modified the ASP.net Core project with:
//services.AddDbContext<ApplicationDbContext>(options =>
// options.UseSqlServer(
// Configuration.GetConnectionString("DefaultConnection")));
services.AddDbContext<ApplicationDbContext>(options =>
options.UseMySql(
Configuration.GetConnectionString("DefaultConnection")));
Then went to Package Manager Console and ran: update-database
This created all of the tables in MySQL.
Then opened up Microsoft SQL Management Studio. Right clicked on the database Tasks > Generate Scripts. Saved everything to one file with Advanced option Schema and Data selected.
Then opened the db script in Notepad++ and applied the following edits using Replace with Extended Search Mode enabled:
GO -> blank
[dbo]. -> blank
[ -> blank
] -> blank
)\r\n -> );\r\n
\r\n' -> '
DateTime2 -> DATETIME
After edits were made in Notepad++, I removed all of the SET, ALTER and CREATE related stuff from the text file and then copied all of the insert lines into MySQL Workbench in order to ensure foreign keys were already populated before that table's data was inserted. Thank goodness there were no Stored Procedures to deal with! What a pain in the tucus!
Also on a side note, the app is hosted on Azure. Spent a couple of hours fighting the API not connecting to the database. This was not apparent at first as the app was fake throwing a 404 error to Postman. When I first attempted to wire up the API controllers to the MySQL DB, I entered the database connection string into Azure's App Service Configuration. It didn't work at all even though running the app locally worked fine. Ended up finding another post on this here site mentioning to get rid of the database connection string out of the App Service > Configuration window. Worked like a champ after that. All of the data with its auto incremented keys linked up without issue.
I am very pleased with the results and hope I never have to go through this process again! It is always a nice feeling to know an app now runs on a completely open source infrastructure. Hope this helps someone. Good luck.
Related
We would like to be able to publish Filemaker data on our Wordpress website. The website is up and running and the filemaker database is set up. We do not need a live connection between both systems so we chose to export the FM data to .csv so we can import it to the mysql database on the server and from there we would like to display in on the website.
Now are my questions, since this kind of development is new to us:
can I setup an automated import to the mysql database from a source like dropbox or something? For example can we make the mysql database import and overwrite the existing database each 24 hours from a .csv file located somewhere? We need this automated overwrite option because the FM data changes often and we need up to date info on the website)
How can we display the data from the mysql database on the WP frontend?
I've been looking into this myself and couldn't find any clear answers or guides. Can you guys point me in the right direction?
(btw, I know there are table plugins I can use for WP but they do not fulfill our needs, and I think it's exciting to do it all by ourself with help from this great community)
Update 01
I've successfully connected FM with my MySQL db using ODBC and can now select tables from the MySQL db in FM's relational graph.
I was wondering how I can write the data from my existing FM file to the MySQL db using ODBC, can anybody help me on this?
I would like to display the data in some MySQL tables so I can fetch them using php on my website.
Thanks!
It is possible to write directly into (and read from) a remote MySQL database from FileMaker via ODBC.
You need an MySQL account which allows remote access. There are providers where this is not allowed.
On the local box the odbc driver needs to be installed. On Win you can use the open source version (http://dev.mysql.com/downloads/connector/odbc/), on Mac it works better with the Actual Tech (http://www.actualtech.com/de/product_opensourcedatabases.php) drivers.
An odbc system dsn (not user dsn) is set up. Be sure to use the 32-bit odbc manager on Win.
Now you can create the external data source within FileMaker and read and write into MySQL tables.
Once you have made the connection to the MySQL database, and you can see the shadow tables, you can write to the fields directly via Filemaker layouts. It's as simple as that.
Once the layout contains the fields from the MySQL database you can move through records, find stuff all as if the data were native in your FM database. Of course, for more automated processing, you can create scripts, relationships etc and manipulate/synchronise data. Be warned though, the connection speed can limit complex relationships and large databases. I would advise 'baby steps'.
I have an .sql script that contains inserts and creates tables. I used the "Create EER Model From Script"
It created the tables but I can't see the data inside these tables.
I went to the query menu and tried to make a query but it gives me an error about not being able to connect to localhost.
Am I doing it right?
As documented under Create EER Model from SQL Script:
Clicking this action item launches the Reverse Engineer SQL Script wizard. This is a multi-stage wizard that enables you to select the script you want to create your model from.
For further information, see Section 7.7.9.1, “Reverse Engineering Using a Create Script”.
Following that link:
However, if you are working with a script that also contains DML statements you need not remove them; they will be ignored.
Instead, you want the Manage Data Import/Export option under Server Administration (within the Workspace section of the Home window).
You are confusing things here. Creating a model from a script is a process where meta data is examined and a model is created that you can then use to modify your schema structure, further design your db objects and all that. Modeling is a design process for the structure of your schema/db so it only deals with meta data. It's also used for documentation (e.g. in teams).
On the other hand there's normal sql work with existing db objects and/or actually creating/deleting/modifying db objects. In order to do the latter you must have an understanding of the design of the schema (which you could get by using the modeling part of MySQL Workbench, but not only by that). This is also the place to load a script, run it to insert data and such.
The error you mentioned regarding the connection is yet another problem and you need to solve this first to be able to even access your server. And yes, you have to install a server first somewhere. MySQL Workbench is a tool to visually work with your server(s) in opposition to the MySQL command line client which is a pure text interface (but still also a client application for your MySQL servers).
If you are on Windows and want a MySQL server installed locally (e.g. for testing) your best option is to download the MySQL Installer which greatly simplifies installing any of the tools from the MySQL family (server, client tools, connectors, documentation and more).
Pls, could you help me on the following issues:
I would like to migrate from mysql to oracle. I used Oracle Sql developer. I import Mysql thirty packet but when i was start migration i get this error
ORA-04 098: trigger 'SYSTEM.MD_PROJECTS_TRG' is invalid and failed re-validation
then migration stop
Have you ever seen this post and tried this solutions?
A problem of constraint or foreign key surely?
Just so you know the Data Move operation actually performs 3 different actions in the following order:
1. Disable all constraints (FKs,PKs).
2. Move data using multiple streams so that it can migrate more than 1 table at any time.
3. Enable all constraints (FKs,PKs)
It appears that for some reason SQL Developer is failing on step 1 where it is trying to disable the constraints. Have you tried to generate the data move scripts for an off-line data migration?. You can look at the disable constraint scripts and run them manually before attempting the on-line data migration again. Hope that works.
Yes you can specify a different schema (need to use the connection for it). But you will have to use the same repository that you used for migration so that it can pick up the mappings for the source database.
For the Data move issues, you can try the followiing:
a. For moving the failed tables one by one i.e. right click on the table in the source database and choose 'copy to oracle' and then choose only data migration (append mode).
b. You can also try changing the default DATE formats for the source database in the SQL Developer to the actual one used in the source database. The default it uses is mm/dd/yyyy which may not be the right one in your environment( --> Preferences-->Migration-->Data Move options). Some times this also causes problems when migrating data
From my personal experience, SQL Developer creates a log file in an XML format under \localuser\applications... directory. So you will have to dig a bit more to find out the log file which will contain the actual error.
More informations in this link -> https://forums.oracle.com/forums/thread.jspa?threadID=2357687&tstart=90
I'm considering a MySQL to Postgresql migration for my web application, but I'm having a really hard time converting my existing MySQL database to Postgresql.
I tried :
mysldump with --compatible=postgresql
migration wizard from EnterpriseDB
Postgresql Data Wizard from EMS
DBConvert from DMSoft
and NONE of the above programs do a good job converting my database!
I saw some Perl and Python scripts for converting mysql to postgresql, but I can't figure out how to use them....(I installed ActivePerl and don't understand what I'm supposed to do next to run that script!)
I use Auto Increment fields (as a primary key) all the time, and these are just ignored... I understand that Postgresql does auto-increments in another way (with sequences), but it can't be THAT hard for MIGRATION software to implement that, or is it?
Did anybody have better luck converting a MySQL database with auto-increments as primary keys?
I know this is probably not the answer you are looking for, but: I don't believe in "automated" migration tools.
Take your existing SQL Scripts that create your database schema, do a search and replace for the necessary data types (autonumber maps to serial which does all the sequence handling automagically for you), remove all the "engine=" stuff and then run the new script against Postgres.
Dump the old database into flat files and import them into the target.
I have done this several times with sample databases that were intended for MySQL and it really doesn't take that long.
Probably just as long as trying all the different "automated" tools.
Why not use an ETL Tool? you dont have to worry about dumps or stuff like that.
I have migrated to PostgresSQL and MySQL and have had no problems with the auto increment fields.
You just need to know the connection credentials and thats it. I personally use Pentaho ( it's open source ).
Download Pentaho ETL from http://kettle.pentaho.org/
Unzip and run Pentaho (using .bat file spoon.bat)
Create a new Job:
Create DB connection for source data base (PostgreSQL) - using menu: Tools→Wizard→Create DataBase Connection (F3) Create DB connection for destination data base (Mysql) - using technique described above.
Run the Wizard: Tools → Wizard → Copy Tables (Ctrl-F10).
Select source (left dialog panel), and destination (left dialog panel). Click Finish.
The Job will be generated - Run the job.
If you need any help let me know.
Even when you familiar with all "PostgreSQL gotchas", doing every step by hand may take a lot of time, especially when your db is "big".
Try some other scripts/tools.
I know this is an old question but I just ran into the same problem migrating from MySQL to Postgres. After trying several migration tools out the very best one I could find, which will migrate your database structure as cleanly as possible, was Pgloader https://github.com/dimitri/pgloader/ it will take care of changing the Auto Increment to Postgres sequences no problem and it's super fast.
I'm creating a new Asp.Net MVC 3 application. Visual Studio does a lot of the job of create the database and initial layout. Very nice! I will upload that initial files to my server, but I want that it runs using the MySql database on the server.
There's some quick/easy way to do it? I'm not worried about the data, just the structure of the tables, and the connection/configuration changes.
Thank you very much!
You can export any MS-SQL database as a Script (Sql Server manager).
Fix it up to make it compatible.
But you will also need a Membership provider, look around if there exist any for MySql, otherwise you'll have to create one (movie).
There are a number of tools listed in "Migrating from Microsoft SQL Server and Access to MySQL".
Or (assuming that you're using column types that exist on both platforms) you can write a script to convert a schema dump from SqlServer into MySQL (or do the conversion by hand in a text editor). Even better yet, you can write a program program to read the INFORMATION_SCHEMA table from SqlServer and produce the necessary CREATE TABLE... statements in mysql. Lots of options.