Migrate big tables from MySql to MsSql - mysql

I have a big database around 46GB in Mysql format and I managed to convert all the database to MSsql except two tables, the biggest ones. When I try to migrate those 2 tables, one by one , after a while I get the error message "The connection has been disabled"
I encreased the timeout from SSMA option from 15 to 1440 and decreased the bash from 1000 to 500 and same thing, The tables have 52 mil rows and 110 milion rows with 1,5 GB and 6.5 GB.
I tried incremential version but I don't have a unique id to use
What can I do to migrate them
Thank You

You should be able to use SQL Server Integration Services (SSIS). You can create a dataflow that pulls from MYSql and dumps the Data into MSSQL.
You'll need to create a Data flow task that includes an OLE Database Source and connect it to an OLE Database Destination.

I had the same problem using SSMA. I managed to migrate 100+ million rows of a table with 40+ columns.
I assume you've done the configuration well.
You need to ensure that there is no activity in the MSSQL database. No SELECTS over those tables and no any other activities.
Check out the two tables' structures to ensure they are fine. You can run the SSMA project only for those two tables.
Hope this helps.

I had the same issue and fixed by using the following configuration described here in detail: MySql 5.6 to MSSql server 2014 migration : ExecuteReader requires an open and available Connection
Also, I have documented the whole MySQL to MSSQL migration process here: Migrate Data and Schema from MySQL to SQL Server

Related

Database sync between SQL Server and Mysql

I have two different application which run on different database, SQL Server and MySQL. Now I want real time data sync between SQL Server and Mysql - can anybody please tell me. I have already tried data migration, but it copies only data not sync data
Take a look at StreamSets. Open source and can read the SQL Server change tracking (CT) tables and turn them into inserts, updates, and deletes to any database with a JDBC driver including MySQL. It can also use the MySQL binlog to build data pipelines in the other direction.

Copying over part of a table from SQL Server to Aurora DB (Based on MySQL by AWS)

I have a legacy SQL Server DB and I need to copy part of a very very big table on it over to a new Aurora DB cluster from AWS (RDS).
The old table in SQL server has 1.8 billion records and 43 columns, however in the new DB I will only have 13 of those columns carried over and almost all rows.
I was wondering if anyone has any ideas on the best way that I can move this data across?
I wrote a simple Python script to query the SQL server and then execute insert statements on the new DB but I estimate this would take about 30 hours to run after I did some tests on smaller sets of data.
Any ideas?
P.S Aurora is based off of MySQL so I would imagine if it works for MySQL it would work for Aurora.
Assuming you can get the data you want into something like a CSV file, LOAD DATA LOCAL INFILE should be pretty performant.
I did wonder whether it would be allowed on RDS and discovered an AWS article on importing data into MySQL on RDS. I couldn't find an equivalent one for Aurora, only migrating from an RDS based MySQL instance. There's an Amazon RDS for Aurora Export/Import Performance Best Practices document that has one reference to LOAD DATA LOCAL INFILE, however.

How to load data from MySql to MS SQL Server database?

I have similar schema in both MySQL and MSSQL Server databases. How can I migrate just the data from MySql to an empty (no data) MSSqlServer database? MSSqlServer DB is empty with just the schema. I could not configure the MySql DB as a linked server (through ODBC) since I don't have DB Admin rights on MSSqlServer. I just have previleges to add data. I explored Sql Server Migration Assistant for MySQL, but I just want to migrate data without touching the schema at the target.
I also noticed that there is a SqlBulkCopy class which helps to programatically migrate data in .NET.
But I need to write code for each table (there are more than 100 tables and 20 GB of data).
What is the most elegant way to do it?
SSMA might be the easiest. Since you don't wanna use that, you can try using mysqldump. You can use it to essentially dump a MySQL db to SQL Server.
Link to the SSMA blog, they update these tools regularly, so check for a later version. SQL Server Migration Assistant (SSMA) Team's Blog

How can I sync my MSSQL database to MySQL?

I am using MSSQL as my transactional database. This database has 200+ tables with about 25 tables that hold 1M plus records. I want to replicate the data with the same structure into a MySql database for reporting. Is there a tool or method that can do this in a fairly real time manner?
Use SQL Server's Linked Server to access MySQL from SQL Server.

MYSQL to SQL 2008 migration

We get a MYSQL 5.0 dataset each month (1.7gig) and I need to create a process to migrate this to a SQL Server 2008.
This seems a little harder than I first thought...
I've tried a few ways:
Using the Import wizard
Setting up a linked server
I've also tried different ways:
Using the .net Framework Dataprovider for MYSQL
Using MYSQL ODBC 5.1 driver.
If I try options 1 + 1 (Wizard, using odbc), I get "unable to retrieve column information",
Option 2 + 4, I get a message: "Cannot get the column information from OLE DB provider "MSDASQL" for linked server "server name"."
This feels like a cache, or size issue, because if I limit the rows I return to less than 300,000 it works. This is more annoying as the main table is over 1.2 million rows.
So my questions two parts: Am I doing this the right or wrong way, and have I missed something obvious?
You can use SQL Server Integration Services to connect to the MySQL database and pull the data you need over. The SSIS team blog has a walk-through for connecting to MySQL at Connecting to MySQL from SSIS. Once you build your SSIS package, you can re-use it each time you get a new data dump.