We've a MySQL 5.1 server and we are trying to import data from SAP Data Services from another server. It works but it takes 3h!
We tried to setup an Oracle XE server to test the same operation and it takes only 10 minutes.
I think the problem is that with Oracle the "Rows per commit" setting (1000) works ok, but with MySQL every row is commited. Is it possible? How can we solve it?
Screenshot of the exporting settings
SAP BusinessObjects Data Services (BODS) allows you to control the number of rows per commit, if data flow ends with a data store in your MySQL server
BODS includes an optimisation PDF guide as part of the help documentation - check that for other recommendations.
Related
How can I check how long it lt took to extract data and load data in SSIS.
I am using Attunity connector now for extracting Data from Oracle and then using OLE DB destination to load it in SQL Server. Using Attunity is not making any difference in speed. It took approx 5 minutes to extract and load 5,600,000 rows from oracle to SQL Server when I use attunity connector which is same as OLE DB Source.
In the Progress tab, you can see the time taken for completion of data load.
As others have pointed out, the package duration is included in the GUI and you can also write the start/end times to a logging table within the package to capture the execution time. If you're looking for more detail on specific components, enable logging, choose the PipelineComponent event, and look for this event on the tasks that you're looking to monitor. The PrimeOutput step, which is when the data is sent downstream, will be of the most interest.
I am new to SSIS is there is any component to load data from MYSQL to SQL server using SSIS. Currently am loading data using ODBC connection it is really slow and it around the speed of 30000 rows/Minute. Is there any way to make the load run fast.
Thanks in Advance...
You can install the .NET Connector for MySQL: http://dev.mysql.com/downloads/connector/net/
Then you can create a script task to act as a data source, import MySql.Data.MySqlClient, and query MySQL directly in C#. The data will then enter your Data Flow and you can map it to a SQL Server destination the same as normal.
I find that when using the SSIS connection manager with .Net Providers I get malformed SQL errors, but this way you write all the SQL yourself.
To improve the performance, we can add Conditional Split Transformation build some parallelism into the common data flow that directly load data from ODBC Source to OLE DB Destination.
For more information about speeding up SSIS Bulk Inserts into SQL Server, please see the following blog:
http://henkvandervalk.com/speeding-up-ssis-bulk-inserts-into-sql-server
In DataFlowTask property, Increase buffer size and no of row commit
We are in the process of moving our backend from ms sql server to mysql. Actually we currently use a couple mysql servers, but mostly ms sql server. I mention this because we are not totally new to mysql. Each day we do a lot of ETL to keep our backend in sync with a legacy system. We move a lot of data and working with sql server has been so much easier than working with mysql for ETL. I know SSIS is MS, but still it has been a headache.
We are using sql server 2012 and BIDS 2010. It has been a struggle to move mysql data at the same rate as ms sql data. We are mainly dealing with innodb tables in mysql. To summarize I have been using the mysql ODBC connector and the ODBC destination in SSIS. The first step is to turn autocommit off on imports. Even with that setting off I can see in package execution that the data source ends up waiting on the destination. It gets about 40,000 rows ahead and waits.
Next I export the data to a text file and then import using a sql task and the INFILE command. This gives pretty good performance, but at the expense of more moving parts. I've had a couple issues with this approach, but it does work and perform well.
Lastly I tried a 3rd party SSIS component from Devart. It creates custom mysql source and destination components. The performance isn't as good as INFILE, but it's not bad and it makes the package simple like when dealing with sql server... a data source and a data destination. No messing with auto commits, exports, INFILE, etc. However I can't use the connections to do other tasks like truncate tables and stuff. So I still have my ODBC connection to do those tasks. I'm going to ask Devart about this.
Right now it looks like Devart is going to be a nice balance. If I absolutely need the performance I have the INFILE method.
I also tried the mysql net connector and could not get that to work at all. I'm running on Windows 7 64bit with Sql Server 2012 64bit. Basically everything I need in BIDS runs in 32bit so I'm guessing this part of issues.
My question is what are others doing when it comes to moving mysql data with SSIS? It has been such a hassle. It would be nice to get some input on what others are doing. What methods are you using? Are you using 3rd party components? Is there a better/dedicated place to discuss SSIS and mysql?
We have a database in production running on SQL Server 2012, the database was copied and then changes were made to the schema, structure, relationships during further development of our application that uses the database.
We are now ready to migrate the data from the old production database to our newly developed database. We spent a late night last night trying to figure out how to get that done using Red Gate's SQL Schema Comare, and SQL Data Compare. We were not very successful as kept getting errors during the SQL Schema Compare Deployment.
The options and features of Red Gate's tools are very extensive and I don't have much experience using them. I tried creating migration scripts within SQL Compare, but as the databases weren't linked up to source control prior, I don't see where or how to write these scripts.
The question I have is what is the proper sequence to migrate data using Red Gate's tools, or would it just be easier to create SSIS packages to handle it all. The amount of records we are migrating is relatively low less than 10,000 records. We would like to preserve the identity values as we have some scripts that may depend on the specific identity values.
Any tips, suggestions, and comments are much appreciated.
Thanks
I'm trying to create a completely new database from an existing MySQL database, bringing over both data and schema, but so far the only way I've been able to do this is to first import the MySQL database into MS Access, and then into SQL Server 2005? Crazy right? Surely, there is a way that doesn't involve a tedious, custom time-consuming programming, right (perhaps using SSIS)?
A few additions to my original description above:
Its a pretty good size database (easily a few gigs).
I'm working in an MS environment (asp.net, C#)
I'm under a tight deadline so I'm looking for an automated process that requires little to no effort in the conversion process.
SSIS would be the preferred way via BIDS (VS 2005)
Thanks for all the great input!
I believe that using the phpMyAdmin tool you can script the MySQL database structure and data into a sql script. Then you simply run those two scripts on your SQL Server 2005 database and it should, in most cases, create the database and fill it with data. It's been a couple years since I had to do it myself, but as I recall that was the process I used to transfer a MySQL database to SQL Server in the past. You will probably have to alter the structure script to change some of the data types to their SQL Server equivalents, but the data should load just fine once you've got the data types all sorted.
I think you can use SQLYog to generate some fairly standard SQL which will dump out and recreate your db, with data. You may have to massage its output for SQL Server's dialect of SQL a bit, though...
The responses I received were certainly helpful, but the solution it would seem is to do a mysqldump and then run that script from SSIS, massaging the output as needed; however, AFAIK it is not possible to use VS 2005 BIDS to create an SSIS package that completely transfers a MySQL database to a SQL Server 2005 database (data and schema) using Windows Vista 64. I said AFAIK, but who knows the interwebs have much to reveal :)