SSIS: SQL 2008 R2 to MySQL data loss - mysql

I have an SSIS package set up to export data from a SQL Server 2008 R2 table to a MySQL version of that table. The package executes however, I am getting about 1% of the rows failing to be exported.
My source connection uses the SQL statement
SELECT * FROM Table1
all of the columns are integers. An example of a row which is exported successfully is
2169,2680, 3532,NULL, 2169
compared to a row which fails
2168,2679,3532,NULL, 2168
virtually nothing different that I can ascertain.
Notably, if I change the source query to only attempt the transfer of a single failing row - ie.
SELECT * FROM Table1 WHERE ID = 2168
then the record is exported fine - it is only when part of a select which returns multiple rows that it fails. The same rows fail the export each time. I have redirected error rows to a text file which displays a -1071610801 error for the failing rows. This would apparently translate to:-
DTS_E_ADODESTERRORUPDATEROW: "An error has occurred while sending this row to destination data source."
which doesn't really add a great deal to my understanding of the issue!
I am wondering if there is a locking issue or something preventing given rows from being fetched or inserted correctly but if anyone has any ideas or suggestions on what might be causing this or even better how to go about resolving it they would be greatly appreciated. I am currently at a total loss...

Try to setup longer timeout (1 day) ot the mysql (ADO.NET) destination.

Well after much head scratching and attempting every work around that I could come up with I have finally found a solution for this.
In the end I switched out the MySQL connector for a different driver produced by devArt -dotConnect for MySql and, with a few minor exceptions (which I think I can resolve) all of my data is now exporting without error.
The driver is a paid for product unfortunately but in the end I'd have taken out a new mortgage to see all those tasks go green!

Related

Migrating Microsoft SQL Server to MySQL: Odd errors I can't find documentation or fixes for, including "text is too long"

I have a database in SQL Server that I am trying to convert into a MySQL database, so I can host it on AWS and move everything off-premises. From this link, it seems like normally this is no big deal, although that link doesn't seem to migrate from a .bak file so much as from your local instance of SQL Server that is running and contains the database in question. No big deal, I can work with that.
However when I actually use MySQL Workbench to migrate using these steps, it gets to the Bulk Data Transfer step, and then comes up with odd errors.
I get errors like the following:
ERROR: OptionalyticsCoreDB-Prod.UserTokens:Inserting Data: Data too long for column 'token' at row 1
ERROR: OptionalyticsCoreDB-Prod.UserTokens:Failed copying 6 rows
ERROR: OptionalyticsCoreDB-Prod.UserLogs:Inserting Data: Data too long for column 'ActionTaken' at row 1
ERROR: OptionalyticsCoreDB-Prod.UserLogs:Failed copying 244 rows
However the data should not be "too long." These columns are nvarchar(MAX) in SQL Server, and the data for them is often very short in the specified rows, nothing that approaches the maximum value for an nvarchar.
Links like this and this show that there used to be, almost a decade ago, bugs with nvarchar formats, but they've been fixed for years now. I have checked and even updated and restarted my software and then computer - I have up-to-date versions of MySQL and MySQL Workbench. So what's going on?
What is the problem here, and how do I get my database successfully migrated? Surely it's possible to migrate from SQL Server to MySQL, right?
I have answered my own question... Apparently there IS some sort of bug with Workbench when translating SQL Server nvarchar(MAX) columns. I output the schema migration to a script and examined it, it was translating those columns as varchar(0). After replacing all of them with TEXT columns, the completed migration worked.
Frustrating lesson.

SSIS - OleDB connection not updating data, inserted thru SQLBulkCopy with SQL Connection

I have created Ole DB connection to execute different SQL task across SSIS package. Its working fine too.
In one of task where i need to do insert data into table, used SQLBulkCopy as i have dynamic tables and columns based on getting files from different sources.
SQLBulkCopy only works with SqlConnection, so i opened SqlConnection, executed SqlBulkCopy. This is also working fine.
After done with SqlBulkCopy, i have created Sql Task which updates metadata of inserted rows, for e.g. Count, Min & Max date and so on in different table.
This table is not getting updated and if i execute stored procedure from Sql Management Studio, it works as expected.
So my assumption is that, Ole DB connection is not able to get the latest data Data inserted thru SQL Connection.
I may be wrong, but not sure why i can see sql execution task successful but still table is not updated.
Am i missing anything here?
My bad.
Instead of passing data type as long (int in SQL), i was passing it as Varchar.
I was looking from last few hours and as soon as i post question here, it strikes me to check the data type.
Hope it will help somebody.

SSIS - Deployed package SQL Command validation error

I have created a SSIS package in order to transfer datas from ACCESS to SQL SERVER.
Source > SQL Command from "mdb" file joinning two tables
Destination > Flat table in SQL Server
I'm performing the JOIN in the source SQL Command because of the number of records in the ACCESS tables (~500k).
I tried to use SSIS join but it take ages doing the ORDERING before JOIN.
While running package in VS2010, it works great.
But after deploying and executing the package on my SQL Server 2014 the following error occurs.
No column information was returned by the SQL command.
Returned validation status "VS_NEEDSNEWMETADATA"."
I'm pretty sure my SQL command is correct (Working in VS and the preview button in the editor show me records).
I tried to disable ValidateMetadata but the same error still occurs, but at execution this time.
In the SQL Server 2014 I have other packages calling ACCESS data (But without join) and it works properly.
Thanks for your help,
Q.
ValidateMetadata is (generally) a good thing.
This error is being caused by the metadata on your source or destination (it's unclear from your question) being different.
At a guess, at least one of the columns in your SQL2014 database is of a different datatype (or length, or is nullable etc.) - there is a difference either way.

SSIS missing data from SQL table using Fast Load

I have a bit of a problem. When I set up a SSIS package and i fire it off it shows me the amount of rows that is going into the SQL table, but when I query the table there is almost 40000 rows missing from what the last count was after the conditional split that I have in the package.
What causes this problem? Even if I have it on normal table or view it still does the same thing. But here I have to use the fastload option as it is a lot of source files being loaded. This is only testing before sending it to production and I am stuck at the moment. Is there a way I can work around this problem and get all the data that is supposed to be pumped into the table. please also take note that in the conditional split it removes any NULL values as seen in first picture.
Check the Error Output (under Connection Manager and Mappings) within Destination Component. If the Error setting is set to Ignore Failure or Redirect Row, the component will succeed, but only the successful rows will be inserted.
What is the data source? Try checking your data and make sure you don't have any terminators stored in one of the rows.

How do you coerce float values in MySQL for classic ASP scripts?

I have been charged with maintaining a legacy classic ASP application. The application uses an ODBC system DSN to connect to a MySQL database.
We had to recently update the servers to satisfy some licencing requirements. We were on Windows, with MySQL 4.x and the 3.51 ODBC driver. We moved to a Linux machine running MySQL 5.1.43 and are running the 5.1.6 ODBC driver on the new IIS server.
Users almost instantly started reporting errors as such:
Row cannot be located for updating.
Some values may have been changed
since it was last read.
This is a ghost error, and the same data changes, on the same record, at a different time won't always produce the error. It is also intermittent between different records, as in, sometimes, no matter what values I plug in, I haven't been able to repro the defect on all records.
It is happening across 70 of about 120 scripts, many over 1,000 lines long.
The only consistency I can find is that on all of the scripts that fail, they are all reading/writing floats to the DB. Fields that have a null value don't seem to crash, but if there is a value like '19' in the database (note the no decimal places) that seems to fail, whereas, '19.00' does not. Most floats are defined as 11,2.
The scripts are using ADODB and recordsets. Updates are done with the following pattern:
select * from table where ID =
udpdated recordID
update properties of the record from the form
call RecordSet.Update and RecordSet.Close
The error is generated from the RecordSet.Update command.
I have created a workaround, where rather than select/copy/update I generate an SQL statement that I execute. This works flawlessly (obviously, an UPDATE statement with a where clause is more focused and doesn't consider fields not updated), so I have a pretty good feeling that it is a rounding issue with the floats that is causing a mis-match with the re-retrieval of the record on the update call.
I really would prefer NOT re-writing 100's of these instances (a grep across the source directly finds 280+ update calls).
Can anyone confirm that the issue here is related to floats/rounding?
And if so, is there a global fix I can apply?
Thanks in advance,
-jc
Have a look at MySQL Forums :: ODBC :: Row cannot be located for updating.
They seem to have found some workaround and some explanations as well..
I ran into a similar issue with a VBA macro utilizing 4.1. When upgraded to 5 errors started popping up.
For me the issue was that values being returned to VBA from MySQL was in a unhandled (by VBA) decimal format.
A CAST on the numbers when querying helped to fix the issue.
So for your issue perhaps the ODBC/ASP combination is recording/reading values differently then what you might expect them to be.