Conversion error between "DT_DBTIME2" and "DT_WSTR" for "ETAHour" - sql-server-2008

I have a SQL query that returns me a "time(0)". I load that into SSIS, and it gets automatically converted to a "DT_DBTIME2", which is okay. I can transform it to any other type without error using a data conversion data flow item.
My problem is that when I try to insert that value into a "time(0)" field of a table, it gives me the following error:
The OLE DB provider used by the OLE DB adapter cannot convert between
types "DT_DBTIME2" and "DT_WSTR" for "ETAHour".
When I mouse over the fields in the OLE DB Destination component, it clearly says that the source field is a DT_DBTIME2 and the destination field is a DT_DBTIME2. I really wonder where this conversion error comes from.

Make sure that you are specifying provider in connection string.
In my case I'm using MSSQL 2012 Enterprise. It works on local machine, but after updating connection string in dtsconfig in installer it fails with error above.
Setting OLE DB provider fixed issue, in my case:
Provider=SQLNCLI11.1

Had a similar issue after importing an existing SSIS project created a new connection string and after I switched had multiple errors.
Changing the OLE DB provider to SQL Server Native Client 11.0 solved issue which is the equivalent of setting SQLNCLI11.1
enter image description here

Related

SSIS ODBC Simba - Error when access table list on ODBC Source \ Destination

I'm using Simba ODBC to create a connection with Google Big Query and using SSIS (Visual Studio 2019) to read and write information on Big Query. The connection works fine and when I use the ODBC Source with the query option, I'm able to get data from Big Query and used it inside SSIS. But when I use the list of tables, I get an error as below:
Exception of HRESULT: 0xC0014020
Error in Data Flow Task[ODBC Source [100]]: SQLSTATE: 42000, Message: [Simba][BigQuery] (70) Invalid query: Invalid dataset ID ""TEST"". Dataset IDs must be alphanumeric (plus underscores and dashes) and must be at most 1024 characters long.
I believe that this happens because the list of tables appear between ("), instead of (`).
Print of table list
The same happens when I use the ODBC Destination. Is there a way to change the format in which the table list appears ?
Obs.: On the Visual Studio 2015 this table list comes with (`) and I can connect with big query just fine.
I can see that the tool is sending "TEST" as the dataset, however, depending on if Visual Studio is using StandardSQL or LegacySQL, the dataset should be specified as:
# LegacySQL
FROM [myproject:TEST.TABLE_TEST]
# StandardSQL
FROM `myproject:TEST.TABLE_TEST`
I was wondering if Visual Studio accepts a custom query or can be parameterized to remove the quotes. If this doesn't help, could you please share the query that cause the error? I understand that there is a query option (I'm not familiar with Visual Studio) and it is not clear for me the exact moment when the tool returns the error, screenshot without sensitive information would be appreciated.
UPDATE:
You can review the following checkpoints that could help to verify that the Simba driver is correctly set up and it is not the cause of the reported error:
Installation. Check that you are using the last version of the driver. The last version usually contains improvements on the driver.
ODBC Configuration. For example, the Step 13 of the link you will be able to see a drop-down list with the datasets available and select one as the default. If you don't have issues is this step, then the issue could be in the tool that uses the ODBC connection.
Language Dialect. In here you case change between StandardSQL or LegacySQL as needed, for example, you can force your tool to use LegacySQL and use the characters [ and ] that I explained above.
Connection String. If your tool allows to use a string with the connection, you might want to use it and explicitly indicating the default Dataset (among other driver options).

how to fix CONVERSION errors after importing SSIS PROJECT

I'm importing a perfectly working SSIS project from TFS.
I have actually a problem with all the packages that contain a data FLOW with a date importation.
I get dozens of this error :
Validation error. DFT Get Date ODBC Source CodeDate2 [63]: The OLE DB provider used by the OLE DB adapter cannot convert between types "DT_BYTES" and "DT_DBDATE" for "Date".
and when I click on the odbc source editor, I have the following message:
the metadata of the following output columnsdoes not match the metadata of the external columns with which the output columns are associated:
Output "ODBC Source Output": "Date"
Do you want to replace the metadata of the output columns with the metadata of the external columns?
the fact is that works everywhere but on my computer.
is there an ole db provider component I'm lacking of something like that?
Downgrading will work, but if that's not possible for you, then rewriting your queries may also solve your problem.
In my case I had a Postgres query returning columns of type date. I just converted them all to timestamptz using ::timestamptz. At that point the columns changed from DT_BYTES to DT_DBTIMESTAMP, which was just fine for my purposes.
It might be related to the version of Visual Studio or SSDT.
Try to install SSDT 15.8.0(SSDT previous releases), and run the package in it.
I once saw similar posts on MSDN after the release of Visual Studio 15.9.2
Import from Teradata using ODBC gives VS_NEEDSNEWMETADATA error
ODBC Progress datatype problems after updating to VS 2017 15.9
Same here, I forced the type casting it in the select and it works :
SELECT
[...]
cast(release_date as datetime) as release_date,
[...]
FROM cm_wo

I can't import a CSV into Microsoft SQL Server Management Studio 2014 due to Pre Execute error 0xc020802e

I'm trying import a CSV file into SQL Server Management Studio 2014 but keep hitting errors every time I try. Specifically, I get a Pre-execute error:
Messages Error 0xc020802e: Data Flow Task 1: The data type for "Source
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142_csv.Outputs[Flat
File Source Output].Columns[Target URL]" is DT_NTEXT, which is not
supported with ANSI files. Use DT_TEXT instead and convert the data to
DT_NTEXT using the data conversion component. (SQL Server Import and
Export Wizard)
Error 0xc0202094: Data Flow Task 1: Unable to
retrieve column information from the flat file connection manager.
(SQL Server Import and Export Wizard)
Error 0xc004701a: Data Flow
Task 1: Source -
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142_csv
failed the pre-execute phase and returned error code 0xC0202094. (SQL
Server Import and Export Wizard)
Information 0x4004300b: Data Flow
Task 1: "Destination -
download_fresh_filename_com_06_Apr_17_4EB41F5D720E569B7AD1D854B1EC3142"
wrote 0 rows. (SQL Server Import and Export Wizard)
The CSV is UTF8 encoded, ~114,900 rows 20 columns. Here's what I've tried so far with no success:
Under Choose a Data Source>advanced I've set the data type to [DT_TEXT] which didn't work,
tried [DT_NTEXT] but still didn't work.
Under Review Data Type Mapping I've set On Error (global) to Ignore, still didn't work
Any help would be appreciated.
Thanks.
You can try with a limited number of rows to narrow down the row having the rogue values.
Also, funny though it may sound, in such cases, I have had better success by importing it first into Excel, and then onward to SQL Server. In other cases, to Excel, onward to Access and further on to SQL Server. Strange world.
I noticed a box I was missing before in the Import Wizard: under choose a data source>flat file source there is a "text qualifier" definition which is set to by default, I changed it to ".
Also in the same section under Advanced, I changed the column width from the default 50 to 1000 for all rows.
Worked perfectly.
Error clearsly indicates regarding ANSI file, Solution is to use Unicode. There is tick as shown below. When you check this box, it will work fine.

SSIS truncation error

First, I have searched and searched and searched and not found anything that helps me with this.
I have an SSIS project that will fetch a lot of data from an iSeries AS400 and it does this in two very different steps.
Step 1 works perfectly so I manage to fetch tons of info from the AS400, so the connection itself is not the issue.
Step two fails horribly with the following three error codes:
[OLE DB Source [41]] Error: There was an error with OLE DB
Source.Outputs[OLE DB Source Output].Columns[NAME] on OLE DB
Source.Outputs[OLE DB Source Output]. The column status returned was: "Text
was truncated or one or more characters had no match in the target code
page.".
[OLE DB Source [41]] Error: The "OLE DB Source.Outputs[OLE DB Source
Output].Columns[NAME]" failed because truncation occurred, and the
truncation row disposition on "OLE DB Source.Outputs[OLE DB Source
Output].Columns[NAME]" specifies failure on truncation. A truncation error
occurred on the specified object of the specified component.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The
PrimeOutput method on OLE DB Source returned error code 0xC020902A. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the component,
but the error is fatal and the pipeline stopped executing. There may be
error messages posted before this with more information about the failure.
I have desperately tried to find the solution to this problem and this is what I have done (which have not helped at all):
1 - Advanced Editor on SOURCE -> tab: Input and Output Properties -> OLE DB Source Output -> Output Column changed to
a) 40 (from 28) in length - no change
b) data text (from string) - complete crash
c) changed codepage from 1251 to UTF-8 - no change
2 - Fetched the information with OPENQUERY in MSSMS, it works perfectly.
3 - Screamed in frustration at the screen (didn't help).
I am at roads end. I don't know what to do anymore. Help...?
Yes, this is completely maddening.
There are two sets of columns under OLE DB Source Output: "External Columns" and "Output Columns".
Have you tried changing the lengths of both columns - column "Name" under External Columns" and under "Output Columns"?
This kind of error often happens from a mismatch between the External Column definition and its corresponding Output Column.
In an OLE DB Source, External Columns are supposed to be auto-typed according to the source data types: the external provider is supposed to talk metadata to SSIS, saying "well, this column is typed String(40)", for example. But either the provider or SSIS are often, let's say, "less than entirely competent" at getting the types and lengths right.
UPDATE: Have you tried checking the length of the data in the source, independently of SSIS? Something like:
SELECT MAX(Len(TheReallyAnnoyingColumn)) FROM TheTable
You may find setting the Error Output for Truncation on the Source editor dialog to "Ignore Failure" gets you around the issue.
Update - Truncation Redirect:-
Forced truncation on surname - output set to redirect
and enabled Data Viewer on the error output
then copied the row from the data viewer to notepad to show the error
Running the same dtsx wif truncation set to fail :-
Everybody else is focused on the truncation. I'm curious about the one or more characters had no match in the target code page part of the error message.
How is the column actually defined on the IBM i? I'm particularly interested in the Coded Character Set Identifier (aka CCSID)
In a green screen you can use the Display File Field Description (DSPFFD) command.
You could also use the iNav GUI.

SSIS (2008R2) import from mssql to mysql failing due to a date column

I have an oledb connection to mssql and an ado.net destination (with odbc driver used) to mysql. The tables are exectly the same and all the columns are working bar one.
The error message received is:
[ADO NET Destination [325]] Error: An exception has occurred during data insertion, the message returned from the provider is: Unable to cast object of type 'System.DateTime' to type 'System.Char[]'.
I've seen similar questions on other data types but the resolution of changing to string does not work here. If I convert to string (has to be length 29 otherwise the conversion step fails) I get the following error message:
[ADO NET Destination [325]] Error: An exception has occurred during data insertion, the message returned from the provider is: ERROR [HY000] [MySQL][ODBC 5.1 Driver][mysqld-5.5.15]Incorrect datetime value: '2011-03-21 11:23:48.573000000' for column 'LastModificationDate' at row 1
Other potentially relevant details:
connection driver- {MySQL ODBC 5.1 Driver}
script run before dataflow - set sql_mode='STRICT_TRANS_TABLES,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION,ANSI_QUOTES'
Other datetime columns are working
This column has a reasonably high proportion of nulls
mssql spec: [LastModificationDate] [datetime] NULL
mysql spec: LastModificationDate datetime NULL
Has anyone had experience with this issue and could provide some advice on resolving it?
Can you try converting it to string on sql server side in your query using:
convert(char(10),LastModificationDate,111)+' '+convert(char(8),LastModificationDate,108)
This works for me all the time.
I got the same big headache this week. I tried many ways. Thanks God, finnally, one of them worked. Hope it could help you a little bit.
For some columns with the data type of Int, datetime, decimal....,here, I identified as ColumnA, and I used it as datetime type.
1.in Data Flow Source, use SQL Command to retrieve data. Sth like select isnull(ColumnA,'1800-01-01') as ColumnA, C1, C2, ... Cn from Table
Make sure to use Isnull function for all columns with the datatype mentioned before.
2.Excute the SSIS pkg. It should work.
3.Go back to Control Flow, under the data flow task, add SQL Task control to replace the data back. I mean, update the ColumnA from '1800-01-01' to null again.
That works for me. In my situation, I cannot use ignore failure option. Because if I do, I will lose thousands rows of data.