SSIS DATA Covertion task - ssis

Hi I am Trying to use dataconvertion Transformation in ssis to get Execl data into a text file But The convertion Failed with The Fallowing Error
" [Data Conversion [77]] Error: The "output column "Copy of Description" (93)" failed because truncation occurred, and the truncation row disposition on "output column "Copy of Description" (93)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component."
can any one help me to find out a solution

Check this link:
http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/8c9c1d45-82d5-43bf-961b-a8e22dab221b/
Besides be aware that Excel uses the first 8 rows to determine the datatype ... This might cause some problems .
My advice is to save the Excel as CSV document and then use it.

Related

bigquery error: "Could not parse '41.66666667' as INT64"

I am attempting to create a table using a .tsv file in BigQuery, but keep getting the following error:
"Failed to create table: Error while reading data, error message: Could not parse '41.66666667' as INT64 for field Team_Percentage (position 8) starting at location 14419658 with message 'Unable to parse'"
I am not sure what to do as I am completely new to this.
Here is a file with the first 100 lines of the full data:
https://wetransfer.com/downloads/25c18d56eb863bafcfdb5956a46449c920220502031838/f5ed2f
Here are the steps I am currently taking to to create the table:
https://i.gyazo.com/07815cec446b5c0869d7c9323a7fdee4.mp4
Appreciate any help I can get!
As confirmed with OP (#dan), the error encountered is caused by selecting Auto detect when creating a table using a .tsv file as the source.
The fix for this is to manually create a schema and define the data type for each column properly. For more reference on using schema in BQ see this document.

Pentaho Data Integration - Connection time out

I am developing a PDI transformation, which takes data from a MySql database, and output the data into an MSSQL table. But before output, I add a deletion step to delete records in dest. table with same key field values. But I do not know why that by this setting the transformation always fails casting exception of connection timeout of data source.
But, after I added a "Block" step between "table input" and "Delete", the issue got gone, and the transformation got successfully finished.
My configuration and exception message are as blow:
Transformation setting and system exception message
Data Input SQL, and Delete condition
Error what I see from the screen-shot you attached and also recommendation in the 4th error line from top "consider raising value of 'net_write_timeout' on the server"
Default value will be 60, Kindly increase the value for the same.
Follow below document for more reference.
https://wiki.pentaho.com/display/EAI/MySQL

SSIS truncation error

First, I have searched and searched and searched and not found anything that helps me with this.
I have an SSIS project that will fetch a lot of data from an iSeries AS400 and it does this in two very different steps.
Step 1 works perfectly so I manage to fetch tons of info from the AS400, so the connection itself is not the issue.
Step two fails horribly with the following three error codes:
[OLE DB Source [41]] Error: There was an error with OLE DB
Source.Outputs[OLE DB Source Output].Columns[NAME] on OLE DB
Source.Outputs[OLE DB Source Output]. The column status returned was: "Text
was truncated or one or more characters had no match in the target code
page.".
[OLE DB Source [41]] Error: The "OLE DB Source.Outputs[OLE DB Source
Output].Columns[NAME]" failed because truncation occurred, and the
truncation row disposition on "OLE DB Source.Outputs[OLE DB Source
Output].Columns[NAME]" specifies failure on truncation. A truncation error
occurred on the specified object of the specified component.
[SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The
PrimeOutput method on OLE DB Source returned error code 0xC020902A. The
component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the component,
but the error is fatal and the pipeline stopped executing. There may be
error messages posted before this with more information about the failure.
I have desperately tried to find the solution to this problem and this is what I have done (which have not helped at all):
1 - Advanced Editor on SOURCE -> tab: Input and Output Properties -> OLE DB Source Output -> Output Column changed to
a) 40 (from 28) in length - no change
b) data text (from string) - complete crash
c) changed codepage from 1251 to UTF-8 - no change
2 - Fetched the information with OPENQUERY in MSSMS, it works perfectly.
3 - Screamed in frustration at the screen (didn't help).
I am at roads end. I don't know what to do anymore. Help...?
Yes, this is completely maddening.
There are two sets of columns under OLE DB Source Output: "External Columns" and "Output Columns".
Have you tried changing the lengths of both columns - column "Name" under External Columns" and under "Output Columns"?
This kind of error often happens from a mismatch between the External Column definition and its corresponding Output Column.
In an OLE DB Source, External Columns are supposed to be auto-typed according to the source data types: the external provider is supposed to talk metadata to SSIS, saying "well, this column is typed String(40)", for example. But either the provider or SSIS are often, let's say, "less than entirely competent" at getting the types and lengths right.
UPDATE: Have you tried checking the length of the data in the source, independently of SSIS? Something like:
SELECT MAX(Len(TheReallyAnnoyingColumn)) FROM TheTable
You may find setting the Error Output for Truncation on the Source editor dialog to "Ignore Failure" gets you around the issue.
Update - Truncation Redirect:-
Forced truncation on surname - output set to redirect
and enabled Data Viewer on the error output
then copied the row from the data viewer to notepad to show the error
Running the same dtsx wif truncation set to fail :-
Everybody else is focused on the truncation. I'm curious about the one or more characters had no match in the target code page part of the error message.
How is the column actually defined on the IBM i? I'm particularly interested in the Coded Character Set Identifier (aka CCSID)
In a green screen you can use the Display File Field Description (DSPFFD) command.
You could also use the iNav GUI.

SSIS Import a date and time from a txt to a table datetime

So I want to import a datetime from a txt:
2015-01-22 09:19:59
into a table using a data flow. I have my Flat Source File and my destination DB set up fine. I changed the data type for the txt input for that column in the advanced settings and the input and output properties to:
database timestamp [DT_DBTIMESTAMP]
This is the same data type as the DB used for the table so this should work.
However, when I execute the package I get a error saying the data conversion failed... How do I make this possible?
[Import txt data [1743]] Error: Data conversion failed. The data conversion for column "statdate" returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
[Import txt data [1743]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "output column "statdate" (2098)" failed because error code 0xC0209084 occurred, and the error row disposition on "output column "statdate" (2098)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[Import txt data [1743]] Error: An error occurred while processing file "C:\Program Files\Microsoft SQL Server\MON_Datamart\Sourcefiles\tbl_L30T1.txt" on data row 14939.
On the row he is giving the error the datetime is filled up with spaces. that is why on the table the "allow nulls" is checked but my SSIS package gives the error for some reason... can I somewhere tell the package to allow nulls aswell?
I suggest you import the data in to a character field and then parse it after entry.
The following function should help you:
SELECT IsDate('2015-01-22 09:19:59')
, IsDate(Current_Timestamp)
, IsDate(' ')
, IsDate('')
The IsDate() function returns a 1 when it thinks the value is a date and a 0 when it is not.
This would allow you to do something like:
SELECT value_as_string
, CASE WHEN IsDate(value_as_string) = 1 THEN
Cast(date_as_string As datetime)
ELSE
NULL
END As value_as_datetime
FROM ...
I solved it Myself. Thank you for your suggestion gvee but the way I did it is way easier.
In the Flat File Source when making a new connection in the advanced tab I fixed all the data types according to the table in the database EXCEPT the column with the timestamp (in my case it was called "statdate")! I changed this data type to a STRING because otherwise my Flat File Source would give me a conversion error even before any scripts would have been able to be executed and the only way arround this was setting the error output to ignore failure wich I don't want. (You still have to change the data type after you set it to a string in the advanced settings by right clicking the flat file source -> show advanced editor -> going to the output colums and changing the data type there from Date to string.)
After the timestamp was set to a string I added a Derived Column with this expression to delete all the spaces and give it then "NULL" value:
TRIM(<YourColumnName>) == "" ? (DT_STR,4,1252)NULL(DT_STR,4,1252) : <YourColumnName>
Next I added a Data Conversion to set the string back to a timestamp. The Data conversion is finally connected to the OLE DB Destination.
I hope this helps anyone with the same problem in the future.
End result: Picture of data flow

SSIS - Use Derived Column to Cast String to Float

I'm having a problem getting data from a .CSV into a column of datatype FLOAT. I've tried to link it directly and also use the Data Conversion Task, but (in both cases) it kept telling me that it couldn't convert:
Error: 0xC02020C5 at DC_Weekly_Cost_Target csv to FatzWklyCst_Target, Data Conversion [156]: Data conversion failed while converting column "Target" (22) to column "Copy of Target" (163). The conversion returned status value 2 and status text "The value could not be converted because of a potential loss of data.".
My research led me to using the Derived Column Transformation Editor. I found a few websites that walked me through how properly use the "Expression" portion:
Above is how I'm attempting to transform the strings (Target and Waste) into datatype Float. I'm not receiving an error message when using the Editor (i.e. It will let me clik OK without an error), however, I am receiving an error when I attempt to run the package:
Error: 0xC0049064 at DC_Weekly_Cost_Target csv to FatzWklyCst_Target, Map Target in correct datatype 1 1 [222]: An error occurred while attempting to perform a type cast.
Error: 0xC0209029 at DC_Weekly_Cost_Target csv to FatzWklyCst_Target, Map Target in correct datatype 1 1 [222]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "component "Map Target in correct datatype 1 1" (222)" failed because error code 0xC0049064 occurred, and the error row disposition on "output column "Target_Float" (227)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at DC_Weekly_Cost_Target csv to FatzWklyCst_Target, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "Map Target in correct datatype 1 1" (222) failed with error code 0xC0209029 while processing input "Derived Column Input" (223). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
This is my first time using the Derived Column Transformation Editor. Does anyone see what I'm doing incorrectly? Or, do you have any suggestions as to what may be the best approach to getting data from a .csv file into a column of datatype float? I appreciate any help that anyone can give me.
You have tried a reasonable approach but something in the data is blowing it up - possibly "invalid" characters e.g. $ or ,
I would replace the Derived Column transformation with a Script Task. There you can leverage the .NET Framework e.g. Try ... Catch, TryParse, Regex. You can debug your code line-by-line to inspect the rows with errors. You can also use Reflection to factor your conversion code as a function that you call for each column passed into the Script Task.
PS: your destination is irrelevant.