I have an ADO Net Source to query data from an IBM AS/400. The column in question is of type CHAR (94). However, the meta data in SSIS shows it as a byte stream, DTS_BYTES. If I run the package, I get an error: data type of "output column "COLUMN_NAME" (38)" does not match the data type "System.String" of the source column "COLUMN_NAME". It's odd because in SSIS they are both set to DTS_BYTES as the data type.
If I go to the advanced editor for the ADO Net Source, it will let me change the data type of the external column to Unicode string, but not the output column.
I'm not sure why it's coming in as a byte string when the type CHAR should just show up as a string. (it does in other packages for other columns on the AS/400) How do I change this metadata so that it shows the external and output columns as Unicode strings instead of byte streams?
Thank you.
Related
I've created a table in SQL Server 2016 with this definition:
CREATE TABLE Stage.Test (f1 DATE NULL);
INSERT INTO Stage.Test (f1) VALUES ('1/1/2000');
Notice the f1 column uses the DATE data type.
Then I created a data flow task in SQL Server Data Tools (VS 2019) to load data from that table. I created an OLEDB Source Component and set the source table to Stage.Test. But when I examine the data type of the "f1" column (in both the 'External Column' and 'Output Column' columns), it says it's a Unicode string:
Why is it choosing a Unicode string instead of DT_DATE?
I haven't seen this exact example when it comes to date fields but SSIS converts data automatically when it has detected the field to be of a certain type. Perhaps it's the '/' in your date field that does it. We do not use this date format over in these parts of the world so I've never had the problem. You can especially see this when you import excel files with SSIS. I usually have this problem with strings where unicode strings can sometimes become non-unicode strings.
A way to fix could be to:
edit the sql query to explicitly cast the field as date
add a conversion step in the data flow after the source (like a derived column getting the parts of the string in the right order)
try to change the output datatype by right clicking on the source in the data flow and using the advanced editor and then edit the output column datatype:
SSIS Source Advanced Editor Output
I'm not sure if that 3 would work with this date format issue as I do not have experience with the format myself but it is working fine for my unicode/non-unicode problem.
I have the following JSON stored in S3:
{"data":"this is a test for firehose"}
I have created the table test_firehose with a varchar column data, and a file_format called JSON with type JSON and the rest in default values. I want to copy the content from s3 to snowflake, and I have tried with the following statement:
COPY INTO test_firehose
FROM 's3://s3_bucket/firehose/2020/12/30/09/tracking-1-2020-12-30-09-38-46'
FILE_FORMAT = 'JSON';
And I receive the error:
SQL compilation error: JSON file format can produce one and only one column of type
variant or object or array. Use CSV file format if you want to load more than one column.
How could I solve this? Thanks
If you want to keep your data as JSON (rather than just as text) then you need to load it into a column with a datatype of VARIANT, not VARCHAR
In SSIS, I'm starting with a SQL Source (a table). It has 3 columns, including a varbinary column ("FileBlob") that comes from a filestream (this shows up as type DT_IMAGE in SSIS).
In the first data flow component, I convert the varbinary column to DT_TEXT, and output the result to a flat file. This works.
In the next step, I read in the flat file I just created, attempting to convert the DT_TEXT column back to DT_IMAGE.
I get this error:
The conversion returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.
I'm assuming there's a way to do a round-trip conversion of the binary data to text, and then back to binary. Just not sure what I'm missing. Thanks.
I've a XML Source column "Comments" and data type is UniCode WSTR but in the source [UI] its capturing more than 4000 characters so SSIS ETL failed to load in destination table as the error shows "truncation of data length" & Destination column DataType is nVarchar(Max).
Please suggest me how to load the data into destination.
You could declare this column as an NTEXT. However, beware of two downsides of such step:
You have to check this definition every time you update XML datasource Yes, every time, because SSIS likes to reset string XML elements to its default - nvarchar(50).
Using NTEXT column has a negative performance impact, for details see https://stackoverflow.com/a/28507399
I have ODBC DB2 query to get data from database and data type of column only contain varchar(x), date, and int.
However, I'd like to keep them in flat file destination, but isn't able to use UTF-8 encoding. SSIS keeps informing error message below:
[Flat File Destination si_ce_f_hotel_capacity_snapshot_weekly [2]]
Error: Data conversion failed. The data conversion for column
"SOURCE_MARKET_CODE" returned status value 2 and status text "The
value could not be converted because of a potential loss of data.".
This column has varchar(2) datatype from source. And I specify this datatype in ssis flat file structure as DT_WSTR (2).
However, when I change file format to Unicode, everything works just fine.
How do I get this work in UTF-8 ??
Thanks a lot for your answer.
varchar -> DT_STR
Nvarchar -> DT_WST
You should specify the datatype in ssis flat file structure as DT_STR or use the ssis data conversion transformation tool
You need to do a data convertion in the middle of your data flow. This can be done by using a Derived Column or Data Convertion Data Flow Transformation.