SSIS Issue Pulling Data From Snowflake - [CData Snowflake Source] Error: Get data error: Received metadata with an incompatible version number - ssis

I'm trying to run a simple data flow task in SSIS pulling data from Snowflake to SQL Server using a component from CDATA called Snowflake Source.
Connection works and I can also see a preview of the data but when actually running the package I get the following error message:
[CData Snowflake Source [2]] Error: System.Exception: Get data error: Received metadata with an incompatible version number
at CData.SSIS.Snowflake.SSISSourceComponent.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
Has anyone experienced this before or know what to do in order to fix it?

Hi I faced this error many time , when i use some custom script component or third party component.
Issue : We do remove or delete some component but some of the reference is not removed from package (it is very tough to debug )
Create fresh package it will work .

Related

SSIS Package failing in SIT environment

I have an SSIS package that uses ZappySys HTML table source to connect to the web. When I run the package pointing to an environment, it runs successfully, but when I execute the package in SSISDB, it throws error " OnError View Contex df_WeatherData:Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on HTML Table Source returned error code 0x80131500. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
OnError View Context df_WeatherData:Error: System.Exception: Table not found for specified criteria
at ZappySys.PowerPack.Adapter.HtmlTableSource.HtmlTableHelper. (String , WebClient , TableFetchSettings , Boolean , Boolean , List1& ) at ZappySys.PowerPack.Adapter.HtmlTableSource.HtmlTableHelper.GetTableData(String url, WebClient webClient, TableFetchSettings settings, Boolean fetchColInfo, Boolean fetchData, List1& colInfoList, List1& linksTable, List1& imageTable, Int32& rowsAdded)
at ZappySys.PowerPack.Adapter.HtmlTableSource.HtmlDataExtractor.Process(HtmlDataExtractArgs args)
at ZappySys.PowerPack.Adapter.HtmlTableSource.HtmlTableSource.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)"
Please help, I have been trying to debug this issue days now. Thanks
I figured the Environment variables values from my package, does not match with the affected environment values, which is causing the job to fail after deployment.

SSIS Chunk Timeout Errors exporting data using CData Snowflake Source

Anyone run into this error? How do I resolve, not getting much guidance from cdata or snowflake.
Error: 0x0 at ADI_PRODUCT, CData Snowflake Source: Get data error: Retrieved Chunk#0 Timeout.
Error: 0xC0047062 at ADI_PRODUCT, CData Snowflake Source [279]: System.Exception: Get data error: Retrieved Chunk#0 Timeout.
at CData.SSIS.Snowflake.SSISSourceComponent.PrimeOutput(Int32 outputs, Int32[] outputIDs, PipelineBuffer[] buffers)
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPrimeOutput(IDTSManagedComponentWrapper100 wrapper, Int32 outputs, Int32[] outputIDs, IDTSBuffer100[] buffers, IntPtr ppBufferWirePacket)
Error: 0xC0047038 at ADI_PRODUCT, SSIS.Pipeline: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on CData Snowflake Source returned error code 0x80131500. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
I am not sure about this, but given the fact that there appears to be a timeout retrieving data (presumably from Azure's cloud storage), it might be worthwhile to verify that you are not encountering a networking issue (eg. firewall) by verifying connectivity via Snowflake's SnowCD utility. Just a thought.

SSRS Migration from 2008R2 to 2017 Error RSPortal

I was migrating our reporting services from the version 2008R2 to the version 2017 restoring the db and all seems to work fine unless for some report for which I cannot open the subscriptions page.
For those report every time I enter in the report subscriptions page from the web view I get this error
"Something went wrong. Please try again later. "
If I go in the log from the RSPortal File I see this error:
ERROR: OData exception occurred: System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.FormatException: The string was not recognized as a valid DateTime. There is an unknown word starting at index 18.
at System.DateTimeParse.Parse(String s, DateTimeFormatInfo dtfi, DateTimeStyles styles)
at Microsoft.ReportingServices.Portal.Services.ODataExtensions.ParameterValueExtensions.FormatAsISO8601Date(String date, String culture)
at Microsoft.ReportingServices.Portal.Services.ODataExtensions.ParameterValueExtensions.ToWebApiReportParameterValue(ParameterValue parameterValue, ReportParameterType reportParameterType, String culture)
at Microsoft.ReportingServices.Portal.Services.ODataExtensions.SubscriptionExtensions.ToReportPameterList(SubscriptionImpl librarySubscription, Dictionary`2 parameterTypes, String culture)
at Microsoft.ReportingServices.Portal.Services.ODataExtensions.SubscriptionExtensions.ToWebApiModel(SubscriptionImpl librarySubscription, Dictionary`2 parameterTypes, SubscriptionProperties properties)
at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()
Does anyone have a suggestion on how to solve it? I thought was something related to the CultureInfo value but I am not sure.
Thanks
Probably could be helpful for someone in the future so I will indicate the steps I took to solve this issue:
This issue was due to a data type mismatching on one parameter for a few subscriptions.
Seems that the error handling has changed in the time with Microsoft in fact, after I found the subscriptions that were having problems I tried to open them in the old environment and I have got this error:
“The Value provided for the report parameter “YourDate’ is not valid for it is type (rsReportParameterTypeMismatch)”
What I did so was to delete these subscriptions with the error in the new Reporting Server environment and finally the web view worked fine

SSIS Kingswaysoft error for JSON source

I am using JSON Source component in a package, it is working fine on my local machine but when I deploy in server am receiving the following error.
System.ArgumentException: Value does not fall within the expected range.
at Microsoft.SqlServer.Dts.Pipeline.Wrapper.IDTSBufferManager100.FindColumnByLineageID(Int32 hBufferType, Int32 nLineageID)
at KingswaySoft.IntegrationToolkit.ProductivityPack.JsonSourceComponent.yhs.cgd(IDTSOutputColumn100 kbk, IDTSExternalMetadataColumn100 kbl)
at System.Linq.Enumerable.d__614.MoveNext()
at System.Linq.Buffer1..ctor(IEnumerable1 source)
at System.Linq.Enumerable.ToArray[TSource](IEnumerable1 source)
at KingswaySoft.IntegrationToolkit.ProductivityPack.JsonSourceComponent.bdc()
at KingswaySoft.IntegrationToolkit.ProductivityPack.JsonSourceComponent.PreExecute()
at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostPreExecute(IDTSManagedComponentWrapper100 wrapper)
For the error “Value does not fall within the expected range”, it seems to be an issue caused by the SSIS optimization design, it happens when you don’t have all the output attached to a destination component.
In this case, either of the following two options can solve the issue properly:
Set the RunInOptimizedMode property to False in the Properties window
at Data Flow Level (see figure 1). This setting can be found after
you click the blank area on your data flow designer.
Alternatively, you can try to direct other outputs of the JSON Source component to an SSIS Destination component which can take care of this situation as well.
Can you please give it a try and see if it helps?
[Figure 1]

SSIS: copy tables from MySQL to SQL Server 2008

I'm getting an error while trying to copy 4 tables from a MySQL source to SQL Server 2008.
Here's a photo of the Data Flow, as you can see, 2 of them are OK (the smaller ones)
With an OnError event handler I'm able to see the errors. Here they are.
SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error
code: 0x80040E21. An OLE DB record is available. Source: "Microsoft
SQL Server Native Client 10.0" Hresult: 0x80040E21 Description:
"Multiple-step OLE DB operation generated errors. Check each OLE DB
status value, if available. No work was done.".
There was an error with input column "FechaHoraCorteAgente" (884) on
input "OLE DB Destination Input" (510). The column status returned
was: "Conversion failed because the data value overflowed the
specified type.".
SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE
DB Destination Input" (510)" failed because error code 0xC020907A
occurred, and the error row disposition on "input "OLE DB Destination
Input" (510)" specifies failure on error. An error occurred on the
specified object of the specified component. There may be error
messages posted before this with more information about the failure.
SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on
component "OLE DB Destination 2" (497) failed with error code
0xC0209029 while processing input "OLE DB Destination Input" (510).
The identified component returned an error from the ProcessInput
method. The error is specific to the component, but the error is
fatal and will cause the Data Flow task to stop running. There may
be error messages posted before this with more information about the
failure.
The component "ado net conptacto" (1) was unable to process the data.
Exception from HRESULT: 0xC0047020
The component "ADO NET logllamados" (482) was unable to process the
data. Exception from HRESULT: 0xC0047020
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
component "ado net conptacto" (1) returned error code 0xC02090F5.
The component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on
component "ADO NET logllamados" (482) returned error code 0xC02090F5.
The component returned a failure code when the pipeline engine called
PrimeOutput(). The meaning of the failure code is defined by the
component, but the error is fatal and the pipeline stopped executing.
There may be error messages posted before this with more information
about the failure.
Any idea of what's going on in here?
"Conversion failed because the data value overflowed the specified type." seems pretty obvious, you are trying to insert something where it doesn't fit. I suggest you compare all your source columns with destination columns and make sure that:
lengths are enough
data types are compatible
you can post your tables' structures if you would like a hand on that
Source column has 0000-00-00 in a datetime field. So there was the error.
Created a Derived Column with the expression:
(DT_DBTIMESTAMP)(DAY([FechaHoraCorteAgente]) == 0 ? NULL(DT_DBTIMESTAMP) : [FechaHoraCorteAgente])
It's a failure at source, If the package fails while inserting at the destination, that's easily solvable. I have come across many situations where the source data is larger than what SSIS source is expecting to see.
I think when you create the source, SSIS automatically samples the input data to see the maximum length. But what if that maximum length were to be exceeded? That's where I see most of the problems relating to overflow.
Also, many a times when dealing with poorly handled source data, you would see a character data in a date time field. Such a scenario would also spoil the package.