SSIS - OleDB connection not updating data, inserted thru SQLBulkCopy with SQL Connection - ssis

I have created Ole DB connection to execute different SQL task across SSIS package. Its working fine too.
In one of task where i need to do insert data into table, used SQLBulkCopy as i have dynamic tables and columns based on getting files from different sources.
SQLBulkCopy only works with SqlConnection, so i opened SqlConnection, executed SqlBulkCopy. This is also working fine.
After done with SqlBulkCopy, i have created Sql Task which updates metadata of inserted rows, for e.g. Count, Min & Max date and so on in different table.
This table is not getting updated and if i execute stored procedure from Sql Management Studio, it works as expected.
So my assumption is that, Ole DB connection is not able to get the latest data Data inserted thru SQL Connection.
I may be wrong, but not sure why i can see sql execution task successful but still table is not updated.
Am i missing anything here?

My bad.
Instead of passing data type as long (int in SQL), i was passing it as Varchar.
I was looking from last few hours and as soon as i post question here, it strikes me to check the data type.
Hope it will help somebody.

Related

SSIS: How to get the number of updated and deleted rows in an audit?

Imagine that you want to save in a variable the number of rows the were updated or deleted in a table.
‌
This is the steps that i did:
First, in the Control flow i created a Data Flow Task.
Them, in the Data Flow, i created a source(in my case is a excel file), then i proceeded to create two variables to count those rows- countDeleted and countUpdated, then connected the variables to two row count transformations, and them connected my destination (OLE DB).
Now in the control flow, what do i do??
Create a SQL execute task?? or a Script task?? What is the best way to do it?? What is the piece of code to use??
Thanks for youy help.
P‌S: i only have 4 weeks off SSIS, sorry for my noobieness :)
An OLD DB destination only inserts. It can't UPDATE or DELETE
What's your logic for updating or deleting?
If you're just starting out and reading about doing things in SSIS you will eventually find advice to use the OLE DB Command to perform row by row delete and inserts.
In my opinion this is to be avoided. It does not scale (works fine for small recorsets then fails for large recordsets), and it is difficult to maintain parameter mappings in the OLE DB Command. Although you should try it anyway to familiarise yourself with it.
My advice is to load the Excel data into a staging table, perform batch DELETE and UPDATE statements to load the data and use ##ROWCOUNT to capture the records updated.
For example;
Your existing described dataflow can be used to load into a table called StagingTable
Before your dataflow you should run an Execute SQL Task (This is in the Control Flow pane, not the Data Flow pane) that clears the staging table:
TRUNCATE TABLE StagingTable;
So first get that working - repeatedly running your package clears the staging table then loads Excel into it without creating duplicates
This in itself is a challenge as Excel is a terrible data interchange format.
Once you have that working, you add an execute SQL task to the end that runs some SQL that deletes the records you want and captures the count. For example:
DELETE FROM MyFinalTable WHERE PriamryKey IN (SELECT PrimaryKey FROM StagingTable);
SELECT ##ROWCOUNT;
Then you follow the instructions here to load that back to your SSIS variable
http://microsoft-ssis.blogspot.com/2011/03/rowcount-for-execute-sql-statement.html
What are you doing with this row count? Are you writing it to a logging table? Save
yourself the bother of pulling it back into an SSIS variable and just write it directly:
DELETE FROM MyFinalTable WHERE PriamryKey IN (SELECT PrimaryKey FROM StagingTable);
INSERT INTO LogTable(Table,Operation,Type)
SELECT 'MyFinalTable','Delete', ##ROWCOUNT;
In my experience it is not a good idea to build convoluted logic into SSIS packages if you can instead do in a database. Although it does depend on the person who has to eventually maintain it. Hopefully you can appreciate that this T-SQL approach is a more straightforward code based approach as opposed to having to dig around in property pages and events and other places inside SSIS packages.
I assume that you're using an Execute SQL Task for the updates and deletes? As #Nick.McDermaid mentioned, using an OLE DB Command within a Data Flow presents various issues when performing DML. You can find the number of rows updated, inserted, or deleted in a table through an Execute SQL Task by using the ExecValueVariable property of this task. Set the variable that will hold the row count to this property and it will return the number of affected rows. Note that is will only return the number of rows impacted by the last statement in the Execute SQL Task, regardless of batches (i.e. GO separators) are in the component.

No columns returned SSIS

I am implementing a SSIS package and currently trying to do the following.
Truncate the destination table
Fetch the data by executing the stored procedure and insert it into the destination table.
I have created an Execute SQL task to address step 1 and dataflow with oledb source and oledb destination to address the second point. It been working successfully so far but isn't working for one my stored procedure that uses temp tables.
When I edit the oledb source and click the preview button, I get the error no column returned
I know that SSIS has an issue with generating column while executing stored procedures that depend on temp tables. I have converted the stored proc to use temporary table variables and its now able to return columns in SSIS when I do a preview. The only downside is that the stored procedure is taking longer time to execute. Its taking 1 hour 15 mins as compared to 15 mins while using temp tables.
I did see a suggestion to use SET FMTONLY before executing the stored procedure as an alternate solution to changing to temp table variables but that didn't seem to work as I am getting syntax or permission denied error.
Could somebody tell me a solution to my problem which does not compromise on the performance.
Sounds like you've already read all the approaches to using Temp tables in SSIS, including the IF 1=0... trick? If you haven't seen that one yet, google it.
You say that using Table Variables causes your stored procedure to take about 5 times longer than using Temp Tables. The most likely reason for that is that you are indexing your temp tables but not your table variables. If you didn't know that table variables can be indexed, they can. You might try that.
Finally, a solution that you haven't mentioned is that you can replace your temporary table with a real table that gets truncated when you're done using it.
Short comment:
Try EXEC WITH RESULT SETS and specify the metadata yourself for a proc with temp tables; or use the Script Component as a source and specify the Output columns yourself.
Long comment:
Technically speaking, it is the driver/database you are using in SSIS that would decide the behavior when working with temp tables.
Metadata is an important factor when using SSIS's pipeline components. By metadata, I mean the names of the columns, their data types etc that a pipeline component uses. When designing a data flow, someone/something should provide this metadata to the components that require it.
In most cases, SSIS automatically retreives the metadata. Components that do not connect to a external data source, like Conditional Split etc, get their metadata from the other components they are connected to. For the pipeline components that connect to a external data source (like Oledb source, oledb destination, Lookup etc.), SSIS provides a mechanism to get this metadata without human involvement. This mechanism involves the driver connecting to the database and retrieving the metadata of the output. If the driver/database is capable of returning the metadata, then that metadata is used. If the driver/database is incapable, then you get the errors you are seeing. The rest of my comments are based on the assumption that you are using a SQL Server database in your question.
When working with a SQL Server database in SSIS, typically, we use the native client drivers provided by Microsoft. When trying to get the metadata, these drivers try to get the metadata without actually executing the SQL Statement (actual execution can have side effects; and also, might take more than a few seconds/minutes/hours; and you dont want side effects and long wait times during package design time.) So to get the metadata, the driver relies on the metadata of the actual objects used in the sql command. If the command uses a physical table or view, SQL Server already has the metadata available and can supply it to the driver. If it is a temp table, SQL Server does not have the metadata until it can create the temp table. If using FMT ONLY option, you can use it in such a way to create the temp tables, but avoid any heavy processing/side affects and thus be able to retrieve metadata without penalties. Post 2012, these native client drivers rely on some newer functionality to retrieve metadata than the drivers before 2012. In 2012 and after, the driver uses the sp_describe_first_result_set proc to retrieve metadata. So, whether you can get metadata or not is determined by the ability of the sp_describe_first_result_set proc.
So while SSIS can automatically get the metadata (because of the driver/database), it does not automatically get the metadata in some cases (again because of the driver/database). In cases involving the second scenario, some other process (typically a human) can help the driver infer metadata or provide the metadata to the component directly.
To help the driver, in case of SQL Server 2012 and after, you can use the WITH RESULTSETS clause to specify the output metadata. When this clause is present, the driver will use it and doesnt try to query the metadata from system objects; and thus avoid the error which you would otherwise get. If you are using the drivers that came with SQL Server 2008, you can use FMT ONLY. This option is at the driver/database level.
Another option could be to use a Script Component as the Source and in the Output columns, you can specify the columns/metadata. SSIS would not try to retrieve metadata from the datasource in this case, but would rely on the definitions you provided in the Output section of the Script Component.
As you can see, both options involve a human (or some other process) specifying the metadata instead of SSIS trying to retrieve the metadata in an automated fashion. I would prefer the first option if working with SQL Server and the second option if working with databases like MySql.

SSIS - Deployed package SQL Command validation error

I have created a SSIS package in order to transfer datas from ACCESS to SQL SERVER.
Source > SQL Command from "mdb" file joinning two tables
Destination > Flat table in SQL Server
I'm performing the JOIN in the source SQL Command because of the number of records in the ACCESS tables (~500k).
I tried to use SSIS join but it take ages doing the ORDERING before JOIN.
While running package in VS2010, it works great.
But after deploying and executing the package on my SQL Server 2014 the following error occurs.
No column information was returned by the SQL command.
Returned validation status "VS_NEEDSNEWMETADATA"."
I'm pretty sure my SQL command is correct (Working in VS and the preview button in the editor show me records).
I tried to disable ValidateMetadata but the same error still occurs, but at execution this time.
In the SQL Server 2014 I have other packages calling ACCESS data (But without join) and it works properly.
Thanks for your help,
Q.
ValidateMetadata is (generally) a good thing.
This error is being caused by the metadata on your source or destination (it's unclear from your question) being different.
At a guess, at least one of the columns in your SQL2014 database is of a different datatype (or length, or is nullable etc.) - there is a difference either way.

SSIS: SQL 2008 R2 to MySQL data loss

I have an SSIS package set up to export data from a SQL Server 2008 R2 table to a MySQL version of that table. The package executes however, I am getting about 1% of the rows failing to be exported.
My source connection uses the SQL statement
SELECT * FROM Table1
all of the columns are integers. An example of a row which is exported successfully is
2169,2680, 3532,NULL, 2169
compared to a row which fails
2168,2679,3532,NULL, 2168
virtually nothing different that I can ascertain.
Notably, if I change the source query to only attempt the transfer of a single failing row - ie.
SELECT * FROM Table1 WHERE ID = 2168
then the record is exported fine - it is only when part of a select which returns multiple rows that it fails. The same rows fail the export each time. I have redirected error rows to a text file which displays a -1071610801 error for the failing rows. This would apparently translate to:-
DTS_E_ADODESTERRORUPDATEROW: "An error has occurred while sending this row to destination data source."
which doesn't really add a great deal to my understanding of the issue!
I am wondering if there is a locking issue or something preventing given rows from being fetched or inserted correctly but if anyone has any ideas or suggestions on what might be causing this or even better how to go about resolving it they would be greatly appreciated. I am currently at a total loss...
Try to setup longer timeout (1 day) ot the mysql (ADO.NET) destination.
Well after much head scratching and attempting every work around that I could come up with I have finally found a solution for this.
In the end I switched out the MySQL connector for a different driver produced by devArt -dotConnect for MySql and, with a few minor exceptions (which I think I can resolve) all of my data is now exporting without error.
The driver is a paid for product unfortunately but in the end I'd have taken out a new mortgage to see all those tasks go green!

What is the best tool to use to transfer Data from Reporting Database to another?

I have a reporting database and have to transfer data from that to another server where we run some other reports or functions on Data. What is the best way to transfer data periodically like months or by-weekly. I can use SSIS but is there anyway I can put some where clause on what rows should be extracted from the source database? like i only want to extract data for a current month. Please do let me know.
Thanks,
Vivek
For scheduling periodic extractions, I'd leave to that SQL Agent.
As for restricting the results by some condition, that's an easy thing. Instead of this (and you should always use SQL Command or SQL Command From Variable over Table Name/Table Name From Variable as they are faster)
Add a parameter. If you're use OLE DB connection manager, your indicator for a variable is ?. ADO.NET will be #parameterName
Now, wire the filter up by clicking the Parameters... button. With OLE DB, it's ordinal position starting at 0. If you wanted to use the same parameter twice, you will have to list it each time or use the ADO.NET connection manager.
The biggest question you will have to answer is how do I identify what row(s) need to go. Possibilities are endless: query into the target database and find most recent modified date for a table or highest key value. You could create a local table that tracks what's been sent and query that. You could perform an incremental load / ETL Instrumentation to identify new/updated/unchanged rows, etc.