Package is struck at "Execute phase is beginning" at Lookup task - ssis

I have used a Lookup in my data flow task. When I use Full Cache mode, the data flow task runs fine. But when I use Partial Cache or no Cache in my lookup, the records do not go past the lookup task and it keeps running for hours. I have checked for errors but there aren't any errors displayed. Could anyone please help me on this?

A Lookup is not appropriate for your task. Instead:
Add an OLE DB Source to pull in the data
Sort the records from the incoming source and the OLE DB Source
Perform a merge join (Full outer).
Add a Derived Column Transformation to check for ISNULL on the two joining columns. Create a new output column Called Action. For the NULLs in the target then you will tag that as an INSERT record.
Add a conditional split to send the INSERT record to an OLE DB Destination to insert the new records.
You can also check to see if there are matches between the two populations and perform updates, or look for NULLs in the source and DELETE in the destination.

Related

SSIS: How to get the number of updated and deleted rows in an audit?

Imagine that you want to save in a variable the number of rows the were updated or deleted in a table.
‌
This is the steps that i did:
First, in the Control flow i created a Data Flow Task.
Them, in the Data Flow, i created a source(in my case is a excel file), then i proceeded to create two variables to count those rows- countDeleted and countUpdated, then connected the variables to two row count transformations, and them connected my destination (OLE DB).
Now in the control flow, what do i do??
Create a SQL execute task?? or a Script task?? What is the best way to do it?? What is the piece of code to use??
Thanks for youy help.
P‌S: i only have 4 weeks off SSIS, sorry for my noobieness :)
An OLD DB destination only inserts. It can't UPDATE or DELETE
What's your logic for updating or deleting?
If you're just starting out and reading about doing things in SSIS you will eventually find advice to use the OLE DB Command to perform row by row delete and inserts.
In my opinion this is to be avoided. It does not scale (works fine for small recorsets then fails for large recordsets), and it is difficult to maintain parameter mappings in the OLE DB Command. Although you should try it anyway to familiarise yourself with it.
My advice is to load the Excel data into a staging table, perform batch DELETE and UPDATE statements to load the data and use ##ROWCOUNT to capture the records updated.
For example;
Your existing described dataflow can be used to load into a table called StagingTable
Before your dataflow you should run an Execute SQL Task (This is in the Control Flow pane, not the Data Flow pane) that clears the staging table:
TRUNCATE TABLE StagingTable;
So first get that working - repeatedly running your package clears the staging table then loads Excel into it without creating duplicates
This in itself is a challenge as Excel is a terrible data interchange format.
Once you have that working, you add an execute SQL task to the end that runs some SQL that deletes the records you want and captures the count. For example:
DELETE FROM MyFinalTable WHERE PriamryKey IN (SELECT PrimaryKey FROM StagingTable);
SELECT ##ROWCOUNT;
Then you follow the instructions here to load that back to your SSIS variable
http://microsoft-ssis.blogspot.com/2011/03/rowcount-for-execute-sql-statement.html
What are you doing with this row count? Are you writing it to a logging table? Save
yourself the bother of pulling it back into an SSIS variable and just write it directly:
DELETE FROM MyFinalTable WHERE PriamryKey IN (SELECT PrimaryKey FROM StagingTable);
INSERT INTO LogTable(Table,Operation,Type)
SELECT 'MyFinalTable','Delete', ##ROWCOUNT;
In my experience it is not a good idea to build convoluted logic into SSIS packages if you can instead do in a database. Although it does depend on the person who has to eventually maintain it. Hopefully you can appreciate that this T-SQL approach is a more straightforward code based approach as opposed to having to dig around in property pages and events and other places inside SSIS packages.
I assume that you're using an Execute SQL Task for the updates and deletes? As #Nick.McDermaid mentioned, using an OLE DB Command within a Data Flow presents various issues when performing DML. You can find the number of rows updated, inserted, or deleted in a table through an Execute SQL Task by using the ExecValueVariable property of this task. Set the variable that will hold the row count to this property and it will return the number of affected rows. Note that is will only return the number of rows impacted by the last statement in the Execute SQL Task, regardless of batches (i.e. GO separators) are in the component.

Which SSIS transformation can perform 'NOT IN' constraint used in SQL query?

I have two OLEDB Data Sources that have similar columns:
TMP_CRUZTRANS
-------------
CUENTA_CTE numeric (20,0)
TMP_CTACTE_S_USD
----------------
CON_OPE numeric(20,0)
I need to substract all the similar values between this two tables and keep the rows which are different. Is there a transformation/task within SSIS that can perform NOT IN constraint normally used in SQL query?
Currently, I am performing this operation using Execute SQL Task on Control Flow.
The top Data Flow creates the first table TMP_CRUZTRANS (Merge join between other 2 tables... But I guess that's not important for my question) that i need to keep the different values with the second table.
In the Execute SQL Task, I have the following statement:
INSERT INTO [dbo].[TMP_CYA]
SELECT RUT_CLIE, CUENTA_CTE, MONTO_TRANSAC
FROM [dbo].[TMP_CRUZTRANS]
WHERE CUENTA_CTE NOT IN (SELECT CON_OPE FROM TMP_CTACTE_S_USD)
Finally, with the new table TMP_CYA I can continue with my work.
The problem with this approach is that the TMP_CRUZTRANS got like 5 millions of rows, so it's VERY slow inserting all this data into a table using Execute SQL Task. It takes about like 5 hours to perform this operation. That's why I need to do this inside the Data Flow task.
You can use Lookup transformation available within Data Flow task to achieve your requirement.
Here is a sample that illustrates what you are trying to achieve.
Create a package with data flow task. Inside the data flow task, use OLE DB Source to read data from your source table TMP_CRUZTRANS. Use Lookup transformation to validate the existence of the values against the table dbo.TMP_CTACTE_S_USD between given columns. Then redirect the non-matching output to OLE DB Destination to insert rows into table dbo.TMP_CYA
Here is how data flow task would look like in place of the Execute SQL Task that you are currently using.
Configure the Lookup transformation as shown below:
On the General tab page, select Redirect rows to no match output from Specify how to handle rows with no matching entries because you are interested only in non matching rows.
On the Connection tab page, select the appropriate OLE DB Connection manager and select the table dbo.TMP_CTACTE_S_USD. That is the table against which you would like to validate the data.
On the Columns tab page, drag the column CUENTA_CTE and drop it on CON_OPE to establish the mapping between source and lookup tables. Click OK.
When you connect the Lookup transformation with OLE DB Destination, Input Output Selection dialog will appear. Please make sure to select Lookup No Match Output.
Here is the sample before executing the package.
You can see that only 2 rows non matching rows have been transferred to OLE DB destination.
You can notice that the destination table now contains the two non matching rows after package execution.
Hope that helps.

SSIS two staging tables

I would like to bring in an XML source and do data conversion and update it in a table. Data from this table will be used to update another table. How to accomplish this in SSIS?
I understand the first two steps. But lost after that.
XML Source (under dataflow task)
Data Conversion
OLE DB Destination? (If I use OLE DB Destination, then I cannot use that as a source again to update another table). What component should I be using to accomplish this?
TIA
Within a dataflow you can split the records to go to multiple tables using either a conditional split (if you want some records to go one way and some to go another way) or a mulicast task if you want all records to go to both destinations. We use a multicast to create two staging tables, one where the raw data from the file will stay and one where the data will be cleaned and transformed before going into our prod tables. This enables us to easily research if some problem data that came in was due to our transformation process (a bug) or bad data being sent (a problem at the client end, but which might require more steps to handle if they can't fix).
You can also have multiple data flows that all have the same source. Or you can insert to one staging table and then have a second data flow or exec SQL task to move that data to where you want it.
Use the OLE DB Destination to inject your XML source data into your staging table. Then, in your control flow use an Execute SQL task after your data flow task to execute a stored procedure or T-SQL script to move your data from the staging table into the production table(s) and truncate the staging table if required.
I've found that SSIS is great for ETL work, but moving data around inside a DB or aggregation work is best carried out using T-SQL in stored procs. Easier to write, control and you know you're not going to have any RBAR shenanigans you can happen upon in a DFT.
YMMV

Combining two tables with SSIS into one destination table

I am new to SSIS, so please bear with me.
I created an Integration Services Project for SQL Server 2008 to import data from an old db to a new one. One of the things I need to do is import data from two old source tables into one new destination table.
What is the best way to do this?
I can easily see the results I want with a simple inner join query using tsql, but am not having any luck using the SSIS package. My current approach is a three step process:
Add OLE DB Source component that pulls all columns from my first source table
Add a Lookup component, which is the next step after my OLE DB Source component. In this I query the second source table 'using the results of a sql query' that returns no nulls, then drag the foreign key id from the 'available input columns' to the primary key in the available lookup columns. I also check the checkboxes in 'available input columns' to add 2 more columns.
Add OLE DB Destination, pointed to my destination table.
This process fails at the first step, not at the lookup step, and fails with the error "Row yielded no match during lookup". The foreign key cannot be null, and obviously the primary key can't either. I used a SQL statement in step to so I could make sure I don't get any null date values in the columns (there were a few) but I am still getting the error. If I output the first step failure path to a Flat File Destination, I get an empty CSV (watching in debug mode says ~600k records go into the flat file).
I am pretty stumped at this point and this seems like it should be super easy task. I have scoured the web for answers, and found this link that sounds like the same exact problem I am having, but changing the cache setting didn't help.
Help appreciated!
It sounds like you have a mismatch in the lookup. I'd hand run the queries and verify that tha both OLE DB SOurce has no null foriegn keys; and that each foreign-key matches something in the lookup table.
There is a simpler approach here. Use your inner join query you mentioned in the OLE DB SOurce. Don't use the table select, provide your SQL query with the join. This let's the SQL Server do all of the heavy lifting of the join and then SSIS can do the transferring.

SSIS SELECT VALUE from a table without a lookup

I'm fairly new to SSIS,
I'm importing from an XLS spreadsheet into a database table. Along the way I want to select a record from a table, but it is NOT a lookup, ie: a straight SELECT with no join from input source. Then I want to merge this along with the other rows from the XLS.
What is the best way to do this? Variables? OLE DB commands?
Thanks
You could use an OLE DB command but the important thing to remember about this is that it is fired on a per-row basis and could potentially be slow. You can still use a lookup for this purpose, but make sure that you use set the error output to ignore lookup errors for the cases when the lookup transformation does not contain an value for the match you are looking for.
You could also use a merge transformation with an outer join condition rather than an inner join.
If the record that you are retrieving from the database table is not dependent on the data within the row from the spreadsheet then it will probably be the same for each row - is that what you are hoping for?
In this case, I would consider using an Execute SQL Task in the Control Flow to retrieve the record and save it to a variable. You can use a Script Component in the Data Flow to copy the values in the record from the variable to the appropriate fields in each row. This will mean that the lookup data is retrieved only once and not once per row which is slow as jn29098 said above.
If the target for your Data Flow is the same database as the one from which you are extracting the 'lookup' record then you could also consider using an Execute SQL Task (in the Control Flow) to add the lookup values once the spreadsheet data has arrived in the database (once the Data Flow has completed). This would be much more efficient.