I frequently encounter a situation in SSIS packages where I run a SQL Command to return a set of rows from an ADO connection. There are cases where I want to branch based on the number of rows returned. The ado resultset is stored in an SSIS 'object' datatype. Is there a way in SSIS expression or Script component to get that count of rows?
Instead of using the Execute Sql task, use a dataflow task like this.
Use a source component to retrieve your data
Use a rowcount component to store your rowcount into a variable
Use a recordset destination component and store that in your original variable (system.object type)
Then return to the control flow and continue as you planned, using the rowcount variable to branch your control flow.
You can create a precedence Constraint after the Execute SQL Task to the object datatype variable, #[User::objectvariable]>0 as expression in precedence constraint. However in Execute SQL Task you will get resultset to the object variable.
Related
In the SSIS Execute SQL task, when I use a stored procedure to return data to a full result set, the resulting C# DataTable object only contains one row/column, with all values contained within one row/column. The rows are separated by commas, columns by dashes. However, when the same code executes via the SQL Statement, I get multiple rows/columns, so I can access via row[0].toString() etc.
This behavior occurs with both ADO.NET and OLEDB connections. Is this by design?
This link started me in the initial direction, but haven't found a definitive answer.
https://learn.microsoft.com/en-us/sql/integration-services/result-sets-in-the-execute-sql-task?view=sql-server-2014
I need to push the filtered data into data flow task...
In the control flow task I have 2 'execute SQL task' and one Data flow task connected one after the other. HOW can I use the output result set of the Execute sql taks into the data flow ?
The two 'execute sql task' performs filter operations and is running fine while debugging.
Inside the datflow task I use a source OLEDB ? What shall I use as a source to get the filtered output data from SQL task in Control Flow...
Adding to this, since you have two EST (Execute SQL Task) which generate a filtered data set which needs to be passed to a DFT (Data Flow Task), you can use a variable substitution method.
Here, you can replace direct SQL with a variable and, create a dynamic SQL using Script task and assign final SQL to the SSIS variable. Now in DFT, use SQL with variable option in your OLEDB Source, this will allow you to get rid of 2 EST's with a single variable which has T-SQL statements
The output data of the Execute SQL Task must be written into the some storage OR in object type variable which can be used as a source in your data flow task.
You can also filter the data in source of the data flow task.
You can store the output of execute SQL task to #Temp table (other properties like delay validation, remainSameConnetion will be required to be set TRUE) OR Permanent table and access that from data flow.
I have a SSIS Package that I have a For Each Loop which imports multiple txt files into a SQL Server table. That runs fine.
What I am trying to accomplish is to store the distinct filename and date it was imported into a separate table. I created a separate For Each Loop for this and then archive the txt file after it's complete with a File System Task.
The issue I am having is I put an event handler to invoke a SQL Task and Send Email task if there is a warning (I was hoping for a warning only if there were no files in the directory where the package is importing from).
However, I found a warning that a column in the Data Flow task was not being used and should be removed if not needed. But the Data Flow task requires at least one field for me to put a Derived Column task
Derived Column Field1: pulls the #User: CurrentFile from the ForEachLoop Container.
Field2 pulls the current date.
Is there a way to perform this without the warning?
It sounds like you're over-complicating thing.
You have a ForEach loop and you're therefore assigning a value into some Variable to contain the file name, #User::CurrentFile. You can get the date it was loaded through either a call to GETDATE() or reference the system scoped variable, StartTime #[System::StarTime]
The most straight forward option would be to add an Execute SQL Task wired up to the OnSuccess Precedent Constraint from your Data Flow Task. The Execute SQL Task will then have a statement like INSERT INTO dbo.MyLog(FileName, InsertDate) SELECT ?, ?, assuming OLE DB Connection Manger, and then you map in your two variables.
Easy, clean, no warnings fired about unused columns in your data flow.
What I think you have is something like this, based on
I created a separate For Each Loop for this
I am using SSIS 2008 and put a simple query (not proc) in an execute sql task (control flow). The query generate one column with a single value, what I am trying to do is based on this value to decide whether to do the following tasks. I tried mapping the value to a variable in the parameter mapping. I tried direction Output/Return value etc but all failed. The query takes no parameter. I know probably I can create a proc with a output parameter to be mapped to a variable but just wondering if there is other options (e.g. not creating proc, it's very simple query)?
As mentioned, you need to change the SQL Task to give a Result Set on a 'Single Row', you can then output that result set to a variable.
From here you can use the constraints within the Control Flow to execute different tasks based upon what the outcome variable will be; for example:
I have to extract data using a sql query by passing values in where clause.
The values to be passed reside in excel. I don't want to hard code those values in query.
Can anyone guide me on how to pass those values.
Eg: Select Name,Age,Address from XYZ where ID in ()
Now i have to pass the ID's that i have in my excel sheet.
Assuming the excel file is in a location that's always known you can use the Excel Data Source to get to the information. Take a look at:
http://msdn.microsoft.com/en-us/library/ms141683.aspx
It does support SQL like syntax but it will be much easier if the worksheet has two rows - a column header and a data row. You pull the named parameters out of the data row.
Also, my recollection is that you do need to have Microsoft Office installed on the machine executing the SSIS job so keep that in mind as you deploy.
Once you have the values from the Excel Data Source, store them in variables where you can pass them as value parameters to your SQL query. See:
how to store sql query result in a variable and messegeBox
Using script task access the excel sheet and populate an SSIS variable(of type object). Then, build an SQL query in a FOR EACH LOOP CONTAINER by iterating over all the values. Store that SQL in another variable(string) and finally execute it using EXECUTE SQL TASK.
You can use a Script Task to iterate all values in the Excel Worksheet and append values in SSIS user variables with each values separated by a comma.
Then you can use Execute SQL Task to execute the SQL with a parameter created in Script Task