SSIS Always run a specific step last without connecting to every parent - ssis

I have an SSIS package with about 50 different steps. Many of the steps will run in parallel.
I would like to put a ScriptTask at the end of the package such that, regardless of success or failure, I send out an email which reports on the status of the package run (the package logs things to a database as it runs, so the ScriptTask involves querying the status tables and formatting them into HTML).
Is there a way to have a task that executes when all other tasks have either completed running, or been skipped due to dependencies?
I have tried to add my task to the OnError event handler, but this has it raise up to n times when tasks fail in parallel. I don't want to have every single step flow to the ScriptTask with the OnFailure condition, because it will make the package functionally impossible to use due to the volume of connections. I also believe that connecting it as such would not ensure it ran, as some steps would be skipped and thus not have a success or failure status.

I can do this in two different ways
Add a Sequence Container to your Control Flow.
Move all existing Tasks inside of the the Sequence Container
Add your Script Task outside the Sequence Container. Change the Precedence constraint from Success to Completion by double clicking on the line
Option b
Create an package that has your script task in it.
Create a master package that runs the original package and then calls the package with the script task, passing any variables as needed for it to generate the email.

Related

system::containerStartTime is not working

I have SSIS package which has few task(Execute SQL Tasks, Data flow task).I have two event handlers(OnPostExecute,OnError) for custom logging on package level.
I am using system::containerStartTime for the TaskStartTime(dateTime) on OnPostExecute event handler. But it's not inserting the correct start time. always all the tasks has the same time with milliseconds different. but there is minute of difference in the different task to start.
it's look like to me it's static variable which sets at the start of the package if this is the case than what is the difference between system:starttime and system:containerStartTime.
This variable should show the same time which appear against each task as start time in execution result tab in Visual studio.
please let me know how i get the correct start time of each tasks in the package.
Thanks,
Zaim Raza.
containerStartTime is the time of the container of the task not the tasks themselves.
I'm afraid there is no system variable for time at task level as per Microsoft documentation here:
https://learn.microsoft.com/en-us/sql/integration-services/system-variables
One thing you can do is save the time to a user variable at the beginning or immediately before the tasks being monitored and use this on your event handler.

SSIS Passing Parameters from Master Package to Child to Data Flow Task

I have been working with a co-worker in an attempt to setup a master and sub SSIS package templates. These templates look into an audit table in order to identify the correct batch or create a new one. The start and end times of the batch are passed into the CDC (change data capture) table valued functions which returns the Start and END LSN's of the table.
This works by identifying the start and end dates of the batch in the master package. These dates then get set to user variables in the master package. The master package then calls a sub package and passes the user variable start time as well as user variable end time to the sub packages package level parameters. At this point I can print or write the dates to a random table from the sub package and they are showing the correct values.
Next I want to add a data flow task that will use the sub packages package level parameters (Passed from master package) to run the table valued function for the correct time period. This should return the Start & End LSN's that can be used in the source query. Unfortunately what I am seeing is the Data Flow Task never executes any of the task inside the Data Flow. The Data Flow Task gets a nice green checkmark but if you open the dataflow none of the tasks have been executed.
We are scratching our heads on this one and have even created a simple proof of concept that passes a value from a master package to the sub and then attempts to print it from the sub packages DFT but it never executes.
Any ides what could be causing the data flow to be ignored but yet show a success status?
After a lot of headache and frustration I finally found the answer, or rather my co-worker. In the past I have always created stand alone packages that get called from a job in SSMS. When developing individual packages I typically add either a Data Conversion or Derived Column component at the end of the last step completed in the Data Flow. I then setup a data viewer on the path to the Data Conversion/Derived Column. This allows me to check the data before moving on to the next piece of the data flow.
I was using this tactic when working with a master package that calls the child package I was working on. This was the source of the problem. When the master package calls the child package it identifies that components in the Data Flow are not used later in the package and skips them.
To summarize: If you are using a master package to call a child package make sure all pieces of the Data Flow are used later on in the package or they will be skipped during processing.

Task Execution Working Correctly. Package Execution Not Working Correctly

I have created an SSIS solution (using SQL SERVER 2012) to extract data from ServiceNow. All the tasks are wrapped in a Sequence Container. When I right click on the Sequence Container and click Execute Task the package behaves as expected. It extracts the data from ServiceNow and performs the rest of the etl. However when I click Execute Package from the Solution Explorer the package successfully completes but it does not extract any data from Service Now.
I have played around with settings and different package designs but with no change in behavior. I can execute with true success at the task level but not the package level. This behavior is even apparent after deployment. I can execute with success from the SSISDB but with no data extraction. When I hook the deployed package to a job I still get no data extraction.
My thinking it has to be some kind of bug or hidden proxy setting because I only get true success (with data extraction) when I manually execute at the task level - i.e. the Sequence Container.
I have been using SSIS for a couple years now and have not come across this issue. Hopefully someone can help or have some ideas.
Got it. I needed a break and to think about this clearly. I needed to update the Package Property Delay Validation from False to True. That solved the issue. I guess the SSIS engine needed the extra time to validate the variable and expressions set up within the package.

Check Points in SSIS

I got question from interview!!!
How to achieve Check point functionality in SSIS with out using check point.
If package fails it has to be re-run from the point that it failed instead of rerunning the entire package.without using check point.
one way is create a task dispatcher as entry point in the control flow. This will only keep track of the last task successfully executed and call the next task depending on that persisted variable. The first time it runs the variable would be 0 and therefore it will start in the first task, if the first task completes successfully the variable will be set to 1. If the task 2 fails, then the next time the control flows run it will restart on task1 and so on... the control flow will look like this:

ssis message box (job runs a package)

In my SSIS package there is a Script which has message box pops up for each item in foreach loop. This is good for me when debugging. But what happens when I run this package using Job? I know I want see any message boxes but does it stay somewhere in memory? Does it have any influence?
Try using either the event log or a database audit table to record the activity. You will not have a handle over any message box for a deployed package and you do not want a self contained scheduled package to require human intervention to keep it processing.