ssis message box (job runs a package) - ssis

In my SSIS package there is a Script which has message box pops up for each item in foreach loop. This is good for me when debugging. But what happens when I run this package using Job? I know I want see any message boxes but does it stay somewhere in memory? Does it have any influence?

Try using either the event log or a database audit table to record the activity. You will not have a handle over any message box for a deployed package and you do not want a self contained scheduled package to require human intervention to keep it processing.

Related

Validating SSIS package

If I create an SSIS package in SSDT that has an error of some sort, save the package, then close and reopen it, it will do validation upon opening the project and alert me of any errors. However, this is the only way I have found to find if I have any errors. If I simply click Build, it will do so without any errors, so this is not the same process as the initial validation. Do I really have to close out of the package and reopen in SSDT to get this functionality? The only other way I have seen is actually during runtime after executing the package. I'm looking for a way before execution.
Task Validations occurs when you open the package if you are working online. This is useful because it finds errors in a package, in most cases with its connections.
If a package contains many connections(SQL or file or others), and every time you open the package, one has to wait several minutes for the validation to complete. If one wants to skip this, change the SSIS to work offline(SSIS->Work offline). This has no affect running the SSIS package as SQL Server Job.
Otherwise, this validation is ALWAYS done automatically before the package executes. If the tasks have validation errors, the execution fails.
The only way I know to explicitly validate the SSIS package is from SSMS Integration Services Catalog. Here you can go to a particular SSIS package and right click on the package you need to validate. Of course, the package needs to be deployed for this.

system::containerStartTime is not working

I have SSIS package which has few task(Execute SQL Tasks, Data flow task).I have two event handlers(OnPostExecute,OnError) for custom logging on package level.
I am using system::containerStartTime for the TaskStartTime(dateTime) on OnPostExecute event handler. But it's not inserting the correct start time. always all the tasks has the same time with milliseconds different. but there is minute of difference in the different task to start.
it's look like to me it's static variable which sets at the start of the package if this is the case than what is the difference between system:starttime and system:containerStartTime.
This variable should show the same time which appear against each task as start time in execution result tab in Visual studio.
please let me know how i get the correct start time of each tasks in the package.
Thanks,
Zaim Raza.
containerStartTime is the time of the container of the task not the tasks themselves.
I'm afraid there is no system variable for time at task level as per Microsoft documentation here:
https://learn.microsoft.com/en-us/sql/integration-services/system-variables
One thing you can do is save the time to a user variable at the beginning or immediately before the tasks being monitored and use this on your event handler.

SSIS Always run a specific step last without connecting to every parent

I have an SSIS package with about 50 different steps. Many of the steps will run in parallel.
I would like to put a ScriptTask at the end of the package such that, regardless of success or failure, I send out an email which reports on the status of the package run (the package logs things to a database as it runs, so the ScriptTask involves querying the status tables and formatting them into HTML).
Is there a way to have a task that executes when all other tasks have either completed running, or been skipped due to dependencies?
I have tried to add my task to the OnError event handler, but this has it raise up to n times when tasks fail in parallel. I don't want to have every single step flow to the ScriptTask with the OnFailure condition, because it will make the package functionally impossible to use due to the volume of connections. I also believe that connecting it as such would not ensure it ran, as some steps would be skipped and thus not have a success or failure status.
I can do this in two different ways
Add a Sequence Container to your Control Flow.
Move all existing Tasks inside of the the Sequence Container
Add your Script Task outside the Sequence Container. Change the Precedence constraint from Success to Completion by double clicking on the line
Option b
Create an package that has your script task in it.
Create a master package that runs the original package and then calls the package with the script task, passing any variables as needed for it to generate the email.

SSIS package data flow tasks report

We have many SSIS packages that move, import, export around large amount of data. What is the best way to get alerts or notifications if expected amount of data is not processed? or How to get daily report on how different SSIS packages are functioning. Is there a way to write/use a custom component and simply plug it in to SSIS packages instead of writing custom component for each package?
For your first question, we use user variables in SSIS to log the
number of rows processed by each step along with the package name
and execution id. You can then run reports on the history table,
and if any of the executions have a large variance in the rowcounts
processed, you can trigger an event.
Yes. See here, or in the alternative, google "custom ssis
component tutorial".

Send Failure Email and Fail Package or Job

I have several SSIS packages that I inherited that have been scheduled as Jobs in SSMS that send email notifications inside of the SSIS package. So, if a particular piece of one of the SSIS packages fail, certain users receive an email notification with the failure and the details of the failure. This works fine for individual packages or SSMS jobs that do not depend on the failure or success of a package ahead of it.
My problem and my question centers around how do I allow the failure email notifications to complete in the package but fail the package in such a way that the step in the SSMS job fails so that other steps do not kick off? Is there a way to do this without having to undo all of the failure notifications inside of the SSIS packages and moving those failure notifications out somewhere else?
I'm using SQL Server 2008-R2.
EDIT: If I simply have the task fail the package, the Failure Send Mail task will not kick off.
Instead, I want it to do this,
but capture that the package was actually a failure. Can I do this with the package as it is, or will I have to have the package fail and redo all of the packages so the failure notifications are sent a different way. Again, this is important for SSMS jobs that contain multiple steps, not so much for the individual packages themselves.
Dunno whether SSIS from SQL Server 2008 had this property already or was it released later, but you can do this on the mail task to indicate this it's just for error handling. Also, you can set FailPackageOnFailure afterwards: