Send Failure Email and Fail Package or Job - ssis

I have several SSIS packages that I inherited that have been scheduled as Jobs in SSMS that send email notifications inside of the SSIS package. So, if a particular piece of one of the SSIS packages fail, certain users receive an email notification with the failure and the details of the failure. This works fine for individual packages or SSMS jobs that do not depend on the failure or success of a package ahead of it.
My problem and my question centers around how do I allow the failure email notifications to complete in the package but fail the package in such a way that the step in the SSMS job fails so that other steps do not kick off? Is there a way to do this without having to undo all of the failure notifications inside of the SSIS packages and moving those failure notifications out somewhere else?
I'm using SQL Server 2008-R2.
EDIT: If I simply have the task fail the package, the Failure Send Mail task will not kick off.
Instead, I want it to do this,
but capture that the package was actually a failure. Can I do this with the package as it is, or will I have to have the package fail and redo all of the packages so the failure notifications are sent a different way. Again, this is important for SSMS jobs that contain multiple steps, not so much for the individual packages themselves.

Dunno whether SSIS from SQL Server 2008 had this property already or was it released later, but you can do this on the mail task to indicate this it's just for error handling. Also, you can set FailPackageOnFailure afterwards:

Related

Validating SSIS package

If I create an SSIS package in SSDT that has an error of some sort, save the package, then close and reopen it, it will do validation upon opening the project and alert me of any errors. However, this is the only way I have found to find if I have any errors. If I simply click Build, it will do so without any errors, so this is not the same process as the initial validation. Do I really have to close out of the package and reopen in SSDT to get this functionality? The only other way I have seen is actually during runtime after executing the package. I'm looking for a way before execution.
Task Validations occurs when you open the package if you are working online. This is useful because it finds errors in a package, in most cases with its connections.
If a package contains many connections(SQL or file or others), and every time you open the package, one has to wait several minutes for the validation to complete. If one wants to skip this, change the SSIS to work offline(SSIS->Work offline). This has no affect running the SSIS package as SQL Server Job.
Otherwise, this validation is ALWAYS done automatically before the package executes. If the tasks have validation errors, the execution fails.
The only way I know to explicitly validate the SSIS package is from SSMS Integration Services Catalog. Here you can go to a particular SSIS package and right click on the package you need to validate. Of course, the package needs to be deployed for this.

SSIS package not deleting AS400 records

I have an SSIS package which downloads data from AS400 to SQL Server.
This is working fine but:
The pre-final task (Execute SQL task) is to delete the downloaded records from AS400.
The query is simple:
DELETE FROM (TABLE_NAME)
I am pretty sure that this task is being hit because a Send Email task after this is working.
The issue is occurring on one server only. And I am unable to figure out why.
The entire setup is the same for all servers.
Try to track down the joblog of the job servicing the request. If there are errors that are being passed along by SSIS, you'll see them there.
One issue I've seen, especially with people that still call the system AS/400, is that the tables aren't being journaled; which means that when SSIS attempts to delete the rows under commitment control, it will fail with an error.

'[180] Job was deleted while it was executing: the outcome was (Unknown)'

I have a simple package to produce a csv file from a query and update a 'lastrun' time on a table. This was written in 2014, running on a test server with 2014. The agent job that runs it simply executes it via an SSIS Package step. No other steps are involved.
However, I get the above error message in the agent log file. The job will successfully execute and produce a file, but ONLY after either restarting the agent service or changing the properties on the job (after refreshing the job list in SSMS). And because it seemingly deletes itself during execution, there is no job history to view, and then the schedule will stop repeating.
I can't find anything like this on here, and wondered if anyone has ever seen this, or has any ideas?
Thanks.
Note (update) : All other agent jobs run ok on the same server. The only difference with this one is that it's the only one that is calling an SSIS package.
It could be when you try to restore the same database (subscriber/distribution) as another database name, it will clear the job automatically.

ssis message box (job runs a package)

In my SSIS package there is a Script which has message box pops up for each item in foreach loop. This is good for me when debugging. But what happens when I run this package using Job? I know I want see any message boxes but does it stay somewhere in memory? Does it have any influence?
Try using either the event log or a database audit table to record the activity. You will not have a handle over any message box for a deployed package and you do not want a self contained scheduled package to require human intervention to keep it processing.

Deadlock on logging variable value changes using a SQL task

Morning
I've been reading "SQL Server 2008 Integration Services Problem - Design - Solution". It outlines a way of logging variable changes which I'm trying to replicate in SQL 2005.
Create variables e.g. PackageId, RecordsAffected. - Set Raise ChangeEvent to true.
Create a string variable g.g. strVariableValue. - Set Raise ChangeEvent to false.
On the package event handler: OnVariableValueChanged add a script task "SCR Convert value to string".
Add ReadOnlyVariables: System::VariableValue
Add ReadWriteVariables: User::strVariableValue
In the script, set a local variable to System::VariableValue.value.tostring
Set the variable User::strVariableValue to the local variable
Add an "Execute SQL Task" component "SQL Log Variable Value Changed" calling a SP with no resultsets.
Set parameter mapping to User::PackageId, System::VariableName, User::strVariableValue
When this is run, I get a deadlock on User::PackageID
Error: 0xC001405B at SQL Log Variable Value Changed: A deadlock was detected while trying to lock variable "User::_PackageID" for read access. A lock could not be acquired after 16 attempts and timed out.
The script step succeeds but the Execute SQL task fails. I'm using Visual Studio 2005 Version 8.0.50727.42, Microsoft SQL Server Integration Services Designer Version 9.00.4035.00 and BIDSHelper Version 1.4.3.0.
Any ideas?
Eureka!
I had the same problem and led to a few deadend posts, then I discovered the root.
I had the framework working just fine and wanted to force some info to be logged.
So I changed the value of the framework variable "strVariableValue" and this caused the deadlock with the change event task.
I fixed by creating my own variable "strLogMe" and putting whatever I wanted to log.
Moral: don't touch the framework variables
Did you use the code sample from the book? All the files are available on the Wiley website for free. The code sample includes a SSIS package, sql scripts, and VB code for the script. If this doesn't work for you, then let me know since one of my team members found a way to log variable changes that was different from this methodology.
I was getting this error ("a deadlock was detected" etc), suddenly, which seemed to coincide with I.T. having done a Microsoft Windows patch on the server. There were packages which were using script tasks, with read-only and/or read-write variables in the SSIS UI. Even though it seemed to have been an environmental issue (because the packages had worked for months, then suddenly stopped working, even though I hadn't changed any code), I thought, well (as I had seen from various blog posts from years gone by), there were instances of companies doing server patches, then having their SSIS packages break; and the blogs seemed to say, change the way you're locking the variables, don't reference them in the UI; instead, lock them explicitly in code. So I tried the same thing. It didn't fix it.
It turns out some individual had removed the permissions of the user under whose identity the packages run, from the AD group; those permissions were required because it was trying to copy a file from a directory which required read permissions on the directory. These packages are typically called by a SQL agent job using a proxy identity. When the package was executed manually from SSMS, it worked. But when it was run by calling the SQL agent job, it failed.
The bottom line is, it was just coincidence that the packages started failing around the time of the Windows update. But the other (main) point is, if your package is trying to access a file on the network, and the identity (or proxy identity) under which that package runs does not have permissions to the source or target directory, then your package could fail and the problem could manifest itself in this cryptic way, where it looks like a variable deadlock issue, but it's actually a file share permissions issue. I only wasted a day on this, but... maybe this will be useful to somebody in the future.