'[180] Job was deleted while it was executing: the outcome was (Unknown)' - ssis

I have a simple package to produce a csv file from a query and update a 'lastrun' time on a table. This was written in 2014, running on a test server with 2014. The agent job that runs it simply executes it via an SSIS Package step. No other steps are involved.
However, I get the above error message in the agent log file. The job will successfully execute and produce a file, but ONLY after either restarting the agent service or changing the properties on the job (after refreshing the job list in SSMS). And because it seemingly deletes itself during execution, there is no job history to view, and then the schedule will stop repeating.
I can't find anything like this on here, and wondered if anyone has ever seen this, or has any ideas?
Thanks.
Note (update) : All other agent jobs run ok on the same server. The only difference with this one is that it's the only one that is calling an SSIS package.

It could be when you try to restore the same database (subscriber/distribution) as another database name, it will clear the job automatically.

Related

SSIS package not deleting AS400 records

I have an SSIS package which downloads data from AS400 to SQL Server.
This is working fine but:
The pre-final task (Execute SQL task) is to delete the downloaded records from AS400.
The query is simple:
DELETE FROM (TABLE_NAME)
I am pretty sure that this task is being hit because a Send Email task after this is working.
The issue is occurring on one server only. And I am unable to figure out why.
The entire setup is the same for all servers.
Try to track down the joblog of the job servicing the request. If there are errors that are being passed along by SSIS, you'll see them there.
One issue I've seen, especially with people that still call the system AS/400, is that the tables aren't being journaled; which means that when SSIS attempts to delete the rows under commitment control, it will fail with an error.

Executing multiple SSRS reports from SSIS package

I have developed an SSIS package to run 3 reports from Reporting Services that are data driven subscriptions.
When I run the SSIS job it executes all the 3 reports at once, what I need is to run the reports sequentially, in other words, one by one. How can I do this?
This is an expected behavior. When you trigger a data driven subscription job, the SQL Server Agent starts the job and that completes the whole transaction. The SSIS package would then go on to trigger the next data driven subscription job and the next ( assuming you have put the job-triggering in sequence).
Now if you want to create a dependency in the way the jobs should run i.e. Job1 followed by Job2 followed by Job3, you need to manually write additional piece of code. The way to go about it would be to monitor the status code of the subscription.
In the ReportServer database there is a table called dbo.Subscriptions containing a column 'LastStatus'. Currently in my local db, I don't have any subscriptions and also am not able to find any documentation for the table. But I am pretty sure this would be either a boolean or a status flag such as 'Sucess' or 'Failure. Upon triggering the first job, you would need to write a .net Code to monitor this status with a polling interval. Once you get the desired outcome, move on to triggering the next job.
Hope this is clear. I would edit this answer with an working example.

Cannot deploy SSIS via Integration Services to SQL Server 2012: Current transaction canot be committed. Error:3930

I have a SSIS that has been working for over a year, and just in a couple of days I made a bunch of changes. When I try to deploy, it returns me an error saying
The current transaction cannot be committed and cannot support operations that write to the log file. Roll back the transaction.
It doesn't tell me what or which package is causing this issue.
Is there any way to trouble shoot this?
I appreciate any recommendation!!
In my case, the reason of the error was trigger on SSISDB DB for DDL_DATABASE_LEVEL_EVENTS, wich tried to write info in another DB.
I just solve it, the partitionw as running out of free space, I just cleaned the log, and it's back to normal!
I changed the name of my Solution and Project but wanted to deploy to the same location in the SSISDB Catalog. I was getting this specific error because of the name change (it was failing on deployment step when it called the internal.preparedeploy procedure). I ended up deleting the Project that existed on the Target Server and re-deploying the new Project to the same location. It deployed successfully.

Inconsistent "cannot access file" error when processing flat file

We have an SSIS 2008R2 process that includes a step where an external application is launched in a task that executes a remote procedure call. The external application produces output as a flat file, which SSIS is then supposed to pick up and process. When the external process has finished, the task that launched it completes successfully.
The executive summary version of our problem is that we seem to always have to run the package twice before the step succeeds that processes the flat file. Anyone have an idea why, and what we might try to resolve this?
Here are the gory details:
The SSIS package pauses while the external process is running, as it is supposed to do, and waits for the "all clear" from the external application before attempting to read the file that has been produced. (FWIW, the external application creates the file when it starts, then populates it over the course of its run.)
Our problem is that, both during development when the package is being run from BIDS and during testing when the package is run as a scheduled SQL Server job, the package would sometimes (bot not always) fail and report that it was unable to open the text file. BUT, it is not consistent.
The file in question is written to a network share. We have verified that the share is accessible to the network account under which the job runs, as well as to the developers.
We have tried adding a script task that does the following:
Verify that the file exists
If the file exists, try to open it as a stream for read/write access, in exclusive mode.
If the file is not available, wait for a specified time and try again. Keep trying until either it is successfully opened (and then closed again), or until the limit of number of tries is reached.
Once the file has been successfully opened, close the stream and wait for a few more seconds in case there is a latency problem of some sort.
Although the script is designed to report a failure if it is never able to open the file, we have never seen that branch of the code actually execute (i.e. we are ALWAYS successful in an attempt to open the file).
We know that networks are busy places and that, microseconds after we close the file, something else could come along behind our backs and open it again, but there is absolutely no reason to expect this to be the case in our environment.
Finally, when the package is run from the SQL Server job on a schedule it always fails. When we do nothing more than re-execute the job manually, it seems to "always" succeed. (It was not always so; before we upped the wait time after our successful attempt to open the file even this was not enough.)
The code that we use to test for whether the flat file could be opened came from a thread right here on StackOverflow. I'm happy to post it if anyone thinks our test might itself be contributing to the problem, but it's hard to understand how that could be since the package works sometimes.
Lets give a try like following...
Can you create a Proxy under SSIS Package Execution.
Try to execute this step with this proxy(Run As drop down once you edit Step).

Deadlock on logging variable value changes using a SQL task

Morning
I've been reading "SQL Server 2008 Integration Services Problem - Design - Solution". It outlines a way of logging variable changes which I'm trying to replicate in SQL 2005.
Create variables e.g. PackageId, RecordsAffected. - Set Raise ChangeEvent to true.
Create a string variable g.g. strVariableValue. - Set Raise ChangeEvent to false.
On the package event handler: OnVariableValueChanged add a script task "SCR Convert value to string".
Add ReadOnlyVariables: System::VariableValue
Add ReadWriteVariables: User::strVariableValue
In the script, set a local variable to System::VariableValue.value.tostring
Set the variable User::strVariableValue to the local variable
Add an "Execute SQL Task" component "SQL Log Variable Value Changed" calling a SP with no resultsets.
Set parameter mapping to User::PackageId, System::VariableName, User::strVariableValue
When this is run, I get a deadlock on User::PackageID
Error: 0xC001405B at SQL Log Variable Value Changed: A deadlock was detected while trying to lock variable "User::_PackageID" for read access. A lock could not be acquired after 16 attempts and timed out.
The script step succeeds but the Execute SQL task fails. I'm using Visual Studio 2005 Version 8.0.50727.42, Microsoft SQL Server Integration Services Designer Version 9.00.4035.00 and BIDSHelper Version 1.4.3.0.
Any ideas?
Eureka!
I had the same problem and led to a few deadend posts, then I discovered the root.
I had the framework working just fine and wanted to force some info to be logged.
So I changed the value of the framework variable "strVariableValue" and this caused the deadlock with the change event task.
I fixed by creating my own variable "strLogMe" and putting whatever I wanted to log.
Moral: don't touch the framework variables
Did you use the code sample from the book? All the files are available on the Wiley website for free. The code sample includes a SSIS package, sql scripts, and VB code for the script. If this doesn't work for you, then let me know since one of my team members found a way to log variable changes that was different from this methodology.
I was getting this error ("a deadlock was detected" etc), suddenly, which seemed to coincide with I.T. having done a Microsoft Windows patch on the server. There were packages which were using script tasks, with read-only and/or read-write variables in the SSIS UI. Even though it seemed to have been an environmental issue (because the packages had worked for months, then suddenly stopped working, even though I hadn't changed any code), I thought, well (as I had seen from various blog posts from years gone by), there were instances of companies doing server patches, then having their SSIS packages break; and the blogs seemed to say, change the way you're locking the variables, don't reference them in the UI; instead, lock them explicitly in code. So I tried the same thing. It didn't fix it.
It turns out some individual had removed the permissions of the user under whose identity the packages run, from the AD group; those permissions were required because it was trying to copy a file from a directory which required read permissions on the directory. These packages are typically called by a SQL agent job using a proxy identity. When the package was executed manually from SSMS, it worked. But when it was run by calling the SQL agent job, it failed.
The bottom line is, it was just coincidence that the packages started failing around the time of the Windows update. But the other (main) point is, if your package is trying to access a file on the network, and the identity (or proxy identity) under which that package runs does not have permissions to the source or target directory, then your package could fail and the problem could manifest itself in this cryptic way, where it looks like a variable deadlock issue, but it's actually a file share permissions issue. I only wasted a day on this, but... maybe this will be useful to somebody in the future.