We are using TFS 2017 update 2 on premise for CI and CD. In my release definition I have multiple "agent phase". Is there any possibility to skip the entire "agent phase" based on some conditions?
An agent phase is a way of defining a sequence of tasks that will run on one or more agents. At run time, one or more jobs are created to be run on agents that match the demands specified in the phase properties.
Unlike the build task, you could not simply disable/skip the task by right click it and select "disable selected task(s)". You need configure the Run this phase properties for an agent phase to run or not when specific conditions are met.
For "custom" you need to enter an expression that evaluates to true or false and controls when this phase should run. This is for the single agent phase. It's not able to skip the entire "agent phase" on some conditions.
No, that capability doesn't exist.
Related
I want to use the Phonograph writeback dataset for downstream analysis in Foundry. When I make an edit to Phonograph, will the writeback dataset be automatically updated too?
No, the writeback dataset only gets built automatically when registration with Phonograph is updated. Typically, users:
Build the writeback dataset on a recurring schedule appropriate for the use-case (e.g. every day), or
Build on-demand when analysis is needed.
I have SSIS package which has few task(Execute SQL Tasks, Data flow task).I have two event handlers(OnPostExecute,OnError) for custom logging on package level.
I am using system::containerStartTime for the TaskStartTime(dateTime) on OnPostExecute event handler. But it's not inserting the correct start time. always all the tasks has the same time with milliseconds different. but there is minute of difference in the different task to start.
it's look like to me it's static variable which sets at the start of the package if this is the case than what is the difference between system:starttime and system:containerStartTime.
This variable should show the same time which appear against each task as start time in execution result tab in Visual studio.
please let me know how i get the correct start time of each tasks in the package.
Thanks,
Zaim Raza.
containerStartTime is the time of the container of the task not the tasks themselves.
I'm afraid there is no system variable for time at task level as per Microsoft documentation here:
https://learn.microsoft.com/en-us/sql/integration-services/system-variables
One thing you can do is save the time to a user variable at the beginning or immediately before the tasks being monitored and use this on your event handler.
I have developed an SSIS package to run 3 reports from Reporting Services that are data driven subscriptions.
When I run the SSIS job it executes all the 3 reports at once, what I need is to run the reports sequentially, in other words, one by one. How can I do this?
This is an expected behavior. When you trigger a data driven subscription job, the SQL Server Agent starts the job and that completes the whole transaction. The SSIS package would then go on to trigger the next data driven subscription job and the next ( assuming you have put the job-triggering in sequence).
Now if you want to create a dependency in the way the jobs should run i.e. Job1 followed by Job2 followed by Job3, you need to manually write additional piece of code. The way to go about it would be to monitor the status code of the subscription.
In the ReportServer database there is a table called dbo.Subscriptions containing a column 'LastStatus'. Currently in my local db, I don't have any subscriptions and also am not able to find any documentation for the table. But I am pretty sure this would be either a boolean or a status flag such as 'Sucess' or 'Failure. Upon triggering the first job, you would need to write a .net Code to monitor this status with a polling interval. Once you get the desired outcome, move on to triggering the next job.
Hope this is clear. I would edit this answer with an working example.
I have an SSIS package with about 50 different steps. Many of the steps will run in parallel.
I would like to put a ScriptTask at the end of the package such that, regardless of success or failure, I send out an email which reports on the status of the package run (the package logs things to a database as it runs, so the ScriptTask involves querying the status tables and formatting them into HTML).
Is there a way to have a task that executes when all other tasks have either completed running, or been skipped due to dependencies?
I have tried to add my task to the OnError event handler, but this has it raise up to n times when tasks fail in parallel. I don't want to have every single step flow to the ScriptTask with the OnFailure condition, because it will make the package functionally impossible to use due to the volume of connections. I also believe that connecting it as such would not ensure it ran, as some steps would be skipped and thus not have a success or failure status.
I can do this in two different ways
Add a Sequence Container to your Control Flow.
Move all existing Tasks inside of the the Sequence Container
Add your Script Task outside the Sequence Container. Change the Precedence constraint from Success to Completion by double clicking on the line
Option b
Create an package that has your script task in it.
Create a master package that runs the original package and then calls the package with the script task, passing any variables as needed for it to generate the email.
It is possible to run only a specific Data Flow task via Visual Studio (right click on the Data Flow and execute). E.g. you have multiple data flows- but only execute one of them.
I am now trying to implement this ability via a job
- e.g. I have 1 Control Flow with 2 Data Flows (DataFlow1, DataFlow2) -
and in
SQLJob1 - it will fire DataFlow1,
SQLJob2 - will fire DataFlow2 of the same SSISPackage
The aforementioned link states "You can build a special control flow logic using expressions on precendance constraints to define optional execution paths."
I dont want to create special control flow logic - or have 2 separate SSIS packages installed - what would the SQL command be in the Job to fire
only DataFlow1 please?
I see I can in VS right click and disable a specific data flow - and then run the package. I tried to see the command for disabling/enabling a specific data flow, but there is no SQL Query for it - or is it possible to run SQL query to disable/enable specific data flow?
This is the SSIS-way to do this, and I'm sorry, but it involves "special control flow logic" because there is no other way:
Add a package-level variable in your SSIS package.
Add a script task to the control flow of your SSIS package. The script doesn't have to do anything. Think of it as a "starting anchor". It will be the first thing that executes in your package.
Add separate precedence constraints (the little arrows that link tasks) from the script task to your two dataflows.
Double-click on each precedence constraint and set them to use an expression for validation. Use the variable you created, and set one of the paths to be true if the variable is set to "DataFlowA" and the other if it is set to "DataFlowB".
In the jobs, set the value of the variable to the appropriate value for the dataflow you want to have execute.
This is the answer. I'm sorry it's not what you were hoping for but the answer to these questions:
what would the SQL command be in the Job to fire only DataFlow1
please?
is it possible to run SQL query to disable/enable specific
data flow?
Are, respectively,
There is no such command.
No, it is not possible to programmatically enable and disable tasks.