How to disable task if process already ran today - ssis

I'm trying to automate and streamline a daily process that I've been babysitting for several months. Because our upstream pipeline can and may be delayed for several hours, I've created a package that checks the pipeline every 30 minutes. Once the pipeline completes, I manually perform steps 2 - 6.
I have stored procedures that will execute steps 3 - 6, but each step can only run once per day and only after step 2 completes.
Step 6 only after step 5.
In the attached image.
Can someone show me how to disable a task based on step 1 value?

Related

Cloud Scheduler with multiple cron schedules?

I have a Scheduler Job within Cloud Scheduler running to call a Cloud Function. Working as expected however I need to create multiple cron schedules for the same job, is this possible without recreating from scratch each one? There doesn't seem to be a Copy function.
Use Case:
Schedule 1 - Every 15 minutes Monday - Thursday
Schedule 2 - Every 15 minutes Friday SoD till Friday 22:00PM
Schedule 3 - Every 15 minutes Sunday 22:00PM till Sunday EoD
How can I achieve this, or do I just need to recreate all the schedules from scratch?
Posting this as a Community Wiki as it's based on #GuillaumeBlaquiere and #Al-dann comments.
At this point you going to need to recreate all of those schedules from scratch, however, you can automate the creation of new Jobs with a script, either with gcloud or terraform. This will make duplication, creation and deletion of Jobs easier and quicker as well as allow you to version it.

SSIS run multiple steps side by side

I have some a job in SSIS that has 5 steps. The way it currently works is that it will go through the steps in order waiting for the previous one to be complete before doing the next. However with this job, steps 1-4 could all run at the same time with impacting the results of each other. So I was curious if it was possible to have steps 1-4 all run at the same time and once all are complete then start step 5.
I am open to the idea of doing this in other ways such as maybe having several different jobs, using triggers or anything else that will get the end result.
The main goal here is to have step 5 start as soon as possible but step 5 can not start until all 4 steps are done.
All of these steps are merely running a stored procedure to update a table.
I am using SQL 2012. I am very new to SSIS.
This is what the Sequence Container tool is for.
You can put steps 1-4 in a Sequence Container and let them run in parallel in the container, and then have a Precedence Constraint from the container to Step 5.
In your package set MaxConcurrentExecutables to.. say.. 6 and make sure there are no precedence constraints between your tasks.
Then they should run in parallel.
See here for more detail. https://blogs.msdn.microsoft.com/sqlperf/2007/05/11/implement-parallel-execution-in-ssis.
I'm curious - did you try googling this?

Make periodic task occur every 2 seconds

I need to check regularly if a new message has been received because the API service I am integrating with does not have a push notification service. How do I set how often a periodic task runs for?
I have the boiler plate code (eg. http://www.c-sharpcorner.com/uploadfile/54f4b6/periodic-and-resourceintensive-tasks-in-windows-phone-mango/) from any example on the internet, but it seems it can only run roughly every 30 minutes :() ?
Unfortunately periodic tasks run not more often than 30 minutes and they are not even guaranteed to run. If you want to run more often than that your only bet is setting up a push notification service...

Running multiple instances of an SSIS package via SQLAgent concurrently

I have an SSIS Package scheduled to run every X minutes in SQLAgent that subsequenly executes a plethora of child packages if certain conditions are met. The problem I am having is that sometimes some of the child packages take a lot longer than X minutes to run which in turn means that nothing else can run until all child packages complete.
This also means that during that run time the conditions to run a child package may have come and went which would mean that they do not run, even when the original package completes.
Is there a way to allow concurrent instances of a parent package to run even if it is previously running?
ParentA is scheduled to run every 10 minutes, at 10:00 it kicks off and ChildA's criteria is met. ChildB needs to run at 10:20am, it is not met so it does not run. ChildA takes 3 hours to complete.
I need to have a new instance of ParentA kick off at 10:10 and then again at 10:20.
How can I go about doing this without having 2+ ParentA scheduled and having to do some fancy coding so multiple instances of the child packages aren't kicked off?
Thanks
I would change the ParentA package so that it starts once and does not terminate, but has a series of loops that launch Child packages.
Within PackageA I would add a For Loop Container, within that I would add an Execute SQL Task that uses SQL WAITFOR command to pause for 10 minutes. Then I would run ChildA package.
I would repeat that pattern for ChildB etc. As each For Loop Container is independant, they will loop independantly. Each individual Loop Container will wait for it's Child package to complete before looping.
You need to add some mechanism to stop the looping, e.g. check on the system time, count the number of loops or similar. This test would go in each For Loop's EvalExpression property.

Trigger periodic job in Hudson only if last execution of another job is successful

I have a job NIGHTLY that runs one time each night by a periodical timer.
Now I want to change so that the NIGHTLY job is only run if the last execution of another job FOO in Hudson is successful.
Note:
Job FOO is run many times each day and is triggered by SCM.
NIGHTLY should only be run one time per night and at a specific time.
Currently I have another job NIGHTLY_TRIGGER that runs a bash script that access the remote API of job FOO to check if job FOO is successful and if so triggers the NIGHTLY job.
Are there a nicer/cleaner way to do this? (preferable by using some Hudson plugins)
You could check out the Hudson Join Plugin which is made for this kind of scenario (wait for the conclusion of a job before executing another one).
The end result wouldn't be much different from what you are already doing, but this would be neatly parameterized:
So you would still have to check the status of FOO job, but at least you would check it right after FOO job completion.