SSIS Passing Parameters from Master Package to Child to Data Flow Task - ssis

I have been working with a co-worker in an attempt to setup a master and sub SSIS package templates. These templates look into an audit table in order to identify the correct batch or create a new one. The start and end times of the batch are passed into the CDC (change data capture) table valued functions which returns the Start and END LSN's of the table.
This works by identifying the start and end dates of the batch in the master package. These dates then get set to user variables in the master package. The master package then calls a sub package and passes the user variable start time as well as user variable end time to the sub packages package level parameters. At this point I can print or write the dates to a random table from the sub package and they are showing the correct values.
Next I want to add a data flow task that will use the sub packages package level parameters (Passed from master package) to run the table valued function for the correct time period. This should return the Start & End LSN's that can be used in the source query. Unfortunately what I am seeing is the Data Flow Task never executes any of the task inside the Data Flow. The Data Flow Task gets a nice green checkmark but if you open the dataflow none of the tasks have been executed.
We are scratching our heads on this one and have even created a simple proof of concept that passes a value from a master package to the sub and then attempts to print it from the sub packages DFT but it never executes.
Any ides what could be causing the data flow to be ignored but yet show a success status?

After a lot of headache and frustration I finally found the answer, or rather my co-worker. In the past I have always created stand alone packages that get called from a job in SSMS. When developing individual packages I typically add either a Data Conversion or Derived Column component at the end of the last step completed in the Data Flow. I then setup a data viewer on the path to the Data Conversion/Derived Column. This allows me to check the data before moving on to the next piece of the data flow.
I was using this tactic when working with a master package that calls the child package I was working on. This was the source of the problem. When the master package calls the child package it identifies that components in the Data Flow are not used later in the package and skips them.
To summarize: If you are using a master package to call a child package make sure all pieces of the Data Flow are used later on in the package or they will be skipped during processing.

Related

system::containerStartTime is not working

I have SSIS package which has few task(Execute SQL Tasks, Data flow task).I have two event handlers(OnPostExecute,OnError) for custom logging on package level.
I am using system::containerStartTime for the TaskStartTime(dateTime) on OnPostExecute event handler. But it's not inserting the correct start time. always all the tasks has the same time with milliseconds different. but there is minute of difference in the different task to start.
it's look like to me it's static variable which sets at the start of the package if this is the case than what is the difference between system:starttime and system:containerStartTime.
This variable should show the same time which appear against each task as start time in execution result tab in Visual studio.
please let me know how i get the correct start time of each tasks in the package.
Thanks,
Zaim Raza.
containerStartTime is the time of the container of the task not the tasks themselves.
I'm afraid there is no system variable for time at task level as per Microsoft documentation here:
https://learn.microsoft.com/en-us/sql/integration-services/system-variables
One thing you can do is save the time to a user variable at the beginning or immediately before the tasks being monitored and use this on your event handler.

Executing multiple SSRS reports from SSIS package

I have developed an SSIS package to run 3 reports from Reporting Services that are data driven subscriptions.
When I run the SSIS job it executes all the 3 reports at once, what I need is to run the reports sequentially, in other words, one by one. How can I do this?
This is an expected behavior. When you trigger a data driven subscription job, the SQL Server Agent starts the job and that completes the whole transaction. The SSIS package would then go on to trigger the next data driven subscription job and the next ( assuming you have put the job-triggering in sequence).
Now if you want to create a dependency in the way the jobs should run i.e. Job1 followed by Job2 followed by Job3, you need to manually write additional piece of code. The way to go about it would be to monitor the status code of the subscription.
In the ReportServer database there is a table called dbo.Subscriptions containing a column 'LastStatus'. Currently in my local db, I don't have any subscriptions and also am not able to find any documentation for the table. But I am pretty sure this would be either a boolean or a status flag such as 'Sucess' or 'Failure. Upon triggering the first job, you would need to write a .net Code to monitor this status with a polling interval. Once you get the desired outcome, move on to triggering the next job.
Hope this is clear. I would edit this answer with an working example.

SSIS Always run a specific step last without connecting to every parent

I have an SSIS package with about 50 different steps. Many of the steps will run in parallel.
I would like to put a ScriptTask at the end of the package such that, regardless of success or failure, I send out an email which reports on the status of the package run (the package logs things to a database as it runs, so the ScriptTask involves querying the status tables and formatting them into HTML).
Is there a way to have a task that executes when all other tasks have either completed running, or been skipped due to dependencies?
I have tried to add my task to the OnError event handler, but this has it raise up to n times when tasks fail in parallel. I don't want to have every single step flow to the ScriptTask with the OnFailure condition, because it will make the package functionally impossible to use due to the volume of connections. I also believe that connecting it as such would not ensure it ran, as some steps would be skipped and thus not have a success or failure status.
I can do this in two different ways
Add a Sequence Container to your Control Flow.
Move all existing Tasks inside of the the Sequence Container
Add your Script Task outside the Sequence Container. Change the Precedence constraint from Success to Completion by double clicking on the line
Option b
Create an package that has your script task in it.
Create a master package that runs the original package and then calls the package with the script task, passing any variables as needed for it to generate the email.

File Watcher Job Using SSIS

I am using SSIS for ETL and I need to monitor a source folder for the source file to be arrived. When ever a file arrives I need to move that file into another location and rename the file and start executing another SSIS Package. Here we don't have an option to use use any other tool to automate the execution. We have only choice to use SQL Server, SSIS.
I need the mechanism and the logic to implement this logic.
I'm assuming that by "File Watcher" you don't mean FileSystemWatcher class in .NET, as there wouldn't be any point in using this class if you're limited to SQL Server and SSIS (you'd need a job with forever-running SSIS package containing ScriptTask with FileSystemWatcher).
The only solution is to create two-step job. First step would contain SSIS package for reading directory content and comparing it to files history log. Second step would contain your main package and would execute only if first steps succeeds or returns value indicating that there are new files to process.
Your answer is here and here. My personal favorite way of doing it is having an infinite loop package. Yet another way of doing it would be to encapsulate the entire logic in an SSIS package and fire it every X minutes. Vary the value of X depending on the urgency.

Task Execution Working Correctly. Package Execution Not Working Correctly

I have created an SSIS solution (using SQL SERVER 2012) to extract data from ServiceNow. All the tasks are wrapped in a Sequence Container. When I right click on the Sequence Container and click Execute Task the package behaves as expected. It extracts the data from ServiceNow and performs the rest of the etl. However when I click Execute Package from the Solution Explorer the package successfully completes but it does not extract any data from Service Now.
I have played around with settings and different package designs but with no change in behavior. I can execute with true success at the task level but not the package level. This behavior is even apparent after deployment. I can execute with success from the SSISDB but with no data extraction. When I hook the deployed package to a job I still get no data extraction.
My thinking it has to be some kind of bug or hidden proxy setting because I only get true success (with data extraction) when I manually execute at the task level - i.e. the Sequence Container.
I have been using SSIS for a couple years now and have not come across this issue. Hopefully someone can help or have some ideas.
Got it. I needed a break and to think about this clearly. I needed to update the Package Property Delay Validation from False to True. That solved the issue. I guess the SSIS engine needed the extra time to validate the variable and expressions set up within the package.