Azure Data Factory SSIS Execution Activity - Connection Props not correct - ssis

I am using Azure DevOps CI/CD to deploy Pipelines with SSIS Execution Activities. Deployments succeed, and the values in the Settings Tab, look correct. In particular, the Environment Path is set as expected (example: ./Uat).
The values in the Parameters TAB (contains the values for the Activity's Connection Managers). These should display the values specific the the environment to which they are deployed.
However, it seems that the values are only pulled from the database once I click the 'Refresh' button in the Settings TAB. At this point, values match values in stored in the SIIS Environment.
Is this just a UI thing? In other words, if I don't do refreshes on each Activity in the ADF tool, can I expect the parameter values, specific to the environment, will be pulled from the SSISDB database and used at runtime, rather than the ones appearing in the Connection Managers?

Related

How to re-establish the link between SSIS parameters configured at the project level and at the job level?

SSIS package parameters can be configured in 2 places.
Under Integration services catalog - on the project (right click, configure).
Under the SQL agent job step package configuration.
Parameters configured at project level automatically show up at the job level, and also the job level picks up any parameter changes made at the project level.
Where as if change is made to the parameter at the job level, then this doesn't update at the project level.
Suppose you change the value of the parameter at job level, then even if you reset the value to the original value (to match with the project level value), then any subsequent changes to this parameter at the project level doesn't update the value at the job level. Is there any way to re-establish the link between the project level and job level parameter?
I think I understand the question but let me know if I've missed the mark.
I have a project parameter, MagicNumber.
In the SSIS project in VS, it's -1.
I deploy to the SSISDB to a folder called Demo and Configure the deployed Project to have a value of 0. If I right click and run that package or create a job to run the package, it's going to use value of 0.
In a SQL Agent job, I then re-define MagicNumber to be 1 for all executions of that job. Even if I then go back into Configure the project to have 2 as my MagicNumber, all instances of this job will use 1 because we've specified that we are overriding the configured value.
If I created Job2 which runs the same package, it will use 0 and then 2 as MagicNumber because it's picking up the override from the Project Configuration.
Were you to deploy the same original .ispac file to a different folder, Pristine and ran that package, it's going to use -1 because that's the default configured value.
We put the first layer of configuration on all runs of the package from our project in the Demo folder by specifying MagicNumber is 0 and then 2. But the job allows us to add yet another layer of configuration by using a job specific level of configuration, 1.
If you decide 1 should be the default value, you update the Configuration for the project in the Demo folder. If you do nothing, then the SQL Agent job will still use the locally provided value so you then need to edit the job to remove the locally configured value and use the Project scoped configuration value.

How does DontSaveSensitive mode and sensitive parameter in case of a deployment work?

Suppose I have parameter marked as sensitive. And the package security mode is DontSaveSensitive.
When I save this package. Then when anyone opens the package the sensitive info will have to be reentered. However when such a package is deployed, then how does sensitive info get stored on the server when the package security mode is DontSaveSensitive?
The DontSaveSensitive flag is not carried into the catalog.
You can set/save the parameter's values once it's deployed.
Within the catalog there is a "sensitive" flag which will influence how the UI displays the data and alters how the parameter value is stored.
You can take a peek at the catalog tables to get a closer look at the internals.
select * from SSISDB.internal.object_parameters

is there any way to create an excel file and save it or email it?

is there any way using SSIS (or any MSSQL Server features) to automatically run a stored procedure and have the output saved as an Excel files (or even flat file) and then have the newly created file sent to people via email?
sorry im a complete newbie to SSIS.
In broad strokes, you'll have an SSIS package with 2 tasks and 3 connection manager.
The first task is a Data Flow Task. Much as the name implies, the data is going to flow here - in your case, from SQL Server to Excel.
In the Data Flow task, add an OLE DB Source to the data flow. It will ask what Connection Manager to use and you'll create a new one pointed at your source system. Change the source from the Table Selector to a Query and then reference your stored procedure EXECUTE dbo.ExportDaily'
Hopefully, the procedure is nothing more than select col1, col2, colN from table where colDate = cast(getdate() as date) Otherwise, you might run into challenges for the component to determine the source metadata. Metadata is the name of the game in an SSIS data flow. If you have trouble, the resolution is version dependent - pre 2012 you'd have a null operation select as your starting point. 2012+ you use the WITH RESULT_SETS to describe the output shape.
With our source settled, we need to land that data somewhere and you've indicated Excel. Drag an Excel destination onto the canvas and again, this is going to need a connection manager so let it create one after you define where the data should land. Where you land the data is important. On your machine, C:\user\pmaj\Documents is a valid path, but when this runs on a server as ServerUser1... Not so much. I have a pattern of C:\ssisdata\SubjectArea\Input & Output & Archive folders.
Click into the Columns tab, and there's nothing to do here as it auto-mapped source columns to the destination. Sort the target column names by clicking on the header. A good practice is to scroll through the listing and look for anything that is unmapped.
Run the package and confirm that we have a new file generated and it has data. Close Excel and run it again. It should have clobbered the file we made. If it errors (and you don't have your "finger" on the file by having it open in Excel, then you need to find the setting in the Excel destination that says overwrite existing file)
You've now solved the exporting data to Excel task. Now you want to share your newfound wisdom with someone else and you want to use email to do so.
There are two ways of sending email. The most common will be the Email task. You'll need to establish a connection to your SMTP server and I find this tends to be more difficult in the cloud based world - especially with authentication and this thing running as an unattended job.
At this point, I'm assuming you've got a valid SMTP connection manager established. The Send Email Task is straightfoward. Define who is receiving the email, the subject, body and then add your attachment.
An alternative to the Send Mail Task, is to use an Execute SQL Task. The DBAs likely already have sp_send_dbmail configured on your server as they want the server to alert them when bad things happen. Sending your files through that process is easier as someone else has already solved the hard problems of smtp connections, permissions, etc.
EXECUTE msdb.dbo.sp_send_dbmail
#profile_name = 'TheDbasToldMeWhatThisIs'
, #recipients ='pmaj#a.com;billinkc#b.com'
, #subject = 'Daily excel'
, #body = 'Read this and do something'
, #file_attachments = 'C:\ssisdata\daily\daily.xlsx';
Besides using existing and maintained mechanism for mailing the files, Execute SQL Task is easily parameterized with the ? place holder so if you need to change profile as the package is deployed through dev/uat/prod, you can create SSIS Variables and Parameters and map values into the procedure's parameters and configure those values post deployment.

How can I dynamically set the location of an Execute Package Task in SSIS

I'm trying to set up a 'master' SSIS Package in SQL Server 2008 to run other 'child' packages. When developing the 'child' packages we have all the packages sitting on disk so we can easily debug them, so we use file connectors during development and can monitor the progress nicely.
When we deploy, we deploy the child packages to SSIS on SQL Server and then go through and change all the Execute Package Task's to use a location value of 'SQL Server' and set the PackageName. Once done, we deploy the 'master'.
What I'd like to do is use an Expression on the Execute Package Task to set the connection properties so we can configure this dependent on the environment. We have already setup a SQL Server configuration database using a view which checks the host name of the query and returns different values dependent on the query.
You have options. You're in the right frame of mind using expressions, but you might benefit from using configurations as well.
To use expressions, you would need to use a Script Task or Execute SQL Task to return back the list of files you want to work through.
You would either have to assign each returned value to it's own variable that is passed into the expression, or use a FOR EACH loop and work through a list, assigning the location of the child package each time.
The other option is to use configurations. My preference is to use a configuration table inside SSIS. If you have the same list of packages in each environment, you could either pass in the root directory and have an expression use that:
#[User::RootPackagePath] + "\PackageName.dtsx"
Or, you could simply have one record for each child package in the configuration table and that would be passed into the package.
Edit based on comments:
I was successfully able to configure a package to change via configurations to call a package from the file system then SQL.
I only needed to pass the Connection and PackageName for each. With SQL, it want a name from the connection manager (.\SQL2008R2 in my case) and the package name (\Package1). For the file system, PackageName is blanked out and the connection is a FileConnection in the connection manager.
You will have to keep both in the package, but you switch between the two.

Reporting Services Expression-based Connection Strings can't have any data driven parameters

I need my reports to have dynamic connections strings. We have multiple database servers and catalogs and only want to maintain a single Report file. The only solution I could find that would let me do this programmatically was "Expression-based Connection Strings". Basically I programmatically pass in parameter values to the report for the ServerName and InitialCatalog.
This works for simple reports. Although not ideal as modifying the report requires changing the connection to a hardcoded connection string and then switch back to the expression-based one when I want to save and publish.
HOWEVER, this does not work for reports that have data-driven parameters. For example I have a report that filters data based on a "City" parameter that the user selects when they first open the report. The City parameter is feed data from a query. It seems that I can't just set the connection parameters and let reporting services query for the City parameter.
I'm open to ideas here other than "Expression-based connection strings".
Thanks.
I had the same problem. The solution was surprisingly simple - just move your "ServerName" and "InitialCatalog" to the top of parameter list.
A possible option is to create a deployment script(which uses rs.exe) and deploy multiple versions of the report. In the deployment script you can update the datasource of the report. Your source control would still only have one report and each time you released it you run the script to update the multiple copies you have in production.