Overridding SSIS Environment variable - ssis

I have set up a Package Configuration where the Configuration Type is (Indirect XML Configuration File). Where the Environment Variable is pointing to the C:\SSIS\MasterConfig\MasterConfig.dtsConfig file.
This working great and allows easy migration from Test to UAT to Production.
The problem is in our Development environment where each developer has their own DB. I am trying to set up an Agent Job for each developer where the Agent Job would override the Master Configuration File. The agent job command line is":
/FILE "C:\SSIS\Packages\Developer1\LoadPackage.dtsx" /CONFIGFILE
"C:\SSIS\Packages\Developer1\Developer1_Config.dtsConfig" /X86
/CHECKPOINTING OFF /REPORTING E
Using 2008 R2.
I was expecting that the /CONFIGFILE
"C:\SSIS\Packages\Developer1\Developer1_Config.dtsConfig" would be
used in stead of the C:\SSIS\MasterConfig\MasterConfig.dtsConfig file.
Only the Master Config file is being used. Any ideas?

The problem you are running into is covered under Defining a Configuration Approach for Integration Services Packages Basically, the command line configuration is being applied but then overwritten by the design time value.
I'm not sure what your solution would be though. Semi-similar situation here except we had to deal with multi-instanced boxes (dev and test shared a physical host). To get around that, we skipped the environment variable bit and decided that design-time value would always point to dev. Test, UAT and PROD would only be running packages via SQL Agent so our jobs explicitly define the connection string to the configuration resource. Yours is a backwards scenario though, design-time values is fine everywhere but dev.

Here's what we do.
All variables are named the same whether it points to Production, QA or Dev (or even to local). What we change are the variable values pointing to the config files.
We create config files that have all of the appropriate connection information for each box. So we'll have at least 3 separate config files for each database. We name them DB_Prod.config, DB_QA.config, DB_Dev.config, and then DB_Joe_local.config (if you want to have one that points to your local db.
We then create .bat files that sets our variables to the right environment. I have 3 different ones, one for QA, one for dev, and one for my local environment.
the environment variables are ll named things like DB1_adonet, DB1_ole, AS400, etc. With no indication of QA, prod, etc.
It's a slight pain in the ass to set up, but once you start using it, the only issue is remembering what enviroment you've set yourself to. Also, you need to remember that you need to open and close dev studio between environment changes as the values are cached. Also, if you happen to run a package localy, it will use your environment variable values, and not the box you are running it from.
Final note, we have all of the config files checked into TFS. Makes for easy distribution of files.

Related

What techniques exist for ensuring production environment variables are persisted in some form within a project?

Apologies for title phrasing; I'm sure it could be clearer.
In the Twelve-Factor App methodology, we are encouraged to store web app configuration using environment variables. When using a managed platform such as Heroku, this configuration is safely persisted as a feature of the platform, automatically made available to each deployment, and readily inspectable by developers. This feature is assumed to be stable and, as far as I know, no separate copy of production config need be maintained elsewhere.
When using a simpler unmanaged deployment process, e.g. git push-ing non-containerised code to a VPS, environment variables can still be used (e.g. a non-source-controlled .env file) but they are now effectively ephemeral, and if the VPS is destroyed through some error or incident, the project can be redeployed elsewhere but the configuration variables will need to be reconstructed from something.
My question is, in such a scenario, what is considered best practice around what that "something" should be? When joining a new project I can often cp .env.example .env to set up a typical local configuration. The values in the example file are usually safe to save in source control. However, I don't know where (if anywhere) I should be saving production configuration in order that I could configure a new production deployment of the kind described above. In the Heroku example, the configuration can always be inspected. But in the VPS example, if that running VPS is the only location where the complete production configuration exists, its unexpected disappearance presents a problem.
Obviously any credentials in the config could be regenerated, but that could quickly turn into a non-trivial exercise. I'm wondering how more experienced folks deal with this issue. Thanks!

SSIS 2012 Passing Parameters / Passwords with Dont Save Sensitive

I have got a package and I want to pass username, password and server name via the Project Parameters. I managed to set it, deployed to SSIS Server and it run successfully in the server.
However, as soon as I set the Protection Level to 'Dont Save Sensitive', I couldn't run the package in my development PC anymore.
After changing that, the package cannot access to the Database anymore and Project Parameters are no longer tied to the package.
In SSIS 2008, we used Package Configuration XML files and by using that XML file, we can run in both Development and Live environment at the same time.
Is there anyway to achieve the same in SSIS 2012?
Your package needs to have a Parameter for each of the Project Parameters you are trying to pass.
Then your Connection Managers need to use those Variables - usually as Expressions to form a ConnectionString property.
You need to:
Store the password in a config file (not the best idea but it does work)
Store the project parameter as sensitive. Then you've gotta use things like the GetSensitive method to decrypt that data.

Can someone suggest a better way to use configuration files in multiple environments

I want to standardize and parameterize values across multiple environments and not have to change the dtsx files at any point.
The pattern I am deciding to use is to run all packages from the DTEXEC program and to specify configuration file on the command line and put that all in a batch file. with a different batch file for each environment.
One requirement is the location for the configuration file cannot be in the same physcial drive location, ie all config files are in D:\SSIS\config files. The main reason is that the production machine has an E drive mapped and this is where the ssis packages live and operate from. And, the staging machine does not, and cannot have a drive mapped to E.
Also, we want all files to reside in same pattern across all environments. config files in one place, archive files in another, etc. And, to try to use one medium, meaning the file system is where we store the packages, config files and batch files, as opposed to having data and artifacts in the registry and environment variables.
Does anyone see a more direct approach that satisfies all the conditions?
There may not be one and I thank you for your time...
That's how we're doing it - all config files on the file system, running packages using batch files that call dtexec, and passing config file locations to dtexec via parameters.
Watch out for a possible nasty gotcha, though. As this Books Online article points out, dtexec's behavior regarding command-line configurations changed between SSIS 2005 and SSIS 2008.

Can a TeamCity build agent be configured to only run builds with a particular parameter dependency?

I have a TeamCity build agent installed on a machine which in theory is dedicated to running dynamic security scans and I don't want it doing anything else (i.e. running the duplicates finder).
Short of either creating custom agent configuration properties then customising each build's agent dependencies (which perhaps strictly speaking I should be doing anyway) or configuring the agent to only run selected configurations, is there any way to avoid this? Both of these approaches require additional configuration on a per-build basis either on every single build.
In a perfect world, I'd like to be able to tell the agent to only ever run builds which match a particular agent dependency. Is this possible or am I coming at it from the wrong direction?
I'm afraid TeamCity doesn't provide a way to specify that agent can run only configurations with a specific property (and not run other configurations).
So, there are only two ways to specify agents: either with agent requirements, or with configuring the agent to only run selected configurations.
You could probably try to make some batch change in your build configuration properties, because all build configuration settings/properties are stored in XML files on disk.
In current versions of TeamCity (e.g. 8.1) you can create a pool just for your security machine, and only assign the one machine to that pool, remembering to remove it from other pools.
Then you can assign the security project to that pool. That should solve your problem.

deployment to another server

How to deploy the SSIS package to another server ? What are the steps to be done for deployment in an another server ?
We use configurations (ours are in the datbase but it's simpler to start with file configurations.). I n the menu for SSIS choose configurations and set them up for dev and create a file. Test to amke sure the configurations work porperly. Then open the file and save a qa version after editing for the qa locations and a prod version after editing for the prod locations.
Then copy the config file to it's designated locatoin and the SSIS packa ge to it's designated location (again we use files for this not directly in SSMS although I think you can use SSMS if it is set up for this.)
THen schedule a job telling it to run the package.