Can someone suggest a better way to use configuration files in multiple environments - ssis

I want to standardize and parameterize values across multiple environments and not have to change the dtsx files at any point.
The pattern I am deciding to use is to run all packages from the DTEXEC program and to specify configuration file on the command line and put that all in a batch file. with a different batch file for each environment.
One requirement is the location for the configuration file cannot be in the same physcial drive location, ie all config files are in D:\SSIS\config files. The main reason is that the production machine has an E drive mapped and this is where the ssis packages live and operate from. And, the staging machine does not, and cannot have a drive mapped to E.
Also, we want all files to reside in same pattern across all environments. config files in one place, archive files in another, etc. And, to try to use one medium, meaning the file system is where we store the packages, config files and batch files, as opposed to having data and artifacts in the registry and environment variables.
Does anyone see a more direct approach that satisfies all the conditions?
There may not be one and I thank you for your time...

That's how we're doing it - all config files on the file system, running packages using batch files that call dtexec, and passing config file locations to dtexec via parameters.
Watch out for a possible nasty gotcha, though. As this Books Online article points out, dtexec's behavior regarding command-line configurations changed between SSIS 2005 and SSIS 2008.

Related

Using Configuration File instead of System Registry

The Portal UI React application makes use of the Registry settings instead of a local settings.json file in order to run the application on the local environment.This is a pain for the developer because everytime a Registry is updated the system needs a restart which is not a advisable kind of approach in this fast moving development world. There is less flexibility and more dependency while using the Registry settings instead of a local json based configuration file.
I propose to move all the configuration files into local json file and checkout the file in the applications repository.
If there is any other approach which would make this easy to use scenario then pls share your thoughts.
Thanks
Iftekhar

Difference Between File System Task & FTP Task in SSIS

I want to know the exact difference between FTP Task and File System Task i gone through some tutorials but i didn't get any major differences
Can anyone help me out with this
They give the same functionality but use complete different mechanics:
File System Task is used to handle local or network (accessible, mapped drives) files. You can also change files properties to hide a file or make it read-only.
FTP Task uses the File Transfer Protocol to interact with an FTP server to move, create or delete files. This protocol involves a lot of communication procedures and handling to do these operations.
For an FTP communication you need a client and a server (that is a software running all the time listening to communications on one end), while in the File System Task the SSIS just interacts to the OS file system directly.

Windows Universal Apps: storing configuration

i come from web development where apps can have multiple config files for storing things like db connection strings, remote server endpoints, passwords and so on
so you have files like base.config, development.config, production.config, local.config and so on
according to the environment the app is running in the correct config file is loaded
is there any such system for Windows Phone and Windows Store apps?
if so, how can i define different configs for diffrent runtimes such as debug and production?
i would really like to avoid storing runtime config in code and then using crazy ifs
There isn't a built-in system for this, but it's pretty easy to mock up. Create and read a file with your config information then create different files for the different configurations. Create a pre-build step which copies the appropriate file for the desired configuration.
I'd probably name the files all the same but put them in different directories named for the $(Configuration) then copy from the $(Configuration) dir in my pre-build.
See Pre-build Event/Post-build Event Command Line Dialog Box on MSDN
There isn't an easy way to switch this at runtime since you can't write to the appx package after it's signed and deployed.

Overridding SSIS Environment variable

I have set up a Package Configuration where the Configuration Type is (Indirect XML Configuration File). Where the Environment Variable is pointing to the C:\SSIS\MasterConfig\MasterConfig.dtsConfig file.
This working great and allows easy migration from Test to UAT to Production.
The problem is in our Development environment where each developer has their own DB. I am trying to set up an Agent Job for each developer where the Agent Job would override the Master Configuration File. The agent job command line is":
/FILE "C:\SSIS\Packages\Developer1\LoadPackage.dtsx" /CONFIGFILE
"C:\SSIS\Packages\Developer1\Developer1_Config.dtsConfig" /X86
/CHECKPOINTING OFF /REPORTING E
Using 2008 R2.
I was expecting that the /CONFIGFILE
"C:\SSIS\Packages\Developer1\Developer1_Config.dtsConfig" would be
used in stead of the C:\SSIS\MasterConfig\MasterConfig.dtsConfig file.
Only the Master Config file is being used. Any ideas?
The problem you are running into is covered under Defining a Configuration Approach for Integration Services Packages Basically, the command line configuration is being applied but then overwritten by the design time value.
I'm not sure what your solution would be though. Semi-similar situation here except we had to deal with multi-instanced boxes (dev and test shared a physical host). To get around that, we skipped the environment variable bit and decided that design-time value would always point to dev. Test, UAT and PROD would only be running packages via SQL Agent so our jobs explicitly define the connection string to the configuration resource. Yours is a backwards scenario though, design-time values is fine everywhere but dev.
Here's what we do.
All variables are named the same whether it points to Production, QA or Dev (or even to local). What we change are the variable values pointing to the config files.
We create config files that have all of the appropriate connection information for each box. So we'll have at least 3 separate config files for each database. We name them DB_Prod.config, DB_QA.config, DB_Dev.config, and then DB_Joe_local.config (if you want to have one that points to your local db.
We then create .bat files that sets our variables to the right environment. I have 3 different ones, one for QA, one for dev, and one for my local environment.
the environment variables are ll named things like DB1_adonet, DB1_ole, AS400, etc. With no indication of QA, prod, etc.
It's a slight pain in the ass to set up, but once you start using it, the only issue is remembering what enviroment you've set yourself to. Also, you need to remember that you need to open and close dev studio between environment changes as the values are cached. Also, if you happen to run a package localy, it will use your environment variable values, and not the box you are running it from.
Final note, we have all of the config files checked into TFS. Makes for easy distribution of files.

How to export WAS 6.1 server Configuration

Is there a way in which I can export my server settings from WAS (running under RAD 6) such that other developers will be able to use the same script to set up their environment?
To do this manually in RAD 6.x, simply right-click the server name in the "Server" view and choose one of:
Export server configuration to server
Import server configuration from server
The choice of wording here is potentially confusing. An import takes a configuration from the already-configured server and imports it into your workspace as a Configuration Archive (.car) file. An export asks for the location of a Configuration Archive (which must be in your workspace) and exports the settings it contains onto your server.
Yes, I agree that this sounds completely backwards.
Fortunately, the names are much more sensible in RAD 7.x. The options are:
Server configuration -> Backup...
Server configuration -> Restore...
These behave just as you'd imagine (Backup creates an archive file and Restore imports settings from an existing archive file.)
Important note: This process will not export service integration buses. However, I have had success including buses with the following steps:
Export a CAR file
Rename to .zip file for easy viewing
Manually copy the following files from your server profile into the archive:
cells/<cell_name>/buses/*
cells/<cell_name>/nodes/<node-name>/servers/server1/sib-engines.xml
Rename the archive back to .car
Note that this process is probably highly dependent on my specific configuration, but seems worth mentioning, since it has saved me a lot of trouble.
Another tip: Any files and folders you place inside the CAR will be dumbly copied into your profile directory whenever restoring a server configuration from that archive. This is convenient, because you can include necessary third-party libraries in the CAR file and reference them via WAS variables relative to your profile directory, resulting in one less thing for developers to download or configure.
You can export and import the profile with all its configuration using AdminTask export and import commands with the wsadmin scripting tool. If you are really serious also at the same time about how you release the applications to production environments you should probably create wsadmin scripts for deploying all your required settings in any case.
Also you might want to consider distributing virtual machines or simply copying the server installation otherwise from a reference installation.