SSIS Deployment: Dev Stage Live AppSettings - sql-server-2008

The main problem is: How do i incorporate an appSettings.Config file with a particular build(dev, stage, live)? My appSettings.Config changes the conx strings for data sources based on which server the package is being deployed to. I am able to go through Package configurations and add my appSettings.Config, however, I can only specifically add one file dev, stage, or live. What i need to do is be able to build the solution and based on teh build type incorporate the dev/stage/live appsettings. How could I do this?

You could include all of the configuration files in the install and then just point to the correct one through an environment variable. I know you're wanting to switch the configuration file based on the solution build configuration, but you'll be looking at a complex solution when a simpler alternative exists.
Its quite straight-forward to add registry information during the package install that will set the machine's environment variable under the key:
HKLM\SYSTEM\CurrentControlSet\Control\Session Manager\Environment\MyVariable
...to the path of the .dtsConfig for the current environment.

Related

Azkaban job configuration per environment

I plan to use the Azkaban https://azkaban.github.io/ for running batch jobs. According to CI ideas we have few environments like dev, test, stage, production and of course job should have different configuration for each environment.
According to Azkaban documentation http://azkaban.github.io/azkaban/docs/latest/#job-configuration Azkaban allows for replacing of parameters whenever a ${parameter} is found. And solution looks like:
system.properties
myFlow/
dev.properties
....
prod.properties
foo.job
#system.properties
env=dev
#dev.properties
dev.database=localhost:2181
#prod.properties
prod.database=aws:port
#foo.job
some command --db ${${env}.database}
And later on each environment we can override the env variable. From my point of view this solution looks strange. Can I just say to Azkaban which property file should be used on environment?
What is the best approach to do it?
An other approach can be found here https://github.com/azkaban/azkaban/issues/1545#issuecomment-339750417

How to pass directives to snappy_ec2 created clusters

We have a need to set some directives in the snappy config files for the various components (servers, locators, etc).
The snappy_ec2 scripts do a good job at creating all of the config's and keeping them in sync across the cluster, but I need to find a serviceable method to add directives to the auto generated scripts.
What is the preferred method using this script?
Example: Add the following to the 'servers' file:
-gemfirexd.disable-getall-local-index=true
Or perhaps I should add these strings to an environments file such as
snappy-env.sh
TIA
-doug
Have you tried adding the directives directly in the servers (or locators or leads) file and placing this file under (SNAPPY_DIR)/ec2/deploy/home/ec2-user/snappydata/? The script would read the conf files under this dir at the time of launching the cluster.
You'll need to specify it for each server you want to launch, with the name of server as shown below. See 'Specifying properties' section in README, if you have not already done so. e.g.
{{SERVER_0}} -heap-size=4096m -locators={{LOCATOR_0}}:9999,{{LOCATOR_1}}:9888 -J-Dgemfirexd.disable-getall-local-index=true
{{SERVER_1}} -heap-size=4096m -locators={{LOCATOR_0}}:9999,{{LOCATOR_1}}:9888 -J-Dgemfirexd.disable-getall-local-index=true
If you want it to be applied for all the servers, simply put it in snappy-env.sh as you mentioned (as SERVER_STARTUP_OPTIONS) and place the file under directory mentioned above.
We could have read the conf files directly from (SNAPPY_DIR)/conf/ instead of making users copy it to above location, but we may release the ec2 scripts as a separate package, in future, so that the users do not have to download the entire distribution.

How to get the checkout directory of project dependencies in TeamCity?

I am using TeamCity as build server and have a little trouble when configuring projects and their dependencies.
Eventually I want to get the checkout directory of project dependencies to configure certain build steps. For that I have the variable %teamcity.build.checkoutDir% for the checkout directory of the project itself.
However, I did not find something like %dep.<dependencyID>.teamcity.build.checkoutDir%.
Is there a way to get the checkout directory of a dependency?
You can add a parameter (say checkoutDir ) in the first build whose value is equal to %teamcity.build.checkoutDir% . You can then fetch this value in the dependent build (either through snapshot or artefact dependency)
I am using this myself and I can access my dependent Build's Checkout directory with...
%dep.<dependecyID>.teamcity.build.default.checkoutDir%
I believe this will only work with a Snapshot Dependency though

Make: Redo some targets if configuration changes

I want to reexecute some targets when the configuration changes.
Consider this example:
I have a configuration variable (that is either read from environment variables or a config.local file):
CONF:=...
Based on this variable CONF, I assemble a header file conf.hpp like this:
conf.hpp:
buildConfHeader $(CONF)
Now, of course, I want to rebuild this header if the configuration variable changes, because otherwise the header would not reflect the new configuration. But how can I track this with make? The configuration variable is not tied to a file, as it may be read from environment variables.
Is there any way to achieve this?
I have figured it out. Hopefully this will help anyone having the same problem:
I build a file name from the configuration itself, so if we have
CONF:=a b c d e
then I create a configuration identifier by replacing the spaces with underscores, i.e.,
null:=
space:= $(null) #
CONFID:= $(subst $(space),_,$(strip $(CONF))).conf
which will result in CONFID=a_b_c_d_e.conf
Now, I use this $(CONFID) as dependency for the conf.hpp target. In addition, I add a rule for $(CONFID) to delete old .conf files and create a new one:
$(CONFID):
rm -f *.conf #remove old .conf files, -f so no error when no .conf files are found
touch $(CONFID) #create a new file with the right name
conf.hpp: $(CONFID)
buildConfHeader $(CONF)
Now everything works fine. The file with name $(CONFID) tracks the configuration used to build the current conf.hpp. If the configuration changes, then $(CONFID) will point to a non-existant .conf file. Thus, the first rule will be executed, the old conf will be deleted and a new one will be created. The header will be updated. Exactly what I want :)
There is no way for make to know what to rebuild if the configuration changed via a macro or environment variable.
You can, however, use a target that simply updates the timestamp of conf.hpp, which will force it to always be rebuilt:
conf.hpp: confupdate
buildConfHeader $(CONF)
confupdate:
#touch conf.hpp
However, as I said, conf.hpp will always be built, meaning any targets that depend upon it will need rebuilt as well. A much more friendly solution is to generate the makefile itself. CMake or the GNU Autotools are good for this, except you sacrifice a lot of control over the makefile. You could also use a build script that creates the makefile, but I'd advise against this since there exist tools that will allow you to build one much more easily.

How do I package a log4j configuration file in a NetBeans Platorm application?

Packaging a log4j configuration file in a NetBeans Platform application apparently requires some thinking through. This is what I tried...
I put log4j.xml in src/main/resources/my/package/log4j.xml of some_netbeans_module. The package is a public module package (i.e. classes from this package are used from other packages). I rebuilt the module and confirmed that the file does, in fact, get packaged into the module.
In my classes I get an instance of the logger the way I always do:
static final Logger log = Logger.getLogger(ThisClass.class);
Every NetBeans Platform application has a my_app.conf file which makes it possible to set certain properties. This is where I set log4j.conf:
log4j.configuration="/my/package/log4j.xml"
Now, when I run the application, I see the following output:
[INFO] /home/me/my_app/application/target/my_app/bin/../etc/my_app.conf: 5:
log4j.configuration=/my/package/log4j.xml: not found
What is wrong with the above configuration?
In the my_app.conf file if you append the log4j.configuration property to the default_options property, like so:
default_options="...<other options> -J-Dlog4j.configuration=my/package/log4j.xml"
then this option will get passed to the JVM. Notice that the log4j property has -J-D appended to it. The -J is used by NetBeans to delineate JVM properties and the -D is used by the JVM to delineate a system property.
Also you can/should drop the quotes and the initial / as the quotes are not necessary and NetBeans will complain if you have the initial /
The other way to do this, and the way that I prefer since it doesn't require editing the .conf file, is to put the log4j.xml file into the default package. If you have other requirements that prevents you from doing this then remember that you must put the log4j.configuration property in the app's platform.properties file while your in dev mode and running the app inside of the IDE. Like so:
run.args.extra=-J-Dlog4j.configuration=my/package/log4j.xml
Edit: For questions regarding NetBeans Platform you might have better luck posting to the NetBeans Platform Users forum.