What is meant by "Configuration Information in the book "Continuous Delivery" by Jez Humble?
Any parameter to any application that can vary across environments. Usernames, passwords, URLs, options and flags.. anything like that.
Humble and Farley recommend (require, even!) that all configuration information be versioned in your SCM and deployed with the application it configures. This avoids problems like default configurations that don't work, regressions caused by overwriting configuration files during deployment, or accidentally accessing production databases when deploying to a test environment.
While it is recommended to have continuous information in an SCM, what I have found is that it is not always done. In such cases, you should look at monitoring these information or data such that any changes trigger a pipeline of validations as well, to ensure you are managing change across your SCM and configurations outside of SCM.
Related
There is ServiceNow, OpenText and SAP systems which have been integrated across their DTAP environments. (e.g. ServiceNow DEV <-> SAP DEV; OpenText UAT <-> SAP UAT; etc.)
Periodically we clone over sub-production instances to keep them the same as production.
During every clone we have problems with breaking integrations on SAP or OpenText.
These are caused by wrong integration credential or wrong endpoint which were overwritten by Production information.
What is the best practice how to avoid this problem, help with detection?
You can exclude specified tables from a clone by adding each table to the list of exclusions in the Clone Profile. This creates new records on the 'clone_data_exclude' table and associates them to the appropriate Profile.
Reference: https://docs.servicenow.com/bundle/rome-platform-administration/page/administer/managing-data/task/t_ExcludeATableFromCloning.html?cshalt=yes
You could also use post-clone clean up scripts to set the appropriate integration related values after a clone.
Reference: https://docs.servicenow.com/bundle/rome-platform-administration/page/administer/managing-data/concept/post-clone-cleanup-scripts.html?cshalt=yes
I am currently looking for a version control tool for MySQL DB which is cloud hosted. The new application/tools needs to be cloud hosted as well.
I am interested in something which is free or have nominal charges. I looked over the internet but could not find a satisfactory solution. My requirement is very simple with the tools i.e. it should just capture all the changes to the DB and if required I can switch over back to the previous state of the DB.
Can you please advise some options which i can explore.
Thanks
There are a couple of tasks/tools you need to implement to consider the minimal DB version control support.
Source Control Repository - tool to physically keep track of changes made to files (i.e. git, SVN, TFS, etc.). Git is free.
Extract a representation of the DB - the DB needs to be represented in some kind of file format in order to be version controlled. It is not recommended that the database files themselves be version controlled, nor is it recommended to take backups of the db and storing those. You'll want to store the DB in a semantic representation, such as .sql file(s). There are a couple free tools that can be used (i.e. mysqldump, HeidiSQL). I like using HeidiSQL as it has a feature that allows you to selectively create dump scripts. These script can be stored in the version control system.
If you are looking at automation, you'll want to consider implementing a build server tool like Jenkins. Jenkins can be configured to poll a git repository and then execute build scripts (created by you) if it notices a change.
Some of my answer is subjective and I have glossed-over several intricacies of DB Version Control. You can research more questions about DB Version Control that have some pretty good answers and direction to well-written blog posts. There are also a plethora of paid tools out there that give significant DB Version Control functionality. To name a few, there's Red Gate, DBMaestro, Liquibase, RoundhousE, MySQL Workbench, and others.
I hope this helps.
I know this question has been asked before and I've been doing some research on using a single package with multiple configurations.
An example of this might be: A package that has an FTP control in it.
We might have two or three different FTP sites to connect to, but we only need one package. What I'm looking for is to run that package with three different configurations.
Example configurations:
- Username
- Password
After doing some research on the use of a configuration table, I'm still confused as to whether or not the standard configuration table would work with this scenario or not in SSIS (2008 R2). The article I found that I'm confused by is:
SSIS TABLE DRIVEN PACKAGE CONFIGURATIONS WITH ROW LEVEL FILTERING
If I read the article, some of it makes sense but I'm still not entirely certain that I would need a custom solution to do this or not.
What I'm after is this: I am designing a three tier architecture (three layers of packages).
The first layer will always do file acquisition, file preparation, and translations to a common XML format based on a provider's proprietary format, whatever it may be.
The second layer would take the common format and transform that into a "destination format" and then the third layer would take the destination XML format and insert it into the database.
My thoughts are to possibly use a table to configure the second and third layers and have the top layer using an XML configuration file since it will change based on the provider. Any thoughts on this would be appreciated.
I highly recomend this book and the package configuration framework contained therein:
Microsoft SQL Server 2008 Integration Services: Problem, Design, Solution
ISBN: 978-0-470-52576-0
For me, this book took all the mystery out of package configuration with easy to understand step by step set up and examples. It presents a robust beginning to end solution that I was able to implement in less than a day. Code download link here.
The authors do an excellent job of walking you through package configuration theory and practical setup as part of their larger package management framework.
I would like to second their recommendation to make a "Template Package" from which you build all other packages. The template has all the SSIS PDS goodies already in it so you only need build those once.
One of the most useful features is their approach to package configuration which stores all your configurations in one table in sql server and shows you have to access these at the system, application, and package levels for a layered configuration approach that I find most useful for situations like the one I think you are describing.
Setting up a general configuration as part of the template package once and then tweaking it from there as needed has saved me hundreds of hours in development time.
Not to gush on about this solution, but I found it particularly useful for Username and Password configurations as these are two of the core examples in the book. It can be a bit tricky to simultaneously configure the package with a Username and Password while simultaneously giving these two key pieces of data an adequate level of security.
For example: If you are interested in keeping Username and password secure, you would almost never want to just store these in package scoped variables. This book shows you how to add Username and password as configurations, keep them secure, and how to easily keep them up to date.
Although hopefully when your packages run in production (at least) you are using a Service account where the password never expires (all the more reason to keep those credentials secret and safe) and not your own user credentials.
hope this is a good spot for my question,
for it i SW Related, but not code related.
We, in our company are using TRAC for Issue tracking and management of the Code links,
I am very satisfied by it, and like how it is working.
i have about several environments (1 per project) and every time we change a setting in the Configurations (e.g. Users & Permissions, Severity, Ticket types, etc...) we need to change all of them.
I Use
[inherit]
file=../../../sharedTrac.ini
and delete the shared parts from the file.
for the preferences, but i didn't find a way to share the Configurations.
this is bad for several reason and the head reason is that is "Bugs me !!!" :p
Can TRAC read its configurations from a central definition, and the data from a local DB?
EDIT:
I noticed all these configurations are in the .db file (sqlite file)...
Is there a Ready made tool to copy the configurations from DB to DB ?
or should i go ahead and analyse what should be copied and how ?
You're almost there. Note though, that local settings will always over-rule inherited ones, so you must delete them in your <env>/conf/trac.ini files to make central configuration effective.
Specifically to the part of configuration inside Trac db: No, there is no sync tool yet. Given that there was one for user accounts that is still a beta after years, there's not much interest. You should use the trac-admin command-line tool (as already advised here) or start to directly sync parts the db by means of own (Python) scripts or custom db syncronisation. For a start have a look at the Trac db schema.
You can try to do this through command line. Just call appropriate "trac-admin" command for each instance. Example one-liner to add user profile:
for D in */; do trac-admin $D session add username "Full Name" user#email.com ; done
I have a Drupal site running in production. After some time I had changes in code and through admin as well, some configurations, changed content types and changed body of some pages etc. Meanwhile the production database was growing. Now I want my changes in production by not loosing the data which is already in production DB. One way is to repeat the same steps as of Dev on production. That looks not good to me. Is there any automated procedure to migrate the changes?
Thanks
The modules features and strongarm will do the trick for you.
Features can help you save and migrate the content types, for example, while strongarm will help you migrate site settings and configuration information that is stored in variables.
After installing the two modules, go to Admin --> Structure --> Features --> Manage on your dev site and create features for the changes you want to transfer from dev to production. If you have both features and strongarm installed, it will let you create features that capture both site building components (content types, views you created, roles and permissions you have changed, etc) and site settings (settings stored in variables -- you'll see the long list of settings you can export once you install the strongarm module). When you create your feature, it is exported as code (as a module), and you can then add that module to any additional sites in which you want to add the components you selected when creating your feature(s).
You will have to install the two modules on your production environment too. Then add the features you just created in your dev environment to your production site. Once set up though, you can transfer changes between dev and production environments more easily going forward!
Here is the features documentation: http://drupal.org/node/580026.
Hope this doesn't sound too confusing!