I am wondering what considerations can lead to choosing package-configurations over project-configurations (or vice-versa) on Sql Server 2012.
I for example was told that using package-configurations it was easier to migrate ETL's from one machine to another.
edit: the question is about package configurations vs project configurations, not about package-parameters vs project parameters
If multiple packages in the same project are going to share the same parameter, and you want to be able to change it once and have that change affect all packages, you would make it a project parameter.
If only one package should use it, and you should be able to change it without affecting other packages in the same project, then you should make it a package parameter.
This is the only factor I consider when deciding.
It seems there is nothing that package-configuration can do which Project-configuration cannot do. The opposite is true however.
Related
My company is migrating from MYSQL to Greenplum Database.
We have many jobs running through Talend and I have to manually change each component from MYSQL to Talend.
Is their an easier way to go about it through which all components are directly converted instead of having to convert each component individually?
With talend studio i am not sure theres is a way to automatically convert a componant but you can store credentials in the repo to make it fairly easy to re-configure the componant and make it a fairly easy process you can even create generic schemas that can be used by any db componant.
https://help.talend.com/r/en-US/7.3/repository-manager-user-guide/how-to-add-repository-connection
https://help.talend.com/r/en-US/7.3/studio-user-guide-data-fabric/setting-up-generic-schema-from-scratch
also to add a bit of extra value here there is some free training available on talend academy https://academy.talend.com/learn/register all you have to do is sign up for an account.
Theoretically one could create a Migration Task similar to existing ones:
https://github.com/Talend/tdi-studio-se/tree/maintenance/7.3/main/plugins/org.talend.repository/
There are many existing to get ideas from:
https://github.com/Talend/tdi-studio-se/blob/maintenance/7.3/main/plugins/org.talend.repository/src/main/java/org/talend/repository/model/migration/RenametDBInputToPostgresqlMigrationTask.java
I myself never tried this but I think this should be the most flexible way to do such. Use System properties to enable/disable this. You'd need to compile the org.talend.repository plugin, then replace the one in your studio.
Don't forget to remove the configuration/org.eclipse.osgi folder as it caches the plugins and your changes wouldn't be picked up!
If you're stuck you can also try https://community.talend.com/
Our Change Approval team is trying to get us to use Octopus to deploy SSIS packages to our Production environment. The problem is that the tool (Azure DevOps) we use to generate the build for Octopus doesn't create or populate Project parameters when it deploys a project. It does, however, create and populate Environment parameters.
Up to now we have not used Environment parameters at all because we don't use multiple environments on any one server.
The CA team is suggesting that we work around the inability to deploy Project parameters by converting them to Environment parameters, and create one Environment per project.
This feels wrong to me, but I haven't been able to come up with a reason we can't do this, nor have I found one anywhere on the internet so far. On the other hand, I also haven't found anybody saying that they are doing this and it's working fine, either.
So, does anybody have any arguments for or against using Environment parameters to replace Project parameters per se?
Assume that I don't have to worry about any possible future need for multiple environments on a server.
I know this question has been asked before and I've been doing some research on using a single package with multiple configurations.
An example of this might be: A package that has an FTP control in it.
We might have two or three different FTP sites to connect to, but we only need one package. What I'm looking for is to run that package with three different configurations.
Example configurations:
- Username
- Password
After doing some research on the use of a configuration table, I'm still confused as to whether or not the standard configuration table would work with this scenario or not in SSIS (2008 R2). The article I found that I'm confused by is:
SSIS TABLE DRIVEN PACKAGE CONFIGURATIONS WITH ROW LEVEL FILTERING
If I read the article, some of it makes sense but I'm still not entirely certain that I would need a custom solution to do this or not.
What I'm after is this: I am designing a three tier architecture (three layers of packages).
The first layer will always do file acquisition, file preparation, and translations to a common XML format based on a provider's proprietary format, whatever it may be.
The second layer would take the common format and transform that into a "destination format" and then the third layer would take the destination XML format and insert it into the database.
My thoughts are to possibly use a table to configure the second and third layers and have the top layer using an XML configuration file since it will change based on the provider. Any thoughts on this would be appreciated.
I highly recomend this book and the package configuration framework contained therein:
Microsoft SQL Server 2008 Integration Services: Problem, Design, Solution
ISBN: 978-0-470-52576-0
For me, this book took all the mystery out of package configuration with easy to understand step by step set up and examples. It presents a robust beginning to end solution that I was able to implement in less than a day. Code download link here.
The authors do an excellent job of walking you through package configuration theory and practical setup as part of their larger package management framework.
I would like to second their recommendation to make a "Template Package" from which you build all other packages. The template has all the SSIS PDS goodies already in it so you only need build those once.
One of the most useful features is their approach to package configuration which stores all your configurations in one table in sql server and shows you have to access these at the system, application, and package levels for a layered configuration approach that I find most useful for situations like the one I think you are describing.
Setting up a general configuration as part of the template package once and then tweaking it from there as needed has saved me hundreds of hours in development time.
Not to gush on about this solution, but I found it particularly useful for Username and Password configurations as these are two of the core examples in the book. It can be a bit tricky to simultaneously configure the package with a Username and Password while simultaneously giving these two key pieces of data an adequate level of security.
For example: If you are interested in keeping Username and password secure, you would almost never want to just store these in package scoped variables. This book shows you how to add Username and password as configurations, keep them secure, and how to easily keep them up to date.
Although hopefully when your packages run in production (at least) you are using a Service account where the password never expires (all the more reason to keep those credentials secret and safe) and not your own user credentials.
How are you supposed to correctly use a Visual Studio Team System database project to implement version control on a sql server database?
This might seem overly generic but everything I've found so far online hasn't helped me in being able to achieve anything useful. I have managed to find functionality that appears to be similar to features that are in Redgate's tool Sql Compare but it definitely didn't seem as intuitive as their product.
From my understanding of how these db projects are supposed to work is that you're able to have a version of the database that either lives in Team Foundation Server (or inside the sql server itself) that you can check out to your local machine work on it and then check in the new changes which would allow for simultaneous development to work normally as it does for coding. Was I misinformed? Or is it just a complicated process to get setup?
Related is then how do you use it to deploy changes to the staging/production servers?
We don't use that, we simply script every thing and put it in source control like any other file and ALL deployments to prod are only through scripts pulled down from source control. I think the real key is that nothing gets put on prod except thorugh a source controlled script. Once the developer can't get his change to prod any other way (Devs should not have prod rights), there is no incentive to not put the change in source control.
Funny you should ask. I am the one responsible for getting our production databases under version control, and we're using Visual Studio Database Edition to do it. It is a fantastic tool. The very nice thing about this tool is that not only will it keep your schema under version control but it will validate your database schema as well and permit you to run code analysis against it. It also allows refactoring operations, and many other things.
Typically we work against a local development database, synch the changes back to VSDE, build the database to make sure there are no warnings or errors, and then create a deployment script for deployment to our production databases.
This is a simplified explanation of what and how we doing this, but I think it gives you a general idea of how it can be used. I'd be glad to answer any more specific questions you have.
I am about to use linqtosql in my first asp.net mvc application.
I have come up with a database schema. But the problem is that I may change few of the tables in future. So keeping the model classes in sync with database will be a issue.
I got this link which states the similar situation,
keep LinqToSQL sync with the database
My question is, has any body used the third party tools given in the above post,
do they work properly
www.huagati.com/dbmltools/
www.perpetuumsoft.com/Product.aspx?lang=en&pid=55&tid=linqtosqlsynchronization
Or is there any better approach for this problem.
The "official" approach is to simply delete any out of date tables from the designer then drag the updated table from your Server Navigator back on again. I've been using this method for well over a year now and so long as you make your data context changes at the same time you're updating the database you should be OK. It also gives you extra incentive to make sure you have your database structure in order before continuing.
There is also SQLMetal.
http://msdn.microsoft.com/en-us/library/bb386987.aspx
This is whats in our CreateDBML.bat file
call "C:\Program Files\Microsoft Visual Studio 9.0\VC\vcvarsall.bat" x86
sqlmetal /server:{server-name} /user:{username} /password:{password} /database:{databasename}
/dbml:..\..\Codebase\Domain\CompanyName.ProjectName.Domain\Entities\ProjectName.dbml
/namespace:CompanyName.ProjectName.Domain.Entities /pluralize /views
sqlmetal /code:..\..\Codebase\Domain\CompanyName.ProjectName.Domain\Entities\ProjectName.designer.cs ..\..\Codebase\Domain\CompanyName.ProjectName.Domain\Entities\ProjectName.dbml
pause