I'm using SQL 2016 and we're converting over a bunch of SSIS packages (from way back in 2005). In our old architecture we had development and production. We're moving now to source control in VSO and we're staging our deployments. We have local development on developer machines, then we post to Dev, then to QA, then Staging, then finally production.
We've figured out how to use SSIS Environment Variables (AWESOME!) and we're able to run the files on local dev machines from inside Visual Studio using SDT. Then we deploy as project to an ispac file which we copy to the Dev server and import into our SSIS Catalog in SSMS. Then in SSMS we are trying to change the variables for each environment.
The problem is the Data Connection. I was passing the connection string and the authentication password as a parameter into a shared connection. So the connection read those values in from the project parameter when executing. Then we were going to change those values for each environment. It turns out on the server we need to execute using Integrated Security. Since we're testing remotely we can't use Integrated Security on our local machines. So basically local dev is SQL Authentication but Dev, QA, Staging, and Production environments will all be tested on servers using Integrated Security.
I can't seem to get this to work right. I have two Project Parameters DB_ConnectionString and DB_Password. I also have a shared Connection (OLEDB SQL) which in the package is parameterized. We use the Project Parameters for the connection so at execution it's using the project parameters to plug in the string and password.
When I post to live I need Integrated. So I tried putting an Integrated security connection string into the Environment Password fro DB_ConnectionString and then it requires a Password. But that isn't really working right. I'm getting a connection error.
SSIS Error Code DTS_OLEDBERROR
"Invalid authorization specification"
If you can avoid using parameters you will be better off in my opinion. When publishing to the catalog you can set your connection strings based on your environment. This way you can have sql auth on development machines and integrated auth when running in any of your server environments.
Related
I am trying to run my SSIS Package via SQL Server Agent Jobs. In the SSIS Project I have Connections to My SQL Database as PROJECT Connections. I have set them up like this as all Packages call this connection. However when running it in SQL Server Agent I get error saying:
The connection "{}" is not found. This error is thrown by Connections collection when the specific connection element is not found.
It obviously can't locate the connection so how do I do this ? - The package executes successfully when executed via Visual Studio.
Many Thanks In Advance !
Bal
First, you need to create an Integration Services Catalog on your server instance. You will deploy your packages to the catalog.
A best practice for specifying the server in your connection managers is to use a single period, which references the local machine. That way, when you deploy your packages, the local machine will always be used:
After you've created your package(s) with project-level connections, you need to deploy the project. Right-click the project folder in SQL Server Data Tools/BIDS, then click Deploy. In the deployment wizard, specify the destination server (or just use "." again to deploy to the local instance) and the Integration Services (IS) Catalog folder:
Once deployed to your SQL Server instance's IS Catalog, you can set the Package Source in the Job Step Properties to "SSIS Catalog" and select the package that you deployed.
If you're exporting/importing to/from files, you'll want to ensure that the SQL Server Agent Service Account has appropriate rights to the folder where files are imported/exported. The easiest way to do that is to create a credential (usually a Windows user account), then create a SQL Server Agent Proxy that uses the credential, and then specify that proxy in the job step's Run as field.
Here's what your Job Step Properties window would look like after following all of the steps above:
The job should then run successfully.
FYI, you can also execute the package directly from the SSIS Catalog. Simply drill down into the Integration Services Catalog node of your server | right-click the package | click Execute....
Helpful Links
Schedule a Package by using SQL Server Agent
SSIS Tutorial: Deploying Packages
If you setup your configurations on the configurations tab in the Step properties of the SQL Server Job that should allow you to accomplish what you asked.
I have an SSIS package that I am attempting to set up as a SQL Server Agent Job. This package takes XML files and inserts/ updates records in Dynamics CRM 2011. In my development environment, the execution works correctly. However, when deployed to the server (which is the same domain as the database concerned), when executing a System.MissingMethodException is thrown.
The machine definitely has the DynamicsCRM2011.dll in the GAC - this machine has both the database and teh instance of CRM2011 running on it. On my test machine, I was connecting to this server (which is on a different domain) without incident. I get the same issue for all users whether integrated security is used or not.
The files are correctly read, this error is happening when declaring the CRM service and assigning a new Helper from the service (CRM2011.Proxy.Helper).
Any ideas?
Are you sure u added the dll in GAC? You can copy the dll in the same folder as the SSIS Package.
I've run into this twice now where I can run my SSIS package in BIDS and Integration Services but it fails when it's run through SQL Server Agent. Both of these packages transfer a file as a final step to a folder on our network.
The error I receive is Could not find a part of the path then the path and file name. When I schedule them in Windows Task Scheduler they execute fine but I'd rather have them run through sql server agent.
Has anyone run into this issue and found a work around? Is there a setting that I'm missing in sql server?
Any help would be appreciated.
You need to run the SQL Server job using a proxy account that is configured to run jobs of type SQL Server Integration Services Packages. The jobs usually run under SQL Server Agent Service account, which does not have access to network folders. To access network folders, you need to set up a proxy with a domain account credentials (preferably) so you can access network path.
Below answer in SO question has the detailed steps on how to set up a proxy account
How do I create a step in my SQL Server Agent Job which will run my SSIS package?
When I run VS2008 locally and open up a package that points to a remote database and run, I believe that the data, from the input file to the db server, is running through my PC, even if the data file is on the database server.
However, if the SSIS package is stored in SQL Server and I start the job through SQL Agent, my PC is out of the picture and that data does not flow through my PC and so I should get see a signatificant performance boost.
Is this the case? I just want to confirm. Currently, I do not have permission to save a Package on our development server and I am considering requesting rights to be able to do for the above reason provided that it is a valid reason.
What kind of access does one have to have to be able to save SSIS Packages on a SQL Server? Might there be a reason to deny me rights to do so perhaps because granting me such access would require giving an elevated access level that would also allow me to do other things that the DBA might not want me to do? As a developer, I think that I should be able to shuffle data from UAT, or so iother non production env into a DEV database without having to request that a DBA do it when he gets around to it.
Your understanding of where the package executes is correct, and performance will certainly be improved by moving execution to the server. At least, it will be if the server has more system resources than your workstation, especially RAM. And avoiding using the network unnecessarily is helpful too, of course.
There are specific roles created in the msdb database for managing SSIS packages so your DBA can let you deploy and run them without making you a sysadmin. However, as the documentation says, there is a possible privilege escalation issue if you run the packages from jobs so the recommended solution is to create a proxy account.
I am using Intellij IDEA to develop my applications and I use glassfish for my applications.
When I want to run/debug my application I can configure it from Glassfish Server -> Local and define arguments at there. However there is another section instead of Glassfish Server, there is a Remote section for configuration. I can easily configure and debug my application just defining host and port variables.
So my question is why to need for Glassfish Server Local configuration(except for when defining extra parameters) and what is difference between them(I mean performance or etc.)?
There are a number of development work-flow optimizations and automation that can be performed by an IDE when it is working with a local server. I don't have a strong background in IDEA, so I am not sure which of the following they may have implemented:
using in-place|exploded|directory deployment can eliminate jar/war/ear creation in the IDE and deconstruction in the server. This can be a significant time saver.
linked to 1 is smarter redeployment. In some cases, a file change (like changing a jsp or an html file) does not need to trigger redeployment.
JDBC driver integration allows users to configure their IDE to access a DB and then propagates that configuration (which usually includes driver jars, etc.) into the server's classpath as part of deployment of an app.
access to server log files during deployment and execution.
The ability to start and stop the server... even today, you do need to restart GlassFish sometimes.
view the generated Java sources of a JSP.
Most of these features are not available with a remote server and that has a negative effect on iterative development since the break between edit and validate can be fairly long.
This answer is based on my familiarity with the work that we have done for the NetBeans/GlassFish integration. The guys at IntelliJ are smart, so I would not be surprised if they have other features that are available when you are working with a local server.
Local starts Glassfish for you and performs the deployment. With Remote you start Glassfish manually. Remote can be used to debug apps running on another machines, Local is useful for development and testing.