Using Visual Studio to build my SSIS packages. Project connections - they seem like a good idea so I'll do that instead of repeating my code....
Problem when you try to run the packages individually outside Visual Studio, say dtexec, they don't have any connection associated with them so they fail.
Do I really have to duplicate package connections for all 20 packages? Is there not a way to share a connection?
I want to be able to run each package separately/in groups as they will have a differing schedules.
I'm a developer so Visual Studio is my default goto - would SQL Server provide a better house for this?
Thanks
For non-project based connections, when you run the package, it'd take a form like
dtexec /file MyPackage.dtsx
However, for project based connections, you need to include the project in the dtexec call. Otherwise, you'll end up with missing connection errors
dtexec /package MyPackage.dtsx /project MyProject.ispac
Note: the current documentation is inaccurate as it specifies /file MyPackage.dtsx /project MyProject.ispac The /File parameter is invalid for Project deployment model solutions
Related
I have an SSIS package that contains a connection manager used for logging. The logging service has been deprecated and I have been tasked with removing references to it from the package. Before I delete the connection manager, I need to remove all references to it, but I haven't been able to figure out how to do that. Can anyone give me any advice? The target server of the package is SQL SERVER 2012, and I am editing the package with Visual Studio 2017.
You would likely spend more time in analysis than just killing the connection manager and following the error list as your clean up punchlist.
Have package in source control
Delete connection manager
Close and reopen package and find and address all instances that show as broken.
Run package as there may be places (Script Task/Components come to mind) that might not know they're broken
Commit to source control.
Lather, rinse, repeat for all packages.
If you need to provide a list beforehand, you could reverse engineer the package to Biml and then you'd have a much smaller domain of XML to sift through looking for references to your connection manager.
I'm having issues running a .dtsx package made in Visual Studio 2019.
In Visual Studio when using "Start without debugging" it runs perfectly with no errors, however when I try to launch the file from command line it's giving me a non-descript error and terminates the package.
It runs for some time and does execute a few of the tasks at the start of the package, however it does not get very far.
This is the "error" showing in the command prompt:
Message: <DTS:ProcessMemoryUsage xmlns:DTS="www.microsoft.com/SqlServer/Dts">DTS:PeakWorkingSetSize177860608</DTS:PeakWorkingSetSize></DTS:ProcessMemoryUsage>
When I look at the error dump, this is what I see:
This is my execution command
"C:\Program Files\Microsoft SQL Server\130\DTS\Binn\DTExec.exe" /f "C:\**PATH**\Visual Studio 2019\Projects\DataDropPrep\DataDropPrep\MSX DataDrop.dtsx" /conf "C:\**PATH**\Visual Studio 2019\Projects\DataDropPrep\Test.dtsConfig" /DECRYPT "****" /CONSOLELOG /DumpOnError /REPORTING EW
I don't understand why it works perfectly when run within Visual Studio 2019 but not in the commandline.
The package is set with target server 2016 (To match our SQL server).
I'm running the package from a local VM on my own PC, but the SQL server is deployed in Azure (cloudapp.azure.com).
I am NOT a DB admin, so I can't add the package to be run from the SQL Server itself unfortunately, wish that I could, would make my life a lot easier.
Alas I am stuck manually running this package every day from my own PC, and I'm trying to automate it with a .bat file, so I need this command working.
This is the package layout, it first truncates a staging table, then loads all items modified or created in the past 3 days, then deletes matching rows in the production table, then imports the new rows from staging. Repeat for all the different objects:
When using "Start without debugging" from Visual Studio 2019, it executes without issues:
Turns out the issue was that I was using a project parameter, which works when running from Visual Studio, but not when executing the dtsx file directly.
After converting the parameters from project to package, it executes successfully!
Without the logging and the insights that provided I wouldn't have known where to start.
Success
In the same folder as the package, SSIS generated a file SQLDUMPER_ERRORLOG.log. This pointed to a file in C:\Program Files\Microsoft SQL Server\150\Shared\ErrorDumps. This folder contains text files such as SQLDmpr0085.mdmp. When I click on this, it launches Visual Studio and gives an error message "The thread tried to read from or write to a virtual address for which it does not have the appropriate access."
After hours of experimenting and doing web searches, I discovered that I was not passing the correct package name to dtexec.exe. Once I corrected this, I was able to execute packages with no issues.
I have some script code in an SSIS package that would work differently when run in SSDT/Visual Studio than on the server.
Development tests are done in Visual Studio and the production runs on the SQL server (version 2017)
What's the best approach to find out where the package is running?
I could use a project/package parameter, but it's less automatic and more error prone. E.g. when forgetting to re-set the parameter to the correct value for the target environment.
While I've not tested all the ways it can be abused, my first thought would be to check the value of System::ServerExecutionID This is a system variable that is zero during design and run time in visual studio.
When executed on the server, this variable will be populated with the Operation ID that is stored in SSISDB.catalog.operations(operation_id). My packages use an audit table that explicitly logs that ServerExecutionID to the insert/update/delete metrics associated to that load.
I have a SQL Server Agent Job executing a file-system-deployed SSIS package via CmdExec & the 32 bit DTExec. However, the most recent version of the package version is not executing.
The change I've made is within a Script Task. I've made sure to Build within the VSTA editor before closing, etc.
I've verified this is the problem by putting the expected version in the command line parameters, so the task fails: /VERIFYVERSIONID "\"{1F870663-1898-4A30-B4A8-226817D1755F}\""
Is there a cache that can be cleared?
One approach would be to change the executiontype for the SQL job to be Operating System (CMDEXE) and provide the path to the dtexec file for SSIS to use as the executable.
C:\Program Files\Microsoft SQL Server\110\DTS\Binn\dtexec.exe /FILE "c:\path\to\my\ssis\package\myssispackage.dtsx" /CHECKPOINTING OFF /REPORTING E
By calling the dtsx specifically you can point to the 32 or 64 bit version. You can also revert to a previous version of SSIS (dtexec) if you have those installed on your executing server.
I recently upgraded some of my packages to SSIS 2012 and had to use the new version for the packages to be executed by dtexec. That resolved my issue. Hope that helps.
Nothing to see here, move along..
I'm doing new child package testing on a server hosting multiple environments shared by my dev team, and we're in the process of shuffling things around. The job in question is one of a few identical ones and the path to the package got changed.
This is a 'master' package that dynamically executes child packages based on values read from a queue which is why it didn't occur to me to check the path.
Noob oversight.
On this link 2nd workaround method suggests to try running SSIS package on computer that doesn't have SQL instance. How is that possible? Do I need to have installed VS 2008 Business Intelligence or it's possible even without it?
I found a link that may help you. Read this MSDN article.
This article has answers for your questions:
"You can install SQL Server Integration Services on a computer that has no previous instances of SQL Server."
"The Business Intelligence Development Tools (BIDS) option installs the Integration Services components required to design a package, but the Integration Services service is not installed and you cannot run packages outside of BI Development Studio."
Of couse the only install the latter is not enough to run arbitary packages.
The following may help you:
Install dtexec utility on the webserver or wherever your SSIS package resides. Regarding the minimum requirements please check this page: http://msdn.microsoft.com/en-us/library/ms162810.aspx
Create a .cmd/bat file that calls dtexec including all the needed parameters. Your webpage 's button or a scheduled job will trigger this .cmd or .bat file
Now, to stop a package in this setup, you can use a precedence constraint for example a file, you can check if the file exists before executing each next step in your SSIS package, this approach is discussed here: http://www.dotnetspider.com/forum/250123-Stop-SSIS-package-at-runtime.aspx. By the way to stop the package I thus suggest creating a second button.
You can install SQL Server Data Tools (SSDT) without installing SQL Server by itself, and then you can use integration services. Try this link:
https://learn.microsoft.com/en-us/sql/ssdt/download-sql-server-data-tools-ssdt?view=sql-server-2017