I have a SQL Server Agent Job executing a file-system-deployed SSIS package via CmdExec & the 32 bit DTExec. However, the most recent version of the package version is not executing.
The change I've made is within a Script Task. I've made sure to Build within the VSTA editor before closing, etc.
I've verified this is the problem by putting the expected version in the command line parameters, so the task fails: /VERIFYVERSIONID "\"{1F870663-1898-4A30-B4A8-226817D1755F}\""
Is there a cache that can be cleared?
One approach would be to change the executiontype for the SQL job to be Operating System (CMDEXE) and provide the path to the dtexec file for SSIS to use as the executable.
C:\Program Files\Microsoft SQL Server\110\DTS\Binn\dtexec.exe /FILE "c:\path\to\my\ssis\package\myssispackage.dtsx" /CHECKPOINTING OFF /REPORTING E
By calling the dtsx specifically you can point to the 32 or 64 bit version. You can also revert to a previous version of SSIS (dtexec) if you have those installed on your executing server.
I recently upgraded some of my packages to SSIS 2012 and had to use the new version for the packages to be executed by dtexec. That resolved my issue. Hope that helps.
Nothing to see here, move along..
I'm doing new child package testing on a server hosting multiple environments shared by my dev team, and we're in the process of shuffling things around. The job in question is one of a few identical ones and the path to the package got changed.
This is a 'master' package that dynamically executes child packages based on values read from a queue which is why it didn't occur to me to check the path.
Noob oversight.
Related
I'm having issues running a .dtsx package made in Visual Studio 2019.
In Visual Studio when using "Start without debugging" it runs perfectly with no errors, however when I try to launch the file from command line it's giving me a non-descript error and terminates the package.
It runs for some time and does execute a few of the tasks at the start of the package, however it does not get very far.
This is the "error" showing in the command prompt:
Message: <DTS:ProcessMemoryUsage xmlns:DTS="www.microsoft.com/SqlServer/Dts">DTS:PeakWorkingSetSize177860608</DTS:PeakWorkingSetSize></DTS:ProcessMemoryUsage>
When I look at the error dump, this is what I see:
This is my execution command
"C:\Program Files\Microsoft SQL Server\130\DTS\Binn\DTExec.exe" /f "C:\**PATH**\Visual Studio 2019\Projects\DataDropPrep\DataDropPrep\MSX DataDrop.dtsx" /conf "C:\**PATH**\Visual Studio 2019\Projects\DataDropPrep\Test.dtsConfig" /DECRYPT "****" /CONSOLELOG /DumpOnError /REPORTING EW
I don't understand why it works perfectly when run within Visual Studio 2019 but not in the commandline.
The package is set with target server 2016 (To match our SQL server).
I'm running the package from a local VM on my own PC, but the SQL server is deployed in Azure (cloudapp.azure.com).
I am NOT a DB admin, so I can't add the package to be run from the SQL Server itself unfortunately, wish that I could, would make my life a lot easier.
Alas I am stuck manually running this package every day from my own PC, and I'm trying to automate it with a .bat file, so I need this command working.
This is the package layout, it first truncates a staging table, then loads all items modified or created in the past 3 days, then deletes matching rows in the production table, then imports the new rows from staging. Repeat for all the different objects:
When using "Start without debugging" from Visual Studio 2019, it executes without issues:
Turns out the issue was that I was using a project parameter, which works when running from Visual Studio, but not when executing the dtsx file directly.
After converting the parameters from project to package, it executes successfully!
Without the logging and the insights that provided I wouldn't have known where to start.
Success
In the same folder as the package, SSIS generated a file SQLDUMPER_ERRORLOG.log. This pointed to a file in C:\Program Files\Microsoft SQL Server\150\Shared\ErrorDumps. This folder contains text files such as SQLDmpr0085.mdmp. When I click on this, it launches Visual Studio and gives an error message "The thread tried to read from or write to a virtual address for which it does not have the appropriate access."
After hours of experimenting and doing web searches, I discovered that I was not passing the correct package name to dtexec.exe. Once I corrected this, I was able to execute packages with no issues.
I have an SSIS solution with about 150 packages. Two of the packages consistently fail with an "Unexpected Termination" error when running in 64-bit. They succeed when running in 32-bit runtime.
The packages that fail writes some rows to the destination before failing.
The source and destination for all data flows is the local SQL Server databases. I use SQL Server Native 11 OLE DB drivers.
Both packages contain DT_NTEXT data types, but so do some of the other packages that succeed.
The are nothing in the ErrirDumps folder (C:\Program Files\Microsoft SQL Server\140\Shared\ErrorDumps). Also nothing in the Windows event log.
Below is an image of the dataflow task. The LKP component only caches 5 rows.
Any help or ideas would be much appreciated.
I had the same problem with my SSIS package. After upgrading from 2010 32-bit to 2016 64-bit, the package stopped working. SQL Server Job doesn't show errors, and even verbose error logging does not provide useful information.
My packages were importing a few different spreadsheets in parallel. After changing the tasks to serial processing (import the next spreadsheets after the first one is done), the package works again.
Before updating your SSIS package, I suggest you disable tasks in your package and find out the source of the error. But hopefully this helps!
I had a similar problem, or hopefully the same. The package could not be run using 64bit env and there were no logs either.
Just check your dotnet framework installation. There was a problem with file permissions at machine.config file within "Framework64" folder...
Using Visual Studio to build my SSIS packages. Project connections - they seem like a good idea so I'll do that instead of repeating my code....
Problem when you try to run the packages individually outside Visual Studio, say dtexec, they don't have any connection associated with them so they fail.
Do I really have to duplicate package connections for all 20 packages? Is there not a way to share a connection?
I want to be able to run each package separately/in groups as they will have a differing schedules.
I'm a developer so Visual Studio is my default goto - would SQL Server provide a better house for this?
Thanks
For non-project based connections, when you run the package, it'd take a form like
dtexec /file MyPackage.dtsx
However, for project based connections, you need to include the project in the dtexec call. Otherwise, you'll end up with missing connection errors
dtexec /package MyPackage.dtsx /project MyProject.ispac
Note: the current documentation is inaccurate as it specifies /file MyPackage.dtsx /project MyProject.ispac The /File parameter is invalid for Project deployment model solutions
I have some custom control flow and data flow tasks that are not showing up in the SSIS Toolbox. When I open a package in SSDT containing the custom tasks I get several errors loading the package.
Error loading MyModuleTemplate1.dtsx: Cannot create a task from XML for task "__ECT Fetch Configuration", type "MyEncryptedConfiguration, MyCustomTasks, Version=1.0.0.0, Culture=neutral, PublicKeyToken=2bdffe791d57af79" due to error 0x80070057 "The parameter is incorrect.".
I've followed Russ' and Matt's replies from the following post http://social.msdn.microsoft.com/Forums/sqlserver/en-US/5fdf9243-7e38-401e-bf4a-9251738877f9/how-to-browse-the-installed-custom-task-in-sql-server-2012-ssis-ie-custom-ssis-control-flow-task, however, the tasks still don't show.
I'm using SQL Server 2012 x64 with SSDT on Windows Server 2008 R2 x64. I have verified that the tasks are in the GAC and that the assemblies are in the appropriate SQL Server\110\DTS directories. The assemblies have been compiled as 32-bit since I kept getting a warning about targeting Any CPU while referencing 32-bit assemblies (the SQL reference assemblies).
I'm able execute the packages with the custom tasks, on the target server successfully through command line. Additionally, these tasks work fine on my development machine. I can see the tasks in the SSIS Toolbox and can open packages with the tasks without issue. My dev machine is nearly identical to the target machine except that I have full installation of Visual Studio 2010.
Am I missing something?
This problem is the result of WiX installer not properly installing the custom tasks. Below is what I did to verify this.
For test purposes I created an empty custom task and manually deployed it to the target server by copying the files and the DTS\Tasks directory. This worked so I added the .cs file of one of the tasks that was throwing errors, to the test custom task and re-deployed. I was able to see both tasks. After cleaning the solution with the original custom tasks, I manually deployed the original custom task to the target server, like the test custom tasks and was able see the tasks in the SSIS toolbox. I then uninstalled the custom tasks and deployed them to the target server using a WiX installer -- the problem resurfaced. So the problem is with the installer. My previous attempts at manually installing the custom tasks failed because I was using the assemblies installed from the WiX installer, although I don't know why.
Ok - this is probably a silly question but what the hell.
I've created a SSIS package with configuration and a deployment manifest. When I copied this over to the server and double clicked it, the Package Wizard returned an error - that Integration Services wasn't installed - bad.
But when I did an export data task via SSMS, I noticed I could save this task as a package. When I set up a job I noticed I could actually select an SSIS package to schedule - even though SSIS is not installed(!?!) Does this mean I could just point to my earlier package and schedule that?
If so, do you only need Integration Services installed in order to install packages within SQL Server - its fine to run them from the file system?
Cheers
Here is a good article on different ways to execute a package programmatically:
http://blogs.msdn.com/b/michen/archive/2007/03/22/running-ssis-package-programmatically.aspx