We are using SQL 2005, and the bundled SSIS.
An Execute Process task is running a standard Windows .BAT batch file.
Inside that batch file, a Java process may be started with something like:
%javapath%\java.exe -cp %classpath% com.mycompany.ToDo
We put a TimeOut value in the task, expecting it to kill the entire task if the job ran too long.
It does appear to terminate the batch file, but not the child Java program.
Options, or ways to kill the entire process tree?
If You are willing to write some code, maybe You could have use of these:
process tree
or
Kill process tree
If You do find solution on the first link, please vote up for both question and answer You used.
Note that code can be utilized from Script Task or You can build an executable program and start it from Execute Process Task.
Related
I have an SSIS package that executes the WinSCP.exe via execute process task.
All works well.
However, i am working on a logging routine for the files that were successfully/unsuccessfully downloaded and i have a header and detail logging table to capture the data.
Is there a way in the execute process task to have the exit code returned into a variable?
Thanks
I need to create a VM instance in google compute engine with a startup script that takes 30 minutes, but it never finishes, it stops around 10 minutes after the instance boots. Is there a timeout? Is there another alternative to accomplish what I need to do? Thanks!
Given the additional clarification in the comments:
My script downloads another script and then executes it, and what that script does is download some big files, and then compute some values based on latitude/longitude. Then, when the process is finished, the VM is destroyed.
My recommendation would be to run the large download and processing asynchronously rather than synchronously. The reason being is that if it's synchronous, it's part of the VM startup (in the critical path), and the VM monitoring infrastructure notices that the VM is not completing its startup phase within a reasonable amount of time and is terminating it.
Instead, take the heavy-duty processing out of the critical path and do it in the background, i.e., asynchronously.
In other words, the startup script currently probably looks like:
# Download the external script
curl [...] -o /tmp/script.sh
# Run the file download, computation, etc. and shut down the VM.
/tmp/script.sh
I would suggest converting this to:
# Download the external script
curl [...] -o /tmp/script.sh
# Run the file download, computation, etc. and shut down the VM.
nohup /tmp/script.sh &
What this does is start the heavy processing in the background, but also disconnect it from the parent process such that it is not automatically terminated when the parent process (the actual startup script) is terminated. We want the main startup script to terminate so that the entire VM startup phase is marked completed.
For more info, see the Wikipedia page on nohup.
I am running Jenkins on a script, that generates a junit.xml report file and other files. However, all those files are zipped by the script, hence Jenkins cannot find it.
Is there a way to make Jenkins unzip the .zip file , find my particular junit file and generate the run results ?
All this is in Linux.
Thanks
Jenkins has the ability to execute arbitrary shell commands as a build step, just add the 'Execute Shell' step to your build and put in the commands you want (presumably 'unzip' would be among them).
Once you've extracted the xml, provided your internal tool generates it using this schema the JUnit plugin just needs the path you extracted to and it will show the results on the build page.
If you have the option, I would really suggest executing your tests via gradle or maven, as outputs from those tasks will produce a report that Jenkins (and other tools) can already read, and can streamline the job setup process for your users. But, if you can't get away with that, the above should work for you.
we have a package that reads text files on an ftp server everyday, however we do not know exactly when the files will be added to the folder on the server, so we have to wait for the files and then fire the package manually. So is there any way to automate the process, so that the job will start whenever files are found?
Why don't you call the SSIS package when the file arrives. In one of my projects, we had a directory listener service that waits for the file to arrive. Once the file arrives, we used to call the SSIS package.
Vijay
Use the WMI Event watcher task. That could be of some help. Or, you can simply execute the package every 10 or 30 minutes. A script task can check if the file is available or not, if not, the package exits immediately and if available, execute the DFT.
I have a .dtsx file (an SSIS package) that downloads files from an FTP server and imports data. It runs fine whenever I run it manually. However, when I schedule calling the package as a step in a SQL server agent job, it fails. The step it fails at is the one where I call a .bat file. The error in the job history viewer says this:
Error: 2009-05-26 12:52:25.64
Code: 0xC0029151 Source: Execute
batch file Execute Process Task
Description: In Executing
"D:\xxx\import.bat" "" at "", The
process exit code was "1" while the
expected was "0". End Error DTExec:
The package execution returned
DTSER_FAILURE (1).
I think it's a permissions issue, but I'm not sure how to resolve this. The job owner is an admin user, so I've verified they have permissions to the directory where the .bat file is located. I've tried going into Services and changing the "Log On As" option for SQL Server Agent, and neither option works (Local System Account and This Account). Does anyone have ideas as to what other permissions need to be adjusted in order to get this to work?
I tried executing just the batch file as a SQL Job step, and it gave more specifics. It showed that it failed when I was trying to call an executable, which was in the same directory as my .bat file, but not in the windows/system32 directory, which is where it was executing from.
I moved the executable to the system32 directory, but then I had no clue where my files were being downloaded to. Then I found that there's a property for the Execute Process Task (the one that executes the .bat) called WorkingDirectory. I set this to be the directory where the bat is located, moved the executable back into the same one as the .bat file, and it's now working as expected.
For me it was a permissions issue. Go to Environment --> Directories, then change Local directory to something the SQLAgentUser can access. I used C:\temp. Click the dropdown for Save, and choose "Set defaults".
Are you executing the SSIS job in the batch file, or is the batch file a step in the SSIS control flow?
I'm assuming the latter for this answer. What task are you using to execute the batch file (e.g. simple execute program task or a script task). If the latter, it looks like your batch file is actually failing on some step, not the SSIS script. I'd check the permissions of what your batch file is trying to access
In fact, it might be a better idea to rewrite the batch file as a script task in SSIS, because you'll get much better error reporting (it'll tell you which step in the script fails).
You could try executing the batch file using the runas command in a command window. If you try and execute it under the local system or network system account, it should give you a better error. If it does error, you can check the error level by going "echo %ERRORLEVEL%".
If it wasn't the latter, and you're executing the SSIS package via a batch file, why?
Are you possibly accessing a mapped drive in your .bat file? If so, you can't rely on the mapped drive from within the service, so you'd have to use UNC path.
I had the same error and I resolved it by logging on to the user account that runs the job, opened Coreftp site in question there, test the site access, made the change there (in my case, I had to reenter the new password) and now it works.
So yes, it is an issue of file access. This one is file access to the coreftp site in question.