SSIS package only executes the last part of package in SQL Agent job but executes all of the package in Visual Studio - ssis

I created an SSIS package which has 3 script tasks that checks if a different file exists for each and if the any of the script tasks finds the file, then it goes to a foreach loop which imports data to a sql server staging table, then copy and renames the file.
It works successfully within Visual Studio, however when I set up a sql agent job, the package only processes the last file and ignores the first two.
Any ideas why this may be occurring?

Probably either permissions of the SQL Agent, or invalid file paths on the SQL Server.

Related

Sql Server - Run job which contains dtsx that creates files in a remote folder in another server

I am doing a SSIS dtsx file in which I access data in Dynamics CRM. This is the procedure:
* Someone copies files in a folder (.txt files from OCR, that's an external process).
* The SSIS grabs those files and copy them to another folder "Processed". Then the SSIS looks for data in those .txt and creates records for entities in Dynamics CRM.
* When some record canĀ“t be created on CRM, it is an Error and the SSIS creates a temporal table in SQLServer with the error. Then the SSIS creates some .txt files which show all the errors, the number of the row and name of the original file.
I could run it from the Visual Studio, but I am not able to run it from a job. I've followed those steps: https://www.codeproject.com/Articles/14401/How-to-Schedule-and-Run-a-SSIS-package-DTS-Job-in , but still not running.
What may I do to handle this?
The dtsx in the job is in a SQL server and the files which I want to work with are in a CRM server.
Thanks very much for your help!
Finally ir worked!!
I followed the steps again and I could run the job successfully.

In to how many ways we can call or execute one package from other package in ssis

In recent interviews i have come across with this question that what are the ways from which you can or execute one package from other package.
Assuming you are using SSIS 2012+ in SSIS Catalog - Project mode.
Call another package from this project in Package task
Start package with stored procedures in SSISDB - in SQL Tasks
Create SQL Job to execute package, then start this Job from SQL Task
Create Execute cmd task, which start package with dtexec
Create Script task which starts package
Approaches 2-5 basically are doing the same - start out of process execution of some package, with either calling SSISDB SPs or DLL directly, or with dtexec wrapper.
Below are the approaches AFAIK.
Using Execute Process Task.
Using SQL Agent and call the agent in SSIS package.
Using command line (dtexec).
Using Scripting (.net script to execute dtsx file).
Using Stored Procedure.
Using a Batch file and calling batch file a package.
ExecutePackageTask, StoredProcedure(sql),dtexec.exe utitlity and dtexecui.exe

Transfer output file of T-SQL statement in SSIS package to another server

I have an existing SSIS package on my DB server and I have another server which is the application server which uses this DB server for its operation. This SSIS package, is being used by a job running on the DB server. I would like to add an additional step. That step is just a basic select count(*) query to get count from one table. I implemented that on my test server and gave the output path in the advanced tab as well. And i got the output in a simple text file.
My question now is, How do i send this output file to my application server instead? Because the output path of a T-SQL server seems to only take local drives. I tried giving the path of my application server(which was being used by the SSIS package already) but the output doesn't come on the text file even 'though it says the job executed successfully. I don't see the necessity to create another SSIS package for just getting a simple count info.
Use the UNC path to your application server file system.

Create a task in SQL or Windows Server to clean a directory nightly

I have a temp directory on my website where users export data in .csv files.
The newer intranet apps delete the file after it's sent to the client but the legacy apps just leave the files in this directory.
I'd like to create a task to clean this directory nightly. There can be .csv files and directories with files in them.
Basically I want to run:
del *.* /s
rd /s
...every night at midnight.
Would love to do it with a SQL maintenance task but that only runs on the actual SQL server and doesn't work with mapped drives (unless I'm missing something).
How does one go about performing this task?
Can it be done through SQL server somehow?
You have the option of creating "Jobs" that run stored procedures or bits of code. These jobs can be scheduled to run daily, weekly, etc. Check out this thread: How can i create SQL Agent job in SQL Server 2008 standard?
Sounds like you may be using SQL Server Express Edition for which Microsoft removed the SQL Agent on 2008 and above. In cases like this you're best choice is using the Windows Scheduler to run your commands via a batch file.

SSIS package does nothing when invoked by agent

SSIS package loops through input files. For each file, flatfile parse adds records to a DB table, then file is renames/moved for archiving. After all files, package calls a sproc to delete all year-old records.
Package runs from visual studio OK. Put in SSIS package store, run from there, no problem.
Create an SQL Agent job to run package. Job does something for about five minutes, announces it was successful, but no new records in DB and no renaming of input files.
Package uses dedicated login for SQL Server privileges. Job is run as HOSTNAME-SVC which has read/write privileges on the input directory and the archive directory.
Have you setup logging for the package? You could add a script task to the For-Each Loop Container that runs a Dts.Events.FireInformation command during each loop. This could help you track the file name it finds, the number of loops it does, how long each loop takes, etc. You could also add a logging step at the end so that you know it is at least exiting the For-Each Loop container successfully.
If you find that the package is running successfully but not looping through any files at all, then you may want to test using a simpler package that reads one file only and loads it into a staging table. If that works, then go the next step of looping over all the files in the director and only importing the one file over and over again. If that works, then go the next step of changing the file connection to match the file that it finds in the For-Each Loop Container file enumerator task.
If the package isn't looping over any files and you can't get it to see even the one file you tested loading from the job, then try creating a proxy account with your credentials and running the job as the proxy account. If that works, then you probably have a permissions issue with your service account.
If the package doesn't import anything even with the proxy account, then you may want to log into the server as the service account and try to run the SSIS package in BIDS. If that works, then you may want to deploy it to the server and run the package from the server (which will really use your machine, but at least it uses the ssis definition from the server). If this works, then try running the package from the agent.
I'm not sure I fully understand. The package has already been thoroughly tested under several Windows accounts, and it does find all the files and rename all the files.
Under the Agent, it does absolutely nothing visible, but takes five minutes to do it. NO permissions errors or any other errors. I didn't mention that an earlier attempt DID get permissions errors because we had failed to give the service acount access to the input and output directories.
I cannot log in as the service account to try that because I do not have a pasword for it. But sa isd job owner so it should be able to switch to the service account--and the access errors we got ten days ago show that it can. The package itself has not changed in those ten days. We just deleted the job in order to do a complete "dress rehearsal" of deployment procedure.
So what has changed, I presume, is some detail in the deployment procedure, which unfortunately was not in source control at the time it succeeded.
It seems to be something different about the permissions. We made the problem go away by allowing "everyone" to read the directory on the production server. For some unknown reason, we did not have to do that on the test server.
When the job tried to fetch the file list, instead of getting an error (which would be logged) it got an empty list. Why looping through an empty list took five minutes is still a mystery, as is the lack of permissions. But at least what happened has been identified.
I had a similar problem. Was able to figure out what was happening by setting the logging option of the SQL Server Agent Job.
Edit the step in the job that runs the package, go to the logging tab and pick "SSIS log provider for SQL Server" and, in the configuration string, I picked (using the drop down) the OLEDB connector that was in the package, it happens to connect to SQL Server in question.
I was then able to view more details in the history of that job, and confirmed that it was not finding files. By changing permissions on the directory to match the sql server agent account, the package finally executed properly.
Hope this helps.
You may want to turn logging off after you resolve your issue, depending on how often your package will run and how much information logging provides in your case.
Regards,
Bertin