I am trying to import bulk text files (more than 100, 000) into a SQL database. I have created an SSIS package to do that. The package has a File System Task which is supposed to move the files imported to the database to another location (archive folder). This helps in case there is an error during importation so am able to identify where the job stopped.
When i execute the package in SSIS (Visual Studio), the files are being imported and moved.
When i however, execute the package in CMD, the package is only importing the files but not moving.
I have tried to run CMD as administrator but it has not worked as i thought that it was an issue with permissions.
I created an SSIS package which has 3 script tasks that checks if a different file exists for each and if the any of the script tasks finds the file, then it goes to a foreach loop which imports data to a sql server staging table, then copy and renames the file.
It works successfully within Visual Studio, however when I set up a sql agent job, the package only processes the last file and ignores the first two.
Any ideas why this may be occurring?
Probably either permissions of the SQL Agent, or invalid file paths on the SQL Server.
I am doing a SSIS dtsx file in which I access data in Dynamics CRM. This is the procedure:
* Someone copies files in a folder (.txt files from OCR, that's an external process).
* The SSIS grabs those files and copy them to another folder "Processed". Then the SSIS looks for data in those .txt and creates records for entities in Dynamics CRM.
* When some record can´t be created on CRM, it is an Error and the SSIS creates a temporal table in SQLServer with the error. Then the SSIS creates some .txt files which show all the errors, the number of the row and name of the original file.
I could run it from the Visual Studio, but I am not able to run it from a job. I've followed those steps: https://www.codeproject.com/Articles/14401/How-to-Schedule-and-Run-a-SSIS-package-DTS-Job-in , but still not running.
What may I do to handle this?
The dtsx in the job is in a SQL server and the files which I want to work with are in a CRM server.
Thanks very much for your help!
Finally ir worked!!
I followed the steps again and I could run the job successfully.
When I try to copy an .abf file from one location to another location using file transfer task in SSIS package, I am getting
The process cannot access the file .abf because it is being used by another process
Steps :
Once the cube is processed we restore the whole cube to another server ( From Server A to Server B) – Sucess
Copying .abf file from server A to some backup location – Above Error
Can anyone please just me what might causing this issue. Facing this issue everyday.
SSIS package loops through input files. For each file, flatfile parse adds records to a DB table, then file is renames/moved for archiving. After all files, package calls a sproc to delete all year-old records.
Package runs from visual studio OK. Put in SSIS package store, run from there, no problem.
Create an SQL Agent job to run package. Job does something for about five minutes, announces it was successful, but no new records in DB and no renaming of input files.
Package uses dedicated login for SQL Server privileges. Job is run as HOSTNAME-SVC which has read/write privileges on the input directory and the archive directory.
Have you setup logging for the package? You could add a script task to the For-Each Loop Container that runs a Dts.Events.FireInformation command during each loop. This could help you track the file name it finds, the number of loops it does, how long each loop takes, etc. You could also add a logging step at the end so that you know it is at least exiting the For-Each Loop container successfully.
If you find that the package is running successfully but not looping through any files at all, then you may want to test using a simpler package that reads one file only and loads it into a staging table. If that works, then go the next step of looping over all the files in the director and only importing the one file over and over again. If that works, then go the next step of changing the file connection to match the file that it finds in the For-Each Loop Container file enumerator task.
If the package isn't looping over any files and you can't get it to see even the one file you tested loading from the job, then try creating a proxy account with your credentials and running the job as the proxy account. If that works, then you probably have a permissions issue with your service account.
If the package doesn't import anything even with the proxy account, then you may want to log into the server as the service account and try to run the SSIS package in BIDS. If that works, then you may want to deploy it to the server and run the package from the server (which will really use your machine, but at least it uses the ssis definition from the server). If this works, then try running the package from the agent.
I'm not sure I fully understand. The package has already been thoroughly tested under several Windows accounts, and it does find all the files and rename all the files.
Under the Agent, it does absolutely nothing visible, but takes five minutes to do it. NO permissions errors or any other errors. I didn't mention that an earlier attempt DID get permissions errors because we had failed to give the service acount access to the input and output directories.
I cannot log in as the service account to try that because I do not have a pasword for it. But sa isd job owner so it should be able to switch to the service account--and the access errors we got ten days ago show that it can. The package itself has not changed in those ten days. We just deleted the job in order to do a complete "dress rehearsal" of deployment procedure.
So what has changed, I presume, is some detail in the deployment procedure, which unfortunately was not in source control at the time it succeeded.
It seems to be something different about the permissions. We made the problem go away by allowing "everyone" to read the directory on the production server. For some unknown reason, we did not have to do that on the test server.
When the job tried to fetch the file list, instead of getting an error (which would be logged) it got an empty list. Why looping through an empty list took five minutes is still a mystery, as is the lack of permissions. But at least what happened has been identified.
I had a similar problem. Was able to figure out what was happening by setting the logging option of the SQL Server Agent Job.
Edit the step in the job that runs the package, go to the logging tab and pick "SSIS log provider for SQL Server" and, in the configuration string, I picked (using the drop down) the OLEDB connector that was in the package, it happens to connect to SQL Server in question.
I was then able to view more details in the history of that job, and confirmed that it was not finding files. By changing permissions on the directory to match the sql server agent account, the package finally executed properly.
Hope this helps.
You may want to turn logging off after you resolve your issue, depending on how often your package will run and how much information logging provides in your case.
Regards,
Bertin