SSIS For Each Loop Container/File System task - ssis

I have a File System task that will move a folder if a process job fails. The process job bulk inserts from a set of files. If one file fails the subsequent job to move the folder fails. I'm presuming that the process job is locking a file in the folder as I am getting and Access to path denied. Any ideas would be great.

Instead of moving the folder contents as part of a single task I split the task into a copy and then deleted the directory contents. The problem has not reoccurred.

Related

SSIS: How to move failure file's' to fail folder step by step

enter image description here
My program is to load all CSV file from source folder(1 CSV file = 1 row data) into database, then moved successful executed file to archived folder and if the file is failed to load into database, it will transfer into fail folder. Let say I have 100 files, 5 of it is failed and need to go into fail folder but my program only move 1 out of 5 and the other 4 still inside source folder. How can this happen ? I'm new to SSIS and do not have any experience in C# and other language.
When executed manually inside the visual studio application by press F5, it will always move 1 file of the fail file. So u need to configure it at SQL Server Agent to retry the program multiple time so that all the failed file move to fail folder

SSIS File System Task - Move Directory

I am extremely confused.
I have a destination directory: \\Client\D$\Data Feed\Archive. I set this as my Destination Connection in File System Task Editor.
I have a source directory: \\Client\D$\Data Feed\Plan 24-01-2020. I set this as my Source Connection in File System Task Editor. It also contains one CSV file.
For the Operation in File System Task Editor I choose Move Directory as the Operation.
All setup just click run. When I execute I get the following error message: "Cannot create a file when that file already exists."
Curiously enough the CSV file inside the source folder is copied to the Archive folder.
I was expecting only that the folder Plan 24-01-2020 would be moved to the folder Archive.
What I'm I doing wrong?
Because according to this tutorial the folder should be moved without any issues: https://www.tutorialgateway.org/move-directory-using-file-system-task-in-ssis/
I think there are two things you'll want to do here.
First, create an expression in the File System Task that sets OverwriteDestinationFile to TRUE.
Next, you'll want to slightly modify your Destination path. Instead of just \\Client\D$\Data Feed\Archive\, you'll probably want to specify your destination as \\Client\D$\Data Feed\Archive\Plan 24-01-2020\. Otherwise, it will just copy the contents of \\Client\D$\Data Feed\Plan 24-01-2020\ into \\Client\D$\Data Feed\Archive\ without creating the Plan 24-01-2020 sub-folder.

Need to pass SSIS variable to the Execute Process task to make Dynamic command line argument for Winscp

I have a folder 'DATA' at SFTP location from where I need to download the set of files to some common location and then copy the respective files to different folder location.
File Names are:
Test1.csv
Test2.csv
Test3.csv
Test4.csv
Test5.csv
I want that files first gets downloaded to below location:
G:\USER_DATA\USER_USER_SYNC\Download
Since these files are related to different schema and have to processed separately by each different ssis packages for further transformations and loading.
For some reasons we have to first keep it at some common location and then move or copy afterwards.
Here's my command line argument.
/log=G:\USER_DATA\USER_USER_SYNC\SFTP_LOG\user_sync_winscp.log /command "open sftp://username:password#stransfer.host.com/" -hostkey=""ssh-rsa 2048 9b:63:5e:c4:26:bb:35:0d:49:e6:74:5e:5a:48:c0:8a""" "get /DATA/Test1.csv G:\USER_DATA\USER_USER_SYNC\Download\" "exit"
Using above, I am able to download a given file one file at a time.
Since, I need to have first it at some common folder location. Hence I am planning to add another Execute process task to copy the files.
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing1
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing2
and so on...
I am looking for some way, using which we can download all the available files to some common folder location and then move or copy to different folder locations.
I have changed the design and followed a new approach. Thanks to Martin for fixing the sftp related issues and continuous support.
New SSIS package has below tasks:
Step1. It will look for latest updated files on sftp server and download the given files Test1.csv and Test2.csv to location G:\USER_DATA\USER_USER_SYNC\Download\
Here's my command line arguments:
/log=G:\USER_DATA\USER_USER_SYNC\SFTP_LOG\user_sync_winscp.log /command "open sftp://bisftp:*UFVy2u6jnJ]#hU0Zer5AjvDU4#K3m#stransfer.host.com/ -hostkey=""ssh-rsa 2048 9b:63:5e:c4:26:bb:35:0d:49:e6:74:5e:5a:48:c0:8a""" "cd /DATA" "get -filemask=">=today" Test1.csv Test2.csv G:\USER_DATA\USER_USER_SYNC\Download\" "exit"
Step-2. Since my requirement was to further copy each file to different folder location, so that respective process can pick corresponding file and start transformation and loading it into sql server.
This step will execute the Window cmd process and copy Test1.csv to new location as
G:\USER_DATA\USER_USER_SYNC\Testing1
command line arguments as:
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test1.csv G:\USER_DATA\USER_USER_SYNC\Testing1
Like wise I have another Execute process task to copy Test2.csv to new location as
G:\USER_DATA\USER_USER_SYNC\Testing2
command line arguments as:
/C copy /b G:\USER_DATA\USER_USER_SYNC\Download\Test2.csv G:\USER_DATA\USER_USER_SYNC\Testing2
The given solution is working fine, However there are couple of things which still needs to be handle.
Since I am downloading the latest file only using -filemask=">=today". Everything runs fine if execute process task is able to find the latest files on sftp server. If it's not there, than the next subsequent execute process task is failing with below error message.
The returning The process exit code was "1" while the expected was "0"
Here what I understand is that it's failing as it has nothing to copy or move.
Is there any way by which we can capture the exit code returned from first execute process task and store it into some variable, so that we can use expression to decide that whether to start next task or not.
Second, as you can see that I am using two execute process task to copy files from one location to another. Can we do anything to combine both these two commands into one execute process task?
Any suggestion most welcome and also i think that this issue needs to be addressed as a separate question.

Wait for file to arrive - SSIS package

we have a package that reads text files on an ftp server everyday, however we do not know exactly when the files will be added to the folder on the server, so we have to wait for the files and then fire the package manually. So is there any way to automate the process, so that the job will start whenever files are found?
Why don't you call the SSIS package when the file arrives. In one of my projects, we had a directory listener service that waits for the file to arrive. Once the file arrives, we used to call the SSIS package.
Vijay
Use the WMI Event watcher task. That could be of some help. Or, you can simply execute the package every 10 or 30 minutes. A script task can check if the file is available or not, if not, the package exits immediately and if available, execute the DFT.

SSIS package not running when called as step in SQL Job

I have a .dtsx file (an SSIS package) that downloads files from an FTP server and imports data. It runs fine whenever I run it manually. However, when I schedule calling the package as a step in a SQL server agent job, it fails. The step it fails at is the one where I call a .bat file. The error in the job history viewer says this:
Error: 2009-05-26 12:52:25.64
Code: 0xC0029151 Source: Execute
batch file Execute Process Task
Description: In Executing
"D:\xxx\import.bat" "" at "", The
process exit code was "1" while the
expected was "0". End Error DTExec:
The package execution returned
DTSER_FAILURE (1).
I think it's a permissions issue, but I'm not sure how to resolve this. The job owner is an admin user, so I've verified they have permissions to the directory where the .bat file is located. I've tried going into Services and changing the "Log On As" option for SQL Server Agent, and neither option works (Local System Account and This Account). Does anyone have ideas as to what other permissions need to be adjusted in order to get this to work?
I tried executing just the batch file as a SQL Job step, and it gave more specifics. It showed that it failed when I was trying to call an executable, which was in the same directory as my .bat file, but not in the windows/system32 directory, which is where it was executing from.
I moved the executable to the system32 directory, but then I had no clue where my files were being downloaded to. Then I found that there's a property for the Execute Process Task (the one that executes the .bat) called WorkingDirectory. I set this to be the directory where the bat is located, moved the executable back into the same one as the .bat file, and it's now working as expected.
For me it was a permissions issue. Go to Environment --> Directories, then change Local directory to something the SQLAgentUser can access. I used C:\temp. Click the dropdown for Save, and choose "Set defaults".
Are you executing the SSIS job in the batch file, or is the batch file a step in the SSIS control flow?
I'm assuming the latter for this answer. What task are you using to execute the batch file (e.g. simple execute program task or a script task). If the latter, it looks like your batch file is actually failing on some step, not the SSIS script. I'd check the permissions of what your batch file is trying to access
In fact, it might be a better idea to rewrite the batch file as a script task in SSIS, because you'll get much better error reporting (it'll tell you which step in the script fails).
You could try executing the batch file using the runas command in a command window. If you try and execute it under the local system or network system account, it should give you a better error. If it does error, you can check the error level by going "echo %ERRORLEVEL%".
If it wasn't the latter, and you're executing the SSIS package via a batch file, why?
Are you possibly accessing a mapped drive in your .bat file? If so, you can't rely on the mapped drive from within the service, so you'd have to use UNC path.
I had the same error and I resolved it by logging on to the user account that runs the job, opened Coreftp site in question there, test the site access, made the change there (in my case, I had to reenter the new password) and now it works.
So yes, it is an issue of file access. This one is file access to the coreftp site in question.