SSIS Package copy/move based on condition. check archive folder before move/copy to process - sql-server-2008

I'm working on a Ssis package and would like to know, how I can achieve the following:
I want to move files from a drop folder into a process folder and I want to implement the following rule:
If file does not exist in archive move file to process and archive.
If file exists in archive drop file (don't archive and don't move to process).
The test "if" exists must be based on file name and time stamp (when raw file got created).
Any ideas?

You can do this by simple way which I have done this in few days back.
1) Create a variable FileName(string), FileExists(boolean)
2) Drag the File System Task and based on your condition you can Copy/Move/Delete file or folder.
3) In my case based on the time frame I archive the file which is move the file from one folder to another by adding one more variable name DestinationFolder (string).
4) The condition I applied is in Precedence Constraint (right click on properties or double click Precedence constraint editor then expression and constraint and give the expression as #fileexists == TRUE or FALSE).
This should work just fine.

Related

SSIS file task system to copy multiple input files from different source paths to different destination paths

I have to copy files from source to destination, here the tricky part is, i have many source files located in different shared paths, i need to copy those input files to particular destination folders.
For Example:
Spath1/a.txt --> Dpath1/, spath2/b.txt --> Dpath2/, spath3/c.txt --> Dpath3/, Etc..
Can i use a table to map the input files/source paths and destination folder and use foreach loop to attain the solution?
kindly post your suggestsions.
Welcome to SO Harun
Yes, you can use a table in this case. First add an an object-type variable to the package, let's call it #folders and a second string-type variable, lets call this as #currentSrc and lastly third string-variable we call #currentDst. Then add Execute SQL task to get the list of folders. Set the result set to "Full result set" and add a reference to #folders-variable in the Result Set tab.
Add a For Each loop, set the Enumerator as "Foreach ADO Enumerator", #folders as the ADO object source variable and "Rows in the first table" as Enumeration mode. Add a reference to #currenSrc and #currenDst-variables in the "Variable Mappings" tab.
Now you can use the File System Task to copy files from sources to destinations based on the data you have on a table in the database (make sure to reference #currentSrc and #currentDst variables in the File System task).

Source File Connection (Flat File) - Not reading column metadata

When I create the SSIS package it requires a file to be referenced to pick up the files metadata. For example the column headers will be ColumnA, ColumnB.
I have always assumed that these column names need to be present in the file for it to be loaded. Recently business, for whatever reason, changed one of the column names in the file to something else so the file contains ColumnA, NotColumnB. When the SSIS package runs it ignores this and loads the file. I assumed that it would fail. Is my assumption correct and there is something weird going on or is my assumption incorrect, if so please let me know why.
I have changed the column names in a few other packages that load data from a file and they also dont care what the column names are
Click on the flat file source, and press F4 to show the properties tab. There are a property called ValidateExternalMetadata change it to True.
For more information check the following answer:
Detect new column in source not mapped to destination and fail in SSIS
Update 1
It looks like that flat file connection manager has no validation engine and the metadata defined is used at configuration time to configure the mappings between the data file and the database.
Why Does't SSIS Flat File Data Check If Columns Names or Order Have Changed? What is best way to check?
Flat file destination columns data types validation

How to Write the File and File Path to table

I have a SSIS package - which within a FOR LOOP CONTAINER I look in a particular location, for a particular file format and import it into a database.
This is working fine - when I have two files the contents of both files are being imported.
So I have a Variable Mapping under my ForLoop which records the fully qualified name. What I want to do is when I import the file is I am also recording the file path of where it has come from.
I'm unsure in my dataflow task where I would put that ? Under the data flow I have my source file and a destination.
I tried to have a sql task after the data flow that updated the field in the database with the variable (via Parameter Mapping), but that set the field to the same value for everything (the last file path found) which is not what I'm after.
Any advice would be welcome
In your dataflow task, in between your source and destination add a Derived Column transformation. This will add columns to your dataset with a name and value that you specify. If you reference variables in which you are storing the file name for your loop container, the name of the file being accessed will be appended to an additional column in your dataset. Obviously you need to make sure that this column is present in your destination table.

SSIS - How to loop through files in folder and get path+file names and finally execute stored Procedure with parameter as Path + Filename

Any help is much appreciated. I am trying to create an SSIS package to loop through files in the folder and get the Path + filename and finally execute the stored proc with parameter as path+filename. I am not sure how to get the path+filename and insert the into the Stored proc as parameter. I have attached the screenshot for your reference:
Looks like you have the right idea in general and the link #Speedbirt186 provided has some good details but it sounds like there are a couple of nuances that I thought I might point out in regards to flow and variables.
The foreach loop can assign the entire path or the file name or file name & extension to a variable. The latter will be the most help in your case if you don't want to add a script task to split the Filename from the path. If you start by adding 5 variables to your project it will make it a little easier. 1 will be the Source Directory Path, another the Destination (Archive) Directory Path, and then 1 to hold the File Name and Extension assigned by the for each loop. Then 2 additional dynamic variables that simply combine the source directory and file name to get the source full path and the destination with file name to get the destination full path.
Next make sure you set up your database and Excel file connections. In your Excel file connection after setting it up go to Expressions in the properties window and set the "Connection String" property to SourceFullPath. This will tell the connection to change the file path at every iteration of your loop.
Now you just need to setup your loop etc. Add the fore each loop container setting a directory, filter, and choose File Name and Extension.
Now in the expression box on the collection page set the directory property to be that of your Source Directory variable.
The last part of the Fore each loop is to set your variable mappings to store the file name in your variable. so go to that tab choose your file name variable and set index to 0.
At this point you can add your data flow and setup your import just like you would with a normal file (note your default value for your file name parameter should be to an actual file with the structure you will want to import).
After your data flow drop in your Execute SQL task and set it up how you need. here is an example of direct input and you can see an easy way to reference a parameter is simply a question mark (?).
Next in your sql task setup your parameter mapping by adding in the details you need such as:
Now you are on to your file task. Drop your file task and setup as you desire, but choose your destination and source full path variables to tell the task which file to move.
that's it your are done. there is 1 more thing to note though. The way you have your precedence set in the image you posted you show going from your data flow to your sql and to your file task simultaneously. If your stored procedure relies on your file you may want to put it after your sql task. You can always change the constraint options to "completion" if you want to move the file even if your stored proc fails.
What you want to do is to create a variable in your package, call it something like Filename. In the Edit window of the Foreach you can configure that variable to be set (on the Variable Mappings page- set index to 0).
To create a variable, you will need to have the Variables window showing. Use the View menu to show it if it's not currently open.
Then when calling your stored procedure you can pass the then current value of the variable as a parameter.
This link might help: https://www.simple-talk.com/sql/ssis/ssis-basics-introducing-the-foreach-loop-container/

SSIS 2005 - How to check a file to see if a files does not exist

How do you check to see if a file does not exist in SQL Server Integration Services 2005?
Is there a native SSIS component which will just do this for you?
I have checked for file existence this using the Script Task and then branch accordingly.
You can do something like
If System.IO.File.Exists("\\Server\Share\Folder\File.Ext") Then
Dts.TaskResult = Dts.Results.Success
Else
Dts.TaskResult = Dts.Results.Failure
End If
Although there are no native components for this, there are several third party components for SSIS that you can use for this purpose.
The File System Task in SSIS is basically for move, copy, delete, etc., but does not support file existence checks.
#Raj More gave a good solution. Another way that I have used before is to create a Foreach Loop Container that loops over the file system for a file spec. If you know the name of the file you want, then you can set the name in a variable and set the spec to equal the variable in the expression tab for the Foreach Loop Container. You could also just specify or a directory or a partial file name if you don't know the exact name but know the naming convention or know there will be no other files in the folder.
If you want to take a specific action based on whether or not there is a file, then you could create a variable with a default value of 0 and create a script task in the Foreach Loop Container that increments the variable. You could also just put the commands in the Foreach Loop Container that you want to execute if you want to execute it for the existence of each individual file. If you want to take an action based on the absence of the file, then you could restrict your precedence constraint after the Foreach Loop Container so that it is restricted on constraint and expression and make it check if the counter variable is > 0.
#Raj's solution could also be used to increment the variable. Instead of using an If Else to raise an error or success result, you could do this:
C#
if (System.IO.File.Exists("\\Server\Share\Folder\File.Ext"))
{ Dts.Variables["my_case_sensitive_variable_name"].Value = Dts.Variables["my_case_sensitive_variable_name"].Value + 1;
}
VB.NET
If System.IO.File.Exists("\\Server\Share\Folder\File.Ext") Then
Dts.Variables["my_case_sensitive_variable_name"].Value = Dts.Variables["my_case_sensitive_variable_name"].Value + 1
End If
The advantage of this approach is that the package may not need to fail in the absence of a file. You could also use a variable name if the file changes that you could define either as a variable in the package or just solely created in the script task. The only short-coming of #Raj's approach is that you have to know the file name you want to check.
Another possibility is to execute a File System Task to rename the file to its existing name or copy the file to its existing location. If the file doesn't exist, then you can route the error to an action. I don't recommend this solution, but I remember using it years ago in one instance where it actually made sense. But in that particular instance, I was actually copying it to a real location.
Good luck!