Create a txt file with SSIS showing File Movements - ssis

I need to create a .txt file every time and SSIS job runs and in that file I need to put the name of the files that I am transmitting. I have a foreach container but I am not sure on how to create the file and then write to it as each file is being moved. Can someone please steer me in the right direction? Thanks in Advance.

You can follow below steps:
Create a dummy empty text file
Copy the dummy file to new name dummyfile_10252016 (using File System Task - need to handle it through variable to make the name dynamic)
Create variable to keep the dummyFilePath
Process your required files in for each loop and store the filename in a variable
Once your file is processed, write the name to dummyfile_10252016
Script task:
string filename = Dts.Variables["User::filename"].Value.ToString();
string path = Dts.Variables["User::dummyFilePath"].Value.ToString();;
System.IO.File.AppendAllText(path, filename );

Related

Copy file found in Script task to folder SSIS

i am new to coding and know a bit about ssis, i am able to help myself but i am stuck - i wanted to copy the latest .gz file from a folder to my own folder daily with a ssis script task, i was able to identify the lastest file with the script but i am getting stuck where said file needs to copy
here is is my script:
public void Main()
{
// TODO: Add your code here
var directory = new System.IO.DirectoryInfo(Dts.Variables["User::VarFolderPath"].Value.ToString());
System.IO.FileInfo[] files = directory.GetFiles("DailyReport-*.gz");
DateTime lastModified = DateTime.MinValue;
foreach (System.IO.FileInfo file in files)
{
if (file.LastWriteTime > lastModified)
{
lastModified = file.LastWriteTime;
Dts.Variables["User::VarFileName"].Value = file.ToString();
}
}
// MessageBox.Show(Dts.Variables["User::VarFileName"].Value.ToString());
}
}
}
which works in identifying the file which i run within a Foreach loop container, but when i add a file system task, it does not copy the right file
i have variables saved for the copy
User:DestinationFolder as Destination Connection
User:SourceFolder as Source Connection
with copy file as the operation
in my User:SourceFolder variable my value is saved as the exact file name i tested with originally, but it changed the next time the source folder was updated, and now the copy task keeps copying the same file
the file has basically the same name everytime it update, except for the end, for example
DailyReport-20180725050247.gz at 6:00 in the morning
and then
DailyReport-20180725080801.gz at 8:00 the same morning
I want to copy the second one or whichever one updates later in the day
my variables:
SourceFolder - X:\FTP\In\DailyReport-20180725050247.gz
DestinationFolder - X:\MIS_AUTO\Lizl\Test
VarFolderPath - X:\FTP\In
VarFileName -
hope this makes sense
I did a try to re-pro your issue with a Script Task and File System Task.
The only thing that needs to be changed here is what actually passing through User::VarFileName. For a File System Task if you are choosing as variable for the Source Connection -> IsSourcePathVariable then you would need to pass the full file path information.
So, in script task you would need this -
Dts.Variables["User::VarFileName"].Value = file.FullName.ToString();

SSIS - How to loop through files in folder and get path+file names and finally execute stored Procedure with parameter as Path + Filename

Any help is much appreciated. I am trying to create an SSIS package to loop through files in the folder and get the Path + filename and finally execute the stored proc with parameter as path+filename. I am not sure how to get the path+filename and insert the into the Stored proc as parameter. I have attached the screenshot for your reference:
Looks like you have the right idea in general and the link #Speedbirt186 provided has some good details but it sounds like there are a couple of nuances that I thought I might point out in regards to flow and variables.
The foreach loop can assign the entire path or the file name or file name & extension to a variable. The latter will be the most help in your case if you don't want to add a script task to split the Filename from the path. If you start by adding 5 variables to your project it will make it a little easier. 1 will be the Source Directory Path, another the Destination (Archive) Directory Path, and then 1 to hold the File Name and Extension assigned by the for each loop. Then 2 additional dynamic variables that simply combine the source directory and file name to get the source full path and the destination with file name to get the destination full path.
Next make sure you set up your database and Excel file connections. In your Excel file connection after setting it up go to Expressions in the properties window and set the "Connection String" property to SourceFullPath. This will tell the connection to change the file path at every iteration of your loop.
Now you just need to setup your loop etc. Add the fore each loop container setting a directory, filter, and choose File Name and Extension.
Now in the expression box on the collection page set the directory property to be that of your Source Directory variable.
The last part of the Fore each loop is to set your variable mappings to store the file name in your variable. so go to that tab choose your file name variable and set index to 0.
At this point you can add your data flow and setup your import just like you would with a normal file (note your default value for your file name parameter should be to an actual file with the structure you will want to import).
After your data flow drop in your Execute SQL task and set it up how you need. here is an example of direct input and you can see an easy way to reference a parameter is simply a question mark (?).
Next in your sql task setup your parameter mapping by adding in the details you need such as:
Now you are on to your file task. Drop your file task and setup as you desire, but choose your destination and source full path variables to tell the task which file to move.
that's it your are done. there is 1 more thing to note though. The way you have your precedence set in the image you posted you show going from your data flow to your sql and to your file task simultaneously. If your stored procedure relies on your file you may want to put it after your sql task. You can always change the constraint options to "completion" if you want to move the file even if your stored proc fails.
What you want to do is to create a variable in your package, call it something like Filename. In the Edit window of the Foreach you can configure that variable to be set (on the Variable Mappings page- set index to 0).
To create a variable, you will need to have the Variables window showing. Use the View menu to show it if it's not currently open.
Then when calling your stored procedure you can pass the then current value of the variable as a parameter.
This link might help: https://www.simple-talk.com/sql/ssis/ssis-basics-introducing-the-foreach-loop-container/

SSIS - Load flat files, save file names to SQL Table

I have a complex task that I need to complete. It worked well before since there was only one file but this is now changing. Each file has one long row that is first bulk inserted into a staging table. From here I'm supposed to save the file name into another table and then insert the the broken up parts of the staging table data. This is not the problem. We might have just one file or even multiple files to load at once. What needs to happen is this:
The first SSIS task is a script task that does some checks. The second task prepares the file list.
The staging table is truncated.
The third task is currently a Foreach loop container task that uses the files from the file list and processes it:
File is loaded into table using Bulk Insert task.
The file name needs to be passed as a variable to the next process. This was done with a C# task before but it is now a bit more complex since there could be more than one file and each file name needs to be saved separately.
The last task is a SQL task that executes a stored procedure with the file name as input variable.
My problem is that before it was only one file. This was easy enough. What would the best way be to go about it now?
In Data Flow Task which imports your file create a derrived column. Populate it with system variable value of filename. Load filename into the same table.
Use a Execute SQL task to retrieve distinc list of filenames into a recordset (Object type variable).
Use For Each Loop container to loop through the recordset. Place your code inside the container. Code will recieve filename from the loop as a value of a variable and process the file.
Use Execute SQL task in For Each Loop container to call SP. Pass filename as a parameter like:
Exec sp_MyCode param1, param2, ?
Where ? will pass filename INPUT as a string
EDIT
To make Flat File Connection to pick up the file specified by a variable - use Connection String property of the Flat File Connection
Select FF Connection, right click and select Properties
Click on empty field for Expressions and then click ellipsis that appears. With Expressions you can define every property of the object listed there using variables. Many objects in SSIS can have Expressions specified.
Add an Expression, select Connection String Property and define an expression with absolute path to the file (just to be on a safe side, it can be a UNC path too).
All the above can be accomplished using C# code in the script task itself. You can loop through all the files one by one and for each file :
1. Bulk Copy the data to the staging
2. Insert the filename to the other table
You can modify the logic as per your requirement and desired execution flow.
Add a colunm to your staging table - FileName
Capture the filename in a SSIS Variable (using expressions) then run something like this each loop:
UPDATE StagingTable SET FileName=? WHERE FileName IS NULL
Why are you messing about with C#? From your description it's totally unnecessary.

SSIS Package - Looping through folder to check if files exists

Can anyone help:
Required: SSIS Package to loop through a folder (containing 100 files) and check whether required files (which are 5/6) are present in that folder.
Does anyone already has code for this - where we are checking for multiple files existence in the destination folder
Regards
Add a Foreach loop container to your Control Flow
Double click it and select Collection. On Enumerator, select Foreach
File Enumerator
Select your folder and the type of file
Select the return type when a file is found. The options are the
whole filename including extension and path, the name and extension
or simply the name of the file found
Select the checkbox if you want the subfolders
Click on the variables option on the left and then new variable or
select an existing variable.
At this point you have each file name on the folder. To prove it, add a script component, double click it, and your variable on the read Only Variable and click on Edit Script. Make your Main like this:
public void Main()
{
System.Windows.Forms.MessageBox.Show(Dts.Variables["FileName"].Value.ToString());
Dts.TaskResult = (int)ScriptResults.Success;
}
now, the comparison you can do several ways. I dont know where do you have the "required files" list, but assuming it is on a database, you can add a data flow task and inside of it send the filename to the DB to do the comparisson.

SSIS:Moving .csv files from share folder to oledb destination on daily basis and sending email notification

I am new to SSIS i'm working on a critical deadline it would be great if someone could help me on this.
I have a share folder in server location //share/source/files where on daily basis one file will get loaded and on a monthly basis one more file gets loaded both the files are of same extension .csv
Can anyone help me in moving the files Say A.csv and B.csv to corresponding tables and more important is file name on day1 will be A 2011-09-10.csv and on day2 in source the file will be A 2011-09-11.csv..This files has to be moved to table A and file B.csv has to be moved the corresponding destination table table b ,once after moving the files this file has to be moved to archive folder and also we need to send users that tabl-A got loaded with 1000 rows and succesfull similarly table -b load was sucesfull along with date and time .
Note:Source Files will be automatically updated in the folder everyday at 5am in the morning.
First, create a variable that will hold the file path name.
Secondly, create a script task that checks to see if the file is available.
The script task will be as follows:
public void Main()
{
string FileName = String.Format("{0}.csv", DateTime.Now.ToString("yyyy-MM-dd"));
if (System.IO.File.Exists("\\Shared\\Path\\" + FileName))
{
Dts.Variables["FileName"].Value = FileName;
Dts.TaskResult = (int)ScriptResults.Success;
}
else
{
Dts.TaskResult = (int)ScriptResults.Failure;
}
}
After the script task, create a Data Flow Task.
Create a connection in the Connection Manager at the bottom of the page which will point to the flat file. In the Flat File Connection Manager Editor popup, set the File name to a file you'd like to upload (this will be updated dynamically, so its actual value isn't relevant). In the properties of the new Connection, open the Expressions popup. Select ConnectionString property, and set the expression to point to the path and the FileName variable : "\\Shared\\Path\\" + #[User::FileName].
Create a Flat File Source, and use the connection we just created as the Connection for the flat file.
Create a destination data flow item, and point it to the database you'd like to insert data into.
From here, create a SQL Server job that runs at the time you'd like it to run. This job should execute the package you have just created.