I am producing two files from an SSIS package.
One is the main content and the other is the header.
After I have output both files - I am then merging them using an Execute Process Task.
So I have a content.txt and a header.txt.
/C copy /B \filepath\header.txt + \filepath\content.txt \filepath\result.txt
What I want to do at this stage it append the data to the result.txt so it becomes result_09102019.txt.
How do I achieve that within the snippet of code I have above?
I am not using an Execute Process Task now in order to achieve the filename.
Instead using an Execute SQL Task to simply write the result set to a variable - which I then point at me Flat File Connection.
select 'filename_' + format(getdate(), 'yyyyMMddHHmm') + '.csv'
The single row result set gets written to a variable called OutputFileName.
I then have an OutputFolder variable and then combine the OutputFolder and OutputFileName to another variable called OutputPath.
Output path is then added via an Expression to the file connection.
suggestion - explore the use of variables in building up file names.
Store filename in variable and create tables with the filename in SSIS
Related
This is the following use case:
I have different csv files from my data lake and want to copy this to my Azure SQL DB. a typical file name of the csv looks like this : Sale-Internet-Header.csv.
In the Sink property of Azure SQL DB i used the expression in the sink dataset #replace(item().name, '-','_').
After execute the copy pipeline the sql table has the following name: dbo.sales_internet_header.csv
I would like to change my expression in the sink dataset to remove the ".csv" so that the SQL table name going to look like : dbo.sales_internet_header
Any suggestions?
Many thanks
You can use replace() and add dynamic content
#replace(variables('cc'),'.csv', '') remove the ".csv".
Give this dynamic content in the SQL dataset table name or inside ForEach as above. Here is the sample demonstration that where I have used set variable to replace the .csv.
Output:
So I am loading data from CSV Data Set Config, what I like to do is write a value of a variable back into the next column of that same file in the same row but on the next column.
I check "url" for some specific response json, extract and create a variable "status" and "action" and add it to the row
Is it even possible to write back to the source csv file? Maybe some post processor script? Searching here is like a needle in a haystack sometimes.
It is possible but I wouldn't recommend it as if you implement this post-processor logic and run your test with > 1 user most probably you will get into a race condition when several threads are concurrently writing into the same file.
Alternatives are:
Adding your status and action variables values to JMeter's .jtl result file, just declare the following Sample Variables:
sample_variables=url,status,action
in the user.properties file and next time you run JMeter in command-line non-GUI mode you will see 3 extra columns in the .jtl results file holding the values of these 3 JMeter Variables
If you want a separate file - first of all execute step 1 and then add a Flexible File Writer to your Test Plan and configure it to write the variables into a file, the relevant configuration would be something like:
variable#0|,|variable#1|,|variable#2|\r\n
I have a package where I have an input file that has a header line
TI,2
and detail line(s) that look like this
YP,302,,0000000000000061.00,20170714,CHK #9999,R04,9999
I have to do some processing on the detail lines. The file name is in a variable called User::FileName
In my Data Flow I have a conditional split where I shoot the Header Record to a path where I create a file with just the header record (it doesn't change).
I process all of the detail records. I have to go into SQL to do this and write out the results into a comma delimited flat file with the same name as the input file (using the variable).
So now I have a Header file with a fixed name and a detail file with the name in a variable. I need to combine these. I am trying to create a .BAT that that says
copy /y /d /b header.txt + User::FileName User::Filename (with the proper values substituted for the variable) and then execute this with an Execute Process task.
I am doing this with a Data Flow Task. The source is a Flat file (copy.bat) that contains 2 columns. Column 0 has copy /y /d /b. I have a derived column called Rest_of_Copy that has header.txt + User::FileName + " " + User::Filename
On the output file Destination I also have 2 columns. I am mapping Column 0 (the copy /y /d /b) to column 0 on the output file and The Derived Column Rest_of_Copy (which should contain the results of header.txt + User::FileName + " " + User::Filename). The Connection Manager for the Destination File is Copyout.bat
When I run the package Copyout.bat is empty.
I can't figure out why. Can anybody help?
I found a different way to do it. I wrote out the 2 files to fixed names, so the COPY command was fixed, and then renamed the output file to the SSIS variable using a File System task.
Thanks,
Dick
I have multiple CSV files in a directory. They may have different column combinations, but I would like to COPY them all with a single command, as there is a lot of them and they all go into same table. But the FDelimitedParser only evaluates the header row for the first file, then rejects all rows that do not fit - ie. all rows from most of the other files. I've been using FDelimitedParser but anything else is fine.
1 - Is this expected behavior, and if so, why ?
2 - I want it to evaluate the headers for each file, is there a way ?
Thanks
(Vertica 7.2)
Looks like you need flexTable for that , see http://vertica-howto.info/2014/07/how-to-load-csv-files-into-flex-tables/
Here's a small workaround that I use when I need to load a bunch of files in at once. This assumes all your files have the same column order.
Download and run Cygwin
Navigate to folder with csv files
cd your_folder_name_with_csv_files
Combine all csv files into a new file
cat *.csv >> new_file_name.csv
Run a copy statement in Vertica from new file. If file headers are an issue, you can follow instructions on this link and run through Cygwin to remove the first line from every file.
in my foreach loop container, I would like to delete the current processed file.
I try as follows, but no file is deleted at the end, any idea??
Here is the property of my loop , Current processed file comes from FileNameSimu variable
I would like to delete the current file
Make sure that the value in the variable User::FileNameSimu contains the file path like C:\Folder1\SubFolder2\File.txt and not just the file name File.txt
Please note the description of the property SourceVariable on File System Task. It expects a path.
On the Variables window, select the variable FilePath and press F4 to view the properties of the variable. Change the property EvaluateAsExpression to True and set the value #[User::Directory] + #[User::FileName] to the property Expression assuming that your variable Directory contains the folder path and the variable FileName contains the name. Make sure that the variable Directory ends with a backslash at the end like C:\temp\ and not like C:\temp. If it doesn't have a backslash at the end change your expression to #[User::Directory] + "\\" + #[User::FileName]
Or use the backslash in the expression