How to create a file with incremental sequence number dynamically in SSIS? - ssis

I am trying to create a file dynamically like abc_id(sequencenumber)_t(timestamp), where I cleared off creating the timestamp along with its prefix , but I don't know how to create a sequence number dynamically while exporting the data from source table to .DAT format file.
I tried to do with an incremental procedure in a variable task but I didn't get correct output.

Related

dynamically adding derived column in SSIS

I have a scenario where my source can be on different versions of our database as a result the in source file I could have different number of columns while my destination have defined number of columns.
now
what we are trying to do is:
load data from source to flat files. move them to central server and
then load that data into central database. but if any column is
missing in flat file i need to add derived column.
what is the best way to do this?? how can i dynamically add derived columns?
You can either do this with BiMLScript as other have suggested in comments, or you can write a script task that reads the file, analyzes the contents, and imports it. Yet another option would be to bulk import the file as is to a staging table (that would have to be dropped and re-created everytime) and write a stored procedure that analyzes the DDL and contents, and imports data to the destination table.

Load and replace file path string with the content from that file in a MySQL database

I have a database of entries consisting of a 'name', 'id' and a 'description', but currently the 'description' field is set to the file path of a .txt file that actually contains the description content. Each .txt file's name is each row's 'id', plus the .txt extension and they all reside in the same directory.
Can I load and replace each 'description' field with the content from the relevant text file (using MySQL)?
You can't write a MySQL query directly that will read the description values from your file system. That would require the MySQL server to be able to read raw text from files in your file system. You Can't Do Thatâ„¢.
You certainly can write a program in your favorite host language (php, java, PERL, you name it) to read the rows from your database, and update your description rows.
You could maybe contrive to issue a LOAD DATA INFILE command to read each text file. But the text files would have to be very carefully formatted to resemble CSV or TSV files.
Purely using mysql this would be a difficult, if not impossible exercise because mysql does not really offer any means to open files.
The only way to open an external text file from mysql is to use LOAD DATA INFILE command, which imports the text file into a mysql table. What you can do is to write a stored procedure in mysql that:
Create a temporary table with a description field large enough to accommodate all descriptions.
Reads all id and description field contents into a cursor from your base table.
Loop through the cursor and use load data infile to load the given text file's data into your temporary table. This is where things can go wrong. The account under which mysql daemon / service runs needs to have access to the directories and fiels where the description files are stored. You also need to be able to parametrise the load data infile command to read the full contents of the text file into a single field, so you need to set the field and line terminated by parameters to such values that cannot be found in any of the description files. But, even for this you need to use a native UDF (user defined function) that can execute command line programs because running load data infile directly from stored procedures is not allowed.
See Using LOAD DATA INFILE with Stored Procedure Workaround-MySQL for full description how to this.
Issue an update statement using the id from the cursor to update the description field in your base table from the temporary table.
Delete the record from your temp table.
Go to 3.
It may be a lot easier to achieve this from an external programming language, that has better file manipulation functions and can update each record in your base table accordingly.

add parameter value to table at import SSIS

I am importing an Excel table from an ftp site to SQL using SSIS. The destination table is going to be used to calculate good and bad records based on another SQL database. Here is my problem. The Excel file is name RTW_032613_ABC_123.xls. This file name is a concatenation of a number of fields. I cannot recreate it based on the fields in the table, so I need to retain it and pass it to the new table in SQL. I have a parameter #FileName that I am using to loop through the files in the ftp folder. What I would like to do is either combine the import of data from the Excel file with the file name or insert the file name in each record after the import. I am calling the SSIS procedure from another stored procedure in SQL. I tried adding a SQL data flow task but I am not seeing where I add the insert statement on either the SQL Server Compact Destination or SQL Server Destination.
I am over my head with SSIS on this one. The key is that the parameter that I need is available in SSIS but I really need to get it passed on to my SQL table.
TIA
If I'm reading your question right, you have an SSIS package with a variable containing the filename and you want to save the filename with each row that you are sending to your SQL table? If so:
Add a derived column to the data flow, making a new column and referencing the variable in the expression
Include that new column in the mapping for the destination of your data flow, sending the filename to whichever column you would like to save that data in.
No need for a seperate SQL task.

How to using the loop and bulk load tasks to insert the name of the csv files being looped?

Description
I have created an SSIS package imports data from hundreds of csv files on a daily bases
I have used the bulk load and foreach loop container
Problem
I have created a column on a database table and wanted to know if it is possible to add the source file name on each row of data.
If you have the filename in a variable (which you could do in the for each loop) then you just use the variable as the data source for the column. Or ther may be a system variable that contains the file name, pole around a bit inthe system varaibles available to you and see.

Possible to open a text file in a MYSQL stored procedure?

Is it possible to open and read from a text file in a MYSQL stored procedure? I have a text file with a list of about 50k telephone numbers, and want to write a stored procedure that will open the file, read the 50k lines and store it as rows in a table. I cannot load the file directly using LOAD IN FILE as the table has additional columns that I have to set.
Thanks!
In the end I used LOAD IN FILE. Apparently you can set which columns get populated by using the SET keyword:
LOAD DATA LOCAL INFILE '/temp/input_file.txt' IGNORE INTO TABLE TEST.TEST_INSERT (INSERT_FIELD) SET VERSION=1, LIST_ID=ID_GEN(), CREATE_DATE=NOW(), CREATE_BY='TESTUSER';