I am importing an Excel table from an ftp site to SQL using SSIS. The destination table is going to be used to calculate good and bad records based on another SQL database. Here is my problem. The Excel file is name RTW_032613_ABC_123.xls. This file name is a concatenation of a number of fields. I cannot recreate it based on the fields in the table, so I need to retain it and pass it to the new table in SQL. I have a parameter #FileName that I am using to loop through the files in the ftp folder. What I would like to do is either combine the import of data from the Excel file with the file name or insert the file name in each record after the import. I am calling the SSIS procedure from another stored procedure in SQL. I tried adding a SQL data flow task but I am not seeing where I add the insert statement on either the SQL Server Compact Destination or SQL Server Destination.
I am over my head with SSIS on this one. The key is that the parameter that I need is available in SSIS but I really need to get it passed on to my SQL table.
TIA
If I'm reading your question right, you have an SSIS package with a variable containing the filename and you want to save the filename with each row that you are sending to your SQL table? If so:
Add a derived column to the data flow, making a new column and referencing the variable in the expression
Include that new column in the mapping for the destination of your data flow, sending the filename to whichever column you would like to save that data in.
No need for a seperate SQL task.
Related
I am creating an SSIS package which has a flat file source and a destination database.
The mappings between the columns are based on the following:
There is a table which contains records indicating the mappings ie: source column name and destination column name. The tables will be based on the name of the flat file.
The reason this has been done is so that the destination column names can be changed in the database rather than needing to recreate or edit the package.
Please could you advise as to how I could do this "lookup" and create the mappings dynamically.
There is no way to create mappings dynamically. You'll need to either generate SSIS package programmatically on the fly, or use other method (openrowset, bcp, bulk insert...).
I am using SSIS to import an Excel file into a table in my SQL Server 2008 database.
Currently I am able to import data into the table by using data flow setting Excel file as the source and data table as the destination. My current import is based on the column mapping between source and the destination, but now I want to add an extra column to the table (basically this column will have the id that is given to the Excel file of which the rows are part of, so this value will be same for each row that belongs to the file whose data we are currently importing)
This column is not present in the source Excel sheet and its value is in a SSIS user variable. I want the insertion of this value a part of the import process, but I cannot figure it out?
How can I achieve this?
The connection manager for the destination doesn't allow me to map user variables to columns...
Put in a Derived column between the Excel Source and the Database destination.
Create a column there and use the SSIS User Variable as the value expression for the column.
Add a execute SQL task after the dataflow task and update the extra column with the variable with an update statement.
I'm new to SSIS and am writing a package that includes moving data to a table that is created in a previous Execute SQL Task object.
The issue that I'm encountering is that I am unable to create a data flow destination task that uses a dynamic destination table name.
The intended process is:
Execute SQL Task object creates new table based on today's date (i.e. Table1_20111014)
Data Flow task moves data from table "Table1" to "Table1_20111014".
The column metadata for Table1 and Table1_20111014 are the same, and does not change. However, the name of the table the data needs to be moved to will change depending on the date at time of execution.
Is it possible to dynamically specify the destination table in a destination data flow object?
If not, are there known workarounds or is using SSIS for this task a bad idea?
As long as the meta data remains the same, there is no drawback to using dynamic destination table name.
To accomplish this, on the ole db destination instead of using "table name" or "table name fast load" use the equivalent "from variable" table load option. This obviously assumes you have a variable defined that contains the name of the table created in the execute sql task
I want to develop an automation in SSIS.
Problem statement :
I have an excel sheet which has a single column.
Based on the values in that column (will be included as a search parameter in the SQL query) I need to fetch 2 or more columns from SQL server database
The results are to be stored in the same Excel sheet against the data obtained for that particular column.
I already have an excel macro for the same. But, now I want to develop a package for the same.
Please guide me through the necessary steps.
I will also keep trying to obtain the solution
Create an Excel Source and link it to your file
use a lookup component to perform a SQL select to obtain the missing data
Create an Excel destination to save your target data
Rather new to SSIS so not sure how to handle this.
I have a flat file which i managed to successfully read from. So right now my data flow consists of just a flat file source.
What i want to do is something like this:
Update SqlTable S
set s.columnA = f.columna
from FlatFile f
where s.columnID = f.columnID
Right now the only way i can see of doing this would be to insert the contents of the flat file into a sql table, then doing my update. This seems wasteful considering i don't need to save the data of the flat file. I just need to update an existing sql table based on the data in the flat file. So is there some way to run the query directly in the SSIS package instead of having to insert a bunch of data into a sql table that i will just wind up dropping?
thanks
Update SqlTable S set s.columnA = f.columna from FlatFile f where s.columnID = f.columnID
That statement above is a SQL statement. You cannot connect a sql table to a flat file. You need to work in SQL to do an update, since that is where the table lives
You have 2 choices:
Use an OLEDB Command component within the data flow. The downside is this calls the statement for each record, so if you have 1,000s of records it is very inefficient.
Push the records to a table using an OLE DB Destination and then you can call your update using an Execute SQL Task. You can then truncate the table if you like
A possible 3rd option is to roll your own OLE DB destination to do an update on record sets vs records.
While this might sound wasteful, to create a table in the database to store update records, it is done very often. You just drop the worktable or truncate when complete.
You could add an OLE DB Command component to the Data Flow that retrieves data from the flat file. The OLE DB Command would do a single row update for each record retrieved from the flat file. This might be okay if there are few rows in the flat file; but, you can imagine how bad performance will be if there are many rows in the flat file.
I think you'll find that sending the flat file rows to a database table and running a single UPDATE is going to be the best performer for lots of data.
I haven't tried this but have you tried sending to a recordset destination and then running the update using that?
The bulk load into a temporary table is the way to go and then do your updates from the temp table. As a previous poster says it is quite a common aproach to stuff data into a staging area prior to doing some more work with the data and then dropping or truncating the table