I am creating an SSIS package which has a flat file source and a destination database.
The mappings between the columns are based on the following:
There is a table which contains records indicating the mappings ie: source column name and destination column name. The tables will be based on the name of the flat file.
The reason this has been done is so that the destination column names can be changed in the database rather than needing to recreate or edit the package.
Please could you advise as to how I could do this "lookup" and create the mappings dynamically.
There is no way to create mappings dynamically. You'll need to either generate SSIS package programmatically on the fly, or use other method (openrowset, bcp, bulk insert...).
Related
I have a scenario where my source can be on different versions of our database as a result the in source file I could have different number of columns while my destination have defined number of columns.
now
what we are trying to do is:
load data from source to flat files. move them to central server and
then load that data into central database. but if any column is
missing in flat file i need to add derived column.
what is the best way to do this?? how can i dynamically add derived columns?
You can either do this with BiMLScript as other have suggested in comments, or you can write a script task that reads the file, analyzes the contents, and imports it. Yet another option would be to bulk import the file as is to a staging table (that would have to be dropped and re-created everytime) and write a stored procedure that analyzes the DDL and contents, and imports data to the destination table.
I have DataFlow task which has been created programmatically,
The data source connected with DBMS via OLE DB Provider,
I can get output columns from the source and map then with input columns of destination component.
But I can't invoke ReinitializeMetaData() for destination component because destination table doesn't exist.
Therefore, I want to get DDL from output columns for creating the table.
Maybe, someone knows which functionality is provided in SSIS for this purpose?
Thanks in advance
I have 30+ tables from which I need to import data. I can create dataflow components and do the same, but then I would need to create source/destination/checking record count for all 30+ tables.
How can I, while using one component (and iterating through it) or using a script component, create dynamic code that takes source and destination connection parameters and then imports the data?
I am looking to create a package which iterates through table names and then creates appropriate mappings and imports the data.
1Create a one mapping table for source and target table
2create script task and use bulkcopy function to mapped the source and target columns
3run the bulkcopy function
I am importing an Excel table from an ftp site to SQL using SSIS. The destination table is going to be used to calculate good and bad records based on another SQL database. Here is my problem. The Excel file is name RTW_032613_ABC_123.xls. This file name is a concatenation of a number of fields. I cannot recreate it based on the fields in the table, so I need to retain it and pass it to the new table in SQL. I have a parameter #FileName that I am using to loop through the files in the ftp folder. What I would like to do is either combine the import of data from the Excel file with the file name or insert the file name in each record after the import. I am calling the SSIS procedure from another stored procedure in SQL. I tried adding a SQL data flow task but I am not seeing where I add the insert statement on either the SQL Server Compact Destination or SQL Server Destination.
I am over my head with SSIS on this one. The key is that the parameter that I need is available in SSIS but I really need to get it passed on to my SQL table.
TIA
If I'm reading your question right, you have an SSIS package with a variable containing the filename and you want to save the filename with each row that you are sending to your SQL table? If so:
Add a derived column to the data flow, making a new column and referencing the variable in the expression
Include that new column in the mapping for the destination of your data flow, sending the filename to whichever column you would like to save that data in.
No need for a seperate SQL task.
I am using SSIS to import an Excel file into a table in my SQL Server 2008 database.
Currently I am able to import data into the table by using data flow setting Excel file as the source and data table as the destination. My current import is based on the column mapping between source and the destination, but now I want to add an extra column to the table (basically this column will have the id that is given to the Excel file of which the rows are part of, so this value will be same for each row that belongs to the file whose data we are currently importing)
This column is not present in the source Excel sheet and its value is in a SSIS user variable. I want the insertion of this value a part of the import process, but I cannot figure it out?
How can I achieve this?
The connection manager for the destination doesn't allow me to map user variables to columns...
Put in a Derived column between the Excel Source and the Database destination.
Create a column there and use the SSIS User Variable as the value expression for the column.
Add a execute SQL task after the dataflow task and update the extra column with the variable with an update statement.