I have 30+ tables from which I need to import data. I can create dataflow components and do the same, but then I would need to create source/destination/checking record count for all 30+ tables.
How can I, while using one component (and iterating through it) or using a script component, create dynamic code that takes source and destination connection parameters and then imports the data?
I am looking to create a package which iterates through table names and then creates appropriate mappings and imports the data.
1Create a one mapping table for source and target table
2create script task and use bulkcopy function to mapped the source and target columns
3run the bulkcopy function
Related
I have DataFlow task which has been created programmatically,
The data source connected with DBMS via OLE DB Provider,
I can get output columns from the source and map then with input columns of destination component.
But I can't invoke ReinitializeMetaData() for destination component because destination table doesn't exist.
Therefore, I want to get DDL from output columns for creating the table.
Maybe, someone knows which functionality is provided in SSIS for this purpose?
Thanks in advance
I am importing an Excel table from an ftp site to SQL using SSIS. The destination table is going to be used to calculate good and bad records based on another SQL database. Here is my problem. The Excel file is name RTW_032613_ABC_123.xls. This file name is a concatenation of a number of fields. I cannot recreate it based on the fields in the table, so I need to retain it and pass it to the new table in SQL. I have a parameter #FileName that I am using to loop through the files in the ftp folder. What I would like to do is either combine the import of data from the Excel file with the file name or insert the file name in each record after the import. I am calling the SSIS procedure from another stored procedure in SQL. I tried adding a SQL data flow task but I am not seeing where I add the insert statement on either the SQL Server Compact Destination or SQL Server Destination.
I am over my head with SSIS on this one. The key is that the parameter that I need is available in SSIS but I really need to get it passed on to my SQL table.
TIA
If I'm reading your question right, you have an SSIS package with a variable containing the filename and you want to save the filename with each row that you are sending to your SQL table? If so:
Add a derived column to the data flow, making a new column and referencing the variable in the expression
Include that new column in the mapping for the destination of your data flow, sending the filename to whichever column you would like to save that data in.
No need for a seperate SQL task.
I am creating an SSIS package which has a flat file source and a destination database.
The mappings between the columns are based on the following:
There is a table which contains records indicating the mappings ie: source column name and destination column name. The tables will be based on the name of the flat file.
The reason this has been done is so that the destination column names can be changed in the database rather than needing to recreate or edit the package.
Please could you advise as to how I could do this "lookup" and create the mappings dynamically.
There is no way to create mappings dynamically. You'll need to either generate SSIS package programmatically on the fly, or use other method (openrowset, bcp, bulk insert...).
I have some requirements as explained below:
I need to have a SSIS data data flow task,which will accept the input parameters as
--SourceServer Connection String
--DestinationServer Connection String
--Source Table Name
--Destination Table Name
Then It should do the column mapping of source and destination tables dynamically at run time.
The source and destination tables schema would be same always.
I want to call this package by passing the above parameters from C#.net.
One main thing is here I will have to pass different sets of source and destination tables.
just answered this on a previous question. You cant loop through tables and dinamically map columns on your source and destinations components. You would need one set of Source -> Ddestination per table.
If that's not feasible, you may want to lokk at the Transfer SQL Server Objects Task
Create SSIS Packages parameters. Set web.config file for passing that parameters.
First you deploy the package in SQL Server.
Create one job for execute the package.
Create one sp using SQL Server.
& execute the job.
using sp_start_job.
I think it solve ur problem.
I'm new to SSIS and am writing a package that includes moving data to a table that is created in a previous Execute SQL Task object.
The issue that I'm encountering is that I am unable to create a data flow destination task that uses a dynamic destination table name.
The intended process is:
Execute SQL Task object creates new table based on today's date (i.e. Table1_20111014)
Data Flow task moves data from table "Table1" to "Table1_20111014".
The column metadata for Table1 and Table1_20111014 are the same, and does not change. However, the name of the table the data needs to be moved to will change depending on the date at time of execution.
Is it possible to dynamically specify the destination table in a destination data flow object?
If not, are there known workarounds or is using SSIS for this task a bad idea?
As long as the meta data remains the same, there is no drawback to using dynamic destination table name.
To accomplish this, on the ole db destination instead of using "table name" or "table name fast load" use the equivalent "from variable" table load option. This obviously assumes you have a variable defined that contains the name of the table created in the execute sql task