SSIS: import multiple files to separate destination tables - ssis

I'm new to SSIS. I'm running BIDS under SQL Server 2008 R2. I have several text files that I need to import into separate SQL Server destination tables. The tables already exist in the DB. Each file will map to only 1 table. (For example, file_A maps to table_A, and file_B maps to table_B.) My general data flow is as follows:
Flat File Source
Data Conversion (to handle the issue of unicode vs non-unicode strings)
OLE DB Destination (to handle the issue of local server to remote server)
Do I need to create a separate data flow task for each of my text files? If so, my package may be very large.

You can create a single data flow task that has more than one "flow" in it. Hard to describe with words, but you can put a source1 that flows to DataConversion1 that flows to Destination1 and then alongside it (no connections) Source2 flows to DataConversion2 flows to Destination2, and so on.
However, I do agree with #billinkc that using a separate dataflow for each is the better way to go. It will make debugging easier, in addition to the other benefits he mentioned.

Related

Dynamically decide columns in SSIS connectors for salesforce

I want migrate data from salesforce to SQL server and I am using SSIS connectors for salesforce. I am creating single SSIS package which fetch data for all objects and insert into SQL server. I tried using following connectors for salesforce.
Connector 1 : Kingswaysoft
https://www.kingswaysoft.com/
Connector 2 : CData
https://www.cdata.com/kb/articles/ado-ssistask-sf.rst
Connector 3 : SSIS PowerPack -
https://zappysys.com/onlinehelp/ssis-powerpack/index.htm
https://zappysys.com/products/ssis-powerpack/ssis-salesforce-source-connector/
In all the connectors I am unable to provide different columns(salesforce fields) dynamically in SOQL query using SSIS variables.
I agree with the comments that SSIS is for static ETL & you can work around with C# script task for dynamic metadata.
As an Alternative, you can try conditional branches and run two different tasks based on expression. Read Add expressions to precedence constraints.
Not sure how many dynamic columns we are talking here, but for discussion sake let’s take 2 different columns has to be filled in salesforce destination based on source column, then have 2 branch.

insert data to nonlinked server using SSIS

I have two sql 2008 R2 servers, which are not linked. I need to read data from server A and write them to server B. Best way is to use SSIS.
But the data insert has to be generic, that means, I do not know the table structure. I have sql queries ready (when I run them on linked server, it works), but SSIS with oledb source and oledb destination needs table structure to do the column mapping.
how can I run dynamic sql task using SSIS and be able to read data from server A and insert them to server B (and then perform rollback on error)?
Your best bet is probably BiML, which dynamically creates SSIS packages based on meta data.

Copying multiple tables using SSIS Package [duplicate]

This question already has answers here:
ssis best practice to load N tables from Source to Target Server
(3 answers)
Closed 8 years ago.
I am trying to design an SSIS package which copy about 50+ tables from an ODBC DataSource (QuickBooks DB) to an SQL DB.
Should I create 50 Data Flow Task to do this ?
What is the best way to do this ?
Putting DFT inside a Loop, and reading the tables ? Or 50+ Data Flow Tasks ???
You can create 50 Data Flow Tasks, but you don't have to.
It is possible to have multiple independent sources-destinations in the same DFT.
This will be not as flexible, because you can run single DFT separately from the package (while debugging), but you cannot run a piece of DFT without modifying it (as far as I know).
Depending on which option you choose, I see a couple of ways to save yourself from mundane work with 50+ tables:
a) Let SQL Server Import and Export Wizard do the boring work for you.
The best about this tool is that it can create a .dtsx package.
So, with the wizard, you can:
select for importing all 50+ tables from ODBC DataSource
instead of running the wizard till the end, save the result as a .dtsx package.
open the package in Visual Studio with SQL Server Data Tools
modify the package up to your needs (for example logically regrouping the tables in different DFTs, adding any additional transformations).
b) Manually edit the package code (some BIML knowledge might be needed):
In Visual Studio with SQL Server Data Tools, create 1 DFT which will be your sample.
In Solution Exporer, right-click on your package, select View Code.
Either copy/paste the DFT 50+ times, changing the table names, or maybe you will even manage to automate your BIML somehow to avoid copy/paste

SSIS ‘Data Flow Task’ No records in flat file destination

Please forgive my initial posting being a question instead of a solution.
I’ve got two SSIS packages that basically do the same thing. The last step of both is a ‘Data Flow Task’ that queries the database and attempts to write the results to a flat file. One of the packages builds the flat file correctly, the other builds the file but doesn’t populate it with any records. Running SQL Server 2008 R2.
This is in a University setting involving transferring degree_codes and demographics between two systems. The degree_code package is functioning, the demographics isn’t. Both ‘Data Flow Tasks’ consist of an OLE DB Source linked to a Flat File Destination (tab delimitated text). Both packages display the correct data set when previewing the OLE DB Source.
In the Flat File Destination, the mappings are correct in both packages. However, when previewing the data, the degree details display correctly, but there are no records in the demographic preview. That’s also true when looking at the connection managers. And when the packages run, the degree_codes file is correct while the demographics file only contains a header. It seems there is a problem with the link between the OLE DB Source and the Flat File Destination
Both packages run with only a warning about shared global memory impacting performance. I’ve deleted and rebuilt the non-functioning Data Flow Task and connection managers without fixing the problem. At this point I am at a loss of which direction to go and don’t know how to diagnose the problem. Have any of you folks run into a similar situation or do you have any suggestions how to chase it down. I’d be grateful for any solutions.
Try to export the data to a tmp table in your db, if the data is saved there the issue is on the file connection, if not your query needs to be rewritten
Verify the query columns you are executing on the tables are matching and data types are as expected in output , try putting all as string types initially and check if it works then apply correct data types after it executes successfully you can modify for data types as needed

Load Excel data into SQL Server using SSIS without Creating tables in the target

I am new to SSIS and am not sure if this is possible to upload Excel data into SQL Server without creating table schema, where the Job automatically creates the table schema according to the source file? I used to do this using SAP Data Integrator with Template table component and I am not sure if there is a similar functionality in SSIS
There is no capability in SSIS to interpret the structure from the source file and create the table schema. You can create tables as part of the SSIS control flow, but the data flows that load the data are very fussy about knowing the source and target structure at design time. In other words, it doesn't handle dynamic structures very well.