This question already has answers here:
ssis best practice to load N tables from Source to Target Server
(3 answers)
Closed 8 years ago.
I am trying to design an SSIS package which copy about 50+ tables from an ODBC DataSource (QuickBooks DB) to an SQL DB.
Should I create 50 Data Flow Task to do this ?
What is the best way to do this ?
Putting DFT inside a Loop, and reading the tables ? Or 50+ Data Flow Tasks ???
You can create 50 Data Flow Tasks, but you don't have to.
It is possible to have multiple independent sources-destinations in the same DFT.
This will be not as flexible, because you can run single DFT separately from the package (while debugging), but you cannot run a piece of DFT without modifying it (as far as I know).
Depending on which option you choose, I see a couple of ways to save yourself from mundane work with 50+ tables:
a) Let SQL Server Import and Export Wizard do the boring work for you.
The best about this tool is that it can create a .dtsx package.
So, with the wizard, you can:
select for importing all 50+ tables from ODBC DataSource
instead of running the wizard till the end, save the result as a .dtsx package.
open the package in Visual Studio with SQL Server Data Tools
modify the package up to your needs (for example logically regrouping the tables in different DFTs, adding any additional transformations).
b) Manually edit the package code (some BIML knowledge might be needed):
In Visual Studio with SQL Server Data Tools, create 1 DFT which will be your sample.
In Solution Exporer, right-click on your package, select View Code.
Either copy/paste the DFT 50+ times, changing the table names, or maybe you will even manage to automate your BIML somehow to avoid copy/paste
Related
I want migrate data from salesforce to SQL server and I am using SSIS connectors for salesforce. I am creating single SSIS package which fetch data for all objects and insert into SQL server. I tried using following connectors for salesforce.
Connector 1 : Kingswaysoft
https://www.kingswaysoft.com/
Connector 2 : CData
https://www.cdata.com/kb/articles/ado-ssistask-sf.rst
Connector 3 : SSIS PowerPack -
https://zappysys.com/onlinehelp/ssis-powerpack/index.htm
https://zappysys.com/products/ssis-powerpack/ssis-salesforce-source-connector/
In all the connectors I am unable to provide different columns(salesforce fields) dynamically in SOQL query using SSIS variables.
I agree with the comments that SSIS is for static ETL & you can work around with C# script task for dynamic metadata.
As an Alternative, you can try conditional branches and run two different tasks based on expression. Read Add expressions to precedence constraints.
Not sure how many dynamic columns we are talking here, but for discussion sake let’s take 2 different columns has to be filled in salesforce destination based on source column, then have 2 branch.
I have a folder with around 15 reports in it, these are Report Server reports. To run each report individually will take a while, so I want them to run together. So, what I want to be able to do is somehow run all the reports in this folder, is this possible?
This is somewhat of an ambiguous question. Let me explain. What are you asking specifically?
Q: Can you run multiple reports at the same time?
A: Yes, and there are several ways to accomplish this.
1. You can use SQL agents
2. Use batch files with task scheduler
3. Use an SSIS package and use an agent to run them at specific times...etc...
Hopefully one of the reports does not depend on another and another thing that you have to take in to consideration is how hard you will be hitting the SSRS or SQL server. Running them all at one time may take longer than one at a time. depending on the bandwidth of the SQL Server and what tables are going to be locked up during each of these processes.
You might want to give a little more detail in your question...
I would recommend an SSIS package, especially as it also one of the options presented by #Michael that can email the Excel workbook too which you mentioned in an earlier comment.
The following resource covers quite well the execution and export of an SSRS report using SSIS, including code you will need as a starting point: Executing an SSRS Report from an SSIS Package.
You could save some time in coding the solution by using the following custom Task that can be integrated into SSIS: SSIS ReportGenerator Task.
There is one problem in your requirements though which is merging reports into one Excel workbook where I assume you want separate sheets for each report within the same workbook?
Reporting Services can use multiple worksheets (to divide a report up into pages a.k.a pagination) but only for a single report; it can't merge reports into one Excel file. This can be accomplished with custom code however. There's a somewhat basic example here: Merging workbooks into a master workbook with separate sheet for each file.
One way to run all the reports at once is to add subscription to all of them and set same subscription start time in all of the reports. what will happen is once the start time arrived all the reports will run simultaneously and will generate excel/pdf (any format specified) file at shared location.
I'm new to SSIS. I'm running BIDS under SQL Server 2008 R2. I have several text files that I need to import into separate SQL Server destination tables. The tables already exist in the DB. Each file will map to only 1 table. (For example, file_A maps to table_A, and file_B maps to table_B.) My general data flow is as follows:
Flat File Source
Data Conversion (to handle the issue of unicode vs non-unicode strings)
OLE DB Destination (to handle the issue of local server to remote server)
Do I need to create a separate data flow task for each of my text files? If so, my package may be very large.
You can create a single data flow task that has more than one "flow" in it. Hard to describe with words, but you can put a source1 that flows to DataConversion1 that flows to Destination1 and then alongside it (no connections) Source2 flows to DataConversion2 flows to Destination2, and so on.
However, I do agree with #billinkc that using a separate dataflow for each is the better way to go. It will make debugging easier, in addition to the other benefits he mentioned.
I have an Oracle query and I want to export the query results to an excel file daily. I've looked into both SSRS and SSIS and am not sure which would be better to use.
The query is a normal select that returns 10-20 fields. It is pretty straight forward with a couple joins and where clauses. It selects DISTINCT to get rid of duplicate rows.
It's a straight mapping from the query to the excel file.
Does SSIS have performance advantages over SSRS?
I was leaning toward SSRS because it's very simple to set up and there are added benefits of being able to easily run our extract/report with different dates through the SSRS web UI.
SSIS seems like it will be more complex to set up, but still simple. However, it seems I would have to handle how to rename the extracts without using the main excel "template" so there are more steps involved. Also having issues getting parameters to work with Oracle queries.
Even though I am a big fan of SSIS, I would go with SSRS in this scenario.
Your requirement is that you simply need data in an Excel file. Though both SSIS and SSRS can do this task. SSRS has slight advantage in what you are trying to achieve.
You can format the Excel file in SSRS report however you would like to.
Similar to SSIS package that is easy to configure, SSRS also has easier development process. You can design and populate however you would like.
SSIS requires a SQL job to schedule it in order to run the package and then send you the Excel file or save it to some location. However in SSRS, you can simply create a subscription and export the Excel file to particular folder or send it to you in an email.
If you ever want to change the file export format, SSRS already does that for you.
Some of the points that I could think of.
This isn't a report, so don't use Reporting Services.
The SSIS package necessary for this is a single Data Flow task with two components: an "OLE DB Source" for your Oracle query, and an "Excel Destination". Draw a connection between the two components, configure them, press F5 and you're done.
Almost any property in SSIS can be set to the value of an expression. This includes the "ExcelFilePath" property of the Excel Connection Manager. Simply set that to an expression that appends the date to the file path, and you'll be set as long as you only run the package daily.
If you need to run it more than once a day, then simply precede the Data Flow task with a File System task to delete any previous version of the file.
Just tried this quickly myself and found one small issue. The data source I used included VARCHAR columns. The Excel Destination wanted Unicode, so I had to place a Data Conversion component between source and destination.
This link has a nice evaluation of the case you are presenting:
http://theruntime.com/blogs/gscarfone/archive/2009/07/15/data-dump-to-excel-through-ssis-and-ssrs.aspx
Basically it depends on your particular scenerio.
Hope it helps...
I've been playing around with hosting on discountasp.net and am in the process of hosting my second web app. Being that discountasp charges you per database and not per sql server or total space used by all of your databases, both apps need to share one database
I need to create all of the tables from the database used by the second application in the live database. I can't just import the mdf file because that would drop all of the data already stored by the first application. Is there a way to automatically generate the scripts or simple sql commands to create the tables in the mdf file from within visual studio?
Also, since multiple applications will ultimately use the same database I'd like to add a prefix to each table names - like App1_Table1. Is there a simple way to rename tables inside VS 2010? Further is there a way to rename the tables but have the entities framework ignore the prefix when generating it's classes?
Thanks for your help, your answers will save me a ton of time I could be programming with :).
There are many options available to you.
In Visual Studio there's database schema compare functionality.
ScriptDb is a simple console app written in C# that uses SQL Management Objects (SMO) to script all the objects in a database. It will work against any SQL Server 2000 or 2005 database. It creates a directory tree structure with a similar hierarchy to that in Object Explorer in SSMS, with a separate file for each object.
There's also an option to script database objects from SSMS. Right-click on a database -> Task -> Generate scripts.