How to use Format files in SSIS Data Flow task? - ssis

I am able to migrate data between two SQL Server tables easily using a SSIS data flow task. Can I use format files to specify the columns to choose from the source and destination? If so, can you give me an example?
In our current system, our Source and Destination tables are always not the same. We were using SQL-DMO with format files so far and are now upgrading to SSIS.
Thanks in advance for your suggestions.

So I think that you can look up info on how to create a format file here: http://msdn.microsoft.com/en-us/library/ms191516.aspx
Google SSIS Bulk Insert Task to find more on that.
I would recommend using a data flow if you can because this can eliminate columns from the source that do not exist in the destination and it can out perform bulk inserts. It's worth consideration.
Mark

Here is a post to which I just finished answering my own question and thought will link the two posts together.
SSIS - Export multiple SQL Server tables to multiple text files

Related

How to Add csv file data into multiple Table in SSIS Package

I have data in .CSV file and I want to insert that data into multiple Table in SQL SERVER using Single SSIS Package.
I tried Multicast option but not come up with the solution...
if you have any idea how to do this please share the solution.....
Multicast is the way how one does is SSIS. It will work for sure as I am using them in my project while reading from files. It seems you were missing something or a mistake while developing. It will help us if you post the error you got while using multicast.
Please check this article for more info on multicast
Or else as stated by Brad, you can go for a direct insert into SQL table which acts as a Landing layer and write custom code or a multicast in a separate dataflow task suffices here
Thanks,
Sree

How to dynamic column mapping in ssis

I have large amount of flat files to migrate to sql server.Each file has different column length based on the metadata defined.Please tell me how i can create a single package to migrate all the files?
If this is a one-time import, you can let the import/export wizard create the package for you.
If you need a more lasting solution, you can use BiML which dynamically creates packages at execution time based on available metadata.

Extract Transform Load into MySql

Am basically from Microsoft background working much on SSIS for ETL sought of project.
Now I got another project on hand to deal with loading of .csv files into MySql database. In process of loading these tables data has to go through some transformations and then into destination table. It is much of ETL project.
Client doesn't have SSIS (BIDS) and am compelled to use open source tools.
I did bit of research and found Talend Data Integration tool best fits for my situation.
As am new to this environment and am sure there are experts in this area, I need some advice on best tools to do ETL of this type and best practices.
If need any futher information please let me know.
If I remember correctly, PhpMyAdmin can import CSV into MySQL, and this question is about a similar topic too, but these don't come close to what SSIS can offer...
Yes you are right Talend Open Studio is pretty good tool with hundreds of connector,
in your case just create job which take CSV as your source and MySQl is destination apply any transformation if required and load it.
you can get more information on CSV to MySQL load with examples Talend forum
if you have any base plan then, share with me, I can guide you how to transfer CSV to MySQL table.

How to load Excel or CSV file into Firebird?

I'm using Firebird database and I need to load Excel file into a database table. I need a tool that does this well. I tried some I found on Google, but all of them have some bugs.
Since Excel data is not created by me, it would be good if it could scan the file and discover what kind of data is inside and suggest a table to be created in the database.
Also, it would be nice if I could compare the file against the data that is already in the database table, and I can pick which data to load and which not.
Tools that load CSV files are also fine, I can "Save as" CSV from Excel before loading.
Well, if you can use CSV, the I guess XMLWizard is the right tool for you. It can load a CSV file and compare with database data. And you can select the changes you wish to make to the table.
Don't let the name fool you, it does work with XML, but it also works very well with CSV files. And it can also estimate the column datatypes and offer CREATE TABLE statement for your file.
Have you tried FSQL?
It's a freeware very similar to Firebird's standard ISQL, but with some extra features, like import data from CSV files.
I've used it with DBF files and it worked fine.
There is also EMS Data import tool for Firebird and Interbase
http://www.sqlmanager.net/en/products/ibfb/dataimport
Not free, though, but it accepts a big variety of formats, including CSV and Excel.
EDIT
Another similar payware tool is Firebird Data Wizard http://www.sqlmaestro.com/products/firebird/datawizard/
There are some online tools which can help you to generate DDL/DML scripts from csv header/sample dump file, check out: http://www.convertcsv.com/csv-to-sql.htm
You can then use sql-workbench's Data Pumper or WbImport Tool from command line.
Orbada has GUI which support for importing csv file also.
DBeaver Free edition also support importing csv out of the box.
BULK INSERT
Other way is on Excell you build formula in new cells with data you want to export. The formula consists to format in strings and lenght to your field according lenght your field in firebird. So you can copy all this cells from excell and past on txt editor, so is possible to use the strategy of BULK INSERT in Firebird.
See more details in http://www.firebirdfaq.org/faq209/
The problem is if you have blob or null data to import, so see if you have this kind of values and if this way is to you.
If you have formated data in txt file, BULK INSERT will be quick way.
Hint: You can too to disable trigger and index associated with your table to accelerate BULK INSERT, and after enable them.
Roberto Novakosky
I load the excel file to lazarus spreadsheet and then export to firebird db. Everythong is fine and the only problem is fpspreadsheet will consider string field with numbers only as a number field. I can check the titles in the first row to see whether the excel file is valid or not.
As far as I can see all replies so far focus on tools that essentially read the Excel (or CSV) file and uses SQL inserts to insert the records into the Firebird database. While this works, I have always found this approach painstakingly slow.
That's why I created a tool that reads an Excel file and writes one file that has a (text) format suitable for Firebird external table (including support for UTF8 char columns) and one DDL file to create the external table in Firebird.
I then use regular SQL to select from the external table, cast as needed, and insert into whatever normal Firebird table I want. The performance with this approach is orders of magnitude faster than SQL inserts from a client app in my experience.
I would be willing to publish the tool. It's written in C#. Let me know if there's any interest.

Best way to gather, then import data into drupal?

I am building my first database driven website with Drupal and I have a few questions.
I am currently populating a google docs excel spreadsheet with all of the data I want to eventually be able to query from the website (after it's imported). Is this the best way to start?
If this is not the best way to start what would you recommend?
My plan is to populate the spreadsheet then import it as a csv into the mysql db via the CCK Node.
I've seen two ways to do this.
http://drupal.org/node/133705 (importing data into CCK nodes)
http://drupal.org/node/237574 (Inserting data using spreadsheet/csv instead of SQL insert statements)
Basically my question(s) is what is the best way to gather, then import data into drupal?
Thanks in advance for any help, suggestions.
There's a comparison of the available modules at http://groups.drupal.org/node/21338
In the past when I've done this I simply write code to do it on cron runs (see http://drupal.org/project/phorum for an example framework that you could strip down and build back up to do what you need).
If I were to do this now I would probably use the http://drupal.org/project/migrate module where the philosophy is "get it into MySQL, View the data, Import via GUI."
There is a very good module for this, node import. It allows you to take your GoogleDocs spreadsheet and import it as a .csv file.
It's really easy to use, the module allows you to map your .csv columns to the node fields you want them to go to, so you don't have to worry about setting your columns in a particular order. Also, if there is an error on some records, it will spit out a .csv with the error files and what caused the error, but will import all good records.
I have imported up to 3000 nodes with this method.