Reading excel from DB (varbinary) in SSIS - ssis

First I'd like to say that I'm brand new to SSIS so bear with me if this is a very basic question. I've searched and cannot find an answer.
I need to read data from SQL Server that is stored in a varbinary column that contains an excel document. I then need to store this data into another table with the appropriate columns (pre-defined format).
My question is essentially... How do I read this varbinary data into something I can work with and then insert into another table?

You could use Export Column Transformation available within the Data Flow Task to read the varbinary data and then save it as a file on local disk where the SSIS package is running.
MSDN documentation about Export Column transformation.
Sample: The Export Column Transformation on BI Monkey
Using another data flow task, you can read the saved file and import the data into the table of your choice.

Related

how to handle middle of data failure case in ssis

Hi I have one doubt in ssis
I want load source excel file data into sql server database table.
Source excel file have billions of data(huge data).
whiel loading time halfoffrecords are loaded into destination table after that its failed due some data comes incorrect format .
in this sistuvation how will handle package for loading all data into destination using ssis.
source: excel(Emp information data)
destination : Table : emp
I tried using check point configuration for rerun at the point of failure..
but its not usefully to handled data row level and duplicate data is loading.
and I tried another way truncate data in the destination table.
after that I used redirect row for error handling.buts its not good implementation due to
trunating destination table.
please tell me how many way to achive this task(complete load) in ssis package level.
Load your data from Excel into a staging table, which you truncate before every load.
Make all of the columns of the staging table nvarchar(max) type, so they can handle any format of incoming character data.
Then run a stored procedure that de-dupes, formats and transfers the data to the final destination table.

SSIS importing ragged right flat file

I am very new to SSIS development. Please help me
I have a ragged right flat file with no column names, which I need to import to sql server database.
The problem is, when importing these file using flat file connection manager, all the columns are stored as varchar. I have different columns like Amount (which should be a decimal data type), percentage (which should be a decimal data type), Date (which should be a date data type mm/dd/yyyy format).
For example, The amount field is loaded as 000012347834 . I need to change it to 123478.34
percentage field is loaded as 03246. I need it as 03.246 .
How can I do these conversions using SSIS.
Thank you in advance for your valuable time and help.
You can use the advanced editor on your connection manager to change the data types.
You can also use a data conversion task which is more along the lines of what SSIS is designed for... Transform is the T in ETL.
However most often if I have a lot of conversions to do on a text file import I will load into a staging table with all varchar and use a stored procedure to do the conversions.

dynamically adding derived column in SSIS

I have a scenario where my source can be on different versions of our database as a result the in source file I could have different number of columns while my destination have defined number of columns.
now
what we are trying to do is:
load data from source to flat files. move them to central server and
then load that data into central database. but if any column is
missing in flat file i need to add derived column.
what is the best way to do this?? how can i dynamically add derived columns?
You can either do this with BiMLScript as other have suggested in comments, or you can write a script task that reads the file, analyzes the contents, and imports it. Yet another option would be to bulk import the file as is to a staging table (that would have to be dropped and re-created everytime) and write a stored procedure that analyzes the DDL and contents, and imports data to the destination table.

Creating a table from .csv file using PDI Kettle

I'm new to PDI and I'm working with PDI Kettle, I have 40 .csv files with different number of columns ,I want to create tables out of those files in a single transformation, I have used a "CSV File Input" step to select a file and "Table Output" step to create table but for creating 40 tables out of those 40 files I again need to select these two steps,so is there any way to create all 40 tables in one go in a single transformation is it possible,pls help me with the same
Thanks in advance
To do this in Pentaho with the standard steps is a bit involved. To read the CSV and get the headings, and then read the data, you need to use ETL Metadata injection.
First read the header with columns name, and use ETL Metadata injection to read the data in another transformation.
To auto create the databases is not straight forward, as this is something the main developer of Pentaho discourages.
Here is the answer and an example of how to auto create a table: Perform an auto CREATE TABLE to store the output of a transformation.
So you would run a job, that passes the filename and tablename to a transformation. The transformation will use ETL Metadata injection to read the CSV into correct fields and meta.getSQLStatementsString(); to get the DDL of the database to store the data.

Importing data from MySQL to Excel

I have data in excel sheet (CSV format) and I have imported this data into Mysql and filtered the data based on dates (only 2014 and 2015) years have been selected.
The client wants data back in excel. So, I have to import the data which I had extracted based on dates into excel. I believe this would be a temporary table. So, how do we import the temporary tables to excel.
I don't know how to use mysql to excel converter in this case, as the temporary table is being used.
Here is a software by MySQL to handle that
https://www.mysql.com/why-mysql/windows/excel/
MySQL for Excel makes the task of getting MySQL data into Excel a very
easy one; there are no intermediate CSV files required, only a couple
of clicks and data will be imported to Excel. MySQL for Excel supports
importing data from tables, views and stored procedures.
The easiest way of doing this could be as follows
select <fieldnames>
from <temporary table>
Then copy and paste the results into excel. save as excel format, and voila!
Edit: if you are using a tool like phpMyamin, you can just export the results of your query into the desired format. If you want the headers, choose "include column names" in the advanced" export.
If you just need a document that opens in excel you could export the tables as a CSV file from mysql.
http://www.mysqltutorial.org/mysql-export-table-to-csv/
If you have to actually create an excel worksheet then you would have to use a library based on what programming language you are using to convert the data to an excel spreadsheet.