Transformation to change column alignment in SSIS - ssis

I have a package, that picks excel file from a location and loads to a table in sql server.(The Excel file is derived from another package)
The Excel file has columns with name A,B,C,D.
I want the columns to be aligned like A,B,D,C ( C & D sequence changed). Is there a way I can achieve this in SSIS? The person dropping the file does not want to manually change it.
Currently I have data flow as:
Excel Source Plus --> Row Count --> OLEDB Destination
Thanks :)

You can map fields in your destination in any order you wish. The columns can be in different positions or even have different names.

Related

Remove Column from CSV file using SSIS

I have a CSV file that I am using as a source in SSIS. The file contains additional blank columns in the file
There is an additional column in S, U, V; is there a way I can remove the column through SSIS Script Task before using it as a source file.
Perhaps I misunderstand the problem, but you have a CSV with blank columns that you'd like to import into SQL Server but you do not want to import the blank columns into your table.
So, don't create the target columns in your table. That's it. There's no problem with having "extra" data in your data pipeline, just don't land it.
The Flat File Connection manager must* have a definition for all the columns. As it appears you do not headers for the blank columns and a column name is required, you will need to set up the flat file connection manager as my file has no header columns but then skip 1 row to avoid the header row. It might be a data starts on line setting - doing this from memory. By specifying no header row, that means you need to manually provide your column names. I favor naming them something obvious like IgnoreBlankColumn_S, IgnoreBlankColumn_U, IgnoreBlankColumn_V that way future maintainers will know this is an intentional design decision since the source data has no data.
*You can write a query against a text file which would allow you to only pull in specific columns but this is not going to be worth the effort.

SSIS package to Insert flat file data to two different tables

I want to insert flat file data to two different sql table.But some additional field coming from flat file should be inserted to other table on the basis of indicator field but the usual field coming should be inserted into the regular table.
The other issue,the additional field to be inserted cannot be inserted directly because of no column mapping.
eg:
1234 056 Y Tushar
5678 065 N
So 1234 056 should be inserted to regular table but indicator Y tells us that Tushar should be inserted to other table.
But the table in which I want to Insert Tushar cannot be done directly as it does not have 1234 column name.
For indicator N also it should get inserted normally in the base table.
So what I did was I used a conditional split and then used ole db command but it it inserting multiple records in the table.
If you put a Multicast task right after your flat file source, you can create extra copies of your data set. Then you can use one copy to insert into Regular Table, and then you can put your Conditional Split on the second copy.
Your data flow would then look like this:
In my Flat File Source I defined four columns:
The Multicast doesn't need any configuration, and I assume the Regular Table destination isn't giving you the trouble. So next, you'd create the Indicator check with a Conditional Split task. Check for a value of Y like this:
Then just map whichever available columns you want to insert into Other Table. I chose the second column (I called mine Seq) and the Name column. You may have these named differently.

Flat File Destination has multiple Inputs from custom data flow task

I have an SSIS Package setup with the following Data Flow Tasks in order:
Flat File Source
Derived Column
Custom Task
Flat File Destination
The Flat File source contains fixed-width rows of data (282 characters per row).
The Derived Column splits each row into columns using the SUBSTRING() method.
The Custom Task performs some Regular Expression validation and creates two new output columns: RowIsValid (a DT_BOOL) and InvalidReason (a DT_WSTR of 200). There is no Custom UI for this Task.
The Flat File Destination is the validated data in delimited column format. Eventually, this would be a
database destination.
I know that this can be done using a Script Task. In fact, I am currently doing so in my solution. However, what I am trying to accomplish is building a Custom Task so that code changes are done in a single-spot instead of having to change multiple Script Tasks.
I have a couple of issues I'm trying to overcome and am hoping for some help/guidance:
(Major) Currently, when I review the mappings of the Flat File Destination, the Available Input columns are coming from the Flat File Source, the Derived Column Task, and the Custom Task. Only one column is coming from the Flat File Source (because there is only one column), while the Derived Column and Custom Task each have all of the columns created in the Derived Column.
My expectation is that the Available Input Columns would/should only display the Custom Validator.[column name] columns (with only the column name) from the Custom Validator. Debugging, I don't see where I can manipulate and suppress the Derived Column.[column name] columns.
(Minor) Getting the input columns from the Derived Column Task to automatically be selected or used when the Input is attached.
Currently, after hooking up the input and output of the Custom Validator, I have to go to the Inputs tab on the Advanced Edit and select the columns I want. I'm selecting all, because I want all columns to go through the task, even though only some will be validated by the task.

Split the Table into multiple excel files using ssis

There is a table with 5000 records ,I need to split it into 10 excel files with names
Jan_DEpt_Records.xlsx,Feb_Deptname_Records.xlsx etc.How to achieve this with ssis.
Here "Dept" part of the excel name would come from the source table dept column.
It has been understood the use of for each loop and dataflow task inside foreachloop.
You should use conditional splits and in that you can right the cases for the number of records and than pass it to your excels just replace derived columns with the sample excel.insert indentity column on basis of that you can differentiate :

Importing excel file to access and set up columns field name

I'm having an access tool where I'm importing an excel file with table information. The system is creating a new table with this info with column fields (F1,F2,F3, etc.) and under it there is 10 lines with data and after that a table. I need the information from this table to be appended in another table in Access. I'm having the code and the append query, but sometimes some of the columns in excel file are change their places and this is a problem for my table 2. I would like to ask you is it possible somehow to change automatically the nameing of the column fields in the first table when I'm importing the info from the excel sheet.
Thank you in advance! - Here is a screenshot. The yellow one to go to the grey one.