i want to create one ssis package which takes values from flat file and insert it into database table depending upon there companyname.
for example:
I have table fields:
Date SecurityId SecurityType EntryPrice Price CompanyName
2011-08-31 5033048 Bond 1.05 NULL ABC Corp
now i want to insert Price into this table but i need to match with CompanyName
and in that also in file CompanyName is like ABC so how can i checked that and insert only particular data...
like this i have 20 records in my file with different company names.
I DID LIKE THIS
in lookup i did
and now my problem is i need to check company name from flat file and insert that company price into table but in flat file company name is given like 'AK STL' ans in table it is like 'AK STEEL CORPORATION' so for this i have used column transformation but what expression i write to find match ...same with other company names only 1ft 2-3 charachters are there in flat file please help
Basically, you are looking to "Upsert" your data into the database. Here is a simple look up upsert example. With as few of records in your dataset as you have said, this method will suffice. With larger datasets, you probably want to look into using temp tables and using sql logic similar to this:
--Insert Portion
INSERT INTO FinalTable
( Colums )
SELECT T.TempColumns
FROM TempTable T
WHERE
(
SELECT 'Bam'
FROM FinalTable F
WHERE F.Key(s) = T.Key(s)
) IS NULL
--Update Portion
UPDATE FinalTable
SET NonKeyColumn(s) = T.TempNonKeyColumn(s)
FROM TempTable T
WHERE FinalTable.Key(s) = T.Key(s)
AND CHECKSUM(FinalTable.NonKeyColumn(s)) <> CHECKSUM(T.NonKeyColumn(s))
Related
Currently, I find it time consuming to copy specific columns and values from Table 1 and paste the values onto Table 2 manually.
I have two tables ( In the same database ) like this:
How can I save time by doing the following? -
Grab specific ID from Table 1, in this case 1 and 2.
and copy just the Pcode & Desc values onto Table 2?
This is the end result I want to achieve (screenshot below)
The ID will be new, because its a new record. So technically I am updating Table 2 with new values that I have copied from Table 1
Every column is varchar type column expect the Id's
Also, I am using MySql Workbench.
This should do the trick;
INSERT INTO table2 (PCode, Desc)
SELECT Pcode, Desc
FROM table1
WHERE table1.id = 1 or table1.id = 2
I'm a rather newbie when it comes to SQL queries and not sure how to approach this:
I have a CSV file that contains 5 columns, 2 of those columns are Value1 and Value2, I need to run over an existing sql table (for this question's purposes I'll call it "target table") and iterate over all rows in target table checking their Value1 column, if that Value1 content equals to the one in the CSV I need to insert Value2 into the Value2 column of that row if the Value1 is not contained in the table, create a new row for it.
Just in case I wasn't clear, here's an example -
assuming the CSV looks like the following:
Name, Age, Location, Height, Weight
David, 12, Macedonia, 1.87, 96
Kim, 15, Denmark, 1.95, 67
I want to go over the existing SQL and work according to name and weight only - if the name David is in the table, insert 96 to its Weight column, if the name Kim is in the table, insert 67 to its Weight column etc...
If the table only contained Kim and not David, then the David row would be created.
I'm assuming the wise way would be to first fill in the gaps of "Value1" that aren't existing in the table and only then run an update on the "Value2" but I might be wrong.
Any help would be much appreciated, thanks!
Theoretically, I think this should work for you.
--Part 1: Clear/Create temporary table and Load CSV into SQL. Credit to mr_eclair for describing this process here
drop table #temp
create table #temp (
tName nvarchar(25),
tAge int,
tLocation nvarchar(25),
tHeight float(3,2), -- alternatively, use cm instead of m and just use int(3)
tWeight int
)
BULK INSERT #temp
FROM 'C:\CSVData\updates.csv'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = ',', --CSV field delimiter
ROWTERMINATOR = '\n', --Use to shift the control to next row
TABLOCK
)
--Part 2: Setting a Unique Key; as suggested by #Yuri_Lachin
Alter table target
Add Unique (Name) -- Sets Name column as a Unique Key for the table target
--Part 3: Adding rows and Updating values from temp table to permanent table. Credit to MySQL 5.7 Reference Manual 13.2.5.2
Insert into target(Name, Age, Location, Height, Weight)
Select tName, tAge, tLocation, tHeight, tWeight from #temp
On DUPLICATE KEY Update Weight = tWeight
I was going to suggest using a Merge statement like the following, but it looks like MySQL doesn't deal with those.
Merge Into people
using #temp
on target.name = #temp.tname
when matched then Update
set target.weight = #temp.tweight
when not matched then
Insert (target.name, target.age, target.location, target.height, target.weight)
values (#temp.tname, #temp.tage, #temp.tlocation, #temp.theight, #temp.tweight);
I would like to have the results of my query (that returns one row) to be displayed in text like this:
columnA: value
columnB: value
columnC: value
as happens in mysql when using
select * from tablename \G
Is there a way to do this? The reason for this is that it is helpful to be able to print out one record with columns and values for example data or to share a record from a table that has many columns and which would be hard to view across the screen.
It's not quite so simple as your MySQL example, but you can do an unpivot to get what you want.
---------------
-- TEST SCHEMA
---------------
declare #tablename as Table(keyvalue varchar(2), dataColA varchar(2), dataColB varchar(2), dataColC varchar(2))
insert into #tablename select '01', '02', '03', '04'
---------------
-- UNPIVOT
---------------
select dataColumns, dataValues
from #tablename
unpivot
(
dataValues
for dataColumns in (keyvalue, dataColA, dataColB, dataColC)
) u;
The easiest way to accomplish what I want is to
execute the query to a results grid, limit to top 1 if necessary to ensure only one row is returned,
right-click in top left corner, Copy with Headers
open Excel, paste
select what was just pasted and copy again within Excel
go to blank area of workbook or new worksheet and Paste Special, Transpose
This will create one row per database query column with column name in column A and value in column B.
I have a transactions table in a flat file like
ItemID ,ItemName ,CustomerID ,CustomerName ,Qty ,Price ,TotalValue
and target transaction table will have
ItemID,CustomerID,Qty,Price,TotalValue
Now I have to import it into the transactions table using SSIS package
But before importing ItemID and CustomerID I should look into the lookup tables ItemMaster and CustomerMaster, if not there, then I have insert new tuples into the tables and take the new itemID or customerID and import the transaction to the transactions table. It can be done using lookup transformations in SSIS.
Or is it better to import transactions into a temporary table using a SSIS package ,update new ItemIDs and customer IDs in the temporary table and then insert transactions from the temp table to the main transactions table
Which option will be better from the performance wise ?
There are several ways of doing it .
1.Using Staging Table
2.Using Lookup
3.Transforming the stored procedure logic in SSIS
1.Using Staging Table
Dump all the flat file data into a staging table .Lets name it as StgTransaction.Create a procedure to perform the tasks .
Merge ItemMaster target
using StgTransaction src
on target.ItemID = src.ItemID
WHEN NOT MATCHED THEN
INSERT (ItemName)
values (src.ItemID);
Merge CustomerMaster target
using Trans src
on target.CustomerID = Src.CustomerID
WHEN NOT MATCHED THEN
INSERT (CustomerName)
values (src.CustomerID);
with cte(ItemID ,ItemName ,CustomerID ,CustomerName ,Qty ,Price ,TotalValue) as
(
Select I.ItemID,I.ItemName,
C.CustomerID,C.CustomerName,
f.Qty,f.price,f.TotalValue
from ItemMaster I inner join Trans f
on I.ItemName = f.ItemName
inner join CustomerMaster c
on c.CustomerName = f.CustomerName
)
Insert into Transactions
Select ItemID ,ItemName ,CustomerID ,CustomerName ,Qty ,Price ,TotalValue
from cte
Basically I'm inserting all the missing values into the 2 master tables using Merge Syntax .Instead of Merge you can use NOT EXISTS
Insert into ItemMaster
Select ItemName from stgTransaction s
where not exists
(Select 1 from ItemMaster im
where im.ItemName = s.ItemName
);
Once the missing values are inserted then just join the staging table with the 2 master tables and insert it into target .
Wrap the above query into a procedure and call the procedure after the Data Flow Task (Which loads the Data from flat file to staging table)
2.Using Lookup
The package design will look like
You should go with this approach if you are not allowed to create staging table in your database . This will be slow because of blocking components (Union ALL) and OlEDB command(problem with RBAR (row by agonizing row) issue)
Steps :-
1.Use lookup with ItemMaster table
2.Create a ItemID column (name it as NewItemID) using Derived transformation which will store the new ItemID generated from ItemMaster table when the data is loaded .join Lookup with Derived Transformation using No Match Output
3.The No Matched values should be inserted into ItemMaster table.For this lets create a procedure which inserts the data and retrieves the ItemID value as an Output
ALTER PROCEDURE usp_InsertMaster
#ItemName AS varchar(20),
#id AS INT OUTPUT AS
INSERT INTO ItemMaster
(ItemName)
VALUES
(#ItemName)
SET #id = SCOPE_IDENTITY()
//If your using ID as Identity value else use Output clause to retrieve the ID
3.Call this procedure in OLEDB command and map the output with the column created in Derived transformation
After the OLEDB command using Union ALL to combine the rows from matched and No Matched values and then again follow the same procedure with the CustomerMaster table
3.Last option is Transforming procedure logic in SSIS
Package Design is
1.Load the data into staging
2.Use Merge or Not Exists and load the missing values in 2 Master tables using Execute SQL Task
3.Use Data Flow Task with source as Staging and 2 lookups with the master tables .Since all the missing values are already inserted into Master tables ,so there wont be any Lookup No match Output. Just connect the Lookup Match output with Oledb Destination (Transaction Table)
IMHO i think the 1st approach will be fast . The problem arises only because there are 2 master tables which needs to be updated along with that get the inserted ID's and load it into target table.So doing it synchronously is difficult .
i have multiple tables containing similar records. i want to merge them into one table.
therefore i use an update query and map the fields from the various tables to the ones in my target-table. but i need to keep track from which table a record comes, so id like to add a literal "TABLE_XY" in the ORIGINALTABLE field in the resulting table to each record. but the query designer always wants a source-field. I cant just put a literal anywhere an select ORIGINALTABLE in "Append To"...
what to do? do i really have to add a NAMEOFTHISTABLE field to the original tables...?
thanks for your help!
Make a backup copy of your database. Create a new query and switch to SQL View. Then paste in this statement, and modify the table and field names to match yours:
INSERT INTO master_table (
ORIGINALTABLE
, field1
, field2
)
SELECT
"TABLE_XY" AS ORIGINALTABLE
, field_a
, field_b
FROM
TABLE_XY;
Using the Query Designer in Design View for an Update Query:
Field: ORIGINALTABLE
Table: <tableName>, where tableName is the name of the table you are updating.
Update To: "TABLE_XY", make sure to include the quotes.
Using the Query Designer in Design View for an Append Query:
Field: Expr1: "TABLE_XY", where Expr1 is an alias name.
Table: <leaveBlank>
Append To: ORIGINALTABLE