I am transferring data from MYSQL to SQL_Server using SSIS and there are around 200 tables.
So I Wrote a dynamic ETL that only takes the name of the table and handles the rest.
but since I had to have a fixed table meta-data I used JSON_array in MYSQL to create a single column from all of the columns except ID something like this:
select id
,JSON_ARRAY(name,cellphone) as JSON
from table
because I know the schema of data I wanted to reduce my JSON size and removed table Schema from the JSON.
the created JSON_ARRAY looks like this:
["hooman", "12345"]
so after moving to SQL_Server I know I can use CROSS APPLY OPENJSON(t.json) like this and read it but then I have to pivot it and that's not efficient at all!
I can see how to open normal JSON so you don't need to pivot your data but I can't find anything for the Array type.
in an ideal world I want something like this:
CROSS APPLY OPENJSON(t.json) with(
name varchar(255) '$[0]' ,
cellphone int '$[1]' )
and as a result, I have 2 columns and don't need to pivot my table anymore.
declare #json nvarchar(max) = N'["hooman", "12345"]';
select json_value(#json, '$[0]') as name, json_value(#json, '$[1]') as cellphone;
select *
from openjson(concat('{"x":', #json, '}'))
with
(
name varchar(255) '$.x[0]' ,
cellphone int '$.x[1]'
);
select *
from openjson(concat('[', #json, ']'))
with
(
name varchar(255) '$[0]' ,
cellphone int '$[1]'
);
Related
Just started playing with JSON_VALUE in SQL Server. I am able to pull values from name/value pairs of JSON but I happen to have an object that looks like this:
["first.last#domain.com"]
When I attempt what works for name/value pairs:
SELECT TOP 1
jsonemail,
JSON_VALUE(jsonemail, '$') as pleaseWorky
FROM MyTable
I get back the full input, not first.last#domain.com. Am I out of luck? I don't control the upstream source of the data. I think its a sting collection being converted into a json payload. If it was name: first.last#domain.com I would be able to get it with $.name.
Thanks in advance.
It is a JSON array. So you just need to specify its index, i.e 0.
Please try the following solution.
SQL
-- DDL and sample data population, start
DECLARE #tbl TABLE (ID INT IDENTITY PRIMARY KEY, jsonemail NVARCHAR(MAX));
INSERT INTO #tbl (jsonemail) VALUES
('["first.last#domain.com"]');
-- DDL and sample data population, end
SELECT ID
, jsonemail AS [Before]
, JSON_VALUE(jsonemail, '$[0]') as [After]
FROM #tbl;
Output
+----+---------------------------+-----------------------+
| ID | Before | After |
+----+---------------------------+-----------------------+
| 1 | ["first.last#domain.com"] | first.last#domain.com |
+----+---------------------------+-----------------------+
From the docs:
Array elements. For example, $.product[3]. Arrays are zero-based.
So you need JSON_VALUE(..., '$[0]') when the root is an array and you want the first value.
To break it out into rows, you would need OPENJSON:
SELECT TOP 1
jsonemail
,j.[value] as pleaseWorky
FROM MyTable
CROSS APPLY OPENJSON(jsonemail) j
While working on oracle json datatype and trying to extract data from it, not able to extract name & value elements from this. tried using all known notations but getting null.
select json_query(po_document, '$.actions.parameters[0]') from j_purchaseorder where ID='2';
You can use the JSON_VALUE function as follows:
SQL> select JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.name') as name,
2 JSON_VALUE('{"_class":"123", "name":"tejash","value":"so"}', '$.value') as value
3 from dual;
NAME VALUE
---------- ----------
tejash so
SQL>
Thanks for your help. got required output using below
select json_value(json_query(po_document, '$.actions.parameters[0]'),'$.value') from j_purchaseorder where ID='2' and
json_value(json_query(po_document, '$.actions.parameters[0]'),'$.name') = 'SERVERUSER';
As explained, for example, in the Oracle documentation, multiple calls to JSON_VALUE() on the same JSON document may result in very poor performance. When we need to extract multiple values from a single document, it is often best (for performance) to make a single call to JSON_TABLE().
Here is how that would work on the provided document. First I create and populate the table, then I show the query and the output. Note the handling of column (attribute) "_class", both in the JSON document and in the SQL SELECT statement. In both cases the name must be enclosed in double-quotes, because it begins with an underscore.
create table j_purchaseorder (
id number primary key,
po_document clob check (po_document is json)
);
insert into j_purchaseorder (id, po_document) values (
2, '{"_class":"hudson.model.StringParameterValue","name":"SERVERUSER","value":"avlipwcnp04"}'
);
commit;
select "_CLASS", name, value
from j_purchaseorder
cross apply
json_table(po_document, '$'
columns (
"_CLASS" varchar2(40) path '$."_class"',
name varchar2(20) path '$.name',
value varchar2(20) path '$.value'
)
)
where id = 2
;
_CLASS NAME VALUE
---------------------------------------- ------------------ ------------------
hudson.model.StringParameterValue SERVERUSER avlipwcnp04
I've got a SQL 2008 R2 table defined like this:
CREATE TABLE [dbo].[Search_Name](
[Id] [bigint] IDENTITY(1,1) NOT NULL,
[Name] [nvarchar](300) NULL),
CONSTRAINT [PK_Search_Name] PRIMARY KEY CLUSTERED ([Id] ASC))
Performance querying the Name field using CONTAINS and FREETEXT works well.
However, I'm trying to keep the values of my Name column unique. Searching for an existing entry in the Name column is unbelievably slow for a large number of names (usually batches of 1,000), even with an index on the Name field. Query plans indicate I'm using the index as expected.
To search for an existing value, my query looks like this:
SELECT TOP 1 Id, Name from Search_Name where Name = 'My Name Value'
I've tried duplicating the Name column to another column and searching on the new column, but the net effect was the same.
At this point, I'm thinking I must be mis-using this feature.
Should I just stop trying to prevent duplication? I'm using a linking table to join these search name values to the underlying data. It seems somehow 'dirty' to just store a whole bunch of duplicate values...
...or is there faster way to take a list of 1,000 names and see which ones are already stored in the database?
The first change to make is to get the entire list to SQL Server at one time. Regardless of how you add the names to the existing table, doing it as a set operation will make a big difference in performance.
Passing the List as a table-valued parameter (TVP) is a clean way to handle it. Have a look here for an example. You can still use an OUTPUT clause to track which rows did or didn't make the cut, for example:
-- Some sample existing names.
declare #Search_Name as Table ( Id Int Identity, Name VarChar(32) );
insert into #Search_Name ( Name ) values ( 'Bob' ), ( 'Carol' ), ( 'Ted' ), ( 'Alice' );
select * from #Search_Name;
-- Some (prospective) new names.
declare #New_Names as Table ( Name VarChar(32) );
insert into #New_Names ( Name ) values ( 'Ralph' ), ( 'Alice' ), ( 'Ed' ), ( 'Trixie' );
select * from #New_Names;
-- Add the unique new names.
declare #Inserted as Table ( Id Int, Name VarChar(32) );
insert into #Search_Name
output inserted.Id, inserted.Name into #Inserted
select New.Name
from #New_Names as New left outer join
#Search_Name as Old on Old.Name = New.Name
where Old.Id is NULL;
-- Results.
select * from #Search_Name;
-- The names that were added and their id's.
select * from #Inserted;
-- The names that were not added.
select New.Name
from #New_Names as New left outer join
#Inserted as I on I.Name = New.Name
where I.Id is NULL;
Alternatively, you could use a MERGE statement and OUTPUT the names that were added, those that weren't, or both.
How to use the SET datatype in MySQL? I have a table Train in which there are fields
trainno int
Weekdays set data type
Stops set data type
train name
How to write a select query where I can compare the Stops set with a particular value like 'Mumbai'?
Create a table like:
CREATE TABLE cl_db.Train
(
trainno INT PRIMARY KEY AUTO_INCREMENT,
Stops set('aaa','bbb','ccc') NOT NULL
)
and you can query it like
select * from cl_db.Train where Stops like 'bbb'
or like
select * from cl_db.Train where FIND_IN_SET('bbb',Stops)>0;
I need to do this: On inserted record I need to store Inserted item identity and selected item identity. (Example below)
I'm using after insert trigger (basically I copy one row from one table into another and do some more modifications.
I have a table parameter like this:
DECLARE #Tempequipment TABLE
(Equipment_Id int,
DefaultEquipment_Id INT)
Then I insert into table like this:
INSERT INTO dbo.tblEquipmentType
( Name, EquipmentType_Id)
SELECT name,(SELECT Equiment_Id FROM INSERTED)
FROM dbo.tblDefaultEquipmentType
This works fine!
What I need to do is: I need to insert into #TempEquipment EquipmentTypeId's that were just ineserted (can be more than one) and DefaultEquipmentTypeId's that were just copied.
I was thinking about doing something like:
INSERT INTO dbo.tblEquipmentType
( Name, EquipmentType_Id)
Output EquipmentTypeId, DefaultEquipmentTypeId into #TempEquipment
SELECT name,(SELECT Equipment_Id FROM INSERTED)
FROM dbo.tblDefaultEquipmentType
but of course this is not going to work, since it cannot get values from select statement, and not written correctly.
Any help is appreciated!
UPDATE:
I have an Item. Item can be built on different equipment. Equipment has types (foreign key. And equipmentType has attributes (foreignkey).
So this mean that we have four tables Item->Equipment->EquipmentType->EquipmentAttribute.
I need to store default EquipmentTypes and default EquipmentAtrributes for that type.
So I also got these replationship: Equipment->DefaultEquipmentType->DefaultEquipmentAttribute.
Now, When I insert new Item and select an equipment I want to copy defaults over to real tables (EquipmentType, EquipmentAttribute).
Is it clear at least a little?
Aside from how you're trying to do this (which isn't working), what specifically are you trying to do?
It may be that this can be resolved by changing / normalizing your paradigm, instead of some kind of exotic code. For example, it looks odd to have a customers table with an orderID field in it. Unless your customers only ever order one thing... I would have expected to see a customers table, an items table, and then an orders table that joined customers with items.
Hope that makes sense -- but anyway, if not, can you post your table structure, and maybe be a little more clear on what you know ahead of time (e.g., I imagine you know who your customers are, and what they ordered...before you do the insert...yes?)
For an INSERT statement you can only access the columns which are in the insert column list, so the solution is to rewrite the statement as a MERGE statement which can access all the columns including columns which are in the INSERT target table for instance IDENTITY columns.
In the demo I've used dbo.INSERTED to emulate the virtual table INSERTED from the trigger.
USE master
GO
IF DB_ID('MergeOutputExample') IS NOT NULL
DROP DATABASE MergeOutputExample
GO
CREATE DATABASE MergeOutputExample
GO
USE MergeOutputExample
GO
DECLARE #Tempequipment TABLE
(EquipmentId int,
DefaultEquipmentId INT,
ID int);
CREATE TABLE dbo.INSERTED
(
EquipmentTypeId int PRIMARY KEY
);
CREATE TABLE dbo.tblEquipmentType
(
ID int IDENTITY(1,1),
Name varchar(50),
EquipmentTypeId int PRIMARY KEY
);
CREATE TABLE dbo.tblDefaultEquipmentType
(
EquipmentTypeId int,
DefaultEquipmentTypeId int IDENTITY(1,1) PRIMARY KEY,
Name varchar(50)
);
INSERT dbo.inserted
(
EquipmentTypeId
)
VALUES (1);
INSERT dbo.tblDefaultEquipmentType
(
EquipmentTypeId,
Name
)
VALUES (
1,
'Hammer'
);
MERGE dbo.tblEquipmentType AS ET
USING (
SELECT DE.EquipmentTypeId,
DE.DefaultEquipmentTypeId,
DE.Name
FROM dbo.tblDefaultEquipmentType DE
INNER JOIN dbo.INSERTED I
ON DE.EquipmentTypeId = I.EquipmentTypeId
) AS DET
ON ET.EquipmentTypeId = DET.EquipmentTypeId
WHEN NOT MATCHED BY TARGET
THEN INSERT
(
Name,
EquipmentTypeID
)
VALUES
(
DET.Name,
DET.EquipmentTypeID
)
OUTPUT DET.EquipmentTypeId,
DET.DefaultEquipmentTypeId,
INSERTED.ID
INTO #Tempequipment;
SELECT *
FROM #Tempequipment;
SELECT *
FROM dbo.tblEquipmentType;