Update json value from sql column value - json

Using SQL Server, I want to take column data and copy it into a json object column
I am using SQL Server to a query a column and a json data. What I want to do is to copy the data in column ename to fieldvalue column in the code below. If I could do it using SQL that would be great.
SELECT
a.id, a.ssn, a.ename, p.CaptionName, p.FieldName, p.FieldType, p.FieldValue
FROM
tablename as a
CROSS APPLY
OPENJSON (details)
WITH (CaptionName NVARCHAR(100),
FieldName NVARCHAR(100),
FieldType NVARCHAR(15),
FieldValue NVARCHAR(50)) AS P
WHERE
p.captionname = 'txtEname'
AND a.ssn = '000-00-0000'
My json string in the details column
[{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":""}]
I'm really not that good with sql which is what i want to use. after copying the data to the json object i will remove the ename column.

UPDATE 2019-07-11
Here's an amended solution which works for scenarios when there are multiple values in the JSON: https://dbfiddle.uk/?rdbms=sqlserver_2017&fiddle=1fde45dfb604b2d5540c56f6c17a822d
update a
set details = JSON_MODIFY(details, '$[' + x.[key] + '].FieldValue', ename)
from dbo.tblUissAssignments a
CROSS APPLY OPENJSON (details, '$') x
CROSS APPLY OPENJSON (x.Value)
WITH (CaptionName NVARCHAR(100),
FieldName NVARCHAR(100),
FieldType NVARCHAR(15),
FieldValue NVARCHAR(50)) AS P
WHERE a.ssn = '000-00-0000'
and p.CaptionName = 'txtEname'
This is similar to my original answer (see below). However:
We now have 2 cross apply statements. The first is used to split the JSON array into elements, so we get a key (index) and value (JSON object as a string), as documented here: https://learn.microsoft.com/en-us/sql/t-sql/functions/openjson-transact-sql?view=sql-server-2017#path
The second does what your original CROSS APPLY did, only acting on the single array element.
We use the [key] returned by the first cross apply to target the item in the array that we wish to update in our JSON_MODIFY statement.
NB: If it's possible for your JSON array to contain multiple objects that need updating, the best solution I can think of is to put the above statement into a loop; since 1 update will only update 1 index on a given JSON. Here's an example: https://dbfiddle.uk/?rdbms=sqlserver_2017&fiddle=120d2ac7dd3a024e5e503a5f64b0089e
declare #doWhileTrueFlag bit = 1
while (#doWhileTrueFlag = 1)
begin
update a
set details = JSON_MODIFY(details, '$[' + x.[key] + '].FieldValue', ename)
from dbo.tblUissAssignments a
CROSS APPLY OPENJSON (details, '$') x
CROSS APPLY OPENJSON (x.Value)
WITH (CaptionName NVARCHAR(100),
FieldName NVARCHAR(100),
FieldType NVARCHAR(15),
FieldValue NVARCHAR(50)) AS P
WHERE a.ssn = '000-00-0000'
and p.CaptionName = 'txtEname'
and p.FieldValue != ename --if it's already got the correct value, don't update it again
set #doWhileTrueFlag = case when ##RowCount > 0 then 1 else 0 end
end
ORIGINAL ANSWER
Try this: https://dbfiddle.uk/?rdbms=sqlserver_2017&fiddle=b7b4d075cac6cd46239561ddb992ac90
update a
set details = JSON_MODIFY(details, '$[0].FieldValue', ename)
from dbo.tblUissAssignments a
cross apply
OPENJSON (details)
WITH (CaptionName NVARCHAR(100),
FieldName NVARCHAR(100),
FieldType NVARCHAR(15),
FieldValue NVARCHAR(50)) AS P
where a.ssn = '000-00-0000'
and p.captionname = 'txtEname'
More info on the JSON_MODIFY method here: https://learn.microsoft.com/en-us/sql/t-sql/functions/json-modify-transact-sql?view=sql-server-2017
The subtle bit is that you're updating a json array containing a json object; not a single object. For that you have to include the index on the root element. See this post for some useful info on JsonPath if you're unfamiliar: https://support.smartbear.com/alertsite/docs/monitors/api/endpoint/jsonpath.html
Regarding scenarios where there's multiple items in the array, ideally we'd use a filter expression, such as this:
update a
set details = JSON_MODIFY(details, '$[?(#.CaptionName == ''txtEname'')].FieldValue', ename)
from dbo.tblUissAssignments a
where a.ssn = '000-00-0000'
Sadly MS SQL doesn't yet support these (see this excellent post: https://modern-sql.com/blog/2017-06/whats-new-in-sql-2016)
As such, I think we need to apply a nasty hack. Two such approaches spring to mind:
Implement a loop to iterate through all matches
Convert from JSON to some other type, then convert back to JSON afterwards
I'll think on these / whether there's something cleaner, since neither sits comfortably at present...

If I understand your question, then one possible approach (if you use SQL Server 2017+) is to use OPENJSON() and string manipulations with STRING_AGG():
Table:
CREATE TABLE #Data (
id int,
ssn varchar(12),
ename varchar(40),
details nvarchar(max)
)
INSERT INTO #Data
(id, ssn, ename, details)
VALUES
(1, '000-00-0000', 'stackoverflow1', N'[{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":""}, {"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":""}]'),
(2, '000-00-0000', 'stackoverflow2', N'[{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":""}, {"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":""}]')
Statement:
SELECT
d.id, d.ssn, d.ename,
CONCAT(N'[', STRING_AGG(JSON_MODIFY(j.[value], '$.FieldValue', ename), ','), N']') AS details
FROM #Data d
CROSS APPLY OPENJSON (d.details) j
WHERE JSON_VALUE(j.[value], '$.CaptionName') = N'txtEname' AND (d.ssn = '000-00-0000')
GROUP BY d.id, d.ssn, d.ename
Output:
id ssn ename details
1 000-00-0000 stackoverflow1 [{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":"stackoverflow1"},{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":"stackoverflow1"}]
2 000-00-0000 stackoverflow2 [{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":"stackoverflow2"},{"CaptionName":"txtEname","FieldName":null,"FieldType":null,"FieldValue":"stackoverflow2"}]
For SQL Server 2016 you may use FOR XML PATH for string aggregation:
SELECT
d.id, d.ssn, d.ename,
CONCAT(N'[', STUFF(s.details, 1, 1, N''), N']') AS details
FROM #Data d
CROSS APPLY (
SELECT CONCAT(N',', JSON_MODIFY(j.[value], '$.FieldValue', ename))
FROM #Data
CROSS APPLY OPENJSON (details) j
WHERE
(JSON_VALUE(j.[value], '$.CaptionName') = N'txtEname') AND
(ssn = '000-00-0000') AND
(id = d.id) AND (d.ssn = ssn) AND (d.ename = ename)
FOR XML PATH('')
) s(details)

Related

FOR JSON PATH in CASE statement

When I use a variable (or a field from a table) in a case statement with "FOR JSON PATH", the JSON provided is not well formed.
Ex:
declare #MyValue nvarchar(50)
set #MyValue='1'
select CASE WHEN #MyValue='1' THEN (select 'ROLE_CLIENT_READONLY' as id FOR JSON PATH) end as [Role] FOR JSON PATH
Return
[{"Role":"[{\"id\":\"ROLE_CLIENT_READONLY\"}]"}]
But if I put this, it's works:
select CASE WHEN '1'='1' THEN (select 'ROLE_CLIENT_READONLY' as id FOR JSON PATH) end as [Role] FOR JSON PATH
Return
[{"Role":[{"id":"ROLE_CLIENT_READONLY"}]}]
Any idea on the reason for this behavior?
How can I fix this in the first scenario?
Not sure why it treats one different than the other. I certainly would not expect the difference in JSON between using a variable in the query and a string literal.
Interestingly
CASE WHEN CAST(N'1' AS NVARCHAR(MAX)) = CAST(N'1' AS NVARCHAR(MAX))
also produces the problematic JSON, however
CASE WHEN CAST(N'1' AS NVARCHAR(50)) = CAST(N'1' AS NVARCHAR(50))
does not.
This seems to work as a workaround for using the variable in the query:
WITH ids AS (
SELECT CASE WHEN #MyValue = '1' THEN 'ROLE_CLIENT_READONLY' END id
)
SELECT (SELECT id FROM ids FOR JSON PATH) AS [Role] FOR JSON PATH;

How can I filter array values in a Json in Sql Server

How can I filter an array value without specifying the index in the array
This is my array
{
"level1":{
"level2":[
{
"level3":"test",
}
]
}
}
I want to retrive all rows that contains a level3 a value test.
Something like this
select * from Documents
where Json_Value(DocumentInfo, '$.level1.level2[X].level3') = 'test'
It this not possible in Sql Server?
Guessing of bit here, but use an EXISTS? This also assumes the JSON is valid, as the JSON in your question is not:
SELECT * --Should be a distinct column list
FROM dbo.Documents D
WHERE EXISTS (SELECT 1
FROM OPENJSON(D.DocumentInfo)
WITH (level1 nvarchar(MAX) AS JSON) L1
CROSS APPLY OPENJSON(L1.Level1)
WITH(level2 nvarchar(MAX) AS JSON) L2
CROSS APPLY OPENJSON(L2.Level2)
WITH (level3 varchar(10)) L3
WHERE L3.level3 = 'test');

Query SQL database with JSON Value

Here is my JSON:
[{"Key":"schedulerItemType","Value":"schedule"},{"Key":"scheduleId","Value":"82"},{"Key":"scheduleEventId","Value":"-1"},{"Key":"scheduleTypeId","Value":"2"},{"Key":"scheduleName","Value":"Fixed Schedule"},{"Key":"moduleId","Value":"5"}]
I want to query the database by FileMetadata column
I've tried this:
SELECT * FROM FileSystemItems WHERE JSON_VALUE(FileMetadata, '$.Key') = 'scheduleId' and JSON_VALUE(FileMetadata, '$.Value') = '82'
but it doesn't work!
I had it working with just a dictionary key/value pair, but I needed to return the data differently, so I am adding it with key and value into the json now.
What am I doing wrong?
With the sample data given you'd have to supply an array index to query the 1th element (0-based array indexes), e.g.:
select *
from dbo.FileSystemItems
where json_value(FileMetadata, '$[1].Key') = 'scheduleId'
and json_value(FileMetadata, '$[1].Value') = '82'
If the scheduleId key can appear at arbitrary positions in the array then you can restructure the query to use OPENJSON instead, e.g.:
select *
from dbo.FileSystemItems
cross apply openjson(FileMetadata) with (
[Key] nvarchar(50) N'$.Key',
Value nvarchar(50) N'$.Value'
) j
where j.[Key] = N'scheduleId'
and j.Value = N'82'

TSQL -- Extract JSON values for unknown/dynamic key in sub-object (not an array)

I'm using DataTables, DataTables Editor, JavaScript, and MSSQL 2016.
I'd like to parse this string in SQL Server:
{
"action":"edit",
"data": {
"2019-08-03":{
"Description":"sdfsafasdfasdf",
"FirstFrozenStep":"333"
}
}
}
I don't know how to access the key "2019-08-03". This represents the primary key, or the DT_RowId in DataTables Editor. It's dynamic... It could change.
Historically, I have just manipulated the data in JavaScript to a FLAT object, which is WAY easier to parse in SQL Server:
{
"action":"edit",
"DT_RowId":"2019-08-03",
"Description":"sdfsafasdfasdf",
"FirstFrozenStep":"333"
}
HOWEVER, I would like to know how to use json_query, json_value, and openjson() to drill down to the "dynamic" key mentioned above, and then access its values.
Here are all my FAILED attempts:
declare
#jsonRequest nvarchar(max) = '{"action":"edit","data":{"2019-08-03":{"Description":"sdfsafasdfasdf","FirstFrozenStep":"333"}}}'
,#json2 nvarchar(max) = '{"2019-08-03":{"Description":"sdfsafasdfasdf","FirstFrozenStep":"333"}}'
,#jsonEASY nvarchar(max) = '{"action":"edit","DT_RowId":"2019-08-03","Description":"sdfsafasdfasdf","FirstFrozenStep":"333"}'
select
json_value(#jsonRequest, '$.action') as [action]
--,json_value(#jsonRequest, '$.data.[0]') as [action]
--,json_query(#jsonRequest, '$.data[0]')
--,json_query(#jsonRequest, '$.data.[0]')
--,json_query(#jsonRequest, '$.data[0].Description')
--,json_query(#jsonRequest, '$.data.Description')
--,json_query(#jsonRequest, '$.data.[0].Description')
select
[Key]
,Value
,Type
--,json_query(value, '$')
from
openjson(#jsonRequest)
SELECT x.[Key], x.[Value]
FROM OPENJSON(#jsonRequest, '$') AS x;
select
x.[Key]
,x.[Value]
--,json_query(x.value, '$')
--,(select * from openjson(x.value))
FROM OPENJSON(#jsonRequest, '$') AS x;
SELECT x.[Key], x.[Value]
FROM OPENJSON(#json2, '$') AS x;
select
json_value(#jsonEASY, '$.action') as [action]
,json_value(#jsonEASY, '$.DT_RowId') as [DT_RowId]
,json_value(#jsonEASY, '$.Description') as [Description]
The most explicit and type-safe approach might be this:
I define your JSON with two dynamic keys
DECLARE #json NVARCHAR(MAX)=
N'{
"action":"edit",
"data": {
"2019-08-03":{
"Description":"sdfsafasdfasdf",
"FirstFrozenStep":"333"
},
"2019-08-04":{
"Description":"blah4",
"FirstFrozenStep":"444"
}
}
}';
--the query
SELECT A.[action]
,B.[key]
,C.*
FROM OPENJSON(#json) WITH([action] NVARCHAR(100)
,[data] NVARCHAR(MAX) AS JSON) A
OUTER APPLY OPENJSON(A.[data]) B
OUTER APPLY OPENJSON(B.[value]) WITH([Description] NVARCHAR(100)
,FirstFrozenStep INT) C;
The result
action key Description FirstFrozenStep
edit 2019-08-03 sdfsafasdfasdf 333
edit 2019-08-04 blah4 444
The idea in short:
The first OPENJSON() will return the two first-level-keys under the alias A. The data element is returned AS JSON, which allows to proceed with this later.
The second OPENJSON() gets A.[data] as input and is needed to get hands on the key, which is your date.
The third OPENJSON() now gets B.[value] as input.
The WITH-clause allows to read the inner elements implicitly pivoted and typed.
In general: In generic data containers it is no good idea to use descriptive parts as content. This is possible and might look clever, but it was much better to place your date as content within a date-key.
You can use OUTER APPLY to get to the next level in JSON:
SELECT L1.[key], L2.[key], L2.[value]
FROM openjson(#json,'$.data') AS L1
OUTER APPLY openjson(L1.[value]) AS L2
It will return:
key key value
2019-08-03 Description sdfsafasdfasdf
2019-08-03 FirstFrozenStep 333

SQL Server: How to query data all rows as Json object into next to other columns?

I have data like this:
I want to query result like this:
Here is my code
SELECT
PML_CODE
,PML_NAME_ENG
,(
SELECT
PML_ID
,PML_NO
,PML_CODE
,PML_NAME_ENG
,PML_FORMULA
FROM DSP.PARAMET_LIST AS A WITH(NOLOCK)
WHERE A.PML_ID = B.PML_ID
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
) AS BR_OBJECT
FROM DSP.PARAMET_LIST AS B WITH(NOLOCK)
My code works for what I want, but I want to know if there is a better, faster way to write this query?
Next time please do not post pictures, but rather try to create some DDL, fill it with sample data and state your own attempts and the expected output. This makes it easier for us to understand and to answer your issue.
You can try it like this:
DECLARE #tbl TABLE(PML_ID BIGINT, PML_NO INT, PML_CODE VARCHAR(10), PML_NAME_ENG VARCHAR(10), PML_FORMULA VARCHAR(10));
INSERT INTO #tbl VALUES
(2017102600050,1,'KHR','Riel','01')
,(2017102600051,2,'USD','Dollar','02')
,(2017102600052,3,'THB','Bath','05')
SELECT
PML_CODE
,PML_NAME_ENG
,BR_OBJECT
FROM #tbl
CROSS APPLY(
SELECT
(
SELECT
PML_ID
,PML_NO
,PML_CODE
,PML_NAME_ENG
,PML_FORMULA
FOR JSON PATH, WITHOUT_ARRAY_WRAPPER
)) AS A(BR_OBJECT);
The big difference to your own approach is that I use a CROSS APPLY using the columns we have already instead of calling a correlated sub-query.
You can just concatenate the values. Be sure to cast the integers and to handle the NULL values. For example, if there is NULL value for column, there can be two cases - ignore the property or add the property with null, right?
For SQL Server 2016 SP1+ and later you can use FOR JSON. Basically, you should end up with something like this:
DECLARE #DataSource TABLE
(
[PML_ID] VARCHAR(64)
,[PML_NO] INT
,[PML_CODE] VARCHAR(3)
,[PML_NAME_ENG] NVARCHAR(32)
,[PML_FORMULA] VARCHAR(2)
);
INSERT INTO #DataSource ([PML_ID], [PML_NO], [PML_CODE], [PML_NAME_ENG], [PML_FORMULA])
VALUES ('201710260000000050', 1, 'KHR', 'Riel', 01)
,('201710260000000051', 2, 'USD', 'Dollar', 02)
,('201710260000000052', 3, 'THB', 'Bath', 05);
SELECT [PML_CODE]
,[PML_NAME_ENG]
,'{"PML_ID":'+ [PML_ID] +',"PML_NO":'+ CAST([PML_NO] AS VARCHAR(12)) +',"PML_CODE":'+ [PML_CODE] +',"PML_NAME_ENG":'+ [PML_NAME_ENG] +',"PML_FORMULA":'+ [PML_FORMULA] +'}' AS [BR_OBJECT]
FROM #DataSource;
-- SQL Server 2016 SP1 and latter
SELECT DS1.[PML_CODE]
,DS1.[PML_NAME_ENG]
,DS.[BR_OBJECT]
FROM #DataSource DS1
CROSS APPLY
(
SELECT *
FROM #DataSource DS2
WHERE DS1.[PML_CODE] = DS2.[PML_CODE]
AND DS2.[PML_NAME_ENG] = DS2.[PML_NAME_ENG]
FOR JSON AUTO
) DS ([BR_OBJECT]);