Update/Delete JSON array value in SQL Server - json

I have a json array in my table. It contains an array. I can create, append or make my array NULL. Inside my stored procedure but I don't see any way to pop value from array. Apparently JSON_Modify may have solution as you can update key as well as Single value but how can I use it to modify my array?
--My Array
Declare #json = '{"array":[123,456]}'
Desired results after update:
'{"array":[123]}'
Please note that array contain int values. Which are my sub department id. All values are (supposed to be) unique.

You could use:
DECLARE #json NVARCHAR(MAX) = '{"array":[123,456]}';
WITH cte AS (
SELECT *, MAX([key]) OVER() AS m_key
FROM OPENJSON(#json, '$.array') s
)
SELECT JSON_QUERY('[' + IIF(MAX(m_key) = 0, '', STRING_AGG(value,',')
WITHIN GROUP (ORDER BY [key])) + ']','$') AS array
FROM cte
WHERE [key] != m_key OR m_key = 0
FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER;
Output:
{"array":[123]}
DBFiddle Demo SQL Server 2017

As I was in hurry I solved my problem following way, but I would really recommend not to use it. Please see answer above by #lad2025.
DECLARE #json VARCHAR(MAX)
=(SELECT jsonDept
FROM tblEmployee
WHERE tblEmployeeID = #empid)
DECLARE #newjson VARCHAR(MAX)= (
SELECT LEFT(subdept, LEN(subdept)-1)
FROM (
SELECT Distinct value + ', ' FROM OPENJSON(#json,'$.array') Where value <> #subdeptid
FOR XML PATH ('')
) t (subdept))
UPDATE tblEmployee SET jsonDept = '{"array":['+ #newjson +']}' WHERE tblEmployeeID = #empid

Related

Sql Server: Select String array of JSON

Given the following test data:
declare #mg nvarchar(max);
set #mg = '{"fiskepind":["ko","hest","gris"]}';
select #mg, JSON_VALUE(#mg,'$.fiskepind')
How do i get returned a column with:
ko,hest,gris
Example returns: NULL, and i dont want to [index] to only get one returned.
Starting from SQL Server 2017, a possible solution is a combination of OPENJSON() and STRING_AGG().
SELECT STRING_AGG([value], ',') WITHIN GROUP (ORDER BY CONVERT(int, [key])) AS Result
FROM OPENJSON(#mg, '$.fiskepind')
Note, that JSON_VALUE() returns a scalar value, so the NULL value is the expected result when you try to extract a JSON array ('$.fiskepind') from the input JSON text.
If you just want a combine list, you can use OPENJSON to get a table and then use FOR XML PATH or STRING_AGG to combine into a single string.
declare #mg nvarchar(max);
set #mg = '{"fiskepind":["ko","hest","gris"]}';
select #mg, JSON_VALUE(#mg,'$.fiskepind')
, STUFF((
SELECT
',' + value
FROM OPENJSON(#mg, '$.fiskepind')
FOR XML PATH('')
),1,1,'') as combined_list

Remove params from JSON in SQL Server

There is a JSON column in SQL Server tables with data like:
["1","2","3","4"]
and I want to delete "3" or ("2","4") (for example) from it.
Can I do it with Json_Modify or anything else?
JSON modify can modify by PATH if you have not any key to modify and just a simple list like that you can do this:
DECLARE #JsonList NVARCHAR(1000) = N'["1","2","3","4"]';
DECLARE #NewList NVARCHAR(1000);
SET #NewList =
(
SELECT CONCAT('[', STRING_AGG(CONCAT('"', oj.Value, '"'), ','), ']')
FROM OPENJSON(#JsonList) AS oj
WHERE oj.Value NOT IN ( 2, 4 )
);
PRINT #NewList

MSSQL select JSON file with multirows and insert into table

I read the docs of handling a JSON file here. So far I am able to read the file and get a result:
QRY: SELECT * FROM OPENROWSET (BULK 'c:\ne.db', SINGLE_CLOB) as import
Result: {"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"} {"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}....
if I qry this, I only get the first row with the res nested:
Declare #JSON varchar(max)
SELECT #JSON=BulkColumn
FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB) import
SELECT *
FROM OPENJSON (#JSON)
What I want to achieve, is to read every entry of the JSON file and insert "res" from the json query into a row of a table in the database containing columns "number","info","id". If anyone could help me to finish this, I would appreciate.
The JSON file contains about 400000 lines and comes from a NodeJS script which uses nedb.
Here is the example file: LINK
The JSON in the file is not a valid JSON, it contains multiple root elements or a single row for each JSON object. It's strange, but OPENJSON() reads only the first element in this JSON input without generating an error.
But you may try to transform the input JSON into a valid JSON array ({...} {...} into [{}, {...}]) and parse this JSON array with OPENJSON() and explicit schema. If the input file has a single row for each JSON object, you need to know the new line separator (it's usually CHAR(10)):
DECLARE #json nvarchar(MAX)
-- Read the file's content
-- SELECT #json = BulkColumn
-- FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB) AS [Insert]
-- Only for test
SELECT #json =
N'{"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"}' +
CHAR(10) +
N'{"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}'
SELECT [number], [info], [_id]
FROM OPENJSON(CONCAT('[', REPLACE(#json, CONCAT('}', CHAR(10), '{'), '},{'), ']')) WITH (
[number] varchar(3) '$.res.number',
[info] varchar(10) '$.res.info',
_id varchar(50) '$._id'
)
Result:
number info _id
123 c-PM6900 aHMIeu6ZwB9lIBZk
456 a-PMs900 aHaIeu6ZwB9sIBZ1
You need to use a couple of calls to OPENJSON to achieve this, with a WITH:
DECLARE #JSON nvarchar(MAX) = N'{"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"} {"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}'
SELECT J._id,
r.number,
r.info
FROM OPENJSON(#JSON)
WITH (_id varchar(30),
res nvarchar(MAX) AS JSON) J
CROSS APPLY OPENJSON(J.res)
WITH(number int,
info varchar(10)) r;
Because the OP appears to think I am telling them to change their DECLARE and assignment statement... to confirm how you get the value into the #JSON, from the OP's own question:
DECLARE #JSON varchar(max);
SELECT #JSON=BulkColumn
FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB);
Final edit, is also appears that the OP's JSON is malformed, as I would expect a comma, or something, before the second res definition. Guessing we need to split it into rows as well, which means some string splitting:
SELECT J._id,
r.number,
r.info
FROM STRING_SPLIT(REPLACE(#JSON,N'}} {"res"',N'}}|{"res"'),'|') SS --I assume a pipe (|`) won't appear in the data
CROSS APPLY OPENJSON(SS.[value])
WITH (_id varchar(30),
res nvarchar(MAX) AS JSON) J
CROSS APPLY OPENJSON(J.res)
WITH(number int,
info varchar(10)) r;
db<>fiddle

StoredprocedureinMSSQLtoMySQL

I have a stored procedure in MSSQl, i would like to write it int My sql,
Any help or sugegstions please.I can not get to use XML function in Mysql.
stored proc:
ALTER PROCEDURE uspGetProductDetailsCSV (
#sku NVARCHAR(MAX)
)
AS
BEGIN
-
SELECT T.C.value('.', 'NVARCHAR(100)') AS [SKU]
INTO #tblPersons
FROM (SELECT CAST ('<Name>' + REPLACE (#sku, ',', '</Name><Name>')
+ '</Name>' AS XML) AS [Products]) AS A
CROSS APPLY Products.nodes('/Name') as T(C)
SELECT *
FROM ProductInformation Pr
WHERE EXISTS (SELECT Name FROM #tblPersons tmp WHERE tmp.SKU
= case when len(tmp.SKU) = 11 then Product_No+Colour_Code+Size_Code
when len(tmp.SKU) = 8 then Product_No+Colour_Code
when len(tmp.sku) = 6 then Product_No end)
DROP TABLE #tblPersons
END
Edit: I could not write XML part of stored proc, as i have pasted same code in Mysql, it doesnt create stored proc
Error: >can not cast as XML<
I dont believe XML is a valid type in MySql. Try just leaving it as a VARCHAR.
So, just remove the cast...I also think you will have to use CONCAT instead of + and change the [] around columns to ticks.
So Instead of:
FROM (SELECT CAST ('<Name>' + REPLACE (#sku, ',', '</Name><Name>')
+ '</Name>' AS XML) AS [Products]) AS A
TRY:
FROM (SELECT CONCAT('<Name>' , REPLACE(#sku, ',', '</Name><Name>'),
'</Name>') AS `Products`) AS A

Specifying SQL variable to xquery exist method to check if value is in given set of values

I am trying to query an XML column to return all rows where an attribute is in a list of possible values.
XQuery allows something like
SELECT COUNT(*)
FROM table
WHERE xml_col.exist('//Field.[#name=("value1","value2","value3")]') = 1
which would return the number of records that have a Field with attribute #name set to either "value1", "value2" or "value3".
What I'd like to do is write a concise query that could handle the set "value1", "value2", "value3" as an input parameter, e.g.
DECLARE #list NVARCHAR(100)
SET #list = '("value1","value2","value3")'
SELECT COUNT(*)
FROM table
WHERE xml_col.exist('//Field.[#name=sql:variable("#list")]') = 1
which, of course, is invalid. Any suggestions would be appreciated!
simplest way to do it is (if your name could not contain ,):
declare #list nvarchar(max) = ',value1,value2,value3,'
select count(*)
from test
where xml_col.exist('//Field[contains(sql:variable("#list"), concat(",", #name, ","))]') = 1;
or SQL way:
select count(*)
from test
where
exists
(
select 1 from xml_col.nodes('//Field') as T(C)
where T.C.value('#name', 'nvarchar(max)') in ('value1', 'value2', 'value3')
)
sql fiddle demo
Maybe in this case it would be easier to check on the SQL side:
SELECT COUNT(*)
FROM table
WHERE xml_col.value('(//Field/#name)[1]', 'nvarchar(255)') in ('value1', 'value2', 'value3')
At least this would work if there is only one Field element in your XML, otherwise it would get a bit more complex.
You may try following construct:
select count(1)
from [table] t
where exists (
select 1 from (values ('value1'),('value2'),('value3')) l(v)
where t.xml_col.exist('//Field[#name=sql:column("l.v")]') = 1);
Also it can be used with table variable or table valued parameter in the following way:
declare #list table (value varchar(100))
insert into #list values ('value1'),('value2'),('value3')
or
create type ListOfValues as table (value varchar(100))
GO
declare #list ListOfValues
insert into #list values ('value1'),('value2'),('value3')
and then
select count(1)
from [table] t
where exists(select 1 from #list l
where t.xml_col.exist('//Field[#name=sql:column("l.value")]') = 1);