OPENJSON Convert Value Column to Multiple Rows does not work - json

I have a JSON file with a simple structure. I try to extract data into rows out of this JSON File.
The JSON File starts with:
[{"result":
[{"country":"Germany",
"parent":"xxxx",
"city":"Reitbrook",
"latitude":"",
I tried this code, all successfully.
Look at the last 3 statement and their results.
I would Expect multiple records at the SELECT last statement.
What am I doing wrong?
DECLARE #details VARCHAR(MAX)
Select #details =BulkColumn FROM OPENROWSET
(BULK 'folder/cmn_location', DATA_SOURCE='blogstorage', SINGLE_CLOB) as JSON;
IF (ISJSON(#details) = 1)
BEGIN PRINT 'Imported JSON is Valid' END
ELSE
BEGIN PRINT 'Invalid JSON Imported' END
SELECT #details as SingleRow_Column
--delivers one row Where
--SingleRow_Column=[{"result":[{country":"Germany","parent":.....
SELECT * FROM OPENJSON(#details, '$')
--delivers one row. Where
--Key=0, value={"result":[{"country":"Germany","parent":"xxx".....
SELECT * FROM OPENJSON(#details, '$.result')
--delivers no row at all
Now error messages, but just no data

Try it like this
Hint: I had to add some closing brackets...
DECLARE #YourJSON NVARCHAR(MAX)=
N'[{"result":
[{"country":"Germany",
"parent":"xxxx",
"city":"Reitbrook",
"latitude":""}]}]';
SELECT B.*
FROM OPENJSON(#YourJson) WITH(result NVARCHAR(MAX) AS JSON) A
CROSS APPLY OPENJSON(A.result) WITH(country NVARCHAR(1000)
,parent NVARCHAR(1000)
,city NVARCHAR(1000) ) B;
The idea in short:
Your JSON is an array, containing at least one object result. (There might be more objects, but you did not show enough).
This object result is an array itself. Therefore we use the WITH in combination with AS JSON and another APPLY OPENJSON using the nested array returned as A.result.

Related

Sql Server: Select String array of JSON

Given the following test data:
declare #mg nvarchar(max);
set #mg = '{"fiskepind":["ko","hest","gris"]}';
select #mg, JSON_VALUE(#mg,'$.fiskepind')
How do i get returned a column with:
ko,hest,gris
Example returns: NULL, and i dont want to [index] to only get one returned.
Starting from SQL Server 2017, a possible solution is a combination of OPENJSON() and STRING_AGG().
SELECT STRING_AGG([value], ',') WITHIN GROUP (ORDER BY CONVERT(int, [key])) AS Result
FROM OPENJSON(#mg, '$.fiskepind')
Note, that JSON_VALUE() returns a scalar value, so the NULL value is the expected result when you try to extract a JSON array ('$.fiskepind') from the input JSON text.
If you just want a combine list, you can use OPENJSON to get a table and then use FOR XML PATH or STRING_AGG to combine into a single string.
declare #mg nvarchar(max);
set #mg = '{"fiskepind":["ko","hest","gris"]}';
select #mg, JSON_VALUE(#mg,'$.fiskepind')
, STUFF((
SELECT
',' + value
FROM OPENJSON(#mg, '$.fiskepind')
FOR XML PATH('')
),1,1,'') as combined_list

MSSQL select JSON file with multirows and insert into table

I read the docs of handling a JSON file here. So far I am able to read the file and get a result:
QRY: SELECT * FROM OPENROWSET (BULK 'c:\ne.db', SINGLE_CLOB) as import
Result: {"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"} {"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}....
if I qry this, I only get the first row with the res nested:
Declare #JSON varchar(max)
SELECT #JSON=BulkColumn
FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB) import
SELECT *
FROM OPENJSON (#JSON)
What I want to achieve, is to read every entry of the JSON file and insert "res" from the json query into a row of a table in the database containing columns "number","info","id". If anyone could help me to finish this, I would appreciate.
The JSON file contains about 400000 lines and comes from a NodeJS script which uses nedb.
Here is the example file: LINK
The JSON in the file is not a valid JSON, it contains multiple root elements or a single row for each JSON object. It's strange, but OPENJSON() reads only the first element in this JSON input without generating an error.
But you may try to transform the input JSON into a valid JSON array ({...} {...} into [{}, {...}]) and parse this JSON array with OPENJSON() and explicit schema. If the input file has a single row for each JSON object, you need to know the new line separator (it's usually CHAR(10)):
DECLARE #json nvarchar(MAX)
-- Read the file's content
-- SELECT #json = BulkColumn
-- FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB) AS [Insert]
-- Only for test
SELECT #json =
N'{"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"}' +
CHAR(10) +
N'{"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}'
SELECT [number], [info], [_id]
FROM OPENJSON(CONCAT('[', REPLACE(#json, CONCAT('}', CHAR(10), '{'), '},{'), ']')) WITH (
[number] varchar(3) '$.res.number',
[info] varchar(10) '$.res.info',
_id varchar(50) '$._id'
)
Result:
number info _id
123 c-PM6900 aHMIeu6ZwB9lIBZk
456 a-PMs900 aHaIeu6ZwB9sIBZ1
You need to use a couple of calls to OPENJSON to achieve this, with a WITH:
DECLARE #JSON nvarchar(MAX) = N'{"res":{"number":"123", "info":"c-PM6900"},"_id":"aHMIeu6ZwB9lIBZk"} {"res":{"number":"456", "info":"a-PMs900"},"_id":"aHaIeu6ZwB9sIBZ1"}'
SELECT J._id,
r.number,
r.info
FROM OPENJSON(#JSON)
WITH (_id varchar(30),
res nvarchar(MAX) AS JSON) J
CROSS APPLY OPENJSON(J.res)
WITH(number int,
info varchar(10)) r;
Because the OP appears to think I am telling them to change their DECLARE and assignment statement... to confirm how you get the value into the #JSON, from the OP's own question:
DECLARE #JSON varchar(max);
SELECT #JSON=BulkColumn
FROM OPENROWSET (BULK 'C:\ne.db', SINGLE_CLOB);
Final edit, is also appears that the OP's JSON is malformed, as I would expect a comma, or something, before the second res definition. Guessing we need to split it into rows as well, which means some string splitting:
SELECT J._id,
r.number,
r.info
FROM STRING_SPLIT(REPLACE(#JSON,N'}} {"res"',N'}}|{"res"'),'|') SS --I assume a pipe (|`) won't appear in the data
CROSS APPLY OPENJSON(SS.[value])
WITH (_id varchar(30),
res nvarchar(MAX) AS JSON) J
CROSS APPLY OPENJSON(J.res)
WITH(number int,
info varchar(10)) r;
db<>fiddle

Update/Delete JSON array value in SQL Server

I have a json array in my table. It contains an array. I can create, append or make my array NULL. Inside my stored procedure but I don't see any way to pop value from array. Apparently JSON_Modify may have solution as you can update key as well as Single value but how can I use it to modify my array?
--My Array
Declare #json = '{"array":[123,456]}'
Desired results after update:
'{"array":[123]}'
Please note that array contain int values. Which are my sub department id. All values are (supposed to be) unique.
You could use:
DECLARE #json NVARCHAR(MAX) = '{"array":[123,456]}';
WITH cte AS (
SELECT *, MAX([key]) OVER() AS m_key
FROM OPENJSON(#json, '$.array') s
)
SELECT JSON_QUERY('[' + IIF(MAX(m_key) = 0, '', STRING_AGG(value,',')
WITHIN GROUP (ORDER BY [key])) + ']','$') AS array
FROM cte
WHERE [key] != m_key OR m_key = 0
FOR JSON AUTO, WITHOUT_ARRAY_WRAPPER;
Output:
{"array":[123]}
DBFiddle Demo SQL Server 2017
As I was in hurry I solved my problem following way, but I would really recommend not to use it. Please see answer above by #lad2025.
DECLARE #json VARCHAR(MAX)
=(SELECT jsonDept
FROM tblEmployee
WHERE tblEmployeeID = #empid)
DECLARE #newjson VARCHAR(MAX)= (
SELECT LEFT(subdept, LEN(subdept)-1)
FROM (
SELECT Distinct value + ', ' FROM OPENJSON(#json,'$.array') Where value <> #subdeptid
FOR XML PATH ('')
) t (subdept))
UPDATE tblEmployee SET jsonDept = '{"array":['+ #newjson +']}' WHERE tblEmployeeID = #empid

JSON in SQL Server

I have JSON values stored in a column in a SQL Server database:
'[{"attribute":"Name","Age":50,"sort":true,"visible":true},
{"attribute":"Address","Street":"Wilson street","Country":"United states"},
{"attribute":"Work","Designation":"Developer","Experience":15}]'
We want to remove that entire work attribute and save that in the same column, we will have different no of items in that attribute, here we have only two(designation and Experience), but no of items will vary for each row.
I want to change the above JSON in below format.
'[{"attribute":"Name","Age":50,"sort":true,"visible":true},
{"attribute":"Address","Street":"Wilson street","Country":"United states"}]'`
Please suggest me the best way to do that.
If you are using the sql server 2016 or higher then you can use the OpenJson() method.
Example:
DECLARE #json NVARCHAR(MAX)
SET #json='{"Name":"Anurag","age":25,"skills":["C#","As.Net","MVC","Linq"]}';
SELECT *
FROM OPENJSON(#json);
You could try the below string manipulation to achieve your desired output -
DECLARE #info NVARCHAR(1000) = '[{"attribute":"Name","Age":50,"sort":true,"visible":true},
{"attribute":"Address","Street":"Wilson street","Country":"United states"},
{"attribute":"Work","Designation":"Developer","Experience":15}]'
SELECT
REVERSE(SUBSTRING(REVERSE(SUBSTRING(#info, 0, CHARINDEX('"attribute":"Work"',#info)-1)),CHARINDEX(',',REVERSE(SUBSTRING(#info, 0, CHARINDEX('"attribute":"Work"',#info)-1)))+1,
LEN(REVERSE(SUBSTRING(#info, 0, CHARINDEX('"attribute":"Work"',#info)-1)))))+']'
We also have JSON_MODIFY to update or remove a JSON string.
Try below approach, bit complex but do the trick. Read inline comments to understand how it works.
CREATE TABLE #temp(ID INT, JSON varchar(1000))
INSERT INTO #temp VALUES(1,'[{"attribute":"Name","Age":40,"sort":true,"visible":true},
{"attribute":"Address","Street":"Wilson street","Country":"United states"}]')
INSERT INTO #temp VALUES(2,'[{"attribute":"Name","Age":50,"sort":true,"visible":true},
{"attribute":"Address","Street":"Wilson street","Country":"United states"},
{"attribute":"Work","Designation":"Developer","Experience":15}]')
INSERT INTO #temp VALUES(3,'[{"attribute":"Name","Age":30,"sort":true,"visible":true},
{"attribute":"Work","Designation":"Developer","Experience":15},
{"attribute":"Address","Street":"New Wilson street","Country":"United states"}]')
INSERT INTO #temp VALUES(4,'[{"attribute":"Work","Designation":"Developer","Experience":15},
{"attribute":"Name","Age":30,"sort":true,"visible":true},
{"attribute":"Address","Street":"New Wilson street","Country":"United states"}]')
;WITH CTE AS
(
SELECT ID
,JSON
,SUBSTRING(JSON,0,CHARINDEX('"attribute":"Work"',JSON)-1) AS JSON_P1 -- Get string before the "Work" attribute.
,SUBSTRING(JSON,CHARINDEX('}',JSON,CHARINDEX('"attribute":"Work"',JSON))+1,LEN(JSON)) AS JSON_P2 -- Get string after the "Work" attribute.
FROM #temp
WHERE JSON LIKE '%"attribute":"Work"%'
)
SELECT ID
,JSON
-- Remove the Comma(',') character used with "Work" attribute.
,CASE WHEN CHARINDEX(',',REVERSE(JSON_P1)) = 0 -- In reverse order, When there is no Comma(',') in the first part of string.
THEN JSON_P1
WHEN CHARINDEX(',',REVERSE(JSON_P1)) < CHARINDEX('}',REVERSE(JSON_P1)) -- In reverse order, When Comma(',') appears before a closing bracket, remove it.
THEN REVERSE(STUFF(REVERSE(JSON_P1),CHARINDEX(',',REVERSE(JSON_P1)),1,''))
ELSE JSON_P1 -- non of above
END +
CASE WHEN CHARINDEX(',',REVERSE(JSON_P1)) = 0 -- Check only if no Comma(',') found in the first part of string.
AND CHARINDEX(',',JSON_P2) < CHARINDEX('{',JSON_P2) -- When Comma(',') appears before an opening bracket in second part of string, remove it.
THEN STUFF(JSON_P2,CHARINDEX(',',JSON_P2),1,'')
ELSE JSON_P2 -- non of above
END AS JSON_Final
FROM CTE

How can i pass multiple values to an array parameter function

i need your help.....how can i pass multi values into single parameter in a function?
The values 'AAA 1','BBB 2', 'CCC 3' 'DDD 4' are to be passed to the same parameter "v_type", the values will be sent based on the selection from the drop down in the front end screen. The user can select one or more values from the list and those values should be passed to the procedure which in turn will be passed to the WHERE clause of the SELECT statement inside the procedure.
My function is somenthing like this:
Example
CREATE OR REPLACE FUNCTION FN_GET_ROWS
(v_date_ini IN DATE,
v_date_end IN DATE,
v_type IN VARCHAR2
)
RETURN TEST_TABTYPE
AS
V_Test_Tabtype Test_TabType;
BEGIN
SELECT TEST_OBJ_TYPE(DATE, NAME, ALERT)
BULK COLLECT INTO V_Test_TabType
FROM (select date, name, alert
from Table
where DATE BETWEEN v_date_ini AND v_date_end
AND Alert in (select REGEXP_SUBSTR (v_type, '[^,]+', 1, level)
from dual
connect by level <= length(regexp_replace(v_type,'[^,]*'))+1)
);
RETURN V_Test_TabType;
END;
Searching internet i found that maybe an Varray works but i dont know how to assign it to the variable :type with the parameters that the user selects on the screen.
I create this types on database, how can i used it? i'm kind a new in plsql.
CREATE TYPE alert_obj AS OBJECT (type_alert VARCHAR2(60));
CREATE TYPE alert_varray_typ AS VARRAY(100) OF alert_obj;
Thanks for your help
Emanuel.
I dont know, if I really understand your problem. But I think, that there is more solutions.
You can use string of VARCHAR2 as parameter and after that parse it with function like that:
PROCEDURE p_parse_into_array (
lv_str IN VARCHAR2,
lt_table IN OUT sys.dbms_debug_vc2coll,
lv_splitter IN VARCHAR2)
IS
ln_position NUMBER := 0;
ln_position_2 NUMBER;
ln_i NUMBER := 1;
BEGIN
ln_position_2 := INSTR(lv_str,lv_splitter,1,1);
WHILE ln_position_2 != 0
LOOP
lt_table.extend(1);
lt_table(ln_i) := SUBSTR(lv_str,ln_position+1,ln_position_2-ln_position-1);
ln_position := INSTR(lv_str,lv_splitter,1,ln_i);
ln_position_2 := INSTR(lv_str,lv_splitter,1,ln_i+1);
ln_i := ln_i + 1;
END LOOP;
END;
where lv_str is string to parse, lt_table is table of varchar(2000) and lv_splitter is character to split (, . ; - etc) and this function return values into lt_table, which you can use in you select menu.
Second solution is to use varray as you say, but there you need to use dynamic sql with command:
execute immediate 'select * from dual where some_value in (select * from table('||my_varray_table||'));
And other solution is to use nested table. It´s your choice, which of this solution you prefer :)