SQL Server OPENROWSET not pulling JSON Data - json

I am trying so save the JSON file Data 'WhitelistedOrigins' in a SQL table however I keep receiving NULL Entries.
I have used the same method to import attributes into a table and this has worked before although the JSON was formatted differently
JSON
"Client": {
"whiteListedOrigins": [
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
}
SQL
DECLARE #JSON VARCHAR(MAX)
SELECT #JSON = BulkColumn
FROM OPENROWSET
(BULK 'C:\config.json', SINGLE_CLOB)
AS A
UPDATE dbo.CommonBaseline
SET CommonBaseline.whiteListedOrigins= whiteListedOrigins
FROM OPENJSON (#JSON, '$.Client')
WITH (
whiteListedOrigins Varchar (MAX))
RESULT

You need to use OPENJSON() with explicit schema and AS JSON option in a column definition.
If you want to return a nested JSON fragment from a JSON property, you
have to provide the AS JSON flag. Without this option, if the property
can't be found, OPENJSON returns a NULL value instead of the
referenced JSON object or array, or it returns a run-time error in
strict mode.
Statement:
DECLARE #Json nvarchar(max) = N'{
"Client": {
"whiteListedOrigins": [
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
}
}'
SELECT *
FROM OPENJSON (#JSON, '$.Client') WITH (
whiteListedOrigins nvarchar (MAX) AS JSON
)
Output:
--------------------
whiteListedOrigins
--------------------
[
"file://",
"https://mobile.gtent.eu",
"https://mobile.assists.co.uk",
"https://valueadds3.active.eu",
"https://flash3.active.eu",
"https://valueadds3.assists.co.uk"
]
Notes:
Your JSON (without surrounding { and }) is not valid.

Related

Using OPENJSON in SQL Server to parse a Non-Array Object

I'm using SQL Server v15, called from a .NET application.
A website I'm using (not mine - I don't control the data) has a JSON dataset formatted strangely. Instead of being an array like:
[{"id":"1","Name":"Charlie"},{"id":"2","Name"="Sally"}]
It's an object with each element named as its ID:
{"1":{"id":"1","Name":"Charlie"}, "2":{"id":"2","Name"="Sally"}}
I know how to use the OPENJSON to read data from an array, but is it possible to have it parse this format? Or is my best bet to have a script loop through the objects one at a time?
Please try the following solution.
SQL
DECLARE #json NVARCHAR(MAX) =
N'{
"1": {
"id": "1",
"Name": "Charlie"
},
"2": {
"id": "2",
"Name": "Sally"
}
}';
SELECT rs.*
FROM OPENJSON (#json) AS seq
CROSS APPLY OPENJSON(seq.value)
WITH
(
[id] INT '$.id'
, [Name] VARCHAR(20) '$.Name'
) AS rs;
Output
id
Name
1
Charlie
2
Sally

Multiple SELECT statements into a single JSON

I'm convinced this must be answered somewhere but for the life of me I just can't seem to find anything no matter how much I change my search phrases.
I need to select data from two completely independent tables and export the information to JSON. In this case, they're both 1 record in each table.
If I select just 1 at a time and export to JSON, they're 1 record, but when I join the two single records in SQL and then export to JSON, they're 1 record arrays.
Just 1 record SQL Input:
DECLARE #Json nvarchar(max) =
(
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
GO
Just 1 record JSON Output (note there's no array):
{
"Data1": "Data1",
"Data2": "Data2"
}
2 record SQL Input:
DECLARE #Json nvarchar(max) =
(
SELECT
(
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
) AS [Part1]
,
(
SELECT 'Text1' AS [Text1], 'Text2' AS [Text2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
) AS [Part2]
FOR JSON PATH
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
GO
2 record JSON Output (note the inclusion of arrays):
{
"Part1": [
{
"Data1": "Data1",
"Data2": "Data2"
}
],
"Part2": [
{
"Text1": "Text1",
"Text2": "Text2"
}
]
}
I "think" that WITHOUT_ARRAY_WRAPPER is the correct attribute to add which will resolve this but as soon as I add that, I get the entire record as a string:
{
"Part1": "{\"Data1\":\"Data1\",\"Data2\":\"Data2\"}",
"Part2": "{\"Text1\":\"Text1\",\"Text2\":\"Text2\"}"
}
I understand that there's text manipulation methods I can use to get this to work, but I'm hoping for a clean SQL > JSON statement.
I'm currently working on SQL Server 2016 but I can if necessary get a 2017 or 2019 server. Not sure if later SQL handles this better or if it's just my query that needs optimisation.
Edit: My desired output is:
{
"Part1": {
"Data1": "Data1",
"Data2": "Data2"
},
"Part2": {
"Text1": "Text1",
"Text2": "Text2"
}
}
According to the accepted answer of FOR JSON PATH. how to not use escape characters on SQL Server's forum on MSDN:
FOR JSON will escape any text unless if it is generated as JSON result by some JSON function/query. In your example, FOR JSON cannot know do you really want raw JSON or you are just sending some free text that looks like JSON.
Properly defined JSON is generated with FOR JSON (unless if it has WITHOUT_ARRAY_WRAPPER option) or JSON_QUERY. If you wrap your JSON literal with JSON_QUERY it will not be escaped.
This answer got me to try the following code:
DECLARE #Json nvarchar(max) =
(
SELECT
JSON_QUERY((
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
)) AS [Part1]
,
JSON_QUERY((
SELECT 'Text1' AS [Text1], 'Text2' AS [Text2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
)) AS [Part2]
FOR JSON PATH
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
As as it turns out - this is working like a charm. Results:
{
"Part1": {
"Data1": "Data1",
"Data2": "Data2"
},
"Part2": {
"Text1": "Text1",
"Text2": "Text2"
}
}
DB<>Fiddle
Update
Look what I found buried in official documentation:
To avoid automatic escaping, provide newValue by using the JSON_QUERY function. JSON_MODIFY knows that the value returned by JSON_MODIFY is properly formatted JSON, so it doesn't escape the value.

Parse unknown JSON path in TSQL with openjson and/or json_value

I have a incoming data structure that looks like this:
declare #json nvarchar(max) = '{
"action": "edit",
"data": {
"2077-09-02": {
"Description": "some stuff",
"EffectDate": "2077-1-1"
}
}
}';
To give you a long story short, I think TSQL hates this json structure, because no matter what I have tried, I can't get to any values other than "action".
The {data} object contains another object, {2077-09-02}. "2077-09-02" will always be different. I can't rely on what that date will be.
This works:
select json_value(#json, '$.action');
None of this works when trying to get to the other values.
select json_value(#json, '$.data'); --returns null
select json_value(#json, '$.data[0]'); --returns null
select json_value(#json, 'lax $.data.[2077-09-02].Description');
--JSON path is not properly formatted. Unexpected character '[' is found at position 11.
select json_value(#json, 'lax $.data.2077-09-02.Description');
--JSON path is not properly formatted. Unexpected character '2' is found at position 11.
How do I get to the other values? Is the JSON not perfect enough for TSQL?
It is never a good idea to use the declarative part of a text based container as data. The "2077-09-02" is a valid json key, but hard to query.
You can try this:
declare #json nvarchar(max) = '{
"action": "edit",
"data": {
"2077-09-02": {
"Description": "some stuff",
"EffectDate": "2077-1-1"
}
}
}';
SELECT A.[action]
,B.[key] AS DateValue
,C.*
FROM OPENJSON(#json)
WITH([action] NVARCHAR(100)
,[data] NVARCHAR(MAX) AS JSON) A
CROSS APPLY OPENJSON(A.[data]) B
CROSS APPLY OPENJSON(B.[value])
WITH (Description NVARCHAR(100)
,EffectDate DATE) C;
The result
action DateValue Description EffectDate
edit 2077-09-02 some stuff 2077-01-01
The idea:
The first OPENJSON will return the action and the data.
I use a WITH clause to tell the engine, that action is a simple value, while data is nested JSON
The next OPENJSON dives into data
We can now use B.[key] to get the json key's value
Now we need another OPENJSON to dive into the columns within data.
However: If this JSON is under your control I'd suggest to change its structure.
Use double quotes instead of []. JSON Path uses JavaScript's conventions where a string is surrounded by double quotes. The documentation's example contains this path $."first name".
In this case :
select json_value(#json,'$.data."2077-09-02".Description');
Returns :
some stuff
As for the other calls, JSON_VALUE can only return scalar values, not objects. You need to use JSON_QUERY to extract JSON objects, eg :
select json_query(#json,'$.data."2077-09-02"');
Returns :
{
"Description": "some stuff",
"EffectDate": "2077-1-1"
}

How to insert json text into one of the column in a table in sql server

I have json code in which I am trying to get output. So, in one of the column in my code like "RETURN_MESSAGE" value is json ...so,I am trying to get the value in return_message column when I run select query but it is giving me null. So ,how can I get the json text which is inside "RETURN_MESSAGE" .Can anyone help me on this please
declare #json varchar(max) ='[
{
"SP_NAME":"test"
,"KEY":"a39a"
,"EXEC_ID":4857
,"RETURN_MESSAGE":{
"d":{
"Key":"e77d83af-2827-447c-8b98-46c9e9d0a39a"
,"reqCountLimit":0
,"reqFormat":"JSON"
,"reqExecutionMode":"B"
,"srcExtid":284
,"tgtExtId":4857
,"srcTableName":"KDP_STG_SAP_CSKS"
,"SourceSchema":"QCDC1"
,"serviceId":"1001"
,"reqBatchRecCount":15000
,"compressedFlag":"Y"
,"mDataFlag":"Y"
,"reqDeltaFromUTC":"0000-00-00T00: 00: 00"
,"reqExtMode":"F"
,"reqStatusType":"S"
,"reqStatus":"Successfully Batch Job Started !!!"
,"OBJECT_MDATA":{
"results":[
{
"Fieldname":"KDP_TABKEY"
,"Datatype":"CHAR"
,"Length":78
}
,{
"Fieldname":"KDP_CHNGIND"
,"Datatype":"CHAR"
,"Length":1
}
]
}
,"DELTA_CONFIG":[
]
}
}
}]'
SELECT * FROM OPENJSON(#json) with(SP_NAME varchar(50),KEY varchar(255),EXEC_ID int,RETURN_MESSAGE varchar(max))
output
sp_name key exec_id RETURN_MESSAGE
test a39a 4857 NULL
You can try the follwoing query.
SELECT * FROM OPENJSON(#json)
WITH(SP_NAME VARCHAR(50),KEY VARCHAR(255),EXEC_ID INT,RETURN_MESSAGE NVARCHAR(MAX) AS JSON)

Parsing a JSON to meet minimum requirements inside a stored procedure

I have been looking at what I believe to every single page on SQL Server and half stackoverflow, and I can't find a proper solution to this
Our challenge, is to deal with an exiting application that send/receive JSON form SQL Server. So we have to build a STRONG JSON architecture on SQL Server.
We need to validate the format of the JSON (legacy system has its own standard) so messages are in exact expected format.
The thing is, JSON functions are not so advance as XML, and seems there is no way to validate a schema in SQL Server.
We tried with sp_prepare and sp_execute, but that does not seem to work.
We tested something like this:
Declare #ptSQL1 int;
Exec sp_prepare #ptSQL1 output,
N'#P1 nvarchar(128), #json NVARCHAR(1000) ',
N' SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, ''$.root'')
WITH (
Cname NVARCHAR(100) ''strict$.FirstName'',
Csurname NVARCHAR(100) ''lax$.surname''
) as J
where Csurname like #P1';
DECLARE #json7 NVARCHAR(1000)
SET #json7 = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}'
IF (#ptSQL1 = 0) PRINT 'THE SUPPLY JSON IS NOT VALID'
ELSE Exec sp_execute #ptSQL1, N'a%', #json7;
but does not meet the sp_prepare/execute behavior.
Our intention it to validate a minimum schema before proceed to process the data, and if the schema doesn't meet the standard, return an ERROR.
How can this be accomplished?
(not sure where we read the #ptSQL1 = 0, but I believe to read somewhere)
Our intention it to validate a minimum schema before proceed to
process the data, and if the schema doesn't meet the standard, return
an ERROR.
The JSON must be parsed in order to validate the schema. A prepare doesn't actually execute the query in order to parse the JSON document, plus sp_prepare and sp_execute are internal API system stored procedures not intended to be called directly in T-SQL.
Although one can't currently validate JSON schema in T-SQL (without writing a custom SQLCLR assembly), you could just use TRY/CATCH and handle errors. The example below handles JSON errors differently but I would personally just THROW all errors and handle specific ones in the app code.
DECLARE #json NVARCHAR(1000);
DECLARE #P1 NVARCHAR(128) = 'a%';
SET #json = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}';
BEGIN TRY
SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, '$.root')
WITH (
Cname NVARCHAR(100) 'strict$.FirstName',
Csurname NVARCHAR(100) 'lax$.surname'
) as J
where Csurname like #P1;
END TRY
BEGIN CATCH
DROP TABLE IF EXISTS temp_tblPersons;
IF ERROR_MESSAGE() LIKE N'%JSON%'
BEGIN
PRINT 'THE SUPPLY JSON IS NOT VALID';
END
ELSE
BEGIN
THROW;
END;
END CATCH;