I'm using SQL Server 2014 and aware that out of the box it does not support JSON.
We are receiving data from a 3rd party supplier that will look like the below:
{
"PersonID": "1",
"MarketingPreference": "Allow",
"AllowPhone": "No",
"AllowEmail": "Yes",
"AllowTxt": "Yes",
"AllowMob": "Yes"
}
However, we may sometimes also receive the below:
{
"PersonID": "2",
"MarketingPreference": "DoNotAllow"
}
I need to insert these values into a table - what is the best way to do this if SQL Server 2014 does not support JSON?
If I convert the JSON to XML it looks like the below:
<PersonID>1</PersonID>
<MarketingPreference>Allow</MarketingPreference>
<AllowPhone>No</AllowPhone>
<AllowEmail>Yes</AllowEmail>
<AllowTxt>Yes</AllowTxt>
<AllowMob>Yes</AllowMob>
How do I then extract the values from the XML?
DECLARE #xml XML
SET #xml = N'
<PersonID>1</PersonID>
<MarketingPreference>Allow</MarketingPreference>
<AllowPhone>No</AllowPhone>
<AllowEmail>Yes</AllowEmail>
<AllowTxt>Yes</AllowTxt>
<AllowMob>Yes</AllowMob>'
SELECT
Tab.Col.value('#PersonID','int') AS ContactID,
Tab.Col.value('#MarketingPreference','varchar(20)') AS Pref,
Tab.Col.value('#AllowPhone','varchar(20)') AS Phone,
Tab.Col.value('#AllowEmail','varchar(20)') AS Email,
Tab.Col.value('#AllowTxt','varchar(20)') AS Txt,
Tab.Col.value('#AllowMob','varchar(20)') AS Mob
FROM
#xml.nodes('/root/') Tab(Col)
GO;
But now I get this error:
Incorrect syntax near 'GO'.
Is there an easier way to select the values from JSON?
You don't need a GO (never mind GO; which is not valid), and your XML syntax just seems to have been plucked from your first search result? Try:
SELECT PersonID = x.p.value('(PersonID)[1]', 'int'),
MarkPref = x.p.value('(MarketingPreference)[1]', 'varchar(20)'),
AllowPhone = x.p.value('(AllowPhone)[1]','varchar(20)'),
AllowEmail = x.p.value('(AllowEmail)[1]','varchar(20)'),
AllowTxt = x.p.value('(AllowTxt)[1]', 'varchar(20)'),
AllowMob = x.p.value('(AllowMob)[1]', 'varchar(20)')
FROM #xml.nodes('.') AS x(p);
Output:
PersonID
MarkPref
AllowPhone
AllowEmail
AllowTxt
AllowMob
1
Allow
No
Yes
Yes
Yes
Example db<>fiddle
Related
I am passing this JSON to a stored procedure in SQL Server
{
"individual": [
{
"app_id": 1057029,
"size": 2
},
{
"app_id": 1057053,
"size": 3
},
{
"app_id": 1057048,
"size": 1
}
]
}
In the stored procedure I am extracting values of app_id and size as under
SET #len = JSON_VALUE(#json, CONCAT('$.individual[', #i, '].size'));
SET #appId = JSON_VALUE(#json, CONCAT('$.individual[', #i, '].app_id'));
(Here i is index variable incrementing in a loop)
This works perfect on Microsoft SQL Server 2017 (version 14.0.1000.169)
But on Microsoft SQL Server 2016 (version 13.0.4604.0) I am getting error:
JSON_Value error: The argument 2 of the "JSON_VALUE or JSON_QUERY"
must be a string literal
Please note this is not duplicate as I already have referred questions below on SO but still didn't get solution.
JSON_Value error: The argument 2 of the "JSON_VALUE or JSON_QUERY" must be a string literal
SQL Sever 2016 - Inconsistent Behavior - The argument 2 of the "JSON_VALUE or JSON_QUERY" must be a string literal
Update
Why this is not duplicate question?
None of the other questions discusses the issue in clear precise way. They have mentioned specifics instead. Rather in this question, I have mentioned use of variable #i which is precisely causing this error. JSON_VALUE works in MSSQL2016 but it doesn't support variables as second param, that precisely is the problem here. As a workaround, we've to use OPENJSON but again writing OPENJSON query to get record of particular index from json is tricky. None of the answers to other similar questions discusses this clearly. I am going to write an answer in some time demonstrating it all.
This example demonstrates the use of OpenJson for your JSON and desired query. I've never used OpenJson but found the Microsoft documentation more than adequate. In the last SELECT in this example, the app_id and size are columns in a table with each value pair as a row. Now you don't have to loop through an array; you now have a standard table to work with.
DECLARE #json nvarchar(max) = '{
"individual": [
{
"app_id": 1057029,
"size": 2
},
{
"app_id": 1057053,
"size": 3
},
{
"app_id": 1057048,
"size": 1
}
]
}';
SELECT * FROM OpenJson(#json);
SELECT * FROM OpenJson(#json, '$.individual');
SELECT * FROM OPENJSON(#json, '$.individual')
WITH (
app_id int,
size int
) as apps
;
The output:
Given the table:
C1 C2 C3
----------------
1 'v1' 1.1
2 'v2' 2.2
3 'v3' 3.3
Is there any "easy" way to return JSON in this format:
{
"columns": [ "C1", "C2", "C3" ],
"rows": [
[ 1, "v1", 1.1 ],
[ 2, "v2", 2.2 ],
[ 3, "v3", 3.3 ]
]
}
To generate an array with single values from a table there is a neat trick like this:
SELECT JSON_QUERY(REPLACE(REPLACE(
(
SELECT id
FROM table a
WHERE pk in (1,2)
FOR JSON PATH
), '{"id":',''),'}','')) 'ids'
Which generates
"ids": [1,2]
But to construct the nested array above the replacing gets really tedious, anyone know a good way to achieve this?
Well, you ask for an easy way but the following will not be easy :-)
The tricky part is to know which values need to be qouted and which can remain naked.
This needs generic type-analysis to find, which values are strings.
The only way I know to get on meta data (besides building dynamic sql using meta views like INFORMATIONSCHEMA.COLUMNS) is XML together with an AUTO-schema.
This XML is very near to your needs actually. There is a list of columns at the beginning, followed by a list of rows. But it is not JSON of course...
Try this out:
--This is a mockup table with the values you provided.
DECLARE #mockup TABLE(C1 INT,C2 VARCHAR(100),C3 DECIMAL(4,2));
INSERT INTO #mockup VALUES
(1,'v1',1.1)
,(2,'v2',2.2)
,(3,'v3',3.3);
--Now we create an XML out of this
DECLARE #xml XML =
(
SELECT *
FROM #mockup t
FOR XML RAW,XMLSCHEMA,TYPE
);
--Check the XML's content with SELECT #xml to see how it is looking internally
--Now the real query can start:
SELECT '{"columns":[' +
STUFF(#xml.query('declare namespace xsd="http://www.w3.org/2001/XMLSchema";
for $col in /xsd:schema/xsd:element//xsd:attribute
return
<x>,{concat("""",xs:string($col/#name),"""")}</x>
').value('.','nvarchar(max)'),1,1,'') +
'],"rows":[' +
STUFF(
(
SELECT
',[' + STUFF(b.query(' declare namespace xsd="http://www.w3.org/2001/XMLSchema";
for $attr in ./#*
return
<x>,{if(/xsd:schema/xsd:element//xsd:attribute[#name=local-name($attr)]//xsd:restriction/#base="sqltypes:varchar") then
concat("""",$attr,"""")
else
xs:string($attr)
}
</x>
').value('.','nvarchar(max)'),1,1,'') + ']'
FROM #xml.nodes('/*:row') B(b)
FOR XML PATH(''),TYPE
).value('.','nvarchar(max)'),1,1,'') +
']}';
The result
{"columns":["C1","C2","C3"],"rows":[[3,"v3",3.30],[1,"v1",1.10],[2,"v2",2.20]]}
Some explanation:
The first part will use XQuery to find all columns (xsd:attribute within XML-schema) and create the array of column names.
The second part will againt use XQuery in order to run through all rows and write their column values in a concatenated string. Each value can refer to its type within the schema. Whenever this type is sqltypes:varchar the value will be quoted. All other values remain naked.
This will not solve each and any case generically...
To be honest, this was more for my own curiosity :-) Wanted to find out, how one can solve this.
Quite probably the best answer is: Use another tool. SQL-Server is not the best choice here ;-)
I have been looking at what I believe to every single page on SQL Server and half stackoverflow, and I can't find a proper solution to this
Our challenge, is to deal with an exiting application that send/receive JSON form SQL Server. So we have to build a STRONG JSON architecture on SQL Server.
We need to validate the format of the JSON (legacy system has its own standard) so messages are in exact expected format.
The thing is, JSON functions are not so advance as XML, and seems there is no way to validate a schema in SQL Server.
We tried with sp_prepare and sp_execute, but that does not seem to work.
We tested something like this:
Declare #ptSQL1 int;
Exec sp_prepare #ptSQL1 output,
N'#P1 nvarchar(128), #json NVARCHAR(1000) ',
N' SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, ''$.root'')
WITH (
Cname NVARCHAR(100) ''strict$.FirstName'',
Csurname NVARCHAR(100) ''lax$.surname''
) as J
where Csurname like #P1';
DECLARE #json7 NVARCHAR(1000)
SET #json7 = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}'
IF (#ptSQL1 = 0) PRINT 'THE SUPPLY JSON IS NOT VALID'
ELSE Exec sp_execute #ptSQL1, N'a%', #json7;
but does not meet the sp_prepare/execute behavior.
Our intention it to validate a minimum schema before proceed to process the data, and if the schema doesn't meet the standard, return an ERROR.
How can this be accomplished?
(not sure where we read the #ptSQL1 = 0, but I believe to read somewhere)
Our intention it to validate a minimum schema before proceed to
process the data, and if the schema doesn't meet the standard, return
an ERROR.
The JSON must be parsed in order to validate the schema. A prepare doesn't actually execute the query in order to parse the JSON document, plus sp_prepare and sp_execute are internal API system stored procedures not intended to be called directly in T-SQL.
Although one can't currently validate JSON schema in T-SQL (without writing a custom SQLCLR assembly), you could just use TRY/CATCH and handle errors. The example below handles JSON errors differently but I would personally just THROW all errors and handle specific ones in the app code.
DECLARE #json NVARCHAR(1000);
DECLARE #P1 NVARCHAR(128) = 'a%';
SET #json = N'{
"root": [
{ "FirstName": "Charles" , "surname":"perez" },
{ "FirstName": "Jade" , "surname":"pelaz" },
{ "FirstName": "Jim" , "surname":"alvarez" },
{ "FirstName": "Luke" , "surname":"alonso" },
{ "FirstName": "Ken"}
]
}';
BEGIN TRY
SELECT *
INTO temp_tblPersons
FROM OPENJSON (#json, '$.root')
WITH (
Cname NVARCHAR(100) 'strict$.FirstName',
Csurname NVARCHAR(100) 'lax$.surname'
) as J
where Csurname like #P1;
END TRY
BEGIN CATCH
DROP TABLE IF EXISTS temp_tblPersons;
IF ERROR_MESSAGE() LIKE N'%JSON%'
BEGIN
PRINT 'THE SUPPLY JSON IS NOT VALID';
END
ELSE
BEGIN
THROW;
END;
END CATCH;
I'm trying to import the entirety of a JSON file into a table of mine in SQL Server.
The JSON data looks like this:
{
"category": "General Knowledge",
"type": "multiple",
"difficulty": "hard",
"question": "Electronic music producer Kygo's popularity skyrocketed after a certain remix. Which song did he remix?",
"correct_answer": "Ed Sheeran - I See Fire",
"incorrect_answers": [
"Marvin Gaye - Sexual Healing",
"Coldplay - Midnight",
"a-ha - Take On Me"
]
},
With multiple entries like this.
I'm attempting to use OPENROWSET and OPENJSON to accomplish this using the following query:
SELECT value
FROM OPENROWSET (BULK 'C:\Users\USERNAME\Desktop\general_questions.json', SINGLE_CLOB) as j
CROSS APPLY OPENJSON(BulkColumn)
However, the output I'm getting only shows the first question object in the file. I have a two part question:
How can I get my query to select ALL of the objects in the file and then insert all of those objects into a table in my SQL Server db?
If I understood you right, is it this:
SELECT value
FROM OPENROWSET (BULK 'C:\Users\USERNAME\Desktop\general_questions.json', SINGLE_CLOB) as j
CROSS APPLY OPENJSON(BulkColumn)
WITH
(
CATEGORY VARCHAR2(MAX),
...
) AS JSON_TABLE
Also, am not sure what you mean by "However, the output I'm getting only shows the first question object in the file."? Do you mean the question object has multiple attributes?
So I have a lot of json files structured like this:
{
"Id": "2551faee-20e5-41e4-a7e6-57bd20b02a22",
"Timestamp": "2016-12-06T08:09:57.5541438+01:00",
"EventEntry": {
"EventId": 1,
"Payload": [
"1a3e0c9e-ef69-4c6a-ac8c-9b2de2fbc701",
"DHS.PlanCare.Business.BusinessLogic.VisionModels.VisionModelServiceWithoutUnitOfWork.FetchVisionModelsForClientOnReferenceDateAsync(System.Int64 clientId, System.DateTime referenceDate, System.Threading.CancellationToken cancellationToken)",
25,
"DHS.PlanCare.Business.BusinessLogic.VisionModels.VisionModelServiceWithoutUnitOfWork+<FetchVisionModelsForClientOnReferenceDateAsync>d__11.MoveNext\r\nDHS.PlanCare.Core.Extensions.IQueryableExtensions+<ExecuteAndThrowTaskCancelledWhenRequestedAsync>d__16`1.MoveNext\r\n",
false,
"2197, 6-12-2016 0:00:00, System.Threading.CancellationToken"
],
"EventName": "Duration",
"KeyWordsDescription": "Duration",
"PayloadSchema": [
"instanceSessionId",
"member",
"durationInMilliseconds",
"minimalStacktrace",
"hasFailed",
"parameters"
]
},
"Session": {
"SessionId": "0016e54b-6c4a-48bd-9813-39bb040f7736",
"EnvironmentId": "C15E535B8D0BD9EF63E39045F1859C98FEDD47F2",
"OrganisationId": "AC6752D4-883D-42EE-9FEA-F9AE26978E54"
}
}
How can I create an u-sql query that outputs the
Id,
Timestamp,
EventEntry.EventId and
EventEntry.Payload[2] (value 25 in the example below)
I can't figure out how to extend my query
#extract =
EXTRACT
Timestamp DateTime
FROM #"wasb://xxx/2016/12/06/0016e54b-6c4a-48bd-9813-39bb040f7736/yyy/{*}/{*}.json"
USING new Microsoft.Analytics.Samples.Formats.Json.JsonExtractor();
#res =
SELECT Timestamp
FROM #extract;
OUTPUT #res TO "/output/result.csv" USING Outputters.Csv();
I have seen some examples like:
U- SQL Unable to extract data from JSON file => this only queries one level of the document, I need data from multiple levels.
U-SQL - Extract data from json-array => this only queries one level of the document, I need data from multiple levels.
JSONTuple supports multiple JSONPaths in one go.
#extract =
EXTRACT
Id String,
Timestamp DateTime,
EventEntry String
FROM #"..."
USING new Microsoft.Analytics.Samples.Formats.Json.JsonExtractor();
#res =
SELECT Id, Timestamp, EventEntry,
Microsoft.Analytics.Samples.Formats.Json.JsonFunctions.JsonTuple(EventEntry,
"EventId", "Payload[2]") AS Event
FROM #extract;
#res =
SELECT Id,
Timestamp,
Event["EventId"] AS EventId,
Event["Payload[2]"] AS Something
FROM #res;
You may want to look at this GIT example. https://github.com/Azure/usql/blob/master/Examples/JsonSample/JsonSample/NestedJsonParsing.usql
This take 2 disparate data elements and combines them, like you have the Payload, and Payload schema. If you create key value pairs using the "Donut" or "Cake and Batter" examples you may be able to match the scema up to the payload and use the cross apply explode function.