As a simplified example, consider this table with two fields. One is a string and the other is XML.
SELECT TOP (1) [Source]
, OrderParameter
FROM [Rops].[dbo].[PreOrder]
Source="MediaConversions"
OrderParameter="<?xml version="1.0" encoding="utf-16"?>"
Now I want to query the table and have the results as json, but also have the XML converted as json in one go.
SELECT TOP (1) [Source]
, OrderParameter
FROM [Rops].[dbo].[PreOrder]
for json path;
results in
[{"Source":"MediaConversions","OrderParameter":"<ParameterList
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:xsd="http://www.w3.org/2001/XMLSchema" />"}]
But I want it to be converted into:
[{"Source":"MediaConversions","OrderParameter":{ "ParameterList": [
"x": 1, "y": 10] }
]
How to add "for json" to have the xml converted?
SELECT TOP (1) [Source]
, select OrderParameter for json path????
FROM [Rops].[dbo].[PreOrder]
for json path;
It looks like you want to pull out the inner text of the ParameterList node inside the XML. You can use .value and XQuery for this:
SELECT TOP (1) [Source]
, OrderParameter = (
SELECT
x = x.PL.value('(x/text())[1]','int'),
y = x.PL.value('(y/text())[1]','int')
FROM (VALUES( CAST(OrderParameter AS xml) )) v(OrderParameter)
CROSS APPLY v.OrderParameter.nodes('ParameterList') x(PL)
FOR JSON PATH, ROOT('ParameterList')
)
FROM [Rops].[dbo].[PreOrder]
FOR JSON PATH;
Related
This is driving me nuts, and I don't understand what's wrong with my approach.
I generate a JSON object in SQL like this:
select #output = (
select distinct lngEmpNo, txtFullName
from tblSecret
for json path, root('result'), include_null_values
)
I get a result like this:
{"result":[{"lngEmpNo":696969,"txtFullName":"Clinton, Bill"}]}
ISJSON() confirms that it's valid JSON, and JSON_QUERY(#OUTPUT, '$.result') will return the array [] portion of the JSON object... cool!
BUT, I'm trying to use JSON_QUERY to extract a specific value:
This gets me a NULL value. Why??????? I've tried it with the [0], without the [0], and of course, txtFullName[0]
SELECT JSON_QUERY(#jsonResponse, '$.result[0].txtFullName');
I prefixed with strict, SELECT JSON_QUERY(#jsonResponse, 'strict $.result[0].txtFullName');, and it tells me this:
Msg 13607, Level 16, State 4, Line 29
JSON path is not properly formatted. Unexpected character 't' is found at
position 18.
What am I doing wrong? What is wrong with my structure?
JSON_QUERY will only extract an object or an array. You are trying to extract a single value so, you need to use JSON_VALUE. For example:
SELECT JSON_VALUE(#jsonResponse, '$.result[0].txtFullName');
I'm convinced this must be answered somewhere but for the life of me I just can't seem to find anything no matter how much I change my search phrases.
I need to select data from two completely independent tables and export the information to JSON. In this case, they're both 1 record in each table.
If I select just 1 at a time and export to JSON, they're 1 record, but when I join the two single records in SQL and then export to JSON, they're 1 record arrays.
Just 1 record SQL Input:
DECLARE #Json nvarchar(max) =
(
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
GO
Just 1 record JSON Output (note there's no array):
{
"Data1": "Data1",
"Data2": "Data2"
}
2 record SQL Input:
DECLARE #Json nvarchar(max) =
(
SELECT
(
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
) AS [Part1]
,
(
SELECT 'Text1' AS [Text1], 'Text2' AS [Text2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
) AS [Part2]
FOR JSON PATH
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
GO
2 record JSON Output (note the inclusion of arrays):
{
"Part1": [
{
"Data1": "Data1",
"Data2": "Data2"
}
],
"Part2": [
{
"Text1": "Text1",
"Text2": "Text2"
}
]
}
I "think" that WITHOUT_ARRAY_WRAPPER is the correct attribute to add which will resolve this but as soon as I add that, I get the entire record as a string:
{
"Part1": "{\"Data1\":\"Data1\",\"Data2\":\"Data2\"}",
"Part2": "{\"Text1\":\"Text1\",\"Text2\":\"Text2\"}"
}
I understand that there's text manipulation methods I can use to get this to work, but I'm hoping for a clean SQL > JSON statement.
I'm currently working on SQL Server 2016 but I can if necessary get a 2017 or 2019 server. Not sure if later SQL handles this better or if it's just my query that needs optimisation.
Edit: My desired output is:
{
"Part1": {
"Data1": "Data1",
"Data2": "Data2"
},
"Part2": {
"Text1": "Text1",
"Text2": "Text2"
}
}
According to the accepted answer of FOR JSON PATH. how to not use escape characters on SQL Server's forum on MSDN:
FOR JSON will escape any text unless if it is generated as JSON result by some JSON function/query. In your example, FOR JSON cannot know do you really want raw JSON or you are just sending some free text that looks like JSON.
Properly defined JSON is generated with FOR JSON (unless if it has WITHOUT_ARRAY_WRAPPER option) or JSON_QUERY. If you wrap your JSON literal with JSON_QUERY it will not be escaped.
This answer got me to try the following code:
DECLARE #Json nvarchar(max) =
(
SELECT
JSON_QUERY((
SELECT 'Data1' AS [Data1], 'Data2' AS [Data2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
)) AS [Part1]
,
JSON_QUERY((
SELECT 'Text1' AS [Text1], 'Text2' AS [Text2]
FOR JSON PATH
, INCLUDE_NULL_VALUES
, WITHOUT_ARRAY_WRAPPER
)) AS [Part2]
FOR JSON PATH
, WITHOUT_ARRAY_WRAPPER
);
SELECT #Json;
As as it turns out - this is working like a charm. Results:
{
"Part1": {
"Data1": "Data1",
"Data2": "Data2"
},
"Part2": {
"Text1": "Text1",
"Text2": "Text2"
}
}
DB<>Fiddle
Update
Look what I found buried in official documentation:
To avoid automatic escaping, provide newValue by using the JSON_QUERY function. JSON_MODIFY knows that the value returned by JSON_MODIFY is properly formatted JSON, so it doesn't escape the value.
I create JSON like this to extract any table (name "randomly" decided at runtime, its name is in variable iv_table_name):
FIELD-SYMBOLS <itab> TYPE STANDARD TABLE.
DATA ref_itab TYPE REF TO data.
DATA(iv_table_name) = 'SCARR'.
CREATE DATA ref_itab TYPE STANDARD TABLE OF (iv_table_name).
ASSIGN ref_itab->* TO <itab>.
SELECT *
INTO TABLE <itab>
FROM (iv_table_name).
DATA results_json TYPE TABLE OF string.
DATA sub_json TYPE string.
DATA(lo_json_writer) = cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ).
CALL TRANSFORMATION id
SOURCE result = <itab>
RESULT XML lo_json_writer.
cl_abap_conv_in_ce=>create( )->convert(
EXPORTING
input = lo_json_writer->get_output( )
IMPORTING
data = sub_json ).
The result variable sub_json looks like this:
{"RESULT":
[
{"MANDT":"220","AUFNR":"0000012", ...},
{"MANDT":"220","AUFNR":"0000013", ...},
...
]
}
Is there a way to avoid the surrounding dictionary and get the result like this?
[
{"MANDT":"220","AUFNR":"0000012", ...},
{"MANDT":"220","AUFNR":"0000013", ...},
...
]
Background:
I used this:
sub_json = /ui2/cl_json=>serialize( data = <lt_result> pretty_name = /ui2/cl_json=>pretty_mode-low_case ).
But the performance of /ui2/cl_json=>serialize( ) is not good.
If you really want to use it just as a tool for extracting table records then you could write your own ID transformation in STRANS. It could look like that, let us name it Z_JSON_TABLE_CONTENTS (create it with type XSLT):
<xsl:transform version="1.0"
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:sap="http://www.sap.com/sapxsl"
>
<xsl:output method="text" encoding="UTF-8" />
<xsl:strip-space elements="*"/>
<xsl:template match="RESULT">
[
<xsl:for-each select="*">
{
<xsl:for-each select="*">
"<xsl:value-of select="local-name()" />": "<xsl:value-of select="text()" />"<xsl:if test="position() != last()">,</xsl:if>
</xsl:for-each>
}<xsl:if test="position() != last()">,</xsl:if>
</xsl:for-each>
]
</xsl:template>
</xsl:transform>
Then you could use it like that.
REPORT ZZZ.
FIELD-SYMBOLS <itab> TYPE STANDARD TABLE.
DATA ref_itab TYPE REF TO data.
DATA(iv_table_name) = 'SCARR'.
CREATE DATA ref_itab TYPE STANDARD TABLE OF (iv_table_name).
ASSIGN ref_itab->* TO <itab>.
SELECT *
INTO TABLE <itab>
FROM (iv_table_name).
DATA results_json TYPE TABLE OF string.
DATA sub_json TYPE string.
DATA g_string TYPE string.
DATA(g_document) = cl_ixml=>create( )->create_document( ).
DATA(g_ref_stream_factory) = cl_ixml=>create( )->create_stream_factory( ).
DATA(g_ostream) = g_ref_stream_factory->create_ostream_cstring( g_string ).
CALL TRANSFORMATION Z_JSON_TABLE_CONTENTS
SOURCE result = <itab>
RESULT XML g_ostream.
DATA(g_json_parser) = new /ui5/cl_json_parser( ).
g_json_parser->parse( g_string ).
I've got no answer whether it's possible to omit the initial "RESULT" tag in full sXML, but my opinion is NO.
Now, there's the solution with the KISS principle :
REPLACE ALL OCCURRENCES OF REGEX '^\{"RESULT":|\}$' IN sub_json WITH ``.
There's also this other writing (slightly slower):
sub_json = replace( val = sub_json regex = '^\{"RESULT":|\}$' with = `` occ = 0 ).
ADDENDUM about performance:
I measured that for a string of 880K characters, the following code with the exact number of positions to remove (10 leading characters and 1 trailing character) is 6 times faster than regex (could vary based on version of ABAP kernel), but maybe it won't be noticeable compared to the rest of the program:
SHIFT sub_json LEFT BY 10 PLACES CIRCULAR.
REPLACE SECTION OFFSET strlen( sub_json ) - 11 OF sub_json WITH ``.
Just a bit of manual work and voila!
DATA(writer) = CAST if_sxml_writer( cl_sxml_string_writer=>create( type = if_sxml=>co_xt_json ) ).
DATA(components) =
CAST cl_abap_structdescr( cl_abap_typedescr=>describe_by_name( iv_table_name ) )->components.
writer->open_element( name = 'object' ).
LOOP AT <itab> ASSIGNING FIELD-SYMBOL(<line>).
LOOP AT components ASSIGNING FIELD-SYMBOL(<fs_comp>).
ASSIGN COMPONENT <fs_comp>-name OF STRUCTURE <line> TO FIELD-SYMBOL(<fs_val>).
writer->open_element( name = 'str' ).
writer->write_attribute( name = 'name' value = CONV string( <fs_comp>-name ) ).
writer->write_value( CONV string( <fs_val> ) ).
writer->close_element( ).
ENDLOOP.
ENDLOOP.
writer->close_element( ).
DATA(xml_json) = CAST cl_sxml_string_writer( writer )->get_output( ).
sub_json = cl_abap_codepage=>convert_from( source = xml_json codepage = `UTF-8` ).
No surrounding list and no dictionary. If you wanna each line in separate dictionary it is easily adjustable.
If you use ID call transformation, then what ever node you give at transformation that node will be added by default. We cannot skip this but you can remove following way..
Replace: Using Regex or Direct word with Replace First Occurrence statement and next last closing brace }. The way you did.
FIND: You can simple use this below statement
FIND REGEX '(\[.*\])' in sub_json SUBMATCHES sub_json.
I have a postgresql table called datasource with jsonb column called config. It has the following structure:
{
"url":"some_url",
"password":"some_password",
"username":"some_username",
"projectNames":[
"project_name_1",
...
"project_name_N"
]
}
I would like to transform nested json array projectNames into a map and add a default value for each element from the array, so it would look like:
{
"url":"some_url",
"password":"some_password",
"username":"some_username",
"projectNames":{
"project_name_1": "value",
...
"project_name_N": "value"
}
}
I have selected projectNames from the table using postgresql jsonb operator config#>'{projectNames}', but I have no idea how to perform transform operation.
I think, I should use something like jsonb_object_agg, but it converts all data into a single row.
I'm using PostgreSQL 9.6 version.
You need to first unnest the array, then build a new JSON document from that. Then you can put that back into the column.
update datasource
set config = jsonb_set(config, '{projectNames}', t.map)
from (
select id, jsonb_object_agg(pn.n, 'value') as map
from datasource, jsonb_array_elements_text(config -> 'projectNames') as pn (n)
group by id
) t
where t.id = datasource.id;
The above assumes that there is a primary (or at least unique) column named id. The inner select transforms the array into a map.
Online example: http://rextester.com/GPP85654
are you looking for smth like:
t=# with c(j) as (values('{
"url":"some_url",
"password":"some_password",
"username":"some_username",
"projectNames":[
"project_name_1",
"project_name_N"
]
}
'::jsonb))
, n as (select j,jsonb_array_elements_text(j->'projectNames') a from c)
select jsonb_pretty(jsonb_set(j,'{projectNames}',jsonb_object_agg(a,'value'))) from n group by j
;
jsonb_pretty
------------------------------------
{ +
"url": "some_url", +
"password": "some_password", +
"username": "some_username", +
"projectNames": { +
"project_name_1": "value",+
"project_name_N": "value" +
} +
}
(1 row)
Time: 19.756 ms
if so, look at:
https://www.postgresql.org/docs/current/static/functions-aggregate.html
https://www.postgresql.org/docs/current/static/functions-json.html
I am trying to use JSX to convert a list of tuples to a JSON object.
The list items are based on a record definition:
-record(player, {index, name, description}).
and looks like this:
[
{player,1,"John Doe","Hey there"},
{player,2,"Max Payne","I am here"}
]
The query function looks like this:
select_all() ->
SelectAllFunction =
fun() ->
qlc:eval(qlc:q(
[Player ||
Player <- mnesia:table(player)
]
))
end,
mnesia:transaction(SelectAllFunction).
What's the proper way to make it convertable to a JSON knowing that I have a schema of the record used and knowing the structure of tuples?
You'll have to convert the record into a term that jsx can encode to JSON correctly. Assuming you want an array of objects in the JSON for the list of player records, you'll have to either convert each player to a map or list of tuples. You'll also have to convert the strings to binaries or else jsx will encode it to a list of integers. Here's some sample code:
-record(player, {index, name, description}).
player_to_json_encodable(#player{index = Index, name = Name, description = Description}) ->
[{index, Index}, {name, list_to_binary(Name)}, {description, list_to_binary(Description)}].
go() ->
Players = [
{player, 1, "John Doe", "Hey there"},
% the following is just some sugar for a tuple like above
#player{index = 2, name = "Max Payne", description = "I am here"}
],
JSON = jsx:encode(lists:map(fun player_to_json_encodable/1, Players)),
io:format("~s~n", [JSON]).
Test:
1> r:go().
[{"index":1,"name":"John Doe","description":"Hey there"},{"index":2,"name":"Max Payne","description":"I am here"}]