Subquery for element of JSON column - json

I have a big JSON data in one column called response_return in a Postgres DB, with a response like:
{
"customer_payment":{
"OrderId":"123456789",
"Customer":{
"Full_name":"Francis"
},
"Payment":{
"AuthorizationCode":"9874565",
"Recurrent":false,
"Authenticate":false,
...
}
}
}
I tried to use Postgres functions like -> ,->> ,#> or #> to walk through headers to achieve AuthorizationCode for a query.
When I use -> in customer_payment in a SELECT, returns all after them. If I try with OrderId, it's returned NULL.
The alternatives and sources:
Using The JSON Datatype In PostgreSQL
Operator ->
Allows you to select an element based on its name.
Allows you to select an element within an array based on its index.
Can be used sequentially: ::json->'elementL'->'subelementM'->…->'subsubsubelementN'.
Return type is json and the result cannot be used with functions and operators that require a string-based datatype. But the result can be used with operators and functions that require a json datatype.
Query for element of array in JSON column
This is not helpful because I don't want filter and do not believe that need to transform to array.

If you just want to get a single attribute, you can use:
select response_return -> 'customer_payment' -> 'Payment' ->> 'AuthorizationCode'
from the_table;
You need to use -> for the intermediate access to the keys (to keep the JSON type) and ->> for the last key to return the value as a string.
Alternatively you can provide the path to the element as an array and use #>>
select response_return #>> array['customer_payment', 'Payment', 'AuthorizationCode']
from the_table;
Online example

Related

Unpacking JSON Into Flat Format

I have JSON that resembles the following:
{
"ANNOTATIONS": [
{
"Label": "CommingledProduct",
"Text": "NBP"
},
{
"Label": "CommingledVenue",
"Text": "OTC"
}
]
}
I need to unpack this into a flat table with columns matching the annotation labels. So columns based on the above json become:
comingled_product
comingled_venue
The JSON is coming from a json field in a source table and being unpacked into another table.
I know that I could code as follows:
INSERT INTO my_target_table (comingled_product, comingled_venue)
SELECT
payload->'ANNOTATIONS'->0->>'Text',
payload->'ANNOTATIONS'->1->>'Text'
FROM my_source_table;
However, I would rather not use the ordinals of the annotations. I would prefer to use some syntax mirroring the psuedo-code below:
INSERT INTO my_target_table (comingled_product, comingled_venue)
SELECT
payload->'ANNOTATIONS'->'label="ComingledProduct"'->>'Text',
payload->'ANNOTATIONS'->'label="ComingledVenueID"'->>'Text'
FROM my_source_table;
Can anyone tell me if what I'm trying to ahcieve is possible and how to do it? There are more than the two annotations I have included in the sample, so anything that involves multiple joins is probably a no go.
Using PostGres 10.7
demo:db<>fiddle
WITH cte AS (
SELECT
elems.value
FROM
my_source_table,
json_array_elements(payload -> 'ANNOTATIONS') elems
)
SELECT
(SELECT value ->> 'Text' FROM cte WHERE value ->> 'Label' = 'CommingledProduct'),
(SELECT value ->> 'Text' FROM cte WHERE value ->> 'Label' = 'CommingledVenue')
Expanding the array into one row per array element and store this result for further usage into a CTE
This result can be used to query the expected values (without doing the expanding twice)
Could be a little bit faster:
demo:db<>fiddle
SELECT
payload,
MIN(the_text) FILTER (WHERE label = 'CommingledProduct'),
MIN(the_text) FILTER (WHERE label = 'CommingledVenue')
FROM (
SELECT
payload::text AS payload,
elems ->> 'Label' AS label,
elems ->> 'Text' AS the_text
FROM
my_source_table,
json_array_elements(payload -> 'ANNOTATIONS') elems
) s
GROUP BY payload
The answer from #S-Man is great and you should use that for your postgres 10.7. json_path will be added in postgres 12, which will allow you to do something a little bit closer to your pseudo-code, but only with jsonb (not json):
INSERT INTO my_target_table (comingled_product, comingled_venue)
SELECT jsonb_path_query(payload,
'$.ANNOTATIONS[*] ? (#.Label == "CommingledProduct")')->>'Text',
jsonb_path_query(payload,
'$.ANNOTATIONS[*] ? (#.Label == "CommingledVenue")')->>'Text'
FROM my_source_table;
The jsonb_path_query syntax takes a bit to figure out, but it is basically returning elements of the ANNOTATIONS array for which the Label equals either CommingledProduct or CommingledVenue. jsonb_path_query returns a jsonb object, so we can use the ->> operator to grab the value of 'Text' from the object.

Combine multiple JSON rows into one JSON object in PostgreSQL

I want to combine the following JSON from multiple rows into one single JSON object as a row.
{"Salary": ""}
{"what is your name?": ""}
{"what is your lastname": ""}
Expected output
{
"Salary": "",
"what is your name?": "",
"what is your lastname": ""
}
With only built-in functions, you need to expand the rows into key/value pairs and aggregate that back into a single JSON value:
select jsonb_object_agg(t.k, t.v)
from the_table, jsonb_each(ob) as t(k,v);
If your column is of type json rather than jsonb you need to cast it:
select jsonb_object_agg(t.k, t.v)
from the_table, jsonb_each(ob::jsonb) as t(k,v);
A slightly more elegant solution is to define a new aggregate that does that:
CREATE AGGREGATE jsonb_combine(jsonb)
(
SFUNC = jsonb_concat(jsonb, jsonb),
STYPE = jsonb
);
Then you can aggregate the values directly:
select jsonb_combine(ob)
from the_table;
(Again you need to cast your column if it's json rather than jsonb)
Online example

How do I search for a specific string in a JSON Postgres data type column?

I have a column named params in a table named reports which contains JSON.
I need to find which rows contain the text 'authVar' anywhere in the JSON array. I don't know the path or level in which the text could appear.
I want to just search through the JSON with a standard like operator.
Something like:
SELECT * FROM reports
WHERE params LIKE '%authVar%'
I have searched and googled and read the Postgres docs. I don't understand the JSON data type very well, and figure I am missing something easy.
The JSON looks something like this.
[
{
"tileId":18811,
"Params":{
"data":[
{
"name":"Week Ending",
"color":"#27B5E1",
"report":"report1",
"locations":{
"c1":0,
"c2":0,
"r1":"authVar",
"r2":66
}
}
]
}
}
]
In Postgres 11 or earlier it is possible to recursively walk through an unknown json structure, but it would be rather complex and costly. I would propose the brute force method which should work well:
select *
from reports
where params::text like '%authVar%';
-- or
-- where params::text like '%"authVar"%';
-- if you are looking for the exact value
The query is very fast but may return unexpected extra rows in cases when the searched string is a part of one of the keys.
In Postgres 12+ the recursive searching in JSONB is pretty comfortable with the new feature of jsonpath.
Find a string value containing authVar:
select *
from reports
where jsonb_path_exists(params, '$.** ? (#.type() == "string" && # like_regex "authVar")')
The jsonpath:
$.** find any value at any level (recursive processing)
? where
#.type() == "string" value is string
&& and
# like_regex "authVar" value contains 'authVar'
Or find the exact value:
select *
from reports
where jsonb_path_exists(params, '$.** ? (# == "authVar")')
Read in the documentation:
The SQL/JSON Path Language
jsonpath Type

Regular expression to remove the key:value from hash / json

Suppose I have following json and I want to skip the entry "data_type" from it.
{
"marketing_type":"FIT",
"controllable":"true",
"plannable":"true",
"sbm_qualified":"true",
"marginal_cost":"{:type=>\"float\", :label=>\"Marginal Cost to steer\",:unit=>\"$/MWh\", :default=>100} must be float.",
"data_type": "any_value",
"start_cost":"{:type=>\"float\", :label=>\"Start Cost\", :unit=>\"$\",:default=>0} must be float."
}
Expected output is "data_type" entry should be removed from above.
Instead of using regex and string manipulations, and if you're running at least MySQL 5.7, you can use one of the built-in JSON functions, json_remove:
update table_name set column_name = json_remove(column_name, "$.data_type")

How to get elements from Json array in PostgreSQL

I have searched quite much on this and still unanswerable. I'm using PostgreSQL. Column name is "sections" and column type is json[] in below example.
My column looks like this in database:
sections
[{"name" : "section1",
"attributes": [{"attrkey1": "value1",
"attrkey2": "value2"},
{"attrkey3": "value3",
"attrkey4": "value4"}]
},
{"name" : "section2",
"attributes": [{"attrkey3": "value5",
"attrkey6": "value6"},
{"attrkey1": "value7",
"attrkey8": "value8"}]
}]
It's json array and I want to get "attrkey3" in my result. For getting particular key from Json, I can use json_extract_path_text(json_column, 'json_property') which is working perfectly fine. But I have no idea how to get some property from json[].
If I talk about above example, I want to get value of property "attrkey2" to be shown in my result. I know it's an array so it might work differently than usual, e.g. all the values of my array would act as a different row so I might have to write subquery but no idea how to do it.
Also, I can't write index statically and get property of the json element from some particular index. My query will be generated dynamically so I would never know how many elements are inside json array.
I saw some static examples but don't know how to implement it in my case. Can someone tell me how to do this in query?
I'm not sure you have a json[] (PostgreSQL array of json values) typed column, or a json typed column, which appears to be a JSON array (like in your example).
Either case, you need to expand your array before querying. In case of json[], you need to use unnest(anyarray); in case of JSON arrays in a json typed column, you need to use json_array_elements(json) (and LATERAL joins -- they are implicit in my examples):
select t.id,
each_section ->> 'name' section_name,
each_attribute ->> 'attrkey3' attrkey3
from t
cross join unnest(array_of_json) each_section
cross join json_array_elements(each_section -> 'attributes') each_attribute
where (each_attribute -> 'attrkey3') is not null;
-- use "where each_attribute ? 'attrkey3'" in case of jsonb
select t.id,
each_section ->> 'name' section_name,
each_attribute ->> 'attrkey3' attrkey3
from t
cross join json_array_elements(json_array) each_section
cross join json_array_elements(each_section -> 'attributes') each_attribute
where (each_attribute -> 'attrkey3') is not null;
SQLFiddle
Unfortunately, you cannot use any index with your data. You need to fix your schema first, in order to do that.
If you wish to access a single element then use json_array -> index
For example, if you have json_arr=[1,2,3] then json_array -> 0 will return 1
And also, if there was a key value map data in array:
select each_data -> 'value' as value3
from t cross join jsonb_array_elements(t.sections -> 'attributes') each_attribute
where each_attribute -> 'key' = '"attrkey3"'
I am mentioning this because the great answer also provided a perfect solution for my case. By the way, also be aware of jsonb_array.. method for jsonb type attribute.