Oracle json_transform replace particular value on any level - json

I need to replace some values in Oracle 19 column that is CLOB with JSON constraint, but cannot find a right way to do it. Json can be big and I suppose regex function on string may not work due to the length restrictions.
My json may have 'locationId' property at any level with some values. Let's say I have bad value 000 and want to replace it with 111, so everywhere where is "locationId":"000" it becomes "locationId":"111".
I'm trying to use next json_transform command:
SELECT json_transform('
{
"a": {
"locationId":"000"
},
"b": {
"locationId":"111",
"x": {
"locationId":"000"
}
}
}', REPLACE '$..locationId?(# == "000")' = '111' IGNORE ON MISSING)
FROM dual
But it returns unchanged json, when next query for the same json fetches values to replace properly :
SELECT json_query('
{
"a": {
"locationId":"000"
},
"b": {
"locationId":"111",
"x": {
"locationId":"000"
}
}
}', '$..locationId?(# == "000")' WITH WRAPPER)
FROM dual
Result:
["000","000"]
There is no any documentation or examples how to use filters within json_transform, so I'm not sure it's even possible.
Maybe anybody know how to do it? Doesn't matter with json functions or not.
Links I use:
Oracle Json Path Expressions
Oracle json_query
Oracle json_transform

Have you seen that even
REPLACE '$..locationId' = '111' IGNORE ON MISSING
doesn't change anything ?
(at least in Cloud 21c)

It's more complicated:
SELECT json_transform('
{
"locationId" : "000",
"a": {
"locationId":"000"
},
"b": {
"locationId":"000",
"x": {
"locationId":"000",
"y": {
"locationId":"000"
}
}
}
}'
, REPLACE '$.locationId?(# == "000")' = '111' IGNORE ON MISSING
, REPLACE '$.*.locationId?(# == "000")' = '111' IGNORE ON MISSING
, REPLACE '$.*.*.locationId?(# == "000")' = '111' IGNORE ON MISSING
, REPLACE '$.*.*.*.locationId?(# == "000")' = '111' IGNORE ON MISSING
)
FROM dual;
Gives:
{"locationId":"111","a":{"locationId":"111"},"b":{"locationId":"111","x":{"locationId":"111","y":{"locationId":"111"}}}}

Related

How can I use the oracle REGEXP_SUBSTR to extract specific json values?

I have some columns in my Oracle database that contains json and to extract it's data in a query, I use REGEXP_SUBSTR.
In the following example, value is a column in the table DOSSIER that contains json. The regex extract the value of the property client.reference in that json
SELECT REGEXP_SUBSTR(value, '"client"(.*?)"reference":"([^"]+)"', 1, 1, NULL, 2) FROM DOSSIER;
So if the json looks like this :
[...],
"client": {
"someproperty":"123",
"someobject": {
[...]
},
"reference":"ABCD",
"someotherproperty":"456"
},
[...]
The SQL query will return ABDC.
My problem is that some json have multiple instance of "client", for example :
[...],
"contract": {
"client":"Name of the client",
"supplier": {
"reference":"EFGH"
}
},
[...],
"client": {
"someproperty":"123",
"someobject": {
[...]
},
"reference":"ABCD",
"someotherproperty":"456"
},
[...]
You get the issue, now the SQL query will return EFGH, which is the supplier's reference.
How can I make sure that "reference" is contained in a json object "client" ?
EDIT : I'm on Oracle 11g so I can't use the JSON API and I would like to avoid using third-party package
Assuming you are using Oracle 12c or later then you should NOT use regular expressions and should use Oracle's JSON functions.
If you have the table and data:
CREATE TABLE table_name ( value CLOB CHECK ( value IS JSON ) );
INSERT INTO table_name (
value
) VALUES (
'{
"contract": {
"client":"Name of the client",
"supplier": {
"reference":"EFGH"
}
},
"client": {
"someproperty":"123",
"someobject": {},
"reference":"ABCD",
"someotherproperty":"456"
}
}'
);
Then you can use the query:
SELECT JSON_VALUE( value, '$.client.reference' ) AS reference
FROM table_name;
Which outputs:
REFERENCE
ABCD
db<>fiddle here
If you are using Oracle 11 or earlier then you could use the third-party PLJSON package to parse JSON in PL/SQL. For example, this question.
Or enable Java within the database and then use CREATE JAVA (or the loadjava utility) to add a Java class that can parse JSON to the database and then wrap it in an Oracle function and use that.
I faced similar issue recently. If "reference" is a property that is only present inside "client" object, this will solve:
SELECT reference FROM (
SELECT DISTINCT
REGEXP_SUBSTR(
DBMS_LOB.SUBSTR(
value,
4000
),
'"reference":"(.+?)"',
1, 1, 'c', 1) reference
FROM DOSSIER
) WHERE reference IS NOT null;
You can also try to adapt the regex to your need.
Edit:
In my case, column type is CLOB and that's why I use DBMS_LOB.SUBSTR function there. You can remove this function and pass column directly in REGEXP_SUBSTR.

Is it possible to do nested search in jsonpath-ng?

Source "mapping.json":
{
"result": {
"src_color": "test_rule_2"
},
"rules": {
"color_degree": {
"test_rule_1": {
"color": 1
},
"test_rule_2": {
"color": 2
}
}
}
}
So it works perfectly:
with open("mapping.json", 'r') as json_file:
mapping = json.load(json_file)
expression = parse('$.rules.color_degree.test_rule_2.color')
match = expression.find(mapping)
if match:
pprint(match[0].value)
but in the path "test_rule_2" I need to replace with the value from result->src_color
How to properly describe something like this:
expression = parse('$.rules.color_degree.($.result.src_color.value).color')
If I understand your question correctly, it can be done, but you need two steps:
#first: get the variable:
src_expression = parse('$.result.src_color')
src_match = src_expression.find(mapping)
#second: get the target
expression = parse(f'$.rules.color_degree.{src_match[0].value}.color')
match = expression.find(mapping)
if match:
print(match[0].value)
Output is 2.
EDIT:
I don't know why someone would want to do it in one step, but it's possible:
parse(f'$.rules.color_degree.{parse("$.result.src_color").find(mapping)[0].value}.color').find(mapping)[0].value
Same output.

How can Postgres extract parts of json, including arrays, into another JSON field?

I'm trying to convince PostgreSQL 13 to pull out parts of a JSON field into another field, including a subset of properties within an array based on a discriminator (type) property. For example, given a data field containing:
{
"id": 1,
"type": "a",
"items": [
{ "size": "small", "color": "green" },
{ "size": "large", "color": "white" }
]
}
I'm trying to generate new_data like this:
{
"items": [
{ "size": "small" },
{ "size": "large"}
]
}
items can contain any number of entries. I've tried variations of SQL something like:
UPDATE my_table
SET new_data = (
CASE data->>'type'
WHEN 'a' THEN
json_build_object(
'items', json_agg(json_array_elements(data->'items') - 'color')
)
ELSE
null
END
);
but I can't seem to get it working. In this case, I get:
ERROR: set-returning functions are not allowed in UPDATE
LINE 6: 'items', json_agg(json_array_elements(data->'items')...
I can get a set of items using json_array_elements(data->'items') and thought I could roll this up into a JSON array using json_agg and remove unwanted keys using the - operator. But now I'm not sure if what I'm trying to do is possible. I'm guessing it's a case of PEBCAK. I've got about a dozen different types each with slightly different rules for how new_data should look, which is why I'm trying to fit the value for new_data into a type-based CASE statement.
Any tips, hints, or suggestions would be greatly appreciated.
One way is to handle the set json_array_elements() returns in a subquery.
UPDATE my_table
SET new_data = CASE
WHEN data->>'type' = 'a' THEN
(SELECT json_build_object('items',
json_agg(jae.item::jsonb - 'color'))
FROM json_array_elements(data->'items') jae(item))
END;
db<>fiddle
Also note that - isn't defined for json only for jsonb. So unless your columns are actually jsonb you need a cast. And you don't need an explicit ... ELSE NULL ... in a CASE expression, NULL is already the default value if no other value is specified in an ELSE branch.

Mysql JSON_EXTRACT with double quotes in path not working

I have the following table def:
`CREATE TABLE `TestInfo` (
`Info` json DEFAULT NULL
) ;
`
Am inserting two rows with json values.
INSERT INTO `TestInfo` (`Info`)
VALUES
('{
"statusCode": 200,
"result": {
"summary": {
"area": 0.0009904206008286565
}
}
} '
);
INSERT INTO `TestInfo` (`Info`)
VALUES
(
'{
"statusCode": 200,
"result": {
"summary": {
"area": 0.0009904206008286565,
"realty-society": {
"price-min": {
"property": "price-min",
"min": 110000.00000000001,
"max": 150000000,
"average": 31184468.085106384,
"sum": 1465670000
}
}
}
}
} '
);
When I run the query:
SELECT JSON_EXTRACT(Info, '$.result.summary')
FROM TestInfo ;
it returns the 2 rows. This is fine.
But when I run the same query with double quotes around the path like this:
SELECT JSON_EXTRACT(Info, '$."result.summary"')
FROM TestInfo;
it returns the 2 rows (with the single column) as NULLs.
Ultimately I need to use double quotes for keys that have a hyphen (dash) in them.
I am using MySQL 5.7 on AWS.
Pl help.
Don't put double quotes around the whole path, just around a specific property name that contains special characters, e.g.
SELECT JSON_EXTRACT(Info, '$.result.summary."realty-society"."price-min"')
FROM TestInfo
Yuor code makes . part of the literal property name, rather than a separator between properties. You would use it if you had:
"result.summary": ...
in the object.
This should be works if MYSQL version>=5.7
SELECT Info->>"$.result.summary"
FROM TestInfo ;

How to get id from text

I have value text:
{
"4384": {
"idRoomSv": 4384,
"NumRoom": 2,
"RoomId": 269
}
}
I want to get RoomId. It is return :269.
Can you help me? Thank very much!
If you have a recent version of MariaDB or MySQL, you can use the JSON_EXTRACT function.
Edit: try on your sql client the code below
SET #json = '{
"4384": {
"idRoomSv": 4384,
"NumRoom": 2,
"RoomId": 269
}
}';
SELECT JSON_EXTRACT(#json, '$.*.RoomId');
And the result is :
JSON_EXTRACT(#json, '$.*.RoomId')
1 [269]
The JSON_EXTRACT function accepts a JSON document for the firtst parameter. The second parameter is a JSONPath expression :
$ : is the root element
* : wildcard = all elements regardless their names
RoomId : value of that field