How to store a json value in a postgres column - json

I have a table with two columns,One column should store int and other should store json.
here the data which i want to store in the table.
id,polygon
1,"{""type"": ""Feature"",
""properties"": {
""stroke"": ""#555555"",
""stroke-width"": 2,
""stroke-opacity"": 1,
""fill"": ""#00aa22"",
""fill-opacity"": 0.5
},
""geometry"": {
""type"": ""Polygon"",
""coordinates"": [
[
[-76.97021484375,
40.17887331434696
],
[-74.02587890625,
39.842286020743394
],
[-73.4326171875,
41.713930073371294
],
[-76.79443359375,
41.94314874732696
],
[-76.97021484375,
40.17887331434696
]
]
]
}
}"
I tired storing in postgres as follows:
insert into gjl_polygon values(1,'"{""type"":
""Feature"",""properties"": {""stroke"": ""#555555"",""stroke-
width"": 2,""stroke-opacity"": 1,""fill"": ""#00aa22"",""fill-
opacity"": 0.5},""geometry"": {""type"":
""Polygon"",""coordinates"":
[[[-76.97021484375,40.17887331434696],[-74.02587890625,
39.842286020743394 ],[-73.4326171875, 41.713930073371294],
[-76.79443359375,41.94314874732696],
[-76.97021484375,40.17887331434696]]]}}"');
I got the following error,
Expecting ':' delimiter: line 1 column 4 (char 3)

The problem of your code is the use of double quotes twice. Try to edit like this:
{
"type": "Feature",
"properties": {
"stroke": "#555555",
"stroke-width": 2,
"stroke-opacity": 1,
"fill": "#00aa22",
"fill-opacity": 0.5
},
"geometry": {
"type": "Polygon",
"coordinates": [
[
[-76.97021484375,
40.17887331434696
],
[-74.02587890625,
39.842286020743394
],
[-73.4326171875,
41.713930073371294
],
[-76.79443359375,
41.94314874732696
],
[-76.97021484375,
40.17887331434696
]
]
]
}
}
The JSON above is a valid JSON string and it should work as expected.

Related

Using Recursive feature while Flattening in Snowflake

I have a JSON string, which needs to be parsed in order to retrieve particular values.Here is an example I am working with;
{
"assignable_type": "SHIPMENT",
"rule": {
"rules": [
{
"meta_data": {},
"rules": [
{
"op": "IN",
"target": "CLIENT_FID",
"type": "ARRAY_VALUE_ASSERTION",
"values": [
"flx::core:client:dbid/64171",
"flx::core:client:dbid/76049",
"flx::core:client:dbid/34040",
"flx::core:client:dbid/61806"
]
}
],
"type": "AND"
}
],
"type": "OR"
},
"type": "USER_DEFINED"
}
The goal is to get the values when "target":"CLIENT_FID".
Expected Output for this JSON file should be ;
["flx::core:client:dbid/64171",
"flx::core:client:dbid/76049",
"flx::core:client:dbid/34040",
"flx::core:client:dbid/61806"]
Here, as we can see rules is a list of dictionaries, and we can have nested lists as seen in the example.
Similarly, we have other JSON file of following type;
{
"assignable_type": "SHIPMENT",
"rule": {
"rules": [
{
"meta_data": {},
"rules": [
{
"op": "IN",
"target": "PORT_OF_ENTRY_FID",
"type": "ARRAY_VALUE_ASSERTION",
"values": [
"flx::core:port:dbid/566788",
"flx::core:port:dbid/566931",
"flx::core:port:dbid/561482"
]
}
],
"type": "AND"
},
{
"meta_data": {},
"rules": [
{
"op": "IN",
"target": "PORT_OF_LOADING_FID",
"type": "ARRAY_VALUE_ASSERTION",
"values": [
"flx::core:port:dbid/561465"
]
},
{
"op": "IN",
"target": "SHIPMENT_MODE",
"type": "ARRAY_VALUE_ASSERTION",
"values": [
0
]
},
{
"op": "IN",
"target": "CLIENT_FID",
"type": "ARRAY_VALUE_ASSERTION",
"values": [
"flx::core:client:dbid/28169"
]
}
],
"type": "AND"
}
],
"type": "OR"
},
"type": "USER_DEFINED"
}
For the second example ,
Expected Output shd be;
["flx::core:client:dbid/28169"]
As. seen, we may need to read the values at different depths in the file. In order to address this issue, I used following code;
/* first convert the string to a JSON object in cte1 */
with cte1 as (
select to_json(json_string) as json_rep,
parse_json(json_extract_path_text(json_rep, 'rule.rules')) as list_elem
from table 1),
cte2 as (select split_array,
json_extract_path_text(split_array, 'target') as target_client
from (
select json_rep,
list_elem,
t.value as split_array,
typeof(split_array) as obj_type,
index
from cte1,
table(flatten(cte1.list_elem, recursive=>true)) as t) temp /* use recursive feature */
where split_array ilike '%"target":"client_fid"%' /* filter for those rows containing this string */
and obj_type='OBJECT')
select
split_array,
json_extract_path_text(split_array, 'values') as client_values
from cte2
where target_client='CLIENT_FID'; /* filter the rows where we have the dictionary containing client fid */
In order to address the issue of varying depth at which client_fid is found we're recursing while flattening the string into rows. The output which is obtained for both of above inputs is provided below,
For the first String we get the actual output in variable client_values as
["flx::core:client:dbid/64171",
"flx::core:client:dbid/76049",
"flx::core:client:dbid/34040",
"flx::core:client:dbid/61806"]
Similarly, for the second string we get the actual output as
["flx::core:client:dbid/28169"]
As seen the code seems to be working in getting the correct output, but the way I filtered in the final query for target_client='CLIENT_FID'; it seems to be a very hacky way. Hence is it possible to get a better approach to resolve the issue of retrieving client fid values though the depth can vary in the given input.
Help is appreciated.

Issue with cts.jsonPropertyScopeQuery and cts.jsonPropertyValueQuery with data types and field order

I have MarkLogic 9 on my database.
I have created the following documents in my database:
test1.json
{
"users": [
{
"userId": "A",
"value": 0
}
]
}
test2.json
{
"users": [
{
"userId": "A",
"value": "0"
}
]
}
test3.json
{
"users": [
{
"value": 0,
"userId": "A"
}
]
}
test4.json
{
"users": [
{
"value": "0",
"userId": "A"
}
]
}
I have run the following codes and have recorded the results:
cts.uris(“”, null, cts.jsonPropertyScopeQuery(
"users",
cts.andQuery(
[
cts.jsonPropertyValueQuery('userId', "A"),
cts.jsonPropertyValueQuery('value', "0"),
]
)
))
Result: test2.json, test4.json
cts.uris(“”, null, cts.jsonPropertyScopeQuery(
"users",
cts.andQuery(
[
cts.jsonPropertyValueQuery('userId', "A"),
cts.jsonPropertyValueQuery('value', 0),
]
)
))
Result: test3.json
I was wondering why test1.json did not return in the 2nd query while test3.json did. They both had the same values for fields but in different order. The order of the fields are different in test2.json and test4.json, however, the query returned both documents. The only difference between the 2 pairs that I can think of is that there are 2 data types for the field “value”, integer and string.
How would I go about resolving this issue?
https://docs.marklogic.com/cts.jsonPropertyValueQuery shows the value to match as an array.
If you want to keep the variants in data, maybe you can try something on the query side like cts.jsonPropertyValueQuery('value', ["0", 0])

Change subelement with jq

I have a structure that looks like so
[
[
{
"ID": "grp1-001",
},
{
"ID": "grp1-002",
},
{
"ID": "grp1-003",
},
{
"ID": "grp1-004",
},
{
"ID": "grp1-005",
},
{
"ID": "grp1-006",
}
],
[
{
"ID": "grp2-001",
},
{
"ID": "grp2-002",
},
{
"ID": "grp2-003",
},
{
"ID": "grp2-004",
},
{
"ID": "grp2-005",
},
{
"ID": "grp2-006",
}
.......
what I need to get as a result of the modification is this
[
[
["1", "grp1-001"],
["2", "grp1-002"],
["3", "grp1-003"],
["4", "grp1-004"],
["5", "grp1-005"],
["6", "grp1-006"],
],
[
["1", "grp2-001"],
["2", "grp2-002"],
["3", "grp2-003"],
["4", "grp2-004"],
["5", "grp2-005"],
["6", "grp2-006"],
],
Which means I need to keep the external structure (outside array and an internal grouping) but convert the inner dict to an array and replace the "ID" key with a value (that will come from external source like --argjson). I am not even sure how to start - any ideas/resources are highly appreciated.
Assuming you're just taking the objects and transforming them to pairs of the index in the array and the ID value, you could do this:
map([to_entries[] | [.key + 1, .value.ID | tostring]])
https://jqplay.org/s/RBac7SPfdG
Using to_entries/0 on an array gives you an array of key/value (index/value) pairs. You could then shift the indices by 1 and convert to strings.

How to Remove and update in one obj of child obj in JSON Using mysql

Given the following JSON stored inside a MySQL json data type:
var data = ' [
{
"key": 1,
"step": 6,
"param": [
{"key_1": "test1"},
{"key_2": "test2"},
{"key_3": "test3"}
]
},
{
"key": 4,
"step": 8,
"param": [
{"key_4": "test4"},
{"key_5": "test5"}
]
}
]';
I need to remove key_3 in param obj also update removed data in mysql using one query.
**Note:**I Don't know the key_3 equal value, I Have only key_1 want to remove {"key_1":"test1"}
OUTPUT
[
{
"key": 1,
"step": 6,
"param": [
{"key_2": "test2"},
{"key_3": "test3"}
]
},
{
"key": 4,
"step": 8,
"param": [
{"key_4": "test4"},
{"key_5": "test5"}
]
}
]
Have you tried the function JSON_REMOVE in your attempts to achieve what you want?

Read JSON File for Records using Linq

Following is my JSON file . I have to get Fields mentioned for each page and for each Type in comma separated string. Please help in how to proceed using Linq
Example : If I want "Type = customFields" defined for "page1" , have to get output in comma separated ProjectID,EmployeeID,EmployeeName,hasExpiration etc
{
"Fields": {
"Pages": {
"Page": {
"-Name": "page1",
"Type": [
{
"-TypeID": "CUSTOMIZEDFIELDS",
"Field": [
"ProjectID",
"EmployeeID",
"EmployeeName",
"HasExpiration",
"EndDate",
"OTStrategy",
"Division",
"AddTimesheets",
"SubmitTimesheets",
"ManagerTimesheetApprovalRequired",
"OTAllowed",
"AddExpenses",
"SubmitExpenses",
"ManagerExpenseApprovalRequired",
"SendApprovalEmails"
]
},
{
"-TypeID": "CFDATASET",
"Field": [
"ProjectID",
"EmployeeID",
"EmployeeName",
"HasExpiration",
"EndDate",
"OTStrategy",
"Division",
"AddTimesheets",
"SubmitTimesheets",
"ManagerTimesheetApprovalRequired",
"OTAllowed",
"AddExpenses",
"SubmitExpenses",
"ManagerExpenseApprovalRequired",
"SendApprovalEmails"
]
},
{
"-TypeID": "CFDATASETCAPTION",
"Field": [
"ProjectID",
"EmployeeID",
"EmployeeName",
"HasExpiration",
"EndDate",
"OTStrategy",
"Division",
"AddTimesheets",
"SubmitTimesheets",
"ManagerTimesheetApprovalRequired",
"OTAllowed",
"AddExpenses",
"SubmitExpenses",
"ManagerExpenseApprovalRequired",
"SendApprovalEmails"
]
}
]
}
}
}
}