MySQL JSON with arbitrary keys to table - mysql

There is a map nested in a large json payload like
{
"map": {
"key1": "value1",
"key2": "value2",
"key3": "value3"
},
// more stuff
}
I would like to generate a table like that:
+------#--------+
| Key | Value |
+------#--------+
| key1 | value1 |
| key2 | value2 |
| key3 | value3 |
+------#--------+
The only thing I can think of is writing a stored function that loops over JSON_KEYS to convert all key value pairs into
[{"key":"key1", "value":"value1"}, {"key":"key2", "value":"value2"}, ...]
which makes the task trivial with JSON_TABLE.
Is there a faster and more elegant way?

Here's a solution:
select j.key, json_unquote(json_extract(m.data, concat('$.map.', j.key))) as value from mytable as m
cross join json_table(json_keys(m.data, '$.map'), '$[*]' columns (`key` varchar(10) path '$')) as j
Output with your sample data:
+------+--------+
| key | value |
+------+--------+
| key1 | value1 |
| key2 | value2 |
| key3 | value3 |
+------+--------+
If that query seems inelegant or hard to maintain, you're probably right. You shouldn't store data in JSON if you want simple or elegant queries.

your doing well and even for enterprise project doing this way

Related

JSON_CONTAINS() with an array of JSON objects in MySQL

This is a sample database 'test' with a JSON column 'arr' containing an array of JSON objects
+----+----------------------------------------------------------+
| id | arr |
+----+----------------------------------------------------------+
| 1 | [{"name": "aman"}, {"name": "jay"}] |
| 2 | [{"name": "yash"}, {"name": "aman"}, {"name": "jay"}] |
+----+----------------------------------------------------------+
I want to use JSON_CONTAINS to know if a value exists in a specific key of an object in the array.
Here's my query :
SELECT JSON_CONTAINS(arr, '"jay"', '$[*].name') from test WHERE id=1;
I get the following error:
ERROR 3149 (42000): In this situation, path expressions may not contain the * and ** tokens or an array range.
I know that I can try using JSON_EXTRACT() for this, but what am I doing wrong here ?
Is there any way to use JSON_CONTAINS with an array of JSON objects in MySQL.
Yes, it is possible using the following syntax:
SELECT JSON_CONTAINS(arr, '{"name": "jay"}') from test WHERE id=1;
db<>fiddle demo
Example:
+-----+--------------------------------------------------------+---+
| id | arr | r |
+-----+--------------------------------------------------------+---+
| 1 | [{"name": "aman"}, {"name": "jay"}] | 1 |
| 2 | [{"name": "yash"}, {"name": "aman"}, {"name": "jay"}] | 1 |
| 3 | [{"name": "yash"}, {"name": "aman"}] | 0 |
+-----+--------------------------------------------------------+---+
You must use JSON_SEARCH:
SELECT JSON_SEARCH(arr, 'one', 'jay', NULL, '$[*].name') IS NOT NULL
FROM test
WHERE id=1;

Append data to JSON array in Redshift

I have a Redshift table in which one of the columns is a JSON array. I would like to append some data into that array. Eg:
id | col1 | col2
1 | A | {"key": []}
2 | B | {"key": []}
3 | B | {"key": ['A']}
4 | B | {"key": ['A', 'B']}
I would like to create a statement like UPDATE <table> SET col2 = <something> where col1 = 'B' so that I get:
id | col1 | col2
1 | A | {"key": []}
2 | B | {"key": ['C']}
3 | B | {"key": ['A', 'C']}
4 | B | {"key": ['A', 'B', 'C']}
You'd have to write your own User Defined Function (UDF), passing in the current value of the column and the element you would like to add, then passing back the result.
Hwoever, you really should avoid JSON columns in Amazon Redshift if at all possible. They cannot take advantage of all the features that make Redshift great (columnar, SORTKEY, etc). Plus, you'll have problems like this that are not in the normal realm of a relational database.

Postgres update JSON field

I've got several Postgres 9.4 tables that contain data like this:
| id | data |
|----|-------------------------------------------|
| 1 | {"user": "joe", "updated-time": 123} |
| 2 | {"message": "hi", "updated-time": 321} |
I need to transform the JSON column into something like this
| id | data |
|----|--------------------------------------------------------------|
| 1 | {"user": "joe", "updated-time": {123, "unit":"millis"}} |
| 2 | {"message": "hi", "updated-time": {321, "unit":"millis"}} |
Ideally it would be easy to apply the transformation to multiple tables. Tables that contain the JSON key data->'updated-time' should be updated, and ones that do not should be skipped. Thanks!
You can use the || operator to merge two jsonb objects together.
select '{"foo":"bar"}'::jsonb || '{"baz":"bar"}'::jsonb;
= {"baz": "bar", "foo": "bar"}

Clean way to query complex JSON in Postgresql

I have JSON data stored in a JSONB field in my postgresql 9.5 DB.
Is there a way of making sub-objects columns without knowing which column is a sub-object?
JSON example in question:
{
"a":1,
"b":[1,2,3],
"c":"bar",
"d":{
"key1":"value1",
"key2":"value2"
}
}
I can use the following to get all of the keys from a JSON object.
SELECT * FROM json_object_keys('{"a":1,"b":[1,2,3],"c":"bar", "d":{"key1":"value1", "key2":"value2"}}')
At that point I can then use json_to_record() but I would like to split the column out to their own separate fields.
select * from json_to_record('{"a":1,"b":[1,2,3],"c":"bar", "d":{"key1":"value1", "key2":"value2"}}') as x(a int, b text, c text, d text)
gets me
a| b | c | d
1| [1,2,3] | bar | {"key1":"value1", "key2":"value2"}
Is there a way to get something like this back, preferably in a single query?
--------------------------------------------------------------------
a| b | c | d | key1 | key2
1| [1,2,3] | bar | {"key1":"value1", "key2":"value2"} |value1 |value2
WITH t(v) AS ( VALUES
('{
"a":1,
"b":[1,2,3],
"c":"bar",
"d":{
"key1":"value1",
"key2":"value2"
}
}'::JSONB)
)
SELECT x1.*,x2.* FROM t,
jsonb_to_record(v) as x1(a int,b text,c text,d jsonb),
jsonb_to_record(v->'d') as x2(key1 text,key2 text);
Result:
a | b | c | d | key1 | key2
---+-----------+-----+--------------------------------------+--------+--------
1 | [1, 2, 3] | bar | {"key1": "value1", "key2": "value2"} | value1 | value2
(1 row)

MySQL dynamic key/pairs converted to single record sets

I'm sure my question is misleading, so allow me to demonstrate my challenge. I have a table that holds dynamically built data (user sets up field names and allows the entry of particular data)
Table
KEY | VALUE | PERSON
key1 | value1 | personA
key2 | value2 | personA
key3 | value3 | personA
key1 | value1 | personB
key2 | value2 | personB
key3 | value3 | personB
I need this to be changed to the following as a query so I can filter a search on these records:
Dynamically created table for querying
PERSON | Key1 | Key2 | Key3
personA | value1 | value2 | value3
personB | value1 | value2 | value3
Please provide me with a mysql query to produce the following result. NB, the keys are dynamically created by the user and can thus result in many more / less.
As pointed out above, the term I was looking for was a Pivot, which is not supported by MySQL. I will therefore pull the single row records into my script and build an object from them before returning the result to the user. Seems to be the easiest way.