IN statement with mysql JSON column not working? - mysql

I am trying to query MySQL table on json column data not working as expected
JSON_EXTRACT(data, \"$.STUDENT_NAME\") IN ('JASMINE','SAM')
where as this JSON_EXTRACT(data, \"$.STUDENT_NAME\") IN ('JASMINE') works with only one element but not with multiple elements in the array, any ideas?
complete query
SELECT `data_uploads`.* FROM `data_uploads` WHERE `data_uploads`.`product_id` = 96 AND (dlname = 'STUDENT' AND JSON_EXTRACT(data, \"$.STUDENT_NAME\") IN ('JASMINE', 'SAM'))
expected results
product_id: 96, dlname: "STUDENT", data: {"CLASS"=>"GRADE-I", "GRAD_IMAGE"=>"jasmine_grad.jpg", "SECTION"=>"A", "STUDENT_IMAGE"=>"jasmine.jpg", "STUDENT_NAME"=>"JASMINE"}, created_at: "2021-06-18 10:16:56", updated_at: "2021-06-18 10:16:56"
product_id: 96, dlname: "STUDENT", data: {"CLASS"=>"GRADE-I", "GRAD_IMAGE"=>"sam_grad.jpg", "SECTION"=>"A", "STUDENT_IMAGE"=>"sam.jpg", "STUDENT_NAME"=>"SAM"}, created_at: "2021-06-18 10:16:56", updated_at: "2021-06-18 10:16:56"
but the above query returns an empty array
Rails active record query:
DataUpload.where(product_id: 96).where("dlname = 'STUDENT' AND JSON_EXTRACT(data, '$.STUDENT_NAME') IN ('JASMINE','SAM')")

Use JSON_UNQUOTE() to remove the quotes around the value before testing it.
SELECT *
FROM data_uploads
WHERE JSON_UNQUOTE(JSON_EXTRACT(data, "$.STUDENT_NAME")) IN ('JASMINE','SAM');
You can also use the ->> shorthand to extract and unquote at once.
SELECT *
FROM data_uploads
WHERE data->>"$.STUDENT_NAME" IN ('JASMINE','SAM');

Related

Count the number of arrays in json with a MySQL select statement

How can I count the number of arrays in json with a MySQL select statement?
For example, in the following case, I want 2 to be returned.
sample
+-----------+-----------+----------------------------------+
| id | json |
+-----------+-----------+----------------------------------+
| 1 | { items: [{name: a, age: 20}, {name: b, age: 30}] } |
...
I was able to get the contents with json_extract.
but I want count the number.
select
json_extract(json, '$.items')
from
sample
where
id = 1
select
json_array_length(json_extract(json, '$.items')) as size
from
sample
where
id = 1
json_array_length() is use to count size of json array
You can use JSON_LENGTH function, which is compatible with MySQL 5.7:
SELECT JSON_EXTRACT(json, '$.items'),
JSON_LENGTH(json, '$.items')
FROM sample
WHERE id = 1
Check the demo here.
Here is a trick to count, you can use a combination of LENGTH() and REPLACE() functions.
db<>fiddle
SELECT id, json, ROUND((LENGTH(json)- LENGTH(REPLACE(json, 'name', '')))/4,0) AS array_count
FROM (
SELECT 1 AS id, '{ items: [{name: a, age: 20}, {name: b, age: 30}] }' AS json
) tmp

Conditionally select values from an array in a nested JSON string in a Mysql database

I am struggling to conditionally extract values from a nested JSON string in the Mysql table.
{"users": [{"userId": "10000001", "userToken": "11000000000001", "userTokenValidity": 1}, {"userId": "10000002", "userToken": "12000000000001", "userTokenValidity": 1}, {"userId": "10000003", "userToken": "13000000000001", "userTokenValidity": 0}]}
I want to select a userToken but only if the userTokenValidity is 1. So in this example only "11000000000001" and "12000000000001" should get selected.
This will extract the whole array ... how should I filter the result?
SELECT t.my_column->>"$.users" FROM my_table t;
SELECT CAST(value AS CHAR) output
FROM test
CROSS JOIN JSON_TABLE(test.data, '$.users[*]' COLUMNS (value JSON PATH '$')) jsontable
WHERE value->>'$.userTokenValidity' = 1
https://dbfiddle.uk/?rdbms=mysql_8.0&fiddle=4876ec22a9df4f6d2e75a476a02a2615

Convert pandas columns to comma separated lists to be used in sql statements

I have a dataframe and I am trying to turn the column into a comma separated list. The end goal is to pass this comma seperated list as a list of filtered items in a SQL query.
How do I go about doing this?
> import pandas as pd
>
> mydata = [{'id' : 'jack', 'b': 87, 'c': 1000},
> {'id' : 'jill', 'b': 55, 'c':2000}, {'id' : 'july', 'b': 5555, 'c':22000}]
df = pd.DataFrame(mydata)
df
Expected solution - note the quotes around the ids since they are strings and the items in column titled 'b' since that is a numerical field and the way in which SQL works. I would then eventually send a query like
select * from mytable where ids in (my_ids) or values in (my_values):
my_ids = 'jack', 'jill','july'
my_values = 87,55,5555
I encountered a similar issue and solved it in one line using values and tolist() as
df['col_name'].values.tolist()
So in your case, it will be
my_ids = my_data['id'].values.tolist() # ['jack', 'jill', 'july']
my_values = my_data['b'].values.tolist()
Let's use apply with argument 'reduce=False' then check the dtype of the series and apply the proper argument to join:
df.apply(lambda x: ', '.join(x.astype(str)) if x.dtype=='int64' else ', '.join("\'"+x.astype(str)+"\'"), reduce=False)
Output:
b 87, 55, 5555
c 1000, 2000, 22000
id 'jack', 'jill', 'july'
dtype: object

How to get the number of elements in a JSON array stored as CLOB with Oracle 12c?

I'm storing a java class A as A_DOC in a clob column in my database.
The structure of A is like:
{
id : 123
var1: abc
subvalues : [{
id: 1
value : a
},
{
id: 1
value :b
}
...
}
]}
I know I can do things like
select json_query(a.A_DOC, '$.subvalues.value') from table_name a;
and so on, but how I'm looking for a way to count the number of elements in the subvalues array through an sql query. Is this possible?
the function exists in Oracle 18 only
SELECT json_query('[19, 15, [16,2,3]]','$[*].size()' WITH ARRAY WRAPPER) FROM dual;
SELECT json_value('[19, 15, [16,2,3]]','$.size()') FROM dual;
You can use JSON_TABLE:
SELECT
id, var1, count(sub_id) subvalues
FROM
JSON_TABLE (
to_clob('{ id: 123, var1: "abc", subvalues : [{ id: 1, value: "a", }, { id: 2, value: "b" } ]}'),
'$'
COLUMNS (
id NUMBER PATH '$.id',
var1 VARCHAR PATH '$.var1',
NESTED PATH '$.subvalues[*]'
COLUMNS (
sub_id NUMBER PATH '$.id'
)
)
)
GROUP BY id, var1

Sort by length of nested JSON array

Let's assume I have a PostgreSQL table with following schema:
id: 1,
attribute_a: 'value'
attribute_b: 'value'
data: { attribute_c: 'value', array_of_values: [1,2,3] }
Where data is stored in a JSON structure. Is that possible to order the elements in the table by array_of_values length?
Use json_array_length() or jsonb_array_length() like Eggplant commented. Assuming jsonb:
SELECT *
FROM tbl
ORDER BY jsonb_array_length(data -> 'array_of_values')
BTW, the syntax for your JSON value should be:
{"attribute_c": "value", "array_of_values": [1, 2, 3]}