My CSV text file has column data of the kind . It is JSON with the quotes escaped.
{\"code\":\"SOURCE\",\"display\":\"NPPES\"}]}
I am not able to query this column in drill as JSON using dot notation. Is there a JSON_UNQUOTE() function in apache-drill that will solve this problem?
apache drill (dfs.tmp)> select hl7s.hl7_bundle.code from hl7s limit 1;
+--------+
| EXPR$0 |
+--------+
| null |
+--------+
First you need to convert this column as JSON, consider using convert_fromJSON function:
select convert_fromJSON('{"num": 55, "nan": NaN, "inf": -Infinity}');
https://drill.apache.org/docs/data-type-conversion/#convert_to-and-convert_from
Related
How can I access values in the status and date columns stored as JSON?
Please, have a look at an example row below.
{"1":{"status":true,"date":"2022-03-30"},"3":{"status":true,"date":"2022-03-30"}}
Demo:
set #j = '{"1":{"status":true,"date":"2022-03-30"},"3":{"status":true,"date":"2022-03-30"}}';
select json_extract(#j, '$."1".status') as status;
+--------+
| status |
+--------+
| true |
+--------+
In this case, it may be unexpected that you need to put double-quotes around "1" to use it in a JSON path.
I have a table that contains a JSON column, and in it a JSON array:
mysql> SELECT profile->'$.countriesVisited' from users;
+-------------------------------+
| profile->'$.countriesVisited' |
+-------------------------------+
| ["us", "il"] |
| ["co", "ph"] |
+-------------------------------+
2 rows in set (0.00 sec)
I want to convert the values inside the array into upper case. (I am assuming this answer would also assist lower case, string replacements.. etc.)
I've been trying to use UPPER, JSON_ARRAY, JSON_QUOTE, JSON_UNQUOTE, etc - at best I end up with a string representation of what I want.
How can I do this? I'm running MySQL 5.7.19.
You need to use JSON casting. Try the following:
UPDATE users
SET profile = JSON_SET(
profile,
'$.countriesVisited',
CAST(
UPPER(profile->'$.countriesVisited')
AS JSON
)
);
Given a table in Postgresql, defined approximately as follows:
Column | Type | Modifiers | Storage | Stats target | Description
-------------+-----------------------------+-----------+----------+--------------+-------------
id | character varying | not null | extended | |
answers | json | | extended | |
we accidentally did a number of inserts to this database of doubly-encoded JSON objects, ie the json value is a string, that is a json-encoded object -- for example:
"{\"a\": 1}"
We'd like to find a query that would convert these values to the JSON objects they represent, for example:
{"a": 1}
We can easily select the bad values by doing:
SELECT * FROM table WHERE json_type(answers) = 'string'
but we are having trouble coming up with a way to parse the JSON in PSQL.
Unfortunaly, there is no string-extraction function for the json[b] type(s) directly, but you can workaround this by embedding the value inside a JSON array & using the ->> operator for string extraction at the 0 array index:
UPDATE table
SET answers = (CONCAT('[', answers::text, ']')::json ->> 0)::json
WHERE json_type(answers) = 'string'
This should work with lower PostgreSQL versions too (9.3). For newer versions (9.4+), you could use the json_build_array() function too.
I have an SQL database field that contains JSON type data stored.
-----------------------------
id | tags |
-----------------------------
1 | ['cat','dog'] |
2 | ['lion','cat','dog'] |
I want to select from this table by passing where condition as cat and get all the JSON fields. How would I do this?
Use the JSON_EXTRACT function as of MySQL 5.7.8. Extract from
https://dev.mysql.com/doc/refman/5.7/en/json.html#json-paths
A JSON path expression selects a value within a JSON document.
Path expressions are useful with functions that extract parts of or modify a JSON document, to specify where within that document to operate. For example, the following query extracts from a JSON document the value of the member with the name key:
mysql> SELECT JSON_EXTRACT('{"id": 14, "name": "Aztalan"}', '$.name');
+---------------------------------------------------------+
| JSON_EXTRACT('{"id": 14, "name": "Aztalan"}', '$.name') |
+---------------------------------------------------------+
| "Aztalan" |
+---------------------------------------------------------+
Note in the example they are inline creatinng a JSON object so for you
:'{"id": 14, "name": "Aztalan"}' would actually be your column value.
JSON manipulation in MySQL is slow. I suggest that you use REGEXP.
// get all rows having "cat" value
select * from animals_tbl
where tags regexp 'cat';
// get all rows based on multiple filters
select * from animals_tbl
where tags regexp 'lion|cat';
I am using Apache Drill to run SQL queries on a HBase table. The value in one of the columns is:
0: jdbc:drill:schema:hbase:zk=localhost> select cast(address['street'] as varchar(20)) from hbase.students;
+------------+
| EXPR$0 |
+------------+
| {"id": 123} |
+------------+
1 row selected (0.507 seconds)
I would like to access the id field using a query. Something like:
0: jdbc:drill:schema:hbase:zk=localhost> select tbl.address['street']['id'] from hbase.students as tbl;
+------------+
| EXPR$0 |
+------------+
| null |
+------------+
As you can see, this does not work. I am run to similar queries on JSON data in a file. My question is can I query JSON data in HBase.
OK. I found the answer to this question, in case someone else has the same requirement.
The first step is to convert the HBase data to JSON using the built-in convert_from() function. A view can be created against which the queries can be run.
> create or replace view Street as select convert_from(Students.address.street, 'JSON') json from hbase.Customer;
Then, run query against the view
> select * from Street;
> select Street.json.id from Street;
You can also use a subquery to convert the data in your HBase column into JSON:
select t.json.id
from (select convert_from(Students.address.street, 'JSON') json
from hbase.Customer) t;