I have following values in postgres table which need to be converted to json
"[{""name"": ""xtz "S" "", ""strength"": ""25 x/1""}]"
"[{""name"": ""xtz "", ""strength"": ""25 x/1""}]"
I tried
::json which gives error with these double quotes
to_json() - it converts to json successfully but this output is of no use. It has lot of " in it. So when I do json_col -> key_name - I get all null values.
Related
I have data which is of type json in MySQL and column name of json data is IMEI_DATA.
IMEI_DATA: {"SUBSCRIBER_HISTORY": [{"IMEI": "12345678", "COUNTER": "1", "Service Flag": "7", "UPDATE_DATE_UNIX_TIME": "65667"}]}
I need to extract the field IMEI from it and tried with json extract query like this:
SELECT json_extract(IMEI_DATA,"$.SUBSCRIBER_HISTORY.IMEI") FROM SUBSCRIBER_HISTORY_JSON
and it is giving the result as null..
all.
I have a query in a ExecuteSQLRecord processor which produces a JSON column in the result:
SELECT t.id, json_agg(json_build_object("value1", t.column1, "value2", t.column2)) AS "test" FROM table t GROUP BY t.id
The processor uses a JSONRecordSetWriter to write the results. However, the processor returns a String from the database, which causes issues in further processing that expects a JSON value:
{
...
"test": "[{\"value1\": \"column1value\", \"value2\": \"column2value\"}, ...]"
}
Is there a way of turning it to a proper JSON value, i.e., in this format:
{
...
"test": [{"value1": "column1value", "value2": "column2value"}, ...]
}
without resorting to ReplaceText processors?
I have tried to replace the quotes in the SQL query, to use jsonb_agg() and jsonb_build_object(), and to use a QueryDatabaseTableRecord but none worked.
Thanks in advance.
In MySQL 5.7 I can compare two json objects where the order of the key are ignored e.g the following two json string are equal:
SELECT CAST('{"num": "27,28", "date": "2019-11-01"}' AS JSON) = CAST('{"date": "2019-11-01", "num": "27,28"}' AS JSON);
In mariadb there is no such json data type and so the above will return false.
Is there any way to accomplish the same in mariadb?
Note I've seen the following post and the solution is not ideal: Compare JSON values in MariaDB
Hi i am using phpmyadmin mysql
In one of my database table i have a column contact_email and the value is stored as below
contact_email
["ass#sss.ib"]
Which is Json encoded data , I am trying to convert that encoded data into string using query
I tried qith JSON_extract but it gave error.
Output what i need is
From ["ass#sss.ib"] to ass#sss.ib
Any help appreciated.
Simple use something like that
var obj = jQuery.parseJSON( '{ "name": "John" }' );
alert( obj.name === "John" );
I have a column of text type be contain JSON value.
{
"customer": [
{
"details": {
"customer1": {
"name": "john",
"addresses": {
"address1": {
"line1": "xyz",
"line2": "pqr"
},
"address2": {
"line1": "abc",
"line2": "efg"
}
}
}
"customer2": {
"name": "robin",
"addresses": {
"address1": null
}
}
}
}
]
}
How can I extract 'address1' JSON field of column with query?
First I am trying to fetch JSON value then I will go with parsing.
SELECT JSON customer from text_column;
With my query, I get following error.
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
com.datastax.driver.core.exceptions.SyntaxError: line 1:12 no viable
alternative at input 'customer' (SELECT [JSON] customer...)
Cassandra version 2.1.13
You can't use SELECT JSON in Cassandra v2.1.x CQL v3.2.x
For Cassandra v2.1.x CQL v3.2.x :
The only supported operation after SELECT are :
DISTINCT
COUNT (*)
COUNT (1)
column_name AS new_name
WRITETIME (column_name)
TTL (column_name)
dateOf(), now(), minTimeuuid(), maxTimeuuid(), unixTimestampOf(), typeAsBlob() and blobAsType()
In Cassandra v2.2.x CQL v3.3.x Introduce : SELECT JSON
With SELECT statements, the new JSON keyword can be used to return each row as a single JSON encoded map. The remainder of the SELECT statment behavior is the same.
The result map keys are the same as the column names in a normal result set. For example, a statement like “SELECT JSON a, ttl(b) FROM ...” would result in a map with keys "a" and "ttl(b)". However, this is one notable exception: for symmetry with INSERT JSON behavior, case-sensitive column names with upper-case letters will be surrounded with double quotes. For example, “SELECT JSON myColumn FROM ...” would result in a map key "\"myColumn\"" (note the escaped quotes).
The map values will JSON-encoded representations (as described below) of the result set values.
If your Cassandra version is 2.1x and below, you can use the Python-based approach.
Write a python script using Cassandra-Python API
Here you have to get your row first and then use python json's loads method, which will convert your json text column value into JSON object which will be dict in Python. Then you can play around with Python dictionaries and extract your required nested keys. See the below code snippet.
from cassandra.cluster import Cluster
from cassandra.auth import PlainTextAuthProvider
import json
if __name__ == '__main__':
auth_provider = PlainTextAuthProvider(username='xxxx', password='xxxx')
cluster = Cluster(['0.0.0.0'],
port=9042, auth_provider=auth_provider)
session = cluster.connect("keyspace_name")
print("session created successfully")
rows = session.execute('select * from user limit 10')
for user_row in rows:
customer_dict = json.loads(user_row.customer)
print(customer_dict().keys()