which form should have json code in mysql field ?
I need users datas (user_id is the key) with 3 values (3 informations : name , age, sex)
145:"name,age,sex",
148:"name,age,sex",
200:"name,age,sex"
I am using mysql version 5.6 and the datas will be inserted with SQL code
is it correct to store it in that way in mysql to retrieve with php and json_decode?
[{236:"paul,26,1"},{2515:"fred,42,1"},{2515:"jane,21,0"}]
thank you
Do you ever want to be able to write a query like select * from users where sex=1? If so, don't store the JSON as text. Store each value (id, name, age, sex) in a column of its own.
Even if you do want to store the JSON as a string, it would probably be better organised like this:
[
{"id":236,"name":"paul","age":26,"sex":1},
{"id":2515,"name":"fred","age":42,"sex":1},
{"id":2516:"jane","age":21,"sex":0}
]
You would need to manipulate it a bit after querying, but you would have more meaningful data.
But if all you want is to store that text, so you can retrieve it as text later, then what you have is fine.
with datas in this form
[{236:"paul,26,1"},{2515:"fred,42,1"},{2515:"jane,21,0"}]
and json_decode($myfield);
I get NULL
if I echo $myfield I get exatly what is stored in the database
what is wrong with N
[{236:"paul,26,1"},{2515:"fred,42,1"},{2515:"jane,21,0"}]
If I parse it here http://json.parser.online.fr/
I get Syntax error
Related
I apologize in advance if this is very simple and I am just missing it.
Would any of you know how to put custom attributes as column headers? I currently have a simple opt in survey on connect and I would like to have each of the 4 items as column headers and the score in the table results. I pull the data using an ODBC connection to excel so ideally I would like to just add this on the end of my current table if I can figure out how to do it.
This is how it currently looks in the output
{"effortscore":"5","promoterscore":"5","satisfactionscore":"5","survey_opt_in":"True"}
If you have any links or something that I can follow to try improve my knowledge.
Thanks in advance
There are multiple options to query data in JSON format in Athena, and based on your use case (data source, query frequency, query destination, etc.) you can choose what makes more sense.
String Column + JSON functions
This is usually the most straightforward option and a good starting point. You define the survey_output as a string column, and when you need to extract the specific attributes from the JSON string, you can apply the JSON functions in Trino/Athena: https://trino.io/docs/current/functions/json.html. For example:
SELECT
id,
json_query(
survey_output,
'lax $.satisfactionscore'
) AS satisfactionscore
FROM customers
String Column + JSON functions + View
The following way to simplify access to data without json_query functions is to define a VIEW on that table using the json_query syntax in the VIEW creation. You define the view once by a DBA, and when the users query the data, they see the columns they care about. For example:
CREATE VIEW survey_results AS
SELECT
id,
json_query(
survey_output,
'lax $.satisfactionscore'
) AS satisfactionscore
FROM customers;
With such dynamic view creation, you have more flexibility in what data will be easily exposed to the users.
Create a Table with STRUCT
Another option is to create the external table from the data source (files in S3, for example) with the STRUCT definition.
CREATE EXTERNAL TABLE survey (
id string,
survey_results struct<
effortscore:string,
promoterscore:string,
satisfactionscore:string,
survey_opt_in:string
>
)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
LOCATION 's3://<YOUR BUCKET HERE>/<FILES>'
I have a table in AWS Glue, and the crawler has defined one field as array.
The content is in S3 files that have a json format.
The table is TableA, and the field is members.
There are a lot of other fields such as strings, booleans, doubles, and even structs.
I am able to query them all using a simpel query such as:
SELECT
content.my_boolean,
content.my_string,
content.my_struct.value
FROM schema.tableA;
The issue is when I add content.members into the query.
The error I get is: [Amazon](500310) Invalid operation: schema "content" does not exist.
Content exists because i am able to select other fiels from the main key in the json (content).
Probably is something related with how to perform the query agains array field in Spectrum.
Any idea?
You have to rename the table to extract the fields from the external schema:
SELECT
a.content.my_boolean,
a.content.my_string,
a.content.my_struct.value
FROM schema.tableA a;
I had the same issue on my data, I really don't know why it needs this cast but it works. If you need to access elements of an array you have to explod it like:
SELECT member.<your-field>,
FROM schema.tableA a, a.content.members as member;
Reference
You need to create a Glue Classifier.
Select JSON as Classifier type
and for the JSON Path input the following:
$[*]
then run your crawler. It will infer your schema and populate your table with the correct fields instead of just one big array. Not sure if this was what you were looking for but figured I'd drop this here just in case others had the same problem I had.
I want to save the ruby hash in the database like
Metrics table has name,values fields.
metrics.create("Registered", '{"Gender": "Male", "Age": 21}')
I want the query should run like this.
select count(*) from metrics where name=“Registered” and values.age > 20
As per best of my knowledge it will not work. But Is there any possibility to achieve this?
If the database used is postgresql, you can have a json column named values and then use json_extract_path_text(values, age) to extract 'age'
Reference for json_extract_path_text can be seen here
So I need to order by percent counted from two values of one field where data is stored in JSON.
The field name where values are stored named program_invested_details and for example value is:
{"invested":"120.00","received":"1.08"}
I need that $query would be (received * 100 / invested) from that field
SELECT *, ($query) AS PERRCENT_TOTAL
FROM programs_list
WHERE program_add_status = 4 AND program_status = 1 ORDER BY PERRCENT_TOTAL DESC
how does it possible to make?
By default MySQL does not have any ability to parse a JSON string.
One option would be to use an extension such as common_schema which would add the ability to parse JSON and extract fields (see get_option). I am not sure of the performance hit you would take with this extension.
Another option would be to query all the data and parse the JSON in your client program. Once again, there would be significant performance impact if there is a lot of data.
Say I have a text field with JSON data like this:
{
"id": {
"name": "value",
"votes": 0
}
}
Is there a way to write a query which would find id and then would increment votes value?
I know i could just retrieve the JSON data update what I need and reinsert updated version, but i wonder is there a way to do this without running two queries?
UPDATE `sometable`
SET `somefield` = JSON_REPLACE(`somefield`, '$.id.votes', JSON_EXTRACT(`somefield` , '$.id.votes')+1)
WHERE ...
Edit
As of MySQL 5.7.8, MySQL supports a native JSON data type that enables efficient access to data in JSON documents.
JSON_EXTRACT will allow you to access a particular JSON element in a JSON field, while JSON_REPLACE will allow you to update it.
To specify the JSON element you wish to access, use a string with the format
'$.[top element].[sub element].[...]'
So in your case, to access id.votes, use the string '$.id.votes'.
The SQL code above demonstrates putting all this together to increment the value of a JSON field by 1.
I think for a task like this you're stuck using a plain old SELECT followed by an UPDATE (after you parse the JSON, increment the value you want, and then serialize the JSON back).
You should wrap these operations in a single transaction, and if you're using InnoDB then you might also consider using SELECT ... FOR UPDATE : http://dev.mysql.com/doc/refman/5.0/en/innodb-locking-reads.html
This is sort of a tangent, but I thought I'd also mention that this is the type of operation that a NoSQL database like MongoDB is quite good at.