MySQL UDF for working with json? - mysql

Are there any good UDFs in MySQL to deal with json data, that supports the ability to retrieve a particular value in json (by dot notation key - EG: json_get('foo.bar.baz')) as well as the ability to set the value of a particular key - EG: json_set('foo.bar.baz', 'value')?
I found http://www.mysqludf.org/lib_mysqludf_json/ - but it seems to only provide the ability to create json data structures from non-json column values, as opposed to interacting with json column values.

This UDF is able to parse JSON and return the value of an attribute:
https://github.com/kazuho/mysql_json
This other one too: https://github.com/webaroo/mysql-json-udf

Related

Postgres Jsonb Column how to maintain json object order of parameters

I Have a use case, where I generate a hash of a JSON object created in java initially and insert the hash and JSON object into the Postgres table as String and jsonb respectively.
I need to validate the JSON object saved initially in regular intervals, During that I fetch the JSON object from Postgres which is stored in jsonb and generate a hash out of it and compare it with the hash generated initially. Both are different now.
The reason is initially when data was inserted order of parameters was different in JSON Object at retrieval the order is different. Ending up generating 2 different hash for the same data.
Please suggest.
Use json type instead of jsonb per JSON:
Because the json type stores an exact copy of the input text, it will preserve semantically-insignificant white space between tokens, as well as the order of keys within JSON objects. Also, if a JSON object within the value contains the same key more than once, all the key/value pairs are kept. (The processing functions consider the last value as the operative one.) By contrast, jsonb does not preserve white space, does not preserve the order of object keys, and does not keep duplicate object keys. If duplicate keys are specified in the input, only the last value is kept.

Sending raw JSON to GraphQL mutation

I need to upload through a GraphQL mutation an object whose schema is not known at design time. Basically it is a raw JSON blob that may or may not come from other, possibly non-GraphQL services. I am using JSON scalar from https://github.com/graphql-java/graphql-java-extended-scalars:
scalar JSON
input MyInput {
jsonField: JSON
}
It works nicely except when the input blob contains keys like "$ref", which is the standard way to describe references in JSON. However, field name $ref does not seem to be accepted as it does not conform to the GraphQL naming conventions.
Obvious solutions appear to be to encode-decode field names in some way, or send the whole JSON as a String, perhaps with a custom scalar type. But is there perhaps a more elegant way to achieve this?

Spark from_avro 2nd argument is a constant string, any way to obtain schema string from some column of each record?

suppose we are developing an application that pulls Avro records from a source
stream (e.g. Kafka/Kinesis/etc), parses them into JSON, then further processes that
JSON with additional transformations. Further assume these records can have a
varying schema (which we can look up and fetch from a registry).
We would like to use Spark's built in from_avro function, But it is pretty clear that
Spark from_avro wants you to hard code a >Fixed< schema into your code. It doesn't seem
to allow the schema to vary row by incoming row.
That sort of makes sense if you are parsing the Avro to Internal row format.. One would need
a consistent structure for the dataframe. But what if we wanted something like
from_avro which grabbed the bytes from some column in the row and also grabbed the string
representation of the Avro schema from some other column in the row, and then parsed that Avro
into a JSON string.
Does such built-in method exist? Or is such functionality available in a 3rd party library ?
Thanks !

What is the best way to store JSON in Postgres

What is the best way to store JSON data in Postgres? Is it possible to query a stored json string without using like/regex?
The best option is to use a column defined as jsonb.
And yes, Postgres has a lot of functions that let you query the content of a JSON value.
e.g. to check if a key/value pair is contained in the value json_column #> '{"key": "value"}'

Is {0:{"id":1,...},{"id:2,....}} a other reprensation of a JSON list like [{"id":1,...},{"id:2,....}]

I have a little dilema. I have a backend/Frontend Application that comunicates with a JSON based REST Api.
The backend is written in PHP(Symfony/jmsserializer) and the Frontend in Dart
The communication between these two has a little Problem.
For most List Data the backend responds with a JSON like
[{"id":1,...},{"id:2,....}]
But for some it responds with
{"0":{"id":1,...}, "1":{"id:2,....}}
Now my Question is should the backend respond with the later at all or only with the first?
Problem
You usually have a list of objects. You sometimes get an object with sub-objects as properties.
Underlying issue
JS/JSON-Lists are ordered from 0 upwards which means that if you have PHP-Array which does not respect this rule json_encode will output a JS/JSON-Object instead using the numeric indices as keys.
PHP-Arrays are ordered maps which have more features that the JSON-Lists. Whenever you're using those extra features you won't be able to translate directly into JSON-Lists without loosing some information (ordering, keys, skipped indices, etc.).
PHP-Arrays and JSON-Objects on the other hand are more ore less equivalent in terms of features and can be correctly translated between each other without any loss of information.
Occurence
This happens if you have an initial PHP-Array of values which respects the JS/JSON-List rules but the keys in the list of objects are modified somehow. For example if you have a custom indexing order {"3":{}, "0":{}, "1":{}, "2":{}} or if you have (any) keys that are strings (ie. not numeric).
This always happens if you want to use the numeric id of the object as the numeric index of the list {"123":{"id": 123, "name": "obj"}} even if the numeric ids are in ascending order... so long as they are not starting from 0 upwards it's not a json-list it's a json-object.
Specific case
So my guess is that the PHP code in the backend is doing something like fetching a list of objects but its modifying something about it like inserting them by (string) keys into the array, inserting them in a specific order, removing some of them.
Resolution
The backend can easily fix this by using array_values($listOfObjects) before using json_encode which will reindex the entire list by numeric indices of ascending value.
Arrays and dictionaries are two separate types in JSON ("array" and "object" respectively), but PHP combines this functionality in a single array type.
PHP's json_encode deals with this as follows: an array that only contains numeric keys ($array = ['cat', 'dog']) is serialized as JSON array, an associative array that contains non-numeric keys ($array = ['cat' => 'meow', 'dog' => 'woof']) is serialized as JSON object, including the array's keys in the output.
If you end up with an associative array in PHP, but want to serialize it as a plain array in JSON, just use this to convert it to a numerical array before JSON encoding it: $array = array_values($array);