Convert doctrine array to JSON - mysql

Is there a way to read a column of doctrine type "simply_array" or "array" in json?
My doctrine database is approached from another api and I want to read data from that api. However there is a column of type doctrine array that I want to convert into JSON.
I am unsure if there is a preferred way of doing this or I need to hack my way around it.
Here is an example of what is stored in the database as a doctrine array:
"a:1:{i:0;a:3:{s:3:\u0022day\u0022;i:5;s:4:\u0022time\u0022;s:7:\u0022morning\u0022;s:12:\u0022availability\u0022;N;}}"

That looks like the format of PHP's serialize() function. And the literal double-quotes in the string have been converted to unicode escape sequences.
You could do the following:
Fetch the serialized string
Fix the \u0022 sequences (replace them with ")
unserialize() it to reproduce the array
Convert the array to JSON with json_encode().

Related

How to prevent adding backslash to JSON string

I would like to read events from eventhub using Databricks, events are in json format but they can have different schema (it's important because i find solutions in which the schema was given to from_json(jsonStr,schema) function, but i cannot use it in my use case). When i use
.withColumn('Value', col('value').cast(StringType() in dataframe returns json output with backslashes "{\"time\": 1432826855000,\"host\":...... .
I found a solution How to prevent spark sql with kafka from adding backslash to JSON string in dataframe but in Delta Live Tables framework we create streaming tables by returning a dataframe, so i cant use this solution.
Should i use non pyspark functions in etl process such as
How to remove backslash from decoded JSON string? ?
Will it be efficient during streaming from eventhub to bronze?
You shouldn't worry about that backslashes - it's just a visual representation of your string when you display data and it has " character embedded into a string. Internally, data will be stored without backslashes, like: {"time": 1432826855000,"host":.......

Custom Formatting of JSON output using Spark

I have a dataset with a bunch of BigDecimal values. I would like to output these records to a JSON file, but when I do the BigDecimal values will often be written with trailing zeros (123.4000000000000), but the spec we are must conform to does not allow this (for reasons I don't understand).
I am trying to see if there is a way to override how the data is printed to JSON.
Currently, my best idea is to convert each record to a string using JACKSON and then writing the data using df.write().text(..) rather than JSON.
I suggest to convert Decimal type to String before writing to JSON.
Below code is in Scala, but you can use it in Java easily
import org.apache.spark.sql.types.StringType
# COLUMN_NAME is your DataFrame column name.
val new_df = df.withColumn('COLUMN_NAME_TMP', df.COLUMN_NAME.cast(StringType)).drop('COLUMN_NAME').withColumnRenamed('COLUMN_NAME_TMP', 'COLUMN_NAME')

Json : getting the unwanted result

I'm using json plugin to get the response in json.
But I m getting the unwanted result:
Here is what I get:
{"data":"[[\"service\",\"webservices\",\"document\"],[\"validation\",\"adapters\",\"server\"]]","records":25,"recordsTotal":75}
originally the data var in my action class is like this:
[["service","webservices","document"],["validation","adapters","server"]]
but json plugin adds the backslash.
The wanted result is that:
{"data":[["service","webservices","document"],["validation","adapters","server"]],"records":25,"recordsTotal":75}
Is there a way to get the later result ?
Thanks
You're representing the data as a PHP string. " is obviously a reserved character in JSON, so your serialization library is dutifully escaping the quote using /.
If you set up the PHP variable so it's an array of arrays, instead of a string representing an array of arrays, then your JSON serialization will work fine.

escape special characters inside json from a data attribute

I have json stored in data attributes.
<div data-dataarray="[["Shipper","Ship No","Weight"],["1WWWQUICK\PARTSCOM",1,1]]">
data = $('#'+moduleId).data('dataarray')
So data is now a string.
Which I then need to parse to get it back to json:
jsondata = JSON.parse(data);
This json can have special characters (notice the backslash)... which causes an error. How can I escape them before/while parsing?
firstly
I think the html5 data attributes need to have a form like data-xyzUserVariable. then you retrieve them using jquery.data("#xyz_id", "xyzUserVariable"),
secondly
However, be wary that jQuery cleverly attempts to convert the data to a suitable type (booleans, numbers, objects, arrays or null) and avoids touching the DOM.
thirdly
your json seems to be an array of objects ..is it missing an ending bracket ']' ?

How to convert string to BSON using MongoDB C++ driver?

Similar to toString is there a way we can convert a string to BSON object? I need to remove a document using C++ driver the the remove function expects the query to have BSON object.
Use the fromjson method found here:
http://api.mongodb.org/cplusplus/1.5.4/namespacemongo.html#a4f542be0d0f9bad2d8cb32c3436026c2
BSONObj mongo::fromjson ( const string & str )
Create a BSONObj from a JSON <http://www.json.org> string.
In addition to the JSON extensions extensions described here
http://mongodb.onconfluence.com/display/DOCS/Mongo+Extended+JSON, this function accepts
certain unquoted field names and allows single quotes to optionally be used when
specifying field names and string values instead of double quotes. JSON unicode escape
sequences (of the form ) are converted to utf8.
Exceptions:
MsgAssertionException if parsing fails. The message included with this assertion includes
a rough indication of where parsing failed.