Rails serialized JSON to string gives error on mysql JSON field - mysql

I have an old table which used to save json to a string sql field using the Rails ActiveRecord serializer. I created a new table with the field datatype as JSON which was introduced in MySQL 5.7. But directly copying the data from the old field to the new one gives me JSON error saying the json structure is wrong.
More specifically the problem is with unicode characters which my database does not support as of yet and the database is too large to just migrate everything to support it.
I am looking for a way to migrate the data from the old field to the new JSON field
I have seen that replacing \ufor the unicode character to \\u in the JSON string solved the issue but I am not just able to do this:
update table_name set column_name=REPLACE(column_name, '\u', '\\u');
since it gives an error again
ERROR 3140 (22032): Invalid JSON text: "Invalid escape character in string." at position 564 in value for column 'table_name.column_name'.

for sql updating table values \u to \\u :
update table_name set column_name=REPLACE(column_name, '\\u', '\\\\u') [where condition];

Related

java.lang.String cannot be cast to java.lang.Byte in NIFI

{"edit_history_tweet_ids":["1607615574473080832"],"id":"1607615574473080832","text":"Twitter bms with this dumb ass update 😭"}
This is my flowfile
and i wanna INSERT that mysql DB using PutDatabaseRecord
table_desc
it's my table description (ver 8.0)
PutDatabaseRecord Process' bulletin say java.lang.String cannot be cast to java.lang.Byte
I think 'edit_history_tweet_ids' column is problem
What should I do?
I tried that JSON flowfile ConvertJsontoSql and PutSql
but after ConvertJsontoSql Processor, edit_history_tweet_ids column's data disappears
after_convertjsontosql
when INSERT valid generated flowfile, it is successfully done.
updated_attribute
It was solved by converting text to escapeJSON using UpdateRecord and changing it to csv format.
It seems to be solved by setting text to string instead of array.

Athena (Trino SQL) parsing JSON document using fields (dot notation)

Athena (Trino SQL) parsing JSON document (table column called document 1 in Athena) using fields (dot notation)
If the underlying json (table column called document 1 in Athena) is in the form of {a={b ...
I can parse it in Athena (Trino SQL) using
document1.a.b
However, if the JSON contains {a={"text": value1 ...
the quote marks will not parse correctly.
Is there a way to do JSON parsing of a 'field' with quotes?
If not, is there an elegant way of parsing the "text" and obtain the string in value 1? [Please see my comment below].
I cannot change the quotes in the json and its Athena "table" so I would need something that works in Trino SQL syntax.
The error message is in the form of: SQL Error [100071] [HY000]: [Simba][AthenaJDBC](100071) An error has been thrown from the AWS Athena client. SYNTAX_ERROR: Expression [redacted] is not of type ROW
NOTE: This is not a duplicate of Oracle Dot Notation Question
Dot notation works only for columns types as struct<…>. You can do that for JSON data, but judging from the error and your description this seems not to be the case. I assume your column is of type string.
If you have JSON data in a string column you can use JSON functions to parse and extract parts of them with JSONPath.

Postgres JSONB column insert error: invalid input syntax for type json the input string ended unexpectedly

I'm trying to create a row manually via DBeaver and I am entering the following in a jsonb column:
{"US":"0.880","PA":"0.028","KY":"0.025"}
I've checked that this is valid JSON on https://jsonformatter.curiousconcept.com/#
However, this is what I get:
Any insight would be appreciated...
I even tried surrounding the object with single quotes like:
'{"US":"0.880","PA":"0.028","KY":"0.025"}'
But got an error about how the ' is an invalid token...
I was writing a nodejs script to insert a json stringified object into the column but I was getting the same error so I decided to manually try it and I can't even insert the above data...
I can insert this data in jsonb column in DBeaver via UI without problem.
But we have issue about input in json(not jsonb) column in table which has no primary key. Maybe is this your case https://github.com/dbeaver/dbeaver/issues/11704 ?
Or can you show table DDL?
insertjsonb

Scala jdbc mysql json field encoding issue

I have a MySQL table with fields of json type. The table and the db have utf8 charset.
The problem I having is that when I query data, the value of these fields are strings in incorrect charset and unicode characters are messed up: like, instead of é I get é.
There's no problems with other text fields.
I tried using Quill and Slick. The connection config is like this:
characterEncoding="UTF-8"
useUnicode=true
I also tried jdbc:mysql://"${DB_HOST}/${DB_NAME}&useUnicode=yes&characterEncoding=UTF-8
The only workaround I found is to manually convert strings to another charset like
new String(string.getBytes("Windows-1252"), "UTF-8")
Is there a better way?

How to save an array to a mysql database in laravel

So i wanted to save some tags data to the database from the UI form,
Tags: ["male","female","kids"]
I tried everything like but it saves as a string, ialso tried checking if i can alter the data type to array in mysql, or json but i got this
You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'JSON(255) NOT NULL' at line 1
I also tried json_encode and json_decode() but still no head so please what can i do?
There isn't any data type as JSON or Array in DB.
What you can do is use the tags field as TEXT as told by #Rick James and then encode your input as json by using json_encode() method before inserting and decode it after retrieving the data from DB by using json_decode() method.
JSON is basically a minimal, readable format for structuring data. Which means it can be considered as a String.
Here is a good post about JSON, in case you need.
What is JSON?
There is no datatype called JSON. Instead use TEXT without the (255).