JOOQ: How to serialize POJO column to JSON - mysql

I'm trying to write a JSON column to MySQL using JOOQ and Jackson but I'm not sure why it isn't serialized as JSON but as a toString representation.
Here is a table for which I generate JOOQ definitions:
create table JsonPayload
(
name varchar(127) primary key,
rules JSON not null,
defaultValue tinyint(1) default 0 not null
);
This are the classes I'd like to bind my model.
data class RuleTest(val name: String, val test: Boolean)
data class Rule(val name: String, val test: Boolean, val rule: RuleTest)
data class JsonPayload(val name: String, val rules: List<Rule>, val defaultValue: Boolean)
Insertion code:
dsl.insertInto(JSONPAYLOAD)
.set(dsl.newRecord(
JSONPAYLOAD,
JsonPayload(
"Test",
listOf(Rule("rule1", false, RuleTest("rule1", false)),
Rule("rule2", true, RuleTest("rule1", false))),
true
)
))
.execute()
It serializes and deserializes fine, however it doesn't write correct JSON to MySQL:
mysql> select * from JsonPayload;
+------+-----------------------------------------------------------------------------------------------------------------------------------------------+--------------+
| name | rules | defaultValue |
+------+-----------------------------------------------------------------------------------------------------------------------------------------------+--------------+
| Test | ["Rule(name=rule1, test=false, rule=RuleTest(name=rule1, test=false))", "Rule(name=rule2, test=true, rule=RuleTest(name=rule1, test=false))"] | 1 |
+------+-----------------------------------------------------------------------------------------------------------------------------------------------+--------------+
This is the demo project I created to demonstrate this behaviour https://github.com/v1ctor/jooq-json-demo
Can you please help me to understand how to write correct Json to MySQL?

That's an interesting feature idea, which isn't supported yet by the DefaultRecordUnmapper implementation in jOOQ. I've created feature requests for this:
#13604 The possibility to make reflective use of Jackson from the DefaultRecordUnmapper
#13605 The possibility to use out of the box Jackson converters in the code generator
Currently, Jackson can only be used for mapping JSON and JSONB to your own data structures when reading from the database. Not when writing to the database. But there isn't any reason why the inverse logic shouldn't be available as well.
In the meantime, you have to implement a data type Converter<JSON, List<Rule>> (or a Binding, if you need more power) and attach that to your generated code, see:
https://www.jooq.org/doc/latest/manual/code-generation/codegen-advanced/codegen-config-database/codegen-database-forced-types/
The benefit of using a Converter is that you now get type safety whenever you read/write to this column.

Related

How to type Date types with Prisma and JSON blobs?

There seems to be an issue with prisma's serialization of JSON blobs wrt Date types.
I wonder if anyone else has seen this and has some guidance/workaround.
This is for a JSON field with Planetscale which is basically MySQL driver.
I have an object with a Date field, that I want to encode. This is a blob of data coming back from an external API
eg my object tp has a field defined typed a Date:
trained_at: Date;
in my prisma schema for tunePrompt I have a JSON field:
model TunePrompt {
apiData Json? // from external API
But when I try to write to that apiData JSON field:
const data = {
apiData: tp,
}
const newPrompt = await prisma.tunePrompt.create({ data })
Property 'trained_at' is incompatible with index signature.
Type 'Date' is not assignable to type 'InputJsonValue | null | undefined'.",
if I were to do JSON.stringify(tp) this works without error, but then I get double escaped JSON.
The generated types are something like:
export type TunePromptCreateInput = {
apiData?: NullableJsonNullValueInput | InputJsonValue
}
the only workaround i've found is to type the Date fields as a string, but I'm sure this is going to lead to other parsing problems later.
Maybe I can look into typing the JSON blob, but I don't think that would solve the issue as its the serialization I think that's the problem.

Handle JSON With null Inside Array With Kotlin

I am trying to handle null or empty JSON value fields value which has received both JSON case:
{
"field": [
null
]
}
and
{
"field": []
}
The case when an empty array works fine for me: If I get an object with an array size of 0, it means it is empty. In the second case, when the field's value is [null], I get an array size 1 with all the elements null. That is why I check the null case with the following approach:
val deserializedJson = jacksonObjectMapper().readValue<DataSimpleClass>(theJsonAsText)
if (deserializedJson.field.size == 1 && deserializedJson.field[0] == null) {
throw Exception()
}
Is there any better or more elegant approach to check such a [null] case?
I deserialize the JSON using jacksonObjectMapper() object version 2.9.8. Also, I have created a two-data class that looks like that:
#JsonInclude(JsonInclude.Include.NON_NULL)
data class DataSimpleClass(
#JsonInclude(JsonInclude.Include.NON_NULL)
val field: List<SpecificClass>
)
#JsonInclude(JsonInclude.Include.NON_NULL)
data class SpecificClass(
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonProperty("field1") val field1: String,
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonProperty("field2") val field2: String,
#JsonProperty("field3") val field3: String? = null,
val updateTime: Instant = Instant.now(Clock.systemUTC())
)
Also, I don't understand how Kotlin (null-safe language) may let me create a field when all the elements are null. How is it possible that Kotlin doesn't catch null while I send the non-nullable field null value The deserializedJson result, then the JSON key's value is null ?
I was expecting that the null won't be deserialized cause of its null value, as the object DataSimpleClass holds a non-nullable field.
In addition, my IntelliJ shows me that due to the null safe fields, the following condition "is always false" while, in fact, it is actually true during the run.
How is it possible that the value shouldn't be null due to the null safe, but it all gets null during the run time?the IntelliJ warning for wrong condition result
Kotlin is "null-safe" language because it enforces non-null-ability by default.
You can still have nulls in Kotlin - e.g. val foo: String? = null
The IDE just says that based on your definition it shouldn't be null and it will not allow you(at compile time) to put null there. Runtime is where don't have control over who/what puts null there.
If there is no guarantee that you will not receive null there you should sanitize it before assuming there are no nulls.
deserializedJson.field..filterNotNull()
If you would rather that it crashed the whole app I think you can set
.addModule(KotlinModule(strictNullChecks = true))
when configuring Jackson.

how can I convert an object of type Any to a specific class type?

I am using Kotlin and I have a service that is getting an object of type Any (this cannot be changed)
The problem with Any is that is an object of 20+ fields and I just need one of them to use it as a filter... therefore I cannot do a simple cast.
So, my object is like: (when I print it)
{messageId=123, userId=32323, address=Some city, phone=111605,type=TYPE1.....
I want to convert it using Kotlinx or Jackson but I cannot convert it first to the expected String format, doing a parseFromString(myObject) will result in an exception as well of a wrong Json format.
I want to convert it to a class like this
#Serializable
private data class UserType(val type: String)
type is the only field I care about.
My convertion is via kotlinx
val format = Json { ignoreUnknownKeys = true }
format.decodeFromString<UserType>(myObject)
I even tried this to see if I can make it in the proper Json format
format.encodeToString(original)
Any idea what I could do here that would be a lightweight solution?
This is my Any type https://kotlinlang.org/api/latest/jvm/stdlib/kotlin/-any/

how to apply custom validation on json value

I have a json data coming via api. I have set of policies that I need to validate over coming json data.
For Example I have a json like
{
"users_id":"x",
"is_user_logged_in":"true",
"checkin_date":"2018-12-12",
"checkout_date":"2019-12-13"
}
Now I want to apply validation like checkin_date should be less than checkout_data or let say if is-user_logged_in is true then user_id should not be null.
I cant deserialize the json as i need to pass it to different application to consume
I am using Scala any idea how can i implement this. The catch is there can be multiple policies or rules i need to validate and i can only get the rules in runtime.
Thanks
Most easier way is to add validation to the default constructor and just use the JSON parser as a validator (no need to use parsed data):
import java.time.LocalDate
case class UserData(
user_id: Option[String],
is_user_logged_in: Boolean,
checkin_date: LocalDate,
checkout_date: LocalDate
) {
require(!is_user_logged_in || user_id.isDefined, "missing `user_id` for logged in user")
require(checkout_date.isAfter(checkin_date), "`checkout_date` should be after `checkin_date`")
}
For more complicated cases please consider to use some handy validation library, like:
https://github.com/jto/validation

Angular 5 not parsing timestamps correctly from json

I have a working (in production) web app (material + angular 5 (5.2.11)). Also I've an API written in .dot core 2 (C#) using Nancy FX and newtonsoft json.
Problem:
DB (mariaDB running on Ubuntu Server): I have this value: 2018-05-16 20:42:36 on a record.
Calling the endpoint yields the correct JSON:
{"timestamp":"2018-05-16T20:42:36Z"}
(the other fields were removed for sanity)
On Angular app I use:
... return this._http.get(this.endpoint + '/' + uuid, { headers:
this._getHeaders }).catch(this.handleError);
Where <T> represents a model that includes timedate: Date; as a property.
Using the service:
this._dataService.getByUuid(uuid).subscribe(result => {
console.log(result);
});
gives:
Object { timedate: "2018-05-16 08:05:36" }
So, the time lacks of AMPM format and I can't display it correctly. {{element.timedate | date: 'dd/MM/yyyy HH:mm' }} does nothing since timedate is just that, a bare string.
What have I tried:
Implementing a different format in JSON output (in NancFx API)
Adding a HTTP INTERCEPTOR
Reading this
Declaring the properties as Date, String
Problem is with any datetime field. The JSON is always on point and so the database.
Any help is appreciate
JSON doesn't have a Date type (only arrays, numbers, string, object, null and undefined), so the converter from JSON to TypeScript cannot know whether it's a date or a plain string.
You need to parse (Date.Parse(yourString) or new Date(yourString)) the Date property everytime your object is deserialized.
** Date.Parse and the Date constructor can take in a Date as well as a string so you don't really have to type check the value before using them.*