Efficient way to have Jackson serialize Java 8 Instant as epoch milliseconds? - json

Using Spring RestControllers with Jackson JSON parsing backend, with AngularJS on front end. I'm looking for an efficient way to have Jackson serialize an Instant as the epoch milliseconds for subsequent convenient usage with JavaScript code. (On the browser side I wish to feed the epoch ms through Angular's Date Filter: {{myInstantVal | date:'short' }} for my desired date format.)
On the Java side, the getter that Jackson would use is simply:
public Instant getMyInstantVal() { return myInstantVal; }
Serialization wouldn't work as-is, because the jackson-datatype-jsr310 doesn't return Epoch milliseconds by default for an Instant. I looked at adding #JsonFormat to the above getter to morph the Instant into something the front-end can use, but it suffers from two problems: (1) the pattern I can supply it is apparently limited to SimpleDateFormat which doesn't provide an "epoch milliseconds" option, and (2) when I tried to send the Instant as a formatted date to the browser instead, Jackson throws an exception because the #JsonFormat annotation requires a TimeZone attribute for Instants, something I don't wish to hardcode as it would vary from user to user.
My solution so far (and it's working fine) is to create a replacement getter using #JsonGetter, which causes Jackson to use this method instead to serialize myInstantVal:
#JsonGetter("myInstantVal")
public long getMyInstantValEpoch() {
return myInstantVal.toEpochMilli();
}
Is this the proper way of doing this? Or is there a nice annotation I'm missing that I can put on getMyInstantVal() so I won't have to create these additional methods?

You just need to read the README that you linked to. Emphasis mine:
Most JSR-310 types are serialized as numbers (integers or decimals as appropriate) if the SerializationFeature#WRITE_DATES_AS_TIMESTAMPS feature is enabled, and otherwise are serialized in standard ISO-8601 string representation.
[...]
Granularity of timestamps is controlled through the companion features SerializationFeature#WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS and DeserializationFeature#READ_DATE_TIMESTAMPS_AS_NANOSECONDS. For serialization, timestamps are written as fractional numbers (decimals), where the number is seconds and the decimal is fractional seconds, if WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS is enabled (it is by default), with resolution as fine as nanoseconds depending on the underlying JDK implementation. If WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS is disabled, timestamps are written as a whole number of milliseconds.

This is what worked for me in Kotlin (should be the same for Java). This lets you serialize as an epoch millisecond without changing the ObjectMapper's configuration
data class MyPojo(
#JsonFormat(without = [JsonFormat.Feature.WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS])
val timestamp: Instant
)

Adding on to JB's answer, to override Spring MVC's default JSON parser to strip away the nanoseconds from Instant (and other Java 8 date objects that have them):
In the mvc:annotation-driven element, specify that you will be overriding the default JSON message converter:
<mvc:annotation-driven validator="beanValidator">
<mvc:message-converters register-defaults="true">
<beans:ref bean="jsonConverter"/>
</mvc:message-converters>
</mvc:annotation-driven>
(register-defaults above is true by default and most probably what you'll want to keep the other converters configured by Spring as-is).
Override MappingJackson2HttpMessageConverter as follows:
<beans:bean id="jsonConverter" class="org.springframework.http.converter.json.MappingJackson2HttpMessageConverter">
<beans:property name="objectMapper">
<beans:bean class="org.springframework.http.converter.json.Jackson2ObjectMapperFactoryBean">
<beans:property name="featuresToDisable">
<beans:array>
<util:constant static-field="com.fasterxml.jackson.databind.SerializationFeature.WRITE_DATE_TIMESTAMPS_AS_NANOSECONDS"/>
</beans:array>
</beans:property>
</beans:bean>
</beans:property>
Step #1 is important as Spring MVC will otherwise ignore the configured MJ2HMC object in favor of its own default one.
partial H/T this SO post.

A simple way to return epoch millis in the JSON response for an Instant property can be following:
#JsonFormat(shape = JsonFormat.Shape.NUMBER, timezone = "UTC")
private Instant createdAt;
This will result in the following response:
{
...
"createdAt": 1534923249,
...
}

Related

How do you control how the Couchbase Java client serializes Dates?

Here's my code (Couchbase Java SDK 3)
Cluster cluster = Cluster.connect("localhost", "Administrator", "password");
Collection c = cluster.bucket("default").defaultCollection();
c.upsert("myDocumentId", new Date());
When I look at the resulting document in Couchbase, I see the java.util.Date has been converted to epoch milliseconds:
1602791214674
What I want instead is for the date to be formatted like yyyy-mm-dd.
How can I make that happen?
By default, the Couchbase Java client uses Jackson to serialize and deserialize JSON. Unless you tell it otherwise, it will use an ObjectMapper with default settings. By default, Jackson serializes java.util.Date objects by converting them to milliseconds since the epoch.
You have a couple of choices. If you're using a POJO to represent your document content, you can apply Jackson annotations to the date fields to control how they are [de]serialized. Here's an article that shows how to use the #JsonFormat annotation to control how a date field is serialized.
Alternatively, you can configure an ObjectMapper to change the default way dates are serialized. Here's how you tell the Couchbase Java SDK to use your custom ObjectMapper:
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS);
ClusterEnvironment env = ClusterEnvironment.builder()
.jsonSerializer(JacksonJsonSerializer.create(objectMapper))
.build();
Cluster cluster = Cluster.connect("localhost",
ClusterOptions.clusterOptions("Administrator", "password")
.environment(env));
Collection c = cluster.bucket("default").defaultCollection();
c.upsert("myDocumentId", new Date());
cluster.disconnect();
// since we created a custom environment, we're responsible for shutting it down
env.shutdown();
This will give you a document that looks like:
"2020-10-15T19:59:45.685+0000"
If you want a different format, you can configure Jackson to serialize dates however you want.

Exception during deserialize java.time.Instant from redis cache

I keep getting following exception while reading data from cache.
org.springframework.data.redis.serializer.SerializationException: Could not read JSON: Cannot construct instance of `java.time.Instant` (no Creators, like default construct, exist): cannot deserialize from Object value (no delegate- or property-based Creator)
It started as soon as I introduced new variable of type java.time.Instant
You can use JsonSerializer and JsonDeserializer to serialize Instant object either as milliseconds or custom text format.
For example implementation follow the answer section of How to set format of string for java.time.Instant using objectMapper?

Json .net settings equivalent to JavaScriptSerializer default for dates

I am using the JQuery Ganntt plugin and it needs dates formatted in the Unix epoch format. Using Newtonsoft's Json.Net with these settings
JsonSerializerSettings microsoftDateFormatSettings = new JsonSerializerSettings
{
DateFormatHandling = DateFormatHandling.MicrosoftDateFormat
};
return JsonConvert.SerializeObject(headers, microsoftDateFormatSettings);
I get json that looks like the following
[{"desc":"STAT","name":"Status","values":[{"to":"/Date(1357483427000-0500)/","from":"/Date(1354891427000-0500)/","desc":"","label":"Implement","customClass":"ganttBlue","dataObj":{"id":35,"projectId":18705,"updatedById":437996,"updatedByName":"Linda","updated":"/Date(1354891427000-0500)/","statusId":160,"statusDescription":"","status":"Implement"}}]},{"desc":"ASGNTO","name":"Assigned To","values":[{"to":"/Date(1357762454000-0500)/","from":"/Date(1355170454000-0500)/","desc":"Suzy","label":"Suzy","customClass":"ganttRed","dataObj":{"id":55,"projectId":18705,"updatedById":719816,"updatedByName":"Joe","updated":"/Date(1355170454000-0500)/","assignedToId":561260,"assignedToName":"Suzy"}}]}]
The gantt plugin does not like the date with the -500. It wants this, which is generated from using the JavaScriptSerializer
"[{\"desc\":\"STAT\",\"name\":\"Status\",\"values\":[{\"to\":\"\/Date(1357483427000)\/\",\"from\":\"\/Date(1354891427000)\/\",\"description\":\"\",\"label\":\"Implement\",\"customClass\":\"ganttBlue\",\"dataObj\":{\"Id\":35,\"ProjectId\":18705,\"UpdatedById\":437996,\"UpdatedByName\":\"Linda\",\"Updated\":\"\/Date(1354891427000)\/\",\"StatusId\":160,\"StatusDescription\":\"\",\"Status\":\"Implement\"}}]},{\"desc\":\"ASGNTO\",\"name\":\"Assigned To\",\"values\":[{\"to\":\"\/Date(1357762454000)\/\",\"from\":\"\/Date(1355170454000)\/\",\"description\":\"Suzy\",\"label\":\"Suzy\",\"customClass\":\"ganttRed\",\"dataObj\":{\"Id\":55,\"ProjectId\":18705,\"UpdatedById\":719816,\"UpdatedByName\":\"Joe\",\"Updated\":\"\/Date(1355170454000)\/\",\"AssignedToId\":561260,\"AssignedToName\":\"Suzy\"}}]}]"
What would be the proper setting for the Json.Net converter? I want to use Json.net when we move to .net 4.5.
To make it display a date that is like the one produced by JavaScriptSerializer, you have to give two settings:
JsonSerializerSettings serializerSettings = new JsonSerializerSettings()
{
DateFormatHandling = DateFormatHandling.MicrosoftDateFormat,
DateTimeZoneHandling = DateTimeZoneHandling.Utc
};
Using any other type of DateTimeZoneHandling will cause the timezone offset to be put in. (Seems like a bug that Unspecified still puts the offset in.)
However, if you are using local time throughout the system, doing this will shift the dates by your timezone offset when serializing them. Your dates will be off.
The easiest fix for me was to use the default ISO date, set DateTimeZoneHandling to Local, and change the client to parse the ISO date. Otherwise you would need to adjust the dates before serializing or play with your own custom serializer. Neither of those last two seemed worth it to me.

Jackson custom serialization under Spring 3 MVC

I have a couple of POJOs which looks like this:
class Items {
List<Item> items;
public List<Item> getItems() {
return items;
}
...
}
class Item {
String name;
Date insertionDate;
...
}
I want to be able to serialize the Date field in Item using a custom format (add a prefix to the date, something like "Date:xxx"), but I don't want to do that always (as it's used by other consumers which don't require this prefix), only in specific cases.
If I annotate Item's getInsertionDate() with#JsonSerialize(using = CustomDateSerializer.class) I can probably make this work, however, I don't want to do that since I don't always want to serialize this field using this method, only in a specific case.
So ideally, I would do this in my controller which does want to customize the serialization:
#JsonSerialize(using = CustomDateSerializer.class)
public List<Item> getItems() {
....
}
where CustomDateSerializer extends SerializerBase<Date> and Jackson would figure out that it should serialize each item in the List using the default serializer, and when it hits a Date object it should use my custom serializer. Of course this does not work since that's not how #JsonSerialize is used, but is there a way to make this work other than to wrap Item with a wrapper and use that wrapper when I want the custom serialization? Am I thinking about this the wrong way and there's another way to do this?
Note that I'm using Spring MVC so I'm not calling the serialization directly.
Any help would be much appreciated :)
The problem is that Jackson does not see the annotations on getItems() if it is a service end point method; it is typically only passed type List<Item> that Spring determines. With JAX-RS (like Jersey), annotations associated with that method are passed, however (and perhaps Spring has some way as well); although it then requires bit more support from integration code (for JAX-RS, Jackson JAX-RS JSON provider module) to pass that along.
It might be easier to actually create a separate POJO (and not pass List types) so that you can add necessary annotations.
If you were using Jackson directly, you could also use ObjectWriter and specify default date format to use. But I don't know if Spring allows you to do that (most frameworks do not and only expose configurability of ObjectMapper).
One more note -- instead of custom serializer (and/or deserializer), you can also use simple annotations with Dates (and on Jackson 2.x):
public class DateStuff {
#JsonFormat(shape=JsonFormat.Shape.STRING, pattern="'Date:'yyyy'-'MM'-'dd")
public Date date;
}
to specify per-property format override.

Mapping Java byte[] to MySQL binary(64) in Hibernate

I'm having some trouble mapping a byte array to a MySQL database in Hibernate and was wondering if I'm missing something obvious. My class looks roughly like this:
public class Foo {
private byte[] bar;
// Getter and setter for 'bar'
}
The table is defined like this in MySQL 5.5:
CREATE TABLE foo (
bar BINARY(64) NOT NULL)
And the Hibernate 3.6.2 mapping looks similar to this:
<hibernate-mapping>
<class name="example.Foo" table="foo">
<property name="bar" column="bar" type="binary" />
</class>
</hibernate-mapping>
I am using hbm2ddl for validation only and it gives me this error when I deploy the application:
Wrong column type in foo for column bar. Found: binary, expected: tinyblob
If using type="binary" in the mapping wouldn't cause Hibernate to expect the column's type to be binary (instead of tinyblob,) I don't know what would. I spent some time Googling this but couldn't find the exact error. The solutions for similar errors were to...
Specify "length" on the <property>. That changes what type Hibernate expects but it's always some variety of blob instead of the "binary" type it's finding.
Instead of declaring a "type" on the property element, nest a column element and give it a sql-type attribute. That work but that would also make the binding specific to MySQL so I would like to avoid it if possible.
Does anything stand out about this setup that would cause this mismatch? If I specify type="binary" instead of "blob", why is Hibernate expecting a blob instead of a binary?
I believe the problem is type="binary".
That type is a hibernate, generic type. It does not directly map to DB-engine specific types. They are translated to different SQL types based on driver you are using. Apparently the MySQL driver maps the hibernate type "binary" to a tinyblob.
The full list of hibernate types is available here.
You have 2 options. You can change your CREATE TABLE script to store that column with a tinyblob data type. Then your hibernate validation would not fail and your application would work. This would be the suggested solution.
The second option should be used only if you HAVE to use BINARY data type in the DB. What you can do is specify a sql-type in the hibernate mapping so that you enforce hibernate to use the type you want. The mapping would look like this:
<property name="bar">
<column name="bar" sql-type="binary" />
</property>
The main down side to this is you lose DB -engine independence which is why most people use hibernate in the first place. This code will only work on DB engines which have the BINARY data type.
What we ended up doing to solve a problem similar to this is write our own custom UserType.
UserTypes are relatively easy to implement. Just create a class that implements org.hibernate.usertype.UserType and implement the #override methods.
in your hibernate definitions, using a user type is pretty easy:
<property name="data" type="com.yourpackage.hibernate.CustomBinaryStreamUserType" column="binary_data" />
Simply put, What this will do is execute this class for reading and writing the data from the database. Specifically the methods nullSafeGet and nullSafeSet are used.
In our case, we used this to gzip compress binary data before writing it to the database, and uncompress it as its read out. This hides the fact that the data is compressed from the application using this data.
I think there is an easy solution for mapping Binary columns in hibernate.
"BINARY" columns can be easily mapped to "java.util.UUID" in hibernate entity classes.
For e.g. Column definition will look like
`tokenValue` BINARY(16) NOT NULL
Hibernate Entitiy will have below code to support BINARY column
private UUID tokenValue;
#Column(columnDefinition = "BINARY(16)", length = 16)
public UUID getTokenValue() {
return this.tokenValue;
}
public void setTokenValue(UUID sessionTokenValue) {
this.tokenValue = tokenValue;
}