Using json with Play 2 - json

I'm trying to create a simple application that allows me to create, read, update and delete various users. I have a basic UI-based view, controller and model that work, but wanted to be more advanced than this and provide a RESTful json interface.
However, despite reading everything I can find in the Play 2 documentation, the Play 2 Google groups and the stackoverflow website, I still can't get this to work.
I've updated my controller based on previous feedback and I now believe it is based on the documentation.
Here is my updated controller:
package controllers;
import models.Member;
import play.*;
import play.mvc.*;
import play.libs.Json;
import play.data.Form;
public class Api extends Controller {
/* Return member info - version to serve Json response */
public static Result member(Long id){
ObjectNode result = Json.newObject();
Member member = Member.byid(id);
result.put("id", member.id);
result.put("email", member.email);
result.put("name", member.name);
return ok(result);
}
// Create a new body parser of class Json based on the values sent in the POST
#BodyParser.Of(Json.class)
public static Result createMember() {
JsonNode json = request().body().asJson();
// Check that we have a valid email address (that's all we need!)
String email = json.findPath("email").getTextValue();
if(name == null) {
return badRequest("Missing parameter [email]");
} else {
// Use the model's createMember class now
Member.createMember(json);
return ok("Hello " + name);
}
}
....
But when I run this, I get the following error:
incompatible types [found: java.lang.Class<play.libs.Json>] [required: java.lang.Class<?extends play.mvc.BodyParser>]
In /Users/Mark/Development/EclipseWorkspace/ms-loyally/loyally/app/controllers/Api.java at line 42.
41 // Create a new body parser of class Json based on the values sent in the POST
42 #BodyParser.Of(Json.class)
43 public static Result createMember() {
44 JsonNode json = request().body().asJson();
45 // Check that we have a valid email address (that's all we need!)
46 String email = json.findPath("email").getTextValue();
As far as I can tell, I've copied from the documentation so I would appreciate any help in getting this working.

There appear to be conflicts in the use of the Json class in the Play 2 documentation. To get the example above working correctly, the following imports are used:
import play.mvc.Controller;
import play.mvc.Result;
import play.mvc.BodyParser;
import play.libs.Json;
import play.libs.Json.*;
import static play.libs.Json.toJson;
import org.codehaus.jackson.JsonNode;
import org.codehaus.jackson.node.ObjectNode;
#BodyParser.Of(play.mvc.BodyParser.Json.class)
public static index sayHello() {
JsonNode json = request().body().asJson();
ObjectNode result = Json.newObject();
String name = json.findPath("name").getTextValue();
if(name == null) {
result.put("status", "KO");
result.put("message", "Missing parameter [name]");
return badRequest(result);
} else {
result.put("status", "OK");
result.put("message", "Hello " + name);
return ok(result);
}
}
Note the explicit calling of the right Json class in #BodyParser
I'm not sure if this is a bug or not? But this is the only way I could get the example to work.

Import those two
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
According to this documentation: http://fasterxml.github.io/jackson-databind/javadoc/2.0.0/com/fasterxml/jackson/databind/node/ObjectNode.html

Try this:
import play.*;
import play.mvc.*;
import org.codehaus.jackson.JsonNode; //Fixing "error: cannot find symbol" for JsonNode
// Testing JSON
#BodyParser.Of(BodyParser.Json.class) //Or you can import play.mvc.BodyParser.Json
public static Result sayHello() {
JsonNode json = request().body().asJson();
String name = json.findPath("name").getTextValue();
if(name==null) {
return badRequest("Missing parameter [name]");
} else {
return ok("Hello " + name);
}
}

AFAIK, the code you are using has not reached any official Play version (neither 2.0 or 2.0.1) according to this: https://github.com/playframework/Play20/pull/212
Instead, you can do this (not tested):
if(request().getHeader(play.mvc.Http.HeaderNames.ACCEPT).equalsIgnoreCase("application/json")) {

Did you try checking out the documentation for it?
Serving a JSON response looks like:
#BodyParser.Of(Json.class)
public static index sayHello() {
JsonNode json = request().body().asJson();
ObjectNode result = Json.newObject();
String name = json.findPath("name").getTextValue();
if(name == null) {
result.put("status", "KO");
result.put("message", "Missing parameter [name]");
return badRequest(result);
} else {
result.put("status", "OK");
result.put("message", "Hello " + name);
return ok(result);
}
}

You have imported play.libs.Json and then use the BodyParser.Of annotation with this Json.class.
The above annotation expects a class which extends a play.mvc.BodyParser. So simply replace #BodyParser.Of(Json.class) by #BodyParser.Of(BodyParser.Json.class).

Related

Camel Bindy Streaming Payload and Writing to File

I have a route which supposes to read a huge XML file and then write a CSV file with a header. XML Record needs to be transformed first so I map it to java POJO and then marshal it again to write into a csv file.
I can't load all of the records in memory as the file contains more 200k records.
Issue: I am only seeing the last record being added to the CSV file. Not sure why it's not appending the data into the existing file.
Any idea how to make it work. The header is required in CSV.I am not seeing any other option to directly transform the stream and write headers along with to CSV without unmarshalling it to Pojo first. I tried using BeanIO as well, which requires me to add a Header record and not sure how that can be injected into a stream.
from("{{xml.files.route}}")
.split(body().tokenizeXML("EMPLOYEE", null))
.streaming()
.unmarshal().jacksonXml(Employee.class)
.marshal(bindyDataFormat)
.to("file://C:/Files/Test/emp/csv/?fileName=test.csv")
.end();
If I try to append into the existing file then CSV file appends headers to each iteration of records.
.to("file://C:/Files/Test/emp/csv/?fileName=test.csv&fileExist=append")
Your problem here is related to camel-bindy and not the file-component. It kinda expects you to marshal collection objects instead of individual objects hence if you marshal each object individually and have #CsvRecord(generateHeaderColumns = true ) on your Employee class then you'll get headers every time you marshal an individual Employee object.
You could set generateHeaderColumns to false and start the file with headers string manually. One way to obtain headers for Bindy annotated class is to get fields annotated with DataField using org.apache.commons.lang3.reflect.FieldUtils from apache-commons and construct headers string based on position, columnName and fieldName.
I usually prefer camel-stream over file-component when I need to stream something to a file but using file-component with appends probably works just as well.
Example:
package com.example;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Comparator;
import java.util.List;
import org.apache.camel.RoutesBuilder;
import org.apache.camel.builder.RouteBuilder;
import org.apache.camel.dataformat.bindy.annotation.DataField;
import org.apache.camel.dataformat.bindy.csv.BindyCsvDataFormat;
import org.apache.camel.test.junit4.CamelTestSupport;
import org.apache.commons.lang3.reflect.FieldUtils;
import org.junit.Test;
public class ExampleTest extends CamelTestSupport {
#Test
public void testStreamEmployeesToCsvFile(){
List<Employee> body = new ArrayList<>();
body.add(new Employee("John", "Doe", 1965));
body.add(new Employee("Mary", "Sue", 1987));
body.add(new Employee("Gary", "Sue", 1991));
template.sendBody("direct:streamEmployeesToCSV", body);
}
#Override
protected RoutesBuilder createRouteBuilder() throws Exception {
return new RouteBuilder(){
#Override
public void configure() throws Exception {
BindyCsvDataFormat csvDataFormat = new BindyCsvDataFormat(Employee.class);
System.out.println(getCSVHeadersForClass(Employee.class, ","));
from("direct:streamEmployeesToCSV")
.setProperty("Employees", body())
// a bit hacky due to camel writing first entry and headers
// on the same line for some reason with (camel 2.25.2)
.setBody().constant("")
.to("file:target/testoutput?fileName=test.csv&fileExist=Override")
.setBody().constant(getCSVHeadersForClass(Employee.class, ","))
.to("stream:file?fileName=./target/testoutput/test.csv")
.split(exchangeProperty("Employees"))
.marshal(csvDataFormat)
.to("stream:file?fileName=./target/testoutput/test.csv")
.end()
.log("Done");
}
private String getCSVHeadersForClass(Class clazz, String separator ) {
Field[] fieldsArray = FieldUtils.getFieldsWithAnnotation(clazz, DataField.class);
List<Field> fields = new ArrayList<>(Arrays.asList(fieldsArray));
fields.sort(new Comparator<Field>(){
#Override
public int compare(Field lhsField, Field rhsField) {
DataField lhs = lhsField.getAnnotation(DataField.class);
DataField rhs = rhsField.getAnnotation(DataField.class);
return lhs.pos() < rhs.pos() ? -1 : (lhs.pos() > rhs.pos()) ? 1 : 0;
}
});
String[] fieldHeaders = new String[fields.size()];
for (int i = 0; i < fields.size(); i++) {
DataField dataField = fields.get(i).getAnnotation(DataField.class);
if(dataField.columnName().equals(""))
fieldHeaders[i] = fields.get(i).getName();
else
fieldHeaders[i] = dataField.columnName();
}
String csvHeaders = "";
for (int i = 0; i < fieldHeaders.length; i++) {
csvHeaders += fieldHeaders[i];
csvHeaders += i < fieldHeaders.length - 1 ? separator : "";
}
return csvHeaders;
}
};
}
}
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-lang3</artifactId>
<version>${apache-commons.version}</version>
</dependency>

Non-blocking parsing of a JSON String to a JsonNode

I'm exploring reactive programming with Spring Webflux and therefore, I'm trying to make my code completely nonblocking to get all the benefits of a reactive application.
Currently my code for the method to parse a Json String to a JsonNode to get specific values (in this case the elementId) looks like this:
public Mono<String> readElementIdFromJsonString(String jsonString){
final JsonNode jsonNode;
try {
jsonNode = MAPPER.readTree(jsonString);
} catch (IOException e) {
return Mono.error(e);
}
final String elementId = jsonNode.get("elementId").asText();
return Mono.just(elementId);
}
However, IntelliJ notifies me that I'm using an inappropriate blocking method call with this code:
MAPPER.readTree(jsonString);
How can I implement this code in a nonblocking way? I have seen that since Jackson 2.9+, it is possible to parse a Json String in a nonblocking async way, but I don't know how to use that API and I couldn't find an example how to do it correctly.
I am not sure why it is saying it is a blocking call since Jackson is non blocking as far as I know. Anyway one way to resolve this issue is to use schedulers if you do not want to use any other library. Like this.
public Mono<String> readElementIdFromJsonString(String input) {
return Mono.just(Mapper.readTree(input))
.map(it -> it.get("elementId").asText())
.onErrorResume( it -> Mono.error(it))
.subscribeOn(Schedulers.boundedElastic());
}
Something along that line.
import reactor.core.publisher.Mono;
import java.nio.charset.StandardCharsets;
import org.springframework.core.ResolvableType;
import org.springframework.core.io.buffer.DataBufferUtils;
import org.springframework.core.io.buffer.DefaultDataBuffer;
import org.springframework.core.io.buffer.DefaultDataBufferFactory;
import org.springframework.http.codec.json.AbstractJackson2Decoder;
import org.springframework.util.MimeType;
import org.springframework.util.MimeTypeUtils;
import com.fasterxml.jackson.databind.ObjectMapper;
#FunctionalInterface
public interface MessageParser<T> {
Mono<T> parse(String message);
}
public class JsonNodeParser extends AbstractJackson2Decoder implements MessageParser<JsonNode> {
private static final MimeType MIME_TYPE = MimeTypeUtils.APPLICATION_JSON;
private static final ObjectMapper OBJECT_MAPPER = allocateDefaultObjectMapper();
private final DefaultDataBufferFactory factory;
private final ResolvableType resolvableType;
public JsonNodeParser(final Environment env) {
super(OBJECT_MAPPER, MIME_TYPE);
this.factory = new DefaultDataBufferFactory();
this.resolvableType = ResolvableType.forClass(JsonNode.class);
this.setMaxInMemorySize(100000); // 1MB
canDecodeJsonNode();
}
#Override
public Mono<JsonNode> parse(final String message) {
final byte[] bytes = message.getBytes(StandardCharsets.UTF_8);
return decode(bytes);
}
private Mono<JsonNode> decode(final byte[] bytes) {
final DefaultDataBuffer defaultDataBuffer = this.factory.wrap(bytes);
return this.decodeToMono(Mono.just(defaultDataBuffer), this.resolvableType, MIME_TYPE, Map.of())
.ofType(JsonNode.class)
.subscribeOn(Schedulers.boundedElastic())
.doFinally((t) -> DataBufferUtils.release(defaultDataBuffer));
}
private void canDecodeJsonNode() {
if (!canDecode(this.resolvableType, MIME_TYPE)) {
throw new IllegalStateException(String.format("JsonNodeParser doesn't supports the given tar`enter code here`get " +
"element type [%s] and the MIME type [%s]", this.resolvableType, MIME_TYPE));
}
}
}

Issue in Date handling for JSON Object

[enter image description here][1]I am facing some issue in Drools I want to pass date as a date type but currently we don't have any method in JSONObject to handle dates .My JSONObject looks like this.
{"id":600,"city":"Gotham","age":25,"startDate":"29-DEC-2017","endDate":"2014-08-31"}
My Drool condition looks like this.
package com.rules
import org.drools.core.spi.KnowledgeHelper;
import org.json.JSONObject;
rule "ComplexRule1"
salience 100
dialect "mvel"
date-effective "16-Jan-2018 00:00"
no-loop
when
$cdr : JSONObject( $cdr.optString("startDate") >= '28-Dec-2017')
then
$cdr.put("Action_1" , new JSONObject().put("actionName","Complex_Rule1_Action1").put("actionTypeName","SEND OFFER").put("channelName","SMS").put("messageTemplateName","SMSTemplate").put("#timestamp",(new java.text.SimpleDateFormat("yyyy/MM/dd HH:mm:ss")).format(new java.util.Date())).put("ruleFileName","ComplexRule1.drl").put("ruleName","ComplexRule1"));
end
I am currently using .optString Because we dont have any methods like optString/optInt/optBoolean for date. So how can I handle date in Drools?
Any help will be appreciated.
Regards Puneet
My new DRL looks like this :
package com.rules
import com.aravind.drools.SuperJSONObject;
import org.drools.core.spi.KnowledgeHelper;
import org.json.JSONObject;
rule "Convert to SuperJSONObject"
when
$cdr: JSONObject()
then
insert(new SuperJSONObject($cdr));
end
rule "ComplexRule1"
salience 100
dialect "mvel"
date-effective "16-Jan-2018 00:00"
no-loop
when
$cdr : SuperJSONObject( $cdr.getAsDate("startDate") == '28-Dec-2017')
then
$cdr.getObject().put("Action_1" , new JSONObject().put("actionName","Complex_Rule1_Action1").put("actionTypeName","SEND OFFER").put("channelName","SMS").put("messageTemplateName","SMSTemplate").put("#timestamp",(new java.text.SimpleDateFormat("yyyy/MM/dd HH:mm:ss")).format(new java.util.Date())).put("ruleFileName","ComplexRule1.drl").put("ruleName","ComplexRule1"));
end
Class look like this :
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.Date;
import org.json.*;
public class SuperJSONObject {
public final JSONObject obj;
SimpleDateFormat sdfmt2= new SimpleDateFormat("yyyy/MM/dd");
public SuperJSONObject(JSONObject obj){
this.obj = obj;
}
public Date getAsDate(String field) throws ParseException{
return sdfmt2.parse(this.obj.optString(field));
}
public JSONObject getObject(){
return this.obj;
}
}
Another Class is like this
import java.io.File
import java.io.FileReader
import org.drools.KnowledgeBase
import org.drools.KnowledgeBaseFactory
import org.drools.builder.KnowledgeBuilder
import org.drools.builder.KnowledgeBuilderFactory
import org.drools.builder.ResourceType
import org.drools.io.ResourceFactory
import org.drools.runtime.StatefulKnowledgeSession
import org.json.JSONObject
object RunStandAloneDrools {
def main(args: Array[String]): Unit = {
var jsonObjectArray: Array[JSONObject] = new Array(1)
jsonObjectArray(0) = new JSONObject("{\"id\":600,\"city\":\"Gotham\",\"age\":25,\"startDate\":\"28-Dec-2017\",\"endDate\":\"2014-08-01\"}")
var file: String = "/home/puneet/Downloads/ComplexRule1.drl"
var kbuilder: KnowledgeBuilder = KnowledgeBuilderFactory.newKnowledgeBuilder()
kbuilder.add(ResourceFactory.newReaderResource(new FileReader(new File(file))), ResourceType.DRL)
println("Errors? " + kbuilder.getErrors.size())
var iter = kbuilder.getErrors.iterator()
while(iter.hasNext()){
println(iter.next().getMessage)
}
var kbase: KnowledgeBase = KnowledgeBaseFactory.newKnowledgeBase()
kbase.addKnowledgePackages(kbuilder.getKnowledgePackages)
var session: StatefulKnowledgeSession = kbase.newStatefulKnowledgeSession()
callRulesEngine(jsonObjectArray,session)
println("Done")
}
def callRulesEngine(data: Array[JSONObject], knowledgeSession: StatefulKnowledgeSession): Unit = {
data.map ( x => callRulesEngine(x,knowledgeSession) )
}
def callRulesEngine(data: JSONObject, knowledgeSession: StatefulKnowledgeSession): Unit = {
try {
println("Input data " + data.toString())
knowledgeSession.insert(data)
knowledgeSession.fireAllRules()
println("Facts details " + knowledgeSession.getFactCount)
println("Enriched data " + data.toString())
} catch {
case (e: Exception) => println("Exception", e);
}
}
`
Output is not coming as per expectations
There are multiple ways to deal with this, but the fundamental thing is for you to understand that this is NOT a Drools issue at all. Your question is more on how do get a Date from a JSONObject.
One way this could be achieved is by using a function in Drools to make the conversion.
But I don't like functions, so I'll give you another, more elaborated, way to deal with this situation (and many others where a type conversion is required).
The idea is to create a wrapper class for your JSONObject- a SuperJSONObject- that will expose all the functionality you need. For the implementation of this class I will be using composition, but you can use inheritance (or a proxy) if you want.
public class SuperJSONObject {
public final JSONObject obj;
public SuperJSONObject(JSONObject obj){
this.obj = obj;
}
//expose all the methods from JSONObject you want/need
public Date getAsDate(String field){
return someDateParser.parse(this.obj.optString(field));
}
public JSONObject getObject(){
return this.obj;
}
}
So now we have a getAsDate() method that we can use in our rules. But we first need to convert a JSONObject into a SuperJSONObject before we can even use that method. You can do this in multiple ways and places. I'll be showing how to do it in DRL.
rule "Convert to SuperJSONObject"
when
$jo: JSONObject() //you may want to filter which objects are converted by adding constraints to this pattern
then
insert(new SuperJSONObject($jo));
end
And now we are good to go. We can now write a rule using this new class as follows:
rule "ComplexRule1"
salience 100
dialect "mvel"
date-effective "16-Jan-2018 00:00"
no-loop
when
$cdr : SuperJSONObject( getAsDate("startDate") >= '28-Dec-2017')
then
$cdr.getObject().put("Action_1" , ...);
end
After I have written all this code, I might reconsider the option of a simple function in DRL... :P
Hope it helps,

Camel bindy marshal to file creates multiple header row

I have the following camel route:
from(inputDirectory)
.unmarshal(jaxb)
.process(jaxb2CSVDataProcessor)
.split(body()) //because there is a list of CSVRecords
.marshal(bindyCsvDataFormat)
.to(outputDirectory); //appending to existing file using "?autoCreate=true&fileExist=Append"
for my CSV model class I am using annotations:
#CsvRecord(separator = ",", generateHeaderColumns = true)
...
and for properties
#DataField(pos = 0)
...
My problem is that the headers are appended every time a new csv record is appended.
Is there a non-dirty way to control this? Am I missing anything here?
I made a work around which is working quite nicely, creating the header by querying the columnames of the #DataField annotation. This is happening once the first time the file is written. I wrote down the whole solution here:
How to generate a Flat file with header and footer using Camel Bindy
I ended up adding a processor that checks if the csv file exists just before the "to" clause. In there I do a manipulation of the byte array and remove the headers.
Hope this helps anyone else. I needed to do something similar where after my first split message I wanted to supress the header output. Here is a complete class (the 'FieldUtils' is part of the apache commons lib)
package com.routes;
import java.io.OutputStream;
import org.apache.camel.Exchange;
import org.apache.camel.dataformat.bindy.BindyAbstractFactory;
import org.apache.camel.dataformat.bindy.BindyCsvFactory;
import org.apache.camel.dataformat.bindy.BindyFactory;
import org.apache.camel.dataformat.bindy.FormatFactory;
import org.apache.camel.dataformat.bindy.csv.BindyCsvDataFormat;
import org.apache.commons.lang3.reflect.FieldUtils;
public class StreamingBindyCsvDataFormat extends BindyCsvDataFormat {
public StreamingBindyCsvDataFormat(Class<?> type) {
super(type);
}
#Override
public void marshal(Exchange exchange, Object body, OutputStream outputStream) throws Exception {
final StreamingBindyModelFactory factory = (StreamingBindyModelFactory) super.getFactory();
final int splitIndex = exchange.getProperty(Exchange.SPLIT_INDEX, -1, int.class);
final boolean splitComplete = exchange.getProperty(Exchange.SPLIT_COMPLETE, false, boolean.class);
super.marshal(exchange, body, outputStream);
if (splitIndex == 0) {
factory.setGenerateHeaderColumnNames(false); // turn off header generate after first exchange
} else if(splitComplete) {
factory.setGenerateHeaderColumnNames(true); // turn on header generate when split complete
}
}
#Override
protected BindyAbstractFactory createModelFactory(FormatFactory formatFactory) throws Exception {
BindyCsvFactory bindyCsvFactory = new StreamingBindyModelFactory(getClassType());
bindyCsvFactory.setFormatFactory(formatFactory);
return bindyCsvFactory;
}
public class StreamingBindyModelFactory extends BindyCsvFactory implements BindyFactory {
public StreamingBindyModelFactory(Class<?> type) throws Exception {
super(type);
}
public void setGenerateHeaderColumnNames(boolean generateHeaderColumnNames) throws IllegalAccessException {
FieldUtils.writeField(this, "generateHeaderColumnNames", generateHeaderColumnNames, true);
}
}
}

Deserializing from JSON back to joda DateTime in Play 2.0

I can't figure out the magic words to allow posting JSON for a DateTime field in my app. When queried, DateTimes are returned as microseconds since the epoch. When I try to post in that format though ({"started":"1341006642000","task":{"id":1}}), I get "Invalid value: started".
I also tried adding #play.data.format.Formats.DateTime(pattern="yyyy-MM-dd HH:mm:ss") to the started field and posting {"started":"2012-07-02 09:24:45","task":{"id":1}} which had the same result.
The controller method is:
#BodyParser.Of(play.mvc.BodyParser.Json.class)
public static Result create(Long task_id) {
Form<Run> runForm = form(Run.class).bindFromRequest();
for (String key : runForm.data().keySet()) {
System.err.println(key + " => " + runForm.apply(key).value() + "\n");
}
if (runForm.hasErrors())
return badRequest(runForm.errorsAsJson());
Run run = runForm.get();
run.task = Task.find.byId(task_id);
run.save();
ObjectNode result = Json.newObject();
result.put("id", run.id);
return ok(result);
}
I can also see from the output that the values are being received correctly. Anyone know how to make this work?
After reading the "Register a custom DataBinder" section of the Handling form submission page along with the Application global settings page and comparing with this question I came up with the following solution:
I created a custom annotation with an optional format attribute:
package models;
import java.lang.annotation.*;
#Target({ ElementType.FIELD })
#Retention(RetentionPolicy.RUNTIME)
#play.data.Form.Display(name = "format.joda.datetime", attributes = { "format" })
public #interface JodaDateTime {
String format() default "";
}
and registered a custom formatter from onStart:
import java.text.ParseException;
import java.util.Locale;
import org.joda.time.DateTime;
import org.joda.time.format.DateTimeFormat;
import play.*;
import play.data.format.Formatters;
public class Global extends GlobalSettings {
#Override
public void onStart(Application app) {
Formatters.register(DateTime.class, new Formatters.AnnotationFormatter<models.JodaDateTime,DateTime>() {
#Override
public DateTime parse(models.JodaDateTime annotation, String input, Locale locale) throws ParseException {
if (input == null || input.trim().isEmpty())
return null;
if (annotation.format().isEmpty())
return new DateTime(Long.parseLong(input));
else
return DateTimeFormat.forPattern(annotation.format()).withLocale(locale).parseDateTime(input);
}
#Override
public String print(models.JodaDateTime annotation, DateTime time, Locale locale) {
if (time == null)
return null;
if (annotation.format().isEmpty())
return time.getMillis() + "";
else
return time.toString(annotation.format(), locale);
}
});
}
}
You can specify a format if you want, or it will use milliseconds since the epoch by default. I was hoping there would be a simpler way since Joda is included with the Play distribution, but this got things working.
Note: you'll need to restart your Play app as it doesn't seem to detect changes to the Global class.