ZonedDateTime Custom JSON Converter Grails 3.3.0 - json

I am in the process of converting a really old Grails app to the latest version (3.3.0). Things have been a bit frustrating, but I'm pretty close to migrating everything except the JSON and XML marshallers which were previously registered in my BootStrap init.
The previous marshaller registering looked like this:
// register JSON marshallers at startup in all environments
MarshallerUtils.registerMarshallers()
This was defined like this:
class MarshallerUtils {
// Registers marshaller logic for various types that
// aren't supported out of the box or that we want to customize.
// These are used whenever the JSON or XML converters are called,
// e.g. return model as JSON
static registerMarshallers() {
final dateTimeFormatter = ISODateTimeFormat.dateTimeNoMillis()
final isoDateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ")
// register marshalling logic for both XML and JSON converters
[XML, JSON].each { converter ->
// This overrides the marshaller from the joda time plugin to
// force all DateTime instances to use the UTC time zone
// and the ISO standard "yyyy-mm-ddThh:mm:ssZ" format
converter.registerObjectMarshaller(DateTime, 10) { DateTime it ->
return it == null ? null : it.toString(dateTimeFormatter.withZone(org.joda.time.DateTimeZone.UTC))
}
converter.registerObjectMarshaller(Date, 10) { Date it ->
return it == null ? null : isoDateFormat.format(it)
}
converter.registerObjectMarshaller(TIMESTAMP, 10) { TIMESTAMP it ->
return it == null ? null : isoDateFormat.format(it.dateValue())
}
}
}
}
During the migration, I ended up converting all instances of org.joda.time.DateTime to java.time.ZonedDateTime:
class MarshallerUtils {
// Registers marshaller logic for various types that
// aren't supported out of the box or that we want to customize.
// These are used whenever the JSON or XML converters are called,
// e.g. return model as JSON
static registerMarshallers() {
final dateTimeFormatter = DateTimeFormatter.ISO_ZONED_DATE_TIME
final isoDateFormat = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ")
// register marshalling logic for both XML and JSON converters
[XML, JSON].each { converter ->
// This overrides the marshaller from the java.time to
// force all DateTime instances to use the UTC time zone
// and the ISO standard "yyyy-mm-ddThh:mm:ssZ" format
converter.registerObjectMarshaller(ZonedDateTime, 10) { ZonedDateTime it ->
return it == null ? null : it.toString(dateTimeFormatter.withZone(ZoneId.of("UTC")))
}
converter.registerObjectMarshaller(Date, 10) { Date it ->
return it == null ? null : isoDateFormat.format(it)
}
converter.registerObjectMarshaller(TIMESTAMP, 10) { TIMESTAMP it ->
return it == null ? null : isoDateFormat.format(it.dateValue())
}
}
}
}
Unfortunately, after the upgrade to Grails 3.3.0, this marshaller registering doesn't seem to be used at all, no matter what I try to do.
I do know that there is a new "JSON Views" way of doing things, but this particular service has many endpoints, and I don't want to write custom converters and ".gson" templates for all of them, if everything is already in the format I need. I just need the responses to be in JSON and the dates to behave property (be formatted strings).
Instead, what I am finding (compared to the previous behavior, is that the properties which utilize ZonedDateTime are "exploded" in my JSON output. There is an insane amount of garbage date object information that is not needed, and it is not formatted as a simple string as I expect.
I have tried a few things (mostly per recommendations in the offical latest Grails documentation) ---
Custom Converters
Default Date Format
Adding configurations for grails views in application.yml:
views:
json:
generator:
dateFormat: "yyyy-MM-dd'T'HH:mm:ss.SSSZ"
locale: "en/US"
timeZone: "GMT"
Creating this path under "src":
src/main/resources/META-INF/services/grails.plugin.json.builder.JsonGenerator$Converter
And adding a Converter to my domain class which is named in the file above^:
class MultibeamFileConverter implements JsonGenerator.Converter {
final DateTimeFormatter isoDateFormat = DateTimeFormatter.ofPattern("yyyy-MM-dd'T'HH:mm:ss.SSSZ").withZone(ZoneId.of("UTC"));
#Override
boolean handles(Class<?> type) {
MultibeamFile.isAssignableFrom(type)
}
#Override
Object convert(Object value, String key) {
MultibeamFile multibeamFile = (MultibeamFile)value
multibeamFile.startTime.format(isoDateFormat)
multibeamFile.endTime.format(isoDateFormat)
return multibeamFile
}
}
In my controller, I have changed:
return multibeamCatalogService.findFiles(cmd, params)
To this (in order to get JSON output in the browser as before):
respond multibeamCatalogService.findFiles(cmd, params), formats: ['json', 'xml']
Unfortuantely, most permutations I can think to try of the above have resulted in errors such as "Could not resolve view". Otherwise, when I am getting a response, the major issue is that the date is not formatted as a string. This function was previously performed by the Marshaller.
I am getting pretty frustrated. Can someone please tell me how to format ZonedDateTime as a simple string (e.g. - "2009-06-21T00:00:00Z") in my JSON output instead of a giant object like this? Simply converting to java.util.Date causes the "Could not resolve view" error to show up again; consequently, that expects me to make a ".gson" view which never ends up showing the format I expect or is empty.
"startTime": {
"dayOfMonth": 26,
"dayOfWeek": {
"enumType": "java.time.DayOfWeek",
"name": "FRIDAY"
},
"dayOfYear": 207,
"hour": 0,
"minute": 0,
"month": {
"enumType": "java.time.Month",
"name": "JULY"
},
"monthValue": 7,
"nano": 0,
"offset": {
"id": "-06:00",
"rules": {
"fixedOffset": true,
"transitionRules": [],
"transitions": []
},
"totalSeconds": -21600
}, ... // AND SO ON FOR_EVAH

The simple answer is to format a ZonedDateTime object you call .format(DateTimeFormatter). It depends what format you want. You can specify your own or use some of the predefined ones in DateTimeFormatter.
I too though would love to know if there's an easy way to say "for every endpoint display it as json". The only way I've found so far is to have this in every controller class, which isn't too bad but seems silly. I'm using respond followed by a return in my controller methods.
static responseFormats = ['json'] // This is needed for grails to indicate what format to use for respond.
Though I still see the error logged, but rest api still appears to work, "Could not resolve view" for any endpoint I hit.

Related

Avro Json.ObjectWriter - "Not the Json schema" error

I'm writing a tool to convert data from a homegrown format to Avro, JSON and Parquet, using Avro 1.8.0. Conversion to Avro and Parquet is working okay, but JSON conversion throws the following error:
Exception in thread "main" java.lang.RuntimeException: Not the Json schema:
{"type":"record","name":"Torperf","namespace":"converTor.torperf",
"fields":[{"name":"descriptor_type","type":"string","
[... rest of the schema omitted for brevity]
Irritatingly this is the schema that I passed along and which indeed I want the converter to use. I have no idea what Avro is complaining about.
This is the relevant snippet of my code:
// parse the schema file
Schema.Parser parser = new Schema.Parser();
Schema mySchema;
// tried two ways to load the schema
// like this
File schemaFile = new File("myJsonSchema.avsc");
mySchema = parser.parse(schemaFile) ;
// and also like Json.class loads it's schema
mySchema = parser.parse(Json.class.getResourceAsStream("myJsonSchema.avsc"));
// initialize the writer
Json.ObjectWriter jsonDatumWriter = new Json.ObjectWriter();
jsonDatumWriter.setSchema(mySchema);
OutputStream out = new FileOutputStream(new File("output.avro"));
Encoder encoder = EncoderFactory.get().jsonEncoder(mySchema, out);
// append a record created by way of a specific mapping
jsonDatumWriter.write(specificRecord, encoder);
I replaced myJsonSchema.avsc with the one returned from the exception without success (and except whitespace and linefeeds they are the same). Initializing the jsonEncoder with org.apache.avro.data.Json.SCHEMA instead of mySchema didn't change anything either. Replacing the schema passed to Json.ObjectWriter with org.apache.avro.data.Json.SCHEMA leads to a NullPointerException at org.apache.avro.data.Json.write(Json.java:183) (which is a deprecated method).
From staring at org.apache.avro.data.Json.java it seems to me like Avro is checking my record schema against it's own schema of a Json record (line 58) for equality (line 73).
58 SCHEMA = Schema.parse(Json.class.getResourceAsStream("/org/apache/avro/data/Json.avsc"));
72 public void setSchema(Schema schema) {
73 if(!Json.SCHEMA.equals(schema))
74 throw new RuntimeException("Not the Json schema: " + schema);
75 }
The referenced Json.avsc defines the field types of a record:
{"type": "record", "name": "Json", "namespace":"org.apache.avro.data",
"fields": [
{"name": "value",
"type": [
"long",
"double",
"string",
"boolean",
"null",
{"type": "array", "items": "Json"},
{"type": "map", "values": "Json"}
]
}
]
}
equals is implemented in org.apache.avro.Schema, line 346:
public boolean equals(Object o) {
if(o == this) {
return true;
} else if(!(o instanceof Schema)) {
return false;
} else {
Schema that = (Schema)o;
return this.type != that.type?false:this.equalCachedHash(that) && this.props.equals(that.props);
}
}
I don't fully understand what's going on in the third check (especially equalCachedHash()) but I only recognize checks for equality in a trivial way which doesn't make sense to me.
Also I can't find any examples or notes about usage of Avro's Json.ObjectWriter on the InterWebs. I wonder if I should go with the deprecated Json.Writer instead because there are at least a few code snippets online to learn and glean from.
The full source is available at https://github.com/tomlurge/converTor
Thanks,
Thomas
A little more debugging proofed that passing org.apache.avro.data.Json.SCHEMA to Json.ObjectWriter is indeed the right thing to do. The object I get back written to System.out prints the JSON object that I expect. The null pointer exception though did not go away.
Probably I would not have had to setSchema() of Json.ObjectWriter at all since omitting the command alltogether leads to the same NullPointerException.
I finally filed a bug with Avro and it turned out that in my code I was handing an object of type "specific" to ObjectWriter which it couldn't handle. It did return silently though and an error was thrown only at a later stage. That was fixed in Avro 1.8.1 - see https://issues.apache.org/jira/browse/AVRO-1807 for details.

How to intercept map getProperty and list getAt?

I'm scraping external sources, mostly JSON. I'm using new JsonSlurper().parse(body) to parse them and I operate on them using constructs like def name = json.user[0].name. These being external, can change without notice, so I want to be able to detect this and log it somehow.
After reading a bit about the MOP I thought that I can change the appropriate methods of the maps and lists to log if the property is missing. I only want to do that the json object and on its properties recursively. The thing is, I don't know how to do that.
Or, is there a better way to accomplish all this?
[EDIT] For example if I get this JSON:
def json = '''{
"owners": [
{
"firstName": "John",
"lastName": "Smith"
},
{
"firstName": "Jane",
"lastName": "Smith"
}
]
}'''
def data = new groovy.json.JsonSlurper().parse(json.bytes)
assert data.owners[0].firstName == 'John'
However, if they change "owners" to "ownerInfo" the above access would throw NPE. What I want is intercept the access and do something (like log it in a special log, or whatever). I can also decide to throw a more specialized exception.
I don't want to catch NullPointerException, because it may be caused by some bug in my code instead of the changed data format. Besides, if they changed "firstName" to "givenName", but kept the "owners" name, I'd just get a null value, not NPE. Ideally I want to detect this case as well.
I also don't want to put a lot of if or evlis operators, if possible.
I actually managed to intercept that for maps:
data.getMetaClass().getProperty = {name -> println ">>> $name"; delegate.get(name)}
assert data.owners // this prints ">>> owners"
I still can't find out how to do that for the list:
def owners = data.owners
owners.getMetaClass().getAt(o -> println "]]] $o"; delegate.getAt(o)}
assert owners[0] // this doesn't print anything
Try this
owners.getMetaClass().getAt = { Integer o -> println "]]] $o"; delegate.get(o)}
I'm only guessing that it got lost because of multiple getAt() methods, so you have to define type. I also delegated to ArrayList's Java get() method since getAt() resulted in recursive calls.
If you want to more control over all methods calls, you could always do this
owners.getMetaClass().invokeMethod = { String methodName, Object[] args ->
if (methodName == "getAt") {
println "]]] $args"
}
return ArrayList.getMetaClass().getMetaMethod(methodName, args).invoke(delegate, args)
}
The short answer is that you can't do this with the given example. The reason is that the owners object is a java.util.ArrayList, and you are calling the get(int index) method on it. The metaClass object is specific to Groovy, and if you have a Java object making a method call to a Java method, it will have no knowledge of the metaClass. Here's a somewhat related question.
The good news is that there is and option, although I'm not sure if it works for your use case. You can create a Groovy wrapper object for this list, so that you can capture method calls.
For example, you could change your code from this
def owners = data.owners
to this
def owners = new GroovyList(data.owners)
and then create this new class
class GroovyList {
private List list
public GroovyList(List list) {
this.list = list
}
public Object getAt(int index) {
println "]]] $index"
list.getAt(index)
}
}
Now when you call
owners[0]
you'll get the output
]]] 0
[firstName:John, lastName:Smith]

Getting SerializableException while saving data into isolated storage

I am working on windows phone 8 app. In which I need to call an api to get some response, that I am serializing and saving into local settings
While saving these data I am getting an exception of type 'System.Runtime.Serialization.SerializationException'.
My Code is,
string someData = responseDataDict["SomeData"].ToString();
if someData != null)
{
Dictionary<string, Object> someDict = JsonConvert.DeserializeObject<Dictionary<string, object>>(someData);
Datastore.SaveData = stringDict;
}
public static Datastore
{
get
{
if (localSettings.Contains("SaveData"))
{
return (Dictionary<string, Object>)localSettings["SaveData];
else
return null;
}
}
set
{
localSettings["SaveData"] = value;
localSettings.Save();
}
}
}
The response from api is,
"MESSAGE": "Some Message",
"UI_MESSAGE": {
"LEFT": "OK",
"RIGHT": "CANCEL",
}
I think the problem is in "UI_MESSAGE",
The Exception is,
System.Runtime.Serialization.SerializationException: Type 'Newtonsoft.Json.Linq.JObject' with data contract name 'ArrayOfKeyValueOfstringJTokeneJCYCtcq:http://schemas.microsoft.com/2003/10/Serialization/Arrays' is not expected. Add any types not known statically to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding them to the list of known types passed to DataContractSerializer.
Please help me to resolve this issue, Thanks in advance
You can't (easily) serialize the data because the serializer doesn't know how to serialize Newtonsoft.Json.Linq.JObject types.
We could try to figure out how to get the serializer working with this type, which may or may not take lots of time, or we could stop and think for a bit...
You grab the response data in the line
string someData = responseDataDict["SomeData"].ToString();
What is someData? It's
{
"MESSAGE": "Some Message",
"UI_MESSAGE": {
"LEFT": "OK",
"RIGHT": "CANCEL",
}
That's a json string. A serialized javascript object. It's already bloody serialized.
Save that in localSettings.

persisting a json to mongodb how to specify a date attribute

I am persisting a JSON object to MongoDB here is relevant snippet:
Viewed: [
-{
ViewedID: "8992ade400a"
Dockey: "3323aba3233"
PID: "32399a"
actionsTaken: "email|direct message|report seeker"
viewDate: "01-APR-2014"
MessageSent: "true"
-Message: [
-{
MessageID: "123aca323"
Delivered: "True"
Opened: "True"
ClickThroughRate: "NotBad"
MessageDate: "02-APR-2014"
-Response: [
-{
ResponseId: "a323a9da"
ResponseDate: "23-APR-2014"
}
]
}
Here is how I am setting the JSON on my persisting object:
trackingData.setEventdata((DBObject)JSON.parse(tracker.getEventData().toString()));
where tracker.getEventData() returns ObjectNode when I persist the DBObject to mongo I see the dates such "viewDate" : "01-APR-2014" and "ResponseDate" : "23-APR-2014" as Strings.
I need these attributes to be converted to type date so I may query against these dates. Is there anyway to specify that these be handled as date objects either before or after parsing that JSON to DBObject?
Cheers,
Will
The com.mongodb.util.JSON.parse method supports Extended JSON format, so if the JSON was actually formatted that way then this would automatically be cast as a date.
But since it isn't then you would either need to implement a custom parser that handles the "date" elements in your JSON or otherwise just manipulate the BSON before sending it on to MongoDB.
The simplest way is just to manipulate, with something like as shown in this example:
try {
DBObject content = (DBObject)com.mongodb.util.JSON.parse("{ \"date\": \"01-APR-2014\" }");
System.out.println(content);
SimpleDateFormat sdf = new SimpleDateFormat("dd-MMM-yyyy");
sdf.setTimeZone(TimeZone.getTimeZone("GMT"));
content.put("date",sdf.parse((String) content.get("date")));
System.out.println(content);
} catch (ParseException e) {
e.printStackTrace();
}
So after processing the .parse() you can basically do that date conversion for each relevant date field in your source. Also keep in mind that the data is going to be stored with dates as "UTC/GMT", so make sure that the Timezone you are getting from the source is correctly represented.

CSV Media Type Formatter for ASP.NET Web API

I tried to implement a CSV MediaTypeFormatter for my Web API as described here:
http://www.tugberkugurlu.com/archive/creating-custom-csvmediatypeformatter-in-asp-net-web-api-for-comma-separated-values-csv-format
(I don't want to paste in all the code from there)
But I don't get it to work using the Web API Controller below.
I used Fiddler to call the Web API with: http://myhostname.com/api/csvexport?format=csv
public dynamic Get()
{
var ret = new[] { "CarId", "Make", "Model", "Name" };
return ret;
}
For "type" in the CsvFormatter i get a:
DeclaringMethod = 'type.DeclaringMethod' threw an exception of type 'System.InvalidOperationException'
with
Method may only be called on a Type for which Type.IsGenericParameter is true.
So I might not get the concept of the Formatter right and have a problem with the type?
You are getting this error because Tugberk's formatter only works for models which implement the generic IEnumerable<T> interface. It makes sense, since people generally want CSV formatted data when they are getting a set of results. If you only want one data entity, why would you want it in CVS format?
Your method return type is dynamic, not IEnumerable<T>. You might be able to get it to work by doing something more like this:
public IEnumerable<string> Get()
{
var ret = new[] { "CarId", "Make", "Model", "Name" };
return ret;
}