I have been dabbling with JSON and I see that in the documentation (JAVA), JSONObject's put() and accumulate() pretty much do the same thing?
What is that about?
I saw the Java Source Code for JSONObject and the difference between accumulate and put is that with accumulate(String key,Object Value), if there exists some value for "key" then the Object is checked for being an array, if it is an array then the "value" is added to the array else an array is created for this key.
In put, however, the key if it exists, it's value is replaced by the value - "value"
Here is the source of JSONObject accumulate(String key, Object Value)
/**
* Appends {#code value} to the array already mapped to {#code name}. If
* this object has no mapping for {#code name}, this inserts a new mapping.
* If the mapping exists but its value is not an array, the existing
* and new values are inserted in order into a new array which is itself
* mapped to {#code name}. In aggregate, this allows values to be added to a
* mapping one at a time.
*
* <p> Note that {#code append(String, Object)} provides better semantics.
* In particular, the mapping for {#code name} will <b>always</b> be a
* {#link JSONArray}. Using {#code accumulate} will result in either a
* {#link JSONArray} or a mapping whose type is the type of {#code value}
* depending on the number of calls to it.
*
* #param value a {#link JSONObject}, {#link JSONArray}, String, Boolean,
* Integer, Long, Double, {#link #NULL} or null. May not be {#link
* Double#isNaN() NaNs} or {#link Double#isInfinite() infinities}.
*/
public JSONObject accumulate(String name, Object value) throws JSONException {
Object current = nameValuePairs.get(checkName(name));
if (current == null) {
return put(name, value);
}
if (current instanceof JSONArray) {
JSONArray array = (JSONArray) current;
array.checkedPut(value);
} else {
JSONArray array = new JSONArray();
array.checkedPut(current);
array.checkedPut(value);
nameValuePairs.put(name, array);
}
return this;
}
And here is the code for JSONObject put (String key, Object value)
/**
* Maps {#code name} to {#code value}, clobbering any existing name/value
* mapping with the same name.
*
* #return this object.
*/
public JSONObject put(String name, boolean value) throws JSONException {
nameValuePairs.put(checkName(name), value);
return this;
}
Related
Using MongoDB java driver, applying toJson() method on Document will get a JSON representation of this document with JsonMode set to STRICT.
The following epoch format is used for dates: { "$date" : "dateAsMilliseconds" }
Using mongoexport, we get an ISO-8601 format.
Seen in official doc ( https://docs.mongodb.com/manual/reference/mongodb-extended-json/ ) :
In Strict mode, date is an ISO-8601 date format with a mandatory time zone field following the template YYYY-MM-DDTHH:mm:ss.mmm<+/-Offset>.
The MongoDB JSON parser currently does not support loading ISO-8601 strings representing dates prior to the Unix epoch. When formatting pre-epoch dates and dates past what your system’s time_t type can hold, the following format is used:
{ "$date" : { "$numberLong" : "dateAsMilliseconds" } }
I would appreciate if someone can explain me why there is no common format used between MongoDB java driver, mongoexport tool and official docs?
Thanks.
Obviously there is NO good reason for the Java driver to deviate from the official specification. The only exception are for those dates which cannot be expressed in the ISO8601 format (like B.C. dates...)
As a work around I have extended the JsonWriter class and provided two toJson static methods as an example of how to use it:
package whatever.package.you.like;
import java.io.IOException;
import java.io.StringWriter;
import java.io.Writer;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.TimeZone;
import org.bson.BSONException;
import org.bson.BsonContextType;
import org.bson.BsonDocument;
import org.bson.codecs.BsonDocumentCodec;
import org.bson.codecs.EncoderContext;
import org.bson.conversions.Bson;
import org.bson.json.JsonMode;
import org.bson.json.JsonWriter;
import org.bson.json.JsonWriterSettings;
import com.mongodb.MongoClient;
/**
* A {#link JsonWriter} extension that conforms to the "strict" JSON format
* specified by MongoDB for data/time values.
*
* The {#link JsonWriter} class provided in the MongoDB Java driver (version
* 3.2.2) does not conform to official MongoDB specification for strict mode
* JSON (see https://docs.mongodb.com/manual/reference/mongodb-extended-json/).
* This is specifically a problem with the date/time values which get filled
* with a milliseconds value (i.e. {$date: 309249234098}) instead of the ISO8601
* date/time (i.e. {$date: "2016-07-14T08:44:23.234Z"}) value which the
* specification calls for. This extension of {#link JsonWriter} conforms to the
* MongoDb specification in this regard.
*/
public class ConformingJsonWriter extends JsonWriter {
private final JsonWriterSettings settings;
private final Writer writer;
private boolean writingIndentedDateTime = false;
/**
* Creates a new instance which uses {#code writer} to write JSON to.
*
* #param writer
* the writer to write JSON to.
*/
public ConformingJsonWriter(final Writer writer) {
this(writer, new JsonWriterSettings());
}
/**
* Creates a new instance which uses {#code writer} to write JSON to and uses
* the given settings.
*
* #param writer
* the writer to write JSON to.
* #param settings
* the settings to apply to this writer.
*/
public ConformingJsonWriter(final Writer writer,
final JsonWriterSettings settings) {
super(writer, settings);
this.writer = writer;
this.settings = settings;
setContext(new Context(null, BsonContextType.TOP_LEVEL, ""));
}
private void writeIndentation(int skip) throws IOException {
for (Context context = getContext()
.getParentContext(); context != null; context = context
.getParentContext()) {
if (skip-- <= 0) {
writer.write(settings.getIndentCharacters());
}
}
}
private static String millisToIso8601(long millis) throws IOException {
SimpleDateFormat dateFormat = new SimpleDateFormat(
"yyyy-MM-dd\'T\'HH:mm:ss.SSS\'Z\'");
dateFormat.setTimeZone(TimeZone.getTimeZone("UTC"));
return dateFormat.format(new Date(millis));
}
#Override
protected void doWriteDateTime(final long value) {
if ((settings.getOutputMode() == JsonMode.STRICT)
&& (value >= -59014396800000L && value <= 253399536000000L)) {
try {
writeStartDocument();
if (settings.isIndent()) {
writingIndentedDateTime = true;
writer.write(settings.getNewLineCharacters());
writeIndentation(0);
} else {
writer.write(" ");
}
writer.write("\"$date\" : ");
writer.write("\"");
writer.write(millisToIso8601(value));
writer.write("\"");
writeEndDocument();
writingIndentedDateTime = false;
} catch (IOException e) {
throw new BSONException("Wrapping IOException", e);
}
} else {
super.doWriteDateTime(value);
}
}
#Override
protected void doWriteEndDocument() {
if (writingIndentedDateTime) {
try {
writer.write(settings.getNewLineCharacters());
writeIndentation(1);
writer.write("}");
if (getContext()
.getContextType() == BsonContextType.SCOPE_DOCUMENT) {
setContext(getContext().getParentContext());
writeEndDocument();
} else {
setContext(getContext().getParentContext());
}
} catch (IOException e) {
throw new BSONException("Wrapping IOException", e);
}
} else {
super.doWriteEndDocument();
}
}
/**
* Take a {#link Bson} instance and convert it to "strict" JSON
* representation with no indentation (read, "NOT pretty printed").
*
* #param bson
* The {#link Bson} instance to convert
* #return The JSON representation.
*/
public static String toJson(Bson bson) {
return toJson(bson, new JsonWriterSettings());
}
/**
* Take a {#link Bson} instance and convert it to JSON representation.
*
* #param bson
* The {#link Bson} instance to convert
* #param writerSettings
* {#link JsonWriterSettings} that specify details about how the
* JSON output should look.
* #return The JSON representation.
*/
public static String toJson(Bson bson,
final JsonWriterSettings writerSettings) {
BsonDocumentCodec encoder = new BsonDocumentCodec();
ConformingJsonWriter writer = new ConformingJsonWriter(new StringWriter(),
writerSettings);
encoder.encode(writer,
bson.toBsonDocument(BsonDocument.class,
MongoClient.getDefaultCodecRegistry()),
EncoderContext.builder().isEncodingCollectibleDocument(true)
.build());
return writer.getWriter().toString();
}
}
I'm building a Rest API and I receive a json_encoded string from the clients.
I want this string to be decoded before saving my entity, because it's going into a jsonb field in PostgreSQL.
The behavior I want is :
Validate that the string is valid json, if not, add a violation in the form via a custom validator
Automatically decode the string and set the json object in the entity property
I've tried two different strategies
In the entity setMetadata($value) method, if $value is a string, I decode it
I created a DataTransformer that json_decode the value received in the form
But both these solutions don't work because the custom validator I created is called after, and it calls directly $lesson->getMetadata(). Since the value has already been decoded (either in the setMetadata() method or in the DataTransformer, the validator receive either a json object or null. So I can't add a violation to the form, since I have no way to know if the value received was actually null, or if the string was malformed.
Here is the lesson entity:
class Lesson extends BaseContent
{
[…]
/**
* #var jsonb
*
* #ORM\Column(name="metadata", type="jsonb", nullable=true)
* #KreactiveAssert\Json
*/
private $metadata;
[…]
}
Here is the custom validator:
class JsonValidator extends ConstraintValidator
{
public function validate($value, Constraint $constraint)
{
if ($value && !json_decode($value)) {
$this->context->addViolation($constraint->message, array('%string%' => $value));
}
}
}
And here is the DataTransformer:
class StringToJsonTransformer implements DataTransformerInterface
{
/**
* Transform a json object to a string
* #param Json|null $json
* #return String
*/
public function transform($json)
{
if (null === $json) {
return "";
}
return json_encode($json);
}
/**
* Transform a string to a json object
* #param String $string
* #return Object
*/
public function reverseTransform($string)
{
if (!$string) {
return null;
}
throw new TransformationFailedException('error transforming');
return json_decode($string);
}
}
Is there any way I can validate the input data in the form, and then set the metadata as a json object?
I've found this (I don't know how come I didn't find it earlier):
Combine constraints and data transformers
I'm going to make an ugly workaround as suggested, even though I don't like that solution.
<?php
class StringToJsonTransformer implements DataTransformerInterface
{
/**
* Transform a string to a json object
* #param String $string
* #return Object
*/
public function reverseTransform($string)
{
if (!$string) {
return null;
}
/*
* UGLY WORKAROUND
* we return -1 if the json_decode fail
* so the validator can add a violation in the form telling
* the json string was not valid
* If we don't do this, the validator will receive either
* null or a json object. In case of null, there is no way to
* tell if the client sent null, or if the decoding failed
*/
$value = json_decode($string);
return $value ? $value : -1;
}
}
I'm still not sure if I'm going to return -1 or something else. In the custom validator, I get an error if I try to compare a jsonObject with -1 (which is normal).
I'm facing some issues when trying to set an enum decoder in Smooks, in order to decode values from a CSV file.
I need an EnumDecoder and although I saw that you can instantiate it and set the configuration, I could not find where to set it in order to make it active.
So far I managed to write something like this:
EnumDecoder decoder = (EnumDecoder)DataDecoder.Factory.create(SomeEnumTypeClass.class);
Properties properties = new Properties();
properties.put("enumType", SomeEnumTypeClass.class.getName());
properties.put("John", SomeEnumTypeClass.JOHN);
decoder.setConfiguration(properties);
One other issue with setting the decoder I had from the configuration file.
smooks-config.xml:
<?xml version="1.0" encoding="UTF-8"?>
<smooks-resource-list xmlns="http://www.milyn.org/xsd/smooks-1.1.xsd"
xmlns:jb="http://www.milyn.org/xsd/smooks/javabean-1.2.xsd"
xmlns:csv="http://www.milyn.org/xsd/smooks/csv-1.2.xsd">
<csv:reader fields="field1,field2EnumType" separator=";">
<csv:listBinding beanId="beanList" class="com.myApp.TestingBean"/>
</csv:reader>
<jb:bean beanId="beanMockUp"
class="com.myApp.TestingBean">
<jb:value property="field1" data="header/field1" />
<jb:value property="field2EnumType"
data="header/field2EnumType"
decoder="Enum">
<jb:decodeParam name="enumType">com.myApp.SomeEnumTypeClass</jb:decodeParam>
<jb:decodeParam name="John">JOHN</jb:decodeParam>
<jb:decodeParam name="Jack">JACK</jb:decodeParam>
</jb:value>
</jb:bean>
</smooks-resource-list>
Main.java:
Smooks smooks = null;
try {
smooks = new Smooks("smooks-config.xml");
} catch (IOException | SAXException ex) {
Logger.getLogger(Main.class.getName()).log(Level.SEVERE, null, ex);
}
try {
ExecutionContext executionContext = smooks.createExecutionContext();
JavaResult result = new JavaResult();
smooks.filterSource(executionContext, new StringSource(messageIn), result);
return (List) result.getBean("beanList");
} finally {
smooks.close();
}
TestingBean.java:
public class TestingBean {
private String field1;
private SomeEnumTypeClass field2EnumType;
/**
* Get the value of field2EnumType
*
* #return the value of field2EnumType
*/
public SomeEnumTypeClass getField2EnumType() {
return field2EnumType;
}
/**
* Set the value of field2EnumType
*
* #param field2EnumType new value of field2EnumType
*/
public void setField2EnumType(SomeEnumTypeClass field2EnumType) {
this.field2EnumType = field2EnumType;
}
/**
* Get the value of field1
*
* #return the value of field1
*/
public String getField1() {
return field1;
}
/**
* Set the value of field1
*
* #param field1 new value of field1
*/
public void setField1(String field1) {
this.field1= field1;
}
}
CSV file:
abc;john
I always end up with the error:
Exception in thread "main" org.milyn.SmooksException: Failed to filter source.
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:97)
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:64)
at org.milyn.Smooks._filter(Smooks.java:526)
at org.milyn.Smooks.filterSource(Smooks.java:482)
..........
Caused by: org.milyn.javabean.DataDecodeException: Failed to decode binding value 'John' for property 'field2EnumType' on bean 'fewijf-3243-fijewoi'
at org.milyn.javabean.BeanInstancePopulator.decodeDataString(BeanInstancePopulator.java:624)
at org.milyn.javabean.BeanInstancePopulator.decodeAndSetPropertyValue(BeanInstancePopulator.java:513)
at org.milyn.javabean.BeanInstancePopulator.bindSaxDataValue(BeanInstancePopulator.java:449)
at org.milyn.javabean.BeanInstancePopulator.visitAfter(BeanInstancePopulator.java:379)
at org.milyn.delivery.sax.SAXHandler.visitAfter(SAXHandler.java:389)
at org.milyn.delivery.sax.SAXHandler.endElement(SAXHandler.java:204)
at org.milyn.delivery.SmooksContentHandler.endElement(SmooksContentHandler.java:96)
at org.milyn.flatfile.FlatFileReader.parse(FlatFileReader.java:165)
at org.milyn.delivery.sax.SAXParser.parse(SAXParser.java:76)
at org.milyn.delivery.sax.SmooksSAXFilter.doFilter(SmooksSAXFilter.java:86)
... 5 more
Thanks!
In your csv data "john" is lowercase, but your smooks config decodeParam for "JOHN" is tied to "John" capitalized.
Your smocks-config for the Enum should be something like this:
<jb:value property="field2EnumType"
data="header/field2EnumType"
decoder="Enum">
<jb:decodeParam name="enumType">com.myApp.SomeEnumTypeClass</jb:decodeParam>
<jb:decodeParam name="john">JOHN</jb:decodeParam>
<jb:decodeParam name="jack">JACK</jb:decodeParam>
</jb:value>
Notice the name attribute of the decodeParam. It needs to exactly match the data you are decoding (case sensitive).
How should one deal with Gsonand required versus optional fields?
Since all fields are optional, I can't really fail my network request based on if the response json contains some key, Gsonwill simply parse it to null.
Method I am using gson.fromJson(json, mClassOfT);
For example if I have following json:
{"user_id":128591, "user_name":"TestUser"}
And my class:
public class User {
#SerializedName("user_id")
private String mId;
#SerializedName("user_name")
private String mName;
public String getId() {
return mId;
}
public void setId(String id) {
mId = id;
}
public String getName() {
return mName;
}
public void setName(String name) {
mName = name;
}
}
Is the any option to get Gson to fail if json would not contain user_id or user_name key?
There can be many cases where you might need at least some values to be parsed and other one could be optional?
Is there any pattern or library to be used to handle this case globally?
Thanks.
As you note, Gson has no facility to define a "required field" and you'll just get null in your deserialized object if something is missing in the JSON.
Here's a re-usable deserializer and annotation that will do this. The limitation is that if the POJO required a custom deserializer as-is, you'd have to go a little further and either pass in a Gson object in the constructor to deserialize to object itself or move the annotation checking out into a separate method and use it in your deserializer. You could also improve on the exception handling by creating your own exception and pass it to the JsonParseException so it can be detected via getCause() in the caller.
That all said, in the vast majority of cases, this will work:
public class App
{
public static void main(String[] args)
{
Gson gson =
new GsonBuilder()
.registerTypeAdapter(TestAnnotationBean.class, new AnnotatedDeserializer<TestAnnotationBean>())
.create();
String json = "{\"foo\":\"This is foo\",\"bar\":\"this is bar\"}";
TestAnnotationBean tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
json = "{\"foo\":\"This is foo\"}";
tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
json = "{\"bar\":\"This is bar\"}";
tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
}
}
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
#interface JsonRequired
{
}
class TestAnnotationBean
{
#JsonRequired public String foo;
public String bar;
}
class AnnotatedDeserializer<T> implements JsonDeserializer<T>
{
public T deserialize(JsonElement je, Type type, JsonDeserializationContext jdc) throws JsonParseException
{
T pojo = new Gson().fromJson(je, type);
Field[] fields = pojo.getClass().getDeclaredFields();
for (Field f : fields)
{
if (f.getAnnotation(JsonRequired.class) != null)
{
try
{
f.setAccessible(true);
if (f.get(pojo) == null)
{
throw new JsonParseException("Missing field in JSON: " + f.getName());
}
}
catch (IllegalArgumentException ex)
{
Logger.getLogger(AnnotatedDeserializer.class.getName()).log(Level.SEVERE, null, ex);
}
catch (IllegalAccessException ex)
{
Logger.getLogger(AnnotatedDeserializer.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
return pojo;
}
}
Output:
This is foo
this is bar
This is foo
null
Exception in thread "main" com.google.gson.JsonParseException: Missing field in JSON: foo
Answer of Brian Roach is very good, but sometimes it's also necessary to handle:
properties of model's super class
properties inside of arrays
For these purposes the following class can be used:
/**
* Adds the feature to use required fields in models.
*
* #param <T> Model to parse to.
*/
public class JsonDeserializerWithOptions<T> implements JsonDeserializer<T> {
/**
* To mark required fields of the model:
* json parsing will be failed if these fields won't be provided.
* */
#Retention(RetentionPolicy.RUNTIME) // to make reading of this field possible at the runtime
#Target(ElementType.FIELD) // to make annotation accessible through reflection
public #interface FieldRequired {}
/**
* Called when the model is being parsed.
*
* #param je Source json string.
* #param type Object's model.
* #param jdc Unused in this case.
*
* #return Parsed object.
*
* #throws JsonParseException When parsing is impossible.
* */
#Override
public T deserialize(JsonElement je, Type type, JsonDeserializationContext jdc)
throws JsonParseException {
// Parsing object as usual.
T pojo = new Gson().fromJson(je, type);
// Getting all fields of the class and checking if all required ones were provided.
checkRequiredFields(pojo.getClass().getDeclaredFields(), pojo);
// Checking if all required fields of parent classes were provided.
checkSuperClasses(pojo);
// All checks are ok.
return pojo;
}
/**
* Checks whether all required fields were provided in the class.
*
* #param fields Fields to be checked.
* #param pojo Instance to check fields in.
*
* #throws JsonParseException When some required field was not met.
* */
private void checkRequiredFields(#NonNull Field[] fields, #NonNull Object pojo)
throws JsonParseException {
// Checking nested list items too.
if (pojo instanceof List) {
final List pojoList = (List) pojo;
for (final Object pojoListPojo : pojoList) {
checkRequiredFields(pojoListPojo.getClass().getDeclaredFields(), pojoListPojo);
checkSuperClasses(pojoListPojo);
}
}
for (Field f : fields) {
// If some field has required annotation.
if (f.getAnnotation(FieldRequired.class) != null) {
try {
// Trying to read this field's value and check that it truly has value.
f.setAccessible(true);
Object fieldObject = f.get(pojo);
if (fieldObject == null) {
// Required value is null - throwing error.
throw new JsonParseException(String.format("%1$s -> %2$s",
pojo.getClass().getSimpleName(),
f.getName()));
} else {
checkRequiredFields(fieldObject.getClass().getDeclaredFields(), fieldObject);
checkSuperClasses(fieldObject);
}
}
// Exceptions while reflection.
catch (IllegalArgumentException | IllegalAccessException e) {
throw new JsonParseException(e);
}
}
}
}
/**
* Checks whether all super classes have all required fields.
*
* #param pojo Object to check required fields in its superclasses.
*
* #throws JsonParseException When some required field was not met.
* */
private void checkSuperClasses(#NonNull Object pojo) throws JsonParseException {
Class<?> superclass = pojo.getClass();
while ((superclass = superclass.getSuperclass()) != null) {
checkRequiredFields(superclass.getDeclaredFields(), pojo);
}
}
}
First of all the interface (annotation) to mark required fields with is described, we'll see an example of its usage later:
/**
* To mark required fields of the model:
* json parsing will be failed if these fields won't be provided.
* */
#Retention(RetentionPolicy.RUNTIME) // to make reading of this field possible at the runtime
#Target(ElementType.FIELD) // to make annotation accessible throw the reflection
public #interface FieldRequired {}
Then deserialize method is implemented. It parses json strings as usual: missing properties in result pojo will have null values:
T pojo = new Gson().fromJson(je, type);
Then the recursive check of all fields of the parsed pojo is being launched:
checkRequiredFields(pojo.getClass().getDeclaredFields(), pojo);
Then we also check all fields of pojo's super classes:
checkSuperClasses(pojo);
It's required when some SimpleModel extends its SimpleParentModel and we want to make sure that all properties of SimpleModel marked as required are provided as SimpleParentModel's ones.
Let's take a look on checkRequiredFields method. First of all it checks if some property is instance of List (json array) - in this case all objects of the list should also be checked to make sure that they have all required fields provided too:
if (pojo instanceof List) {
final List pojoList = (List) pojo;
for (final Object pojoListPojo : pojoList) {
checkRequiredFields(pojoListPojo.getClass().getDeclaredFields(), pojoListPojo);
checkSuperClasses(pojoListPojo);
}
}
Then we are iterating through all fields of pojo, checking if all fields with FieldRequired annotation are provided (what means these fields are not null). If we have encountered some null property which is required - an exception will be fired. Otherwise another recursive step of the validation will be launched for current field, and properties of parent classes of the field will be checked too:
for (Field f : fields) {
// If some field has required annotation.
if (f.getAnnotation(FieldRequired.class) != null) {
try {
// Trying to read this field's value and check that it truly has value.
f.setAccessible(true);
Object fieldObject = f.get(pojo);
if (fieldObject == null) {
// Required value is null - throwing error.
throw new JsonParseException(String.format("%1$s -> %2$s",
pojo.getClass().getSimpleName(),
f.getName()));
} else {
checkRequiredFields(fieldObject.getClass().getDeclaredFields(), fieldObject);
checkSuperClasses(fieldObject);
}
}
// Exceptions while reflection.
catch (IllegalArgumentException | IllegalAccessException e) {
throw new JsonParseException(e);
}
}
}
And the last method should be reviewed is checkSuperClasses: it just runs the similar required fields validation checking properties of pojo's super classes:
Class<?> superclass = pojo.getClass();
while ((superclass = superclass.getSuperclass()) != null) {
checkRequiredFields(superclass.getDeclaredFields(), pojo);
}
And finally lets review some example of this JsonDeserializerWithOptions's usage. Assume we have the following models:
private class SimpleModel extends SimpleParentModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
#JsonDeserializerWithOptions.FieldRequired NestedModel nested;
#JsonDeserializerWithOptions.FieldRequired ArrayList<ListModel> list;
}
private class SimpleParentModel {
#JsonDeserializerWithOptions.FieldRequired Integer rev;
}
private class NestedModel extends NestedParentModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
}
private class NestedParentModel {
#JsonDeserializerWithOptions.FieldRequired Integer rev;
}
private class ListModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
}
We can be sure that SimpleModel will be parsed correctly without exceptions in this way:
final Gson gson = new GsonBuilder()
.registerTypeAdapter(SimpleModel.class, new JsonDeserializerWithOptions<SimpleModel>())
.create();
gson.fromJson("{\"list\":[ { \"id\":1 } ], \"id\":1, \"rev\":22, \"nested\": { \"id\":2, \"rev\":2 }}", SimpleModel.class);
Of course, provided solution can be improved and accept more features: for example - validations for nested objects which are not marked with FieldRequired annotation. Currently it's out of answer's scope, but can be added later.
(Inspired by Brian Roache's answer.)
It seems that Brian's answer doesn't work for primitives because the values can be initialized as something other than null (e.g. 0).
Moreover, it seems like the deserializer would have to be registered for every type. A more scalable solution uses TypeAdapterFactory (as below).
In certain circumstances, it is safer to whitelist exceptions from required fields (i.e. as JsonOptional fields) rather than annotating all fields as required.
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface JsonOptional {
}
Though this approach can easily be adapted for required fields instead.
import com.google.gson.Gson;
import com.google.gson.JsonElement;
import com.google.gson.JsonParseException;
import com.google.gson.TypeAdapter;
import com.google.gson.TypeAdapterFactory;
import com.google.gson.internal.Streams;
import com.google.gson.reflect.TypeToken;
import com.google.gson.stream.JsonReader;
import com.google.gson.stream.JsonWriter;
import java.io.IOException;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public class AnnotatedTypeAdapterFactory implements TypeAdapterFactory {
#Override
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> typeToken) {
Class<? super T> rawType = typeToken.getRawType();
Set<Field> requiredFields = Stream.of(rawType.getDeclaredFields())
.filter(f -> f.getAnnotation(JsonOptional.class) == null)
.collect(Collectors.toSet());
if (requiredFields.isEmpty()) {
return null;
}
final TypeAdapter<T> baseAdapter = (TypeAdapter<T>) gson.getAdapter(rawType);
return new TypeAdapter<T>() {
#Override
public void write(JsonWriter jsonWriter, T o) throws IOException {
baseAdapter.write(jsonWriter, o);
}
#Override
public T read(JsonReader in) throws IOException {
JsonElement jsonElement = Streams.parse(in);
if (jsonElement.isJsonObject()) {
ArrayList<String> missingFields = new ArrayList<>();
for (Field field : requiredFields) {
if (!jsonElement.getAsJsonObject().has(field.getName())) {
missingFields.add(field.getName());
}
}
if (!missingFields.isEmpty()) {
throw new JsonParseException(
String.format("Missing required fields %s for %s",
missingFields, rawType.getName()));
}
}
TypeAdapter<T> delegate = gson.getDelegateAdapter(AnnotatedTypeAdapterFactory.this, typeToken);
return delegate.fromJsonTree(jsonElement);
}
};
}
}
This is my simple solution that creates a generic solution with minimum coding.
Create #Optional annotation
Mark First Optional. Rest are assumed optional. Earlier are assumed required.
Create a generic 'loader' method that checks that source Json object has a value. The loop stops once an #Optional field is encountered.
I am using subclassing so the grunt work is done in the superclass.
Here is the superclass code.
import com.google.gson.Gson;
import java.lang.reflect.Field;
import java.lang.annotation.Annotation;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
...
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface Optional {
public boolean enabled() default true;
}
and the grunt work method
#SuppressWarnings ("unchecked")
public <T> T payload(JsonObject oJR,Class<T> T) throws Exception {
StringBuilder oSB = new StringBuilder();
String sSep = "";
Object o = gson.fromJson(oJR,T);
// Ensure all fields are populated until we reach #Optional
Field[] oFlds = T.getDeclaredFields();
for(Field oFld:oFlds) {
Annotation oAnno = oFld.getAnnotation(Optional.class);
if (oAnno != null) break;
if (!oJR.has(oFld.getName())) {
oSB.append(sSep+oFld.getName());
sSep = ",";
}
}
if (oSB.length() > 0) throw CVT.e("Required fields "+oSB+" mising");
return (T)o;
}
and an example of usage
public static class Payload {
String sUserType ;
String sUserID ;
String sSecpw ;
#Optional
String sUserDev ;
String sUserMark ;
}
and the populating code
Payload oPL = payload(oJR,Payload.class);
In this case sUserDev and sUserMark are optional and the rest required. The solution relies on the fact that the class stores the Field definitions in the declared order.
I searched a lot and found no good answer. The solution I chose is as follows:
Every field that I need to set from JSON is an object, i.e. boxed Integer, Boolean, etc. Then, using reflection, I can check that the field is not null:
public class CJSONSerializable {
public void checkDeserialization() throws IllegalAccessException, JsonParseException {
for (Field f : getClass().getDeclaredFields()) {
if (f.get(this) == null) {
throw new JsonParseException("Field " + f.getName() + " was not initialized.");
}
}
}
}
From this class, I can derive my JSON object:
public class CJSONResp extends CJSONSerializable {
#SerializedName("Status")
public String status;
#SerializedName("Content-Type")
public String contentType;
}
and then after parsing with GSON, I can call checkDeserialization and it will report me if some of the fields is null.
With the introduction of Zend_Rest_Route in Zend Framework 1.9 (and its update in 1.9.2) we now have a standardized RESTful solution for routing requests. As of August 2009 there are no examples of its usage, only the basic documentation found in the reference guide.
While it is perhaps far more simple than I assume, I was hoping those more competent than I might provide some examples illustrating the use of the Zend_Rest_Controller in a scenario where:
Some controllers (such as indexController.php) operate normally
Others operate as rest-based services (returning json)
It appears the JSON Action Helper now fully automates and optimizes the json response to a request, making its use along with Zend_Rest_Route an ideal combination.
Appears it was rather simple. I've put together a Restful Controller template using the Zend_Rest_Controller Abstract. Simply replace the no_results return values with a native php object containing the data you want returned. Comments welcome.
<?php
/**
* Restful Controller
*
* #copyright Copyright (c) 2009 ? (http://www.?.com)
*/
class RestfulController extends Zend_Rest_Controller
{
public function init()
{
$config = Zend_Registry::get('config');
$this->db = Zend_Db::factory($config->resources->db);
$this->no_results = array('status' => 'NO_RESULTS');
}
/**
* List
*
* The index action handles index/list requests; it responds with a
* list of the requested resources.
*
* #return json
*/
public function indexAction()
{
// do some processing...
// Send the JSON response:
$this->_helper->json($this->no_results);
}
// 1.9.2 fix
public function listAction() { return $this->_forward('index'); }
/**
* View
*
* The get action handles GET requests and receives an 'id' parameter; it
* responds with the server resource state of the resource identified
* by the 'id' value.
*
* #param integer $id
* #return json
*/
public function getAction()
{
$id = $this->_getParam('id', 0);
// do some processing...
// Send the JSON response:
$this->_helper->json($this->no_results);
}
/**
* Create
*
* The post action handles POST requests; it accepts and digests a
* POSTed resource representation and persists the resource state.
*
* #param integer $id
* #return json
*/
public function postAction()
{
$id = $this->_getParam('id', 0);
$my = $this->_getAllParams();
// do some processing...
// Send the JSON response:
$this->_helper->json($this->no_results);
}
/**
* Update
*
* The put action handles PUT requests and receives an 'id' parameter; it
* updates the server resource state of the resource identified by
* the 'id' value.
*
* #param integer $id
* #return json
*/
public function putAction()
{
$id = $this->_getParam('id', 0);
$my = $this->_getAllParams();
// do some processing...
// Send the JSON response:
$this->_helper->json($this->no_results);
}
/**
* Delete
*
* The delete action handles DELETE requests and receives an 'id'
* parameter; it updates the server resource state of the resource
* identified by the 'id' value.
*
* #param integer $id
* #return json
*/
public function deleteAction()
{
$id = $this->_getParam('id', 0);
// do some processing...
// Send the JSON response:
$this->_helper->json($this->no_results);
}
}
great post, but I would have thought the Zend_Rest_Controller would route the request to the right action with respect to the HTTP method used. It'd be neat if a POST request to http://<app URL>/Restful would automatically _forward to postAction for example.
I'll go ahead and provide another strategy below, but maybe I'm missing the point behind Zend_Rest_Controller ... please comment.
My strategy:
class RestfulController extends Zend_Rest_Controller
{
public function init()
{
$this->_helper->viewRenderer->setNoRender();
$this->_helper->layout->disableLayout();
}
public function indexAction()
{
if($this->getRequest()->getMethod() === 'POST')
{return $this->_forward('post');}
if($this->getRequest()->getMethod() === 'GET')
{return $this->_forward('get');}
if($this->getRequest()->getMethod() === 'PUT')
{return $this->_forward('put');}
if($this->getRequest()->getMethod() === 'DELETE')
{return $this->_forward('delete');}
$this->_helper->json($listMyCustomObjects);
}
// 1.9.2 fix
public function listAction() { return $this->_forward('index'); }
[the rest of the code with action functions]