Is . char allowed in JSON field name?
java.lang.IllegalArgumentException: instance.id is not a valid JSON field name.
at com.google.gson.JsonFieldNameValidator.validate(JsonFieldNameValidator.java:52)
Atleast gson library seems to be complaining. But I couldn't find anything in json spec.
Note that I have serialized name annotation to avoid issue in java field name.
#SerializedName("instance.id")
private String instanceId;
Update:
It is a bug in serializedname and This is the fix I did:
#SdeSerializedName("instance.id")
private String instanceId;
and
new GsonBuilder().setFieldNamingStrategy
(new FieldNamingStrategy() {
public String translateName(final Field field) {
final SdeSerializedName annotation = field.getAnnotation(SdeSerializedName.class);
return ((null != annotation) && null != annotation.value()) ? annotation.value() : field.getName();
}
})
It is allowed in JSON itself, but (if I understand the GSON documentation correctly) the error message is because it can't map instance.id to a Java class member of the same name.
Have a look at Following thread about a similar problem mapping field names:
http://groups.google.com/group/google-gson/tree/browse_frm/month/2010-05/e575bb65cdd30410?rnum=31&_done=/group/google-gson/browse_frm/month/2010-05?&pli=1
Since the dot "." is already the separator between an object and a member name in javascript (this is where json originates), it cannot be a valid field name.
Related
I'm developing a Rest Client using Spring Boot and Spring Framework (spring-boot-starter-parent 2.1.6.RELEASE)
I have a class representing a response object as shown below:
public class ValidateResponse {
private String ResponseCode;
private String ResponseDesc;
//getters and setters
//constructors using fields
//empty constructor
}
I'm creating a web-hook for an external api and I need to return a JSON object to for a specific endpoint (the JSON object properties must start with uppercase(s)). I'm calling returning the object from a PostMapping method nested in a RequestMapping root path:
#PostMapping("hooks/validate")
public ValidateResponse responseObj(#RequestHeader Map<String, String> headersObj) {
ValidateResponse response = new ValidateResponse("000000", "Success");
logger.info("Endpoint = hooks/validate | Request Headers = {}", headersObj);
return response;
}
However, when I hit the endpoint from postman I'm getting duplicate varialbes
{
"ResponseCode": "000000",
"ResponseDesc": "Success",
"responseCode": "000000",
"responseDesc": "Success"
}
I understand that the pojo-json conversion is handled by spring but I don't understand why the conversion is yielding duplicate variables.
Note: I know the ResponseDesc and the ResponseCode are not declared using the best standards for naming variables (camelCasing).
I've done some digging and according to the Java Language Specification
An identifier is an unlimited-length sequence of Java letters and Java digits, the first of which must be a Java letter.
and
The "Java letters" include uppercase and lowercase ASCII Latin letters A-Z (\u0041-\u005a), and a-z (\u0061-\u007a), and, for historical reasons, the ASCII underscore (_, or \u005f) and dollar sign ($, or \u0024). The $ character should be used only in mechanically generated source code or, rarely, to access pre-existing names on legacy systems.
So, I'm assuming its syntactically correct to define a variable using the Camelcase format [Need clarification on this].
I'm considering having to create the JSON object manually but I'd like to know the cause of this behaviour first. Any pointers are appreciated.
Jackson deserializes all the public fields that it comes across. However if you want Jackson to return the response in your expected element names (in your case elements starting with capital letters), make the fields private and annotate them with the #JsonProperty(expected_name_here). Your class file will typically looks as shown below
public class ValidateResponse {
#JsonProperty("ResponseDesc")
private String responseCode;
#JsonProperty("ResponseDesc")
private String responseDesc;
//getters and setters
//constructors using fields
//empty constructor
}
Note: The getters and setters for these fields should be public, otherwise Jackson won't see anything to deserialize in the class.
public class ValidateResponse {
#JsonProperty("ResponseDesc")
public String responseCode;
#JsonProperty("ResponseDesc")
public String responseDesc;
//getters and setters
//constructors using fields
//empty constructor
}
This must fix your problem, however I do not know the reason as it requires deep Jackson investigation.
EDIT
I found out the reason.
The field got duplicated because in you case you had:
2 public fields named in upper case -> they are to be processed by jackson
2 getters getResponseCode and getResponseDesc -> they are to be resolved
as accessors for properties responseCode and responseDesc accordingly.
Summing this up - you have 4 properties resolved by Jackson. Simply making your fields private will resolve your issue, however I still advise using JsonProperty approach.
I added a com.google.code.gson dependency in the projects pom.xml file to configure Spring Boot to use Gson (instead of the default jackson).
The Json object returned from the hooks/validate endpoint must have its property names starting with a capital letter. Using a java class to generate the response object was resulting to camelCased property names so I resolved to create the JSON response object manually. Here's the code for creating the custom JSON object:
public ResponseEntity<String> responseObj(#RequestHeader Map<String, String> headersObj) {
HttpHeaders responseHeaders = new HttpHeaders();
responseHeaders.setContentType(MediaType.APPLICATION_JSON);
JsonObject response = new JsonObject();
response.addProperty("ResponseCode", "00000000");
response.addProperty("ResponseDesc" , "Success");
logger.info("Endpoint = hooks/validate | Request Headers = {}", headersObj);
return ResponseEntity.ok().headers(responseHeaders).body(response.toString());
}
Note The JSON object is returned as a String so the response from the endpoint must have an additional header to define MediaType to inform the calling system that the response is in JSON format:
responseHeaders.setContentType(MediaType.APPLICATION_JSON);
then add the header to the response:
return ResponseEntity.ok().headers(responseHeaders).body(response.toString());
I'm using xStream to some JSON. I've used xstream quite extensively over the years. However this issue has me stumped.
I'm getting the following ConversionException...
com.thoughtworks.xstream.converters.ConversionException: For input string: ".232017E.232017E44"
---- Debugging information ----
message : For input string: ".232017E.232017E44"
cause-exception : java.lang.NumberFormatException
cause-message : For input string: ".232017E.232017E44"
class : java.sql.Timestamp
required-type : java.sql.Timestamp
converter-type : com.etepstudios.xstream.XStreamTimestampConverter
line number : -1
class[1] : com.pbp.bookacall.dataobjects.AppleReceipt
converter-type[1] : com.thoughtworks.xstream.converters.reflection.ReflectionConverter
class[2] : com.pbp.bookacall.dataobjects.AppleReceiptCollection
version : 1.4.10
-------------------------------
at com.etepstudios.xstream.XStreamTimestampConverter.unmarshal(XStreamTimestampConverter.java:87)
In my XStreamTimestampConverter class I print out the value that is attempting to be converted.. Which turns out to be the following...
XStreamTimestampConverter value = 2017-08-05 23:44:23.GMT
Here is the unmarshal function in my converter...
public Object unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context)
{
Timestamp theTimestamp;
Date theDate;
String value = reader.getValue ();
try
{
SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.Z");
theDate = formatter.parse(value);
theTimestamp = new Timestamp (theDate.getTime());
}
catch (Exception e)
{
System.out.println ("XStreamTimestampConverter value = " + value);
throw new ConversionException(e.getMessage(), e);
}
return theTimestamp;
}
Any idea where this odd string is coming from? It does not exist anywhere in my JSON. Does xstream have some odd .[num]E.[num]E[num] notation for something? These numbers can change as I run this each time. Also I get an For input string: "" on occasion too. Yet the value is similar to the what is above. It's like it's randomly getting odd values for somewhere.
The data source is from Apple's In-App Purchase /VerifyReceipt web call. The system works just fine some times but then others it does not. It's also important to note that in this very case it parsed 100s of other Date/Timestamp strings using this converter. It just get's confused. Perhaps due to the size of the data?
So I figured out what was going on here. The unmarshal function above is not exactly as I have it in code...
The SimpleDateFormat formatter is actually set in the class rather than in the unmarshal method. Therefore if Xstream holds on to an instance of my converter and the unmarshal is called across multiple threads then it is possible that the formatter can get confused since it is the same object.
That's my only guess at this point as moving the formatter initialization into the method solved the issue. I would say SimpleDateFormatter is not thread safe?
It was the just the sheer about of data and the number of times it was concurrently being called that exposed this issue. Just a tip for anyone else in case this happens to them.
I'm trying to use Redis to store some cache data for my entity, which has different types of fields inside, for example,
public class Job {
private String id;
private Date createTime; //Long
private String submitterName;
private JobDefinition jobDef; //Another class
}
There are more fields and due to the fact that several fields are updated more frequently than others, I decided to save this job as a Hashmap in Redis with each field as a key. Here the nested object like jobDef is not important so I used Jackson2JsonRedisSerializer as hashValueSerializer for RedisTemplate and the jobDef obj will just be serialized as a long JSON string, which is totally fine in my case.
But I don't know how can I effectively deserialize the whole job object back from Redis. The type I set to deserializer is like Jackson2JsonRedisSerializer(Map.class) but it complains when deserializing String keys and values.
So is this an invalid usage with RedisTemplate or how should I configure my serializer for it?
EDIT:
Adding more code details,
#Autowired
private StringRedisTemplate redisTemplate; //Here I'm using a String template as I need to use the same redisTemplate for some key-value/list operations too
Map jobHash= new ObjectMapper().convertValue(job, Map.class);
redisTemplate.setHashValueSerializer(new Jackson2JsonRedisSerializer(Map.class));
redisTemplate.opsForHash().putAll("job:"+job.getId(), jobHash); //After this the job hash shows up in Redis as I expected, while the jobDef member is serialized and saved as a JSON string
Map jobMap = redisTemplate.opsForHash().entries("job:" + job.getId()); //But this won't work as it'll throw exception complaining cannot deserialize a String value to Map. But when I set Jackson2JsonRedisSerializer(String.class) it throws exception that cannot resolve the byte code
2nd EDIT:
If using JdkSerializationRedisSerializer as HashValueSerializer in RedisTemplate then the deserialization works fine, however the downside for using this one is the value stored in Redis is not the same human readable string value as when using Jackson2JsonRedisSerializer.
The Jackson2JsonRedisSerializer does not include mapping information into the actual hash structure.
The resulting Redis HASH results in something like:
127.0.0.1:6379> hgetall job:1
1) "id"
2) "\"1\""
3) "createTime"
4) "1455778716799"
5) "submitterName"
6) "\"Jon Snow\""
7) "jobDef"
8) "{\"def\":\"nightwatch\"}"
The ObjectMapper produces a LinkedHashMap for the JobDefinition entry which fails to deserialize as the type is unknown.
Using the GenericJackson2JsonRedisSerializer includes type information so the resulting Redis HASH looks like this:
127.0.0.1:6379> hgetall job:1
1) "id"
2) "\"1\""
...
7) "jobDef"
8) "{\"#class\":\"java.util.LinkedHashMap\",\"def\":\"nightwatch\"}"
This allows to deserialize values correctly.
Another approach would be to NOT use a specific HashValueSerializer but instead use a DecoratingStringHashMapper along with the StringRedisTemplate.
DecoratingStringHashMapper mapper = new DecoratingStringHashMapper<Job>(
new JacksonHashMapper<Job>(Job.class));
template.opsForHash().putAll("job:" + job.id, mapper.toHash(job));
Map jobMap = template.opsForHash().entries("job:" + job.id);
The DecoratingStringHashMapper will produce a Redis Hash as follows:
127.0.0.1:6379> hgetall job:1
1) "id"
2) "1"
3) "createTime"
4) "1455780810643"
5) "submitterName"
6) "Jon Snow"
7) "jobDef"
8) "{def=nightwatch}"
Unfortunately there is no Jackson2HashMapper. Please vote for DATAREDIS-423 and help us prioritize.
I am using FlexJson within my play framework application but at the point I am trying to deseralize the json string it throws a java.lang.ClassCastException: java.lang.String cannot be cast to java.lang.Boolean:
User user = new JSONDeserializer<User>()
.use(null, User.class).deserialize(body);
Body is the json string passed into the controller using standard jquery/ajax and
where User has the following boolean value declared:
public Boolean isCurrentUser;
Any ideas as to what I am doing wrong?
Thanks
In Json, Boolean is a type. Your JSon is:
{"user_id":"18","isCurrentUser":"true","title":"mr","description":"description"}
when it should be:
{"user_id":"18","isCurrentUser":true,"title":"mr","description":"description"}
Note that true is not a String, but a boolean. The parser fails because it finds a String instead of the expected boolean type. Fix the JSon generation to add a boolean, not a String.
I'm trying to parse some JSON in Grails using the grails.converters.JSON library. I have a field which will contain either a string, or a null value. When I parse the JSON and get the field, the null values come back as a JSONObject.NULL type. This is not good when checking != null as JSONObject.NULL is evaluated as non-null (not good for null checks)
def obj = JSON.parse('{"date1":null,"date2":"2011-06-26T05:00:00Z"}')
def date1 = obj.date1
if (date1)
parse(date1) // parse error occurs here because date1 evaluates true in the if because it is JSONObject.NULL
Is there an easy way to get the parse to parse a real null value so that I don't have to check if the object is a JSONObject.NULL.
I tried the suggestion here to use .toString(), but it ended up returning the string value 'null' instead of actual null for a JSONObject.NULL value.
You may find this more useful and natural
JSONObject.NULL.equals(jsonObj.get("key_name"))
Have a look at: http://grails.1312388.n4.nabble.com/The-groovy-truth-of-JSONObject-Null-td3661040.html
Ian Roberts mentions a nice trick to make a null check possible:
JSONObject.NULL.metaClass.asBoolean = {-> false}
I think I found a better solution, which consists in overriding the toString() method implementation of the JSONObject.NULL inner class by copying the JSONObject.java file into your Grails src/java project and then changing the implementation to this:
/**
* Get the "" string value.
* #return An empty String "".
*/
#Override
public String toString() {
return "";
}
Once you restart with this new class in your classpath, the classloader will use your JSONObject class instead of the one packaged in the Grails dependencies.
Make sure you keep it in the same package as the original.
For more details you can go here: https://github.com/grails/grails-core/issues/9129
Hope it helps :-)