I have a data structure and I am serializing this data structure to a JSON-File using Jackson Databind. As time progresses, the data model changes, but my application still needs to be able to read an old version of the JSON. My intention is that when an old version of the JSON is encountered, it is implicitly converted to the new version of the data structure in memory and when serialized for the next time, it is stored as the new format version.
For the case of newly added properties, this is simple: I simply specify a default value in Kotlin and Jackson uses that default value if the property is missing in the JSON. However, this case is more complicated: Previously, I had the following data structure:
data class Options(
var applyClahePerColorChannel: Boolean = false
)
Now, I want to make this more general and change the data structure to the following:
data class Options(
var multichannelMode: MultichannelMode = MultichannelMode.ApplyToLuminance
)
enum class MultichannelMode {
ApplyToLuminance, ApplyToAllColorsSeparately
}
Now, when reading an old version of the JSON, applyClahePerColorChannel == false should implicitly be translated to multichannelMode == ApplyToLuminance and applyClahePerColorChannel == true to multichannelMode == ApplyToAllColorsSeparately.
How can I achieve that in Jackson in a concise way?
Here's a solution I found, but I'm still open to better suggestions.
A simple renamed property can be handled with #JsonAlias like so:
If this data model...
data class Options(
var applyClahePerColorChannel: Boolean = false
)
...changes to...
data class Options(
var applyPerColorChannel: Boolean = false
)
...the old name can be added as an alias like so:
data class Options(
#get:JsonAlias("applyClahePerColorChannel")
var applyPerColorChannel: Boolean = false
)
Jackson will then treat both the same way, but upon serialization, Jackson will use the new name.
However, in my case, I also changed the type of the variable, requiring a custom converter like so:
data class Options(
#get:JsonAlias("applyClahePerColorChannel")
#get:JsonDeserialize(converter = BooleanToMultichannelModeConverter::class)
var multichannelMode: MultichannelMode = MultichannelMode.ApplyToLuminance
)
class BooleanToMultichannelModeConverter : Converter<String, MultichannelMode> {
override fun convert(value: String): MultichannelMode {
return when (value) {
in listOf("true", "True", "TRUE") -> ApplyToAllColorsSeparately
in listOf("false", "False", "FALSE") -> ApplyToLuminance
else -> MultichannelMode.valueOf(value)
}
}
#OptIn(ExperimentalStdlibApi::class)
override fun getInputType(typeFactory: TypeFactory): JavaType = typeFactory
.constructType(String::class.starProjectedType.javaType)
#OptIn(ExperimentalStdlibApi::class)
override fun getOutputType(typeFactory: TypeFactory): JavaType = typeFactory
.constructType(MultichannelMode::class.starProjectedType.javaType)
}
The converter basically tells Jackson to not do any parsing on its own and instead simply hand the unparsed string to the converter class, which will first attempt to parse the string as a boolean and if that fails as an enum value.
Related
I'm calling different APIs, that use the same key name in the JSON file. Depending on the response, there's one field that may be different types.
To be clear:
The key "results" when calling the API nº1 is a JSON object
The key "results" when calling the API nº2 is a JSON array
My code looks like this when using the second API:
data class Result(
#SerializedName("results") var persons:ArrayList<Person> =ArrayList()
)
The question is if there's any way to use the same class, without taking care if it's a JSON array or a JSON object.
I believe you can define results as an instance of com.fasterxml.jackson.databind.JsonNode.
data class Result(
val results: JsonNode
)
Then you can process results based on it's type—whether it is an ArrayNode or an ObjectNode (as both extend JsonNode):
fun processResults(results: JsonNode) = when{
results.isArray -> processArrayNode(results)
else -> processObjectNode(results)
}
private fun processArrayNode(list: JsonNode): *return whatever you need*{
val elements = list
.elements()
.asSequence()
.toList()
val mappedElements = elements.map{
processObjectNode(it)
}
// do whatever you need with the array
}
private fun processObjectNode(person: JsonNode): *return whatever you need*{
//** this will transform the json node into a linkedHashMap where the keys are the json keys and the values are the values (here interpreted as jsonNodes) **/
val fieldsMap = person
.fields()
.asSequence()
.associateBy( {it.key}, {it.value} )
// process whatever you need
}
This is one way to use the same DTO for both API calls. In my opinion, it is not worth the extra work. I would create two DTOs containing the results field, where in one it is an instance of Person, and in the other it is an instance of List<Person>.
Edit: One little upgrade to the above snippet would be to add extension methods to JsonNode:
fun JsonNode.elementsToList(): List<JsonNode> = this
.elements()
.asSequence()
.toList()
fun JsonNode.fieldsToMap(): Map<String, JsonNode> = this
.fields()
.asSequence()
.associateBy({it.key}, {it.value})
You can use ObjectMapper.typeFactory.constructParametricType to handle generic types:
data class Result<T>(
var x:T
)
val om = ObjectMapper()
om.registerModule(KotlinModule())
val parsedList = om.readValue<Result<List<String>>>(
"""{"x":["x1", "x2"]}""",
om.typeFactory.constructParametricType(Result::class.java, List::class.java)
)
println(parsedList)
val parsedMap = om.readValue<Result<Map<String, String>>>(
"""{"x":{"k1": "v1", "k2": "v2"}}""",
om.typeFactory.constructParametricType(Result::class.java, Map::class.java)
)
println(parsedMap)
Gives output:
Result(x=[x1, x2])
Result(x={k1=v1, k2=v2})
There is a MongoDB instance on my computer with a database. A couple of documents are present in one of the collections, I inserted them manually. There is a Scala application to manipulate the database. There is a case class called Location.
case class Location(_id: Option[ObjectId] = None, name: String) {
var visible: Boolean = false
}
This is the MongoDB configuration in the application.
private val customCodecs = fromProviders(
classOf[Location]
)
private val javaCodecs =
fromCodecs(new LocalDateTimeDateCodec(), new LocalDateDateCodec())
private val codecRegistry =
fromRegistries(customCodecs, javaCodecs,
DEFAULT_CODEC_REGISTRY)
val dbConnection = MongoClient(dbURI)
val database: MongoDatabase = dbConnection.getDatabase(dbName).withCodecRegistry(codecRegistry)
There are more classOf definitions in the customCodecs, just removed them. The dbURI string is retrieved from a config file.
There is a controller endpoint, which returns all Locations from the database. The result is this:
[{"_id":{},"name":"Hungary","visible":false},{"_id":{},"name":"Germany","visible":false},{"_id":{},"name":"France","visible":false},{"_id":{},"name":"Switzerland","visible":false},{"_id":{},"name":"Poland","visible":false}]
The documents in the database have ObjectId, since I entered them manually, and some documents should have the visibility property true. I suspect there is something wrong with the JSON serialization, but cannot figure it out what.
This is the code which queries the collection.
val query = collection.find().toFuture()
Await.result(query, 10.seconds).toList
The service method calls this code and passes the result to the controller.
import org.json4s.native.Serialization.write
val languages = enrollmentService.getAllLanguages
logger.info("GET all languages")
Ok(Json.parse(write[List[Language]](languages)))
I use json4s for JSON serialization / deserialization.
What could be the issue here?
Perhaps you need to include org.json4s.mongo.ObjectIdSerializer?
I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx
I need to serialize/deserialize a Scala class with structure something like the following:
#JsonIgnoreProperties(ignoreUnknown = true, value = Array("body"))
case class Example(body: Array[Byte]) {
lazy val isNativeText = bodyIsNativeText
lazy val textEncodedBody = (if (isNativeText) new String(body, "UTF-8") else Base64.encode(body))
def this(isNativeText: Boolean, textEncodedBody: String) = this((if(isNativeText) str.getBytes("UTF-8") else Base64.decode(textEncodedBody)))
def bodyIsNativeText: Boolean = // determine if the body was natively a string or not
}
It's main member is an array of bytes, which MIGHT represent a UTF-8 encoded textual string, but might not. The primary constructor accepts an array of bytes, but there is an alternate constructor which accepts a string with a flag indicating whether this string is base64 encoded binary data, or the actual native text we want to store.
For serializing to a JSON object, I want to store the body as a native string rather than a base64-encoded string if it is native text. That's why I use #JsonIgnoreProperties to not include the body property, and instead have a textEncodedBody that gets echoed out in the JSON.
The problem comes when I try to deserialize it like so:
val e = Json.parse[Example]("""{'isNativeText': true, 'textEncodedBody': 'hello'}""")
I receive the following error:
com.codahale.jerkson.ParsingException: Invalid JSON. Needed [body],
but found [isNativeText, textEncodedBody].
Clearly, I have a constructor that will work...it just is not the default one. How can I force Jerkson to use this non-default constructor?
EDIT: I've attempted to use both the #JsonProperty and #JsonCreator annotation, but jerkson appears to disregard both of those.
EDIT2: Looking over the jerkson case class serialization source code, it looks like a case class method with the same name as its field will be used in the way that a #JsonProperty would function - that is, as a JSON getter. If I could do that, it would solve my problem. Not being super familiar with Scala, I have no idea how to do that; is it possible for a case class to have a user-defined method with the same name as one of its fields?
For reference, here is the code below that leads me to this conclusion...
private val methods = klass.getDeclaredMethods
.filter { _.getParameterTypes.isEmpty }
.map { m => m.getName -> m }.toMap
def serialize(value: A, json: JsonGenerator, provider: SerializerProvider) {
json.writeStartObject()
for (field <- nonIgnoredFields) {
val methodOpt = methods.get(field.getName)
val fieldValue: Object = methodOpt.map { _.invoke(value) }.getOrElse(field.get(value))
if (fieldValue != None) {
val fieldName = methodOpt.map { _.getName }.getOrElse(field.getName)
provider.defaultSerializeField(if (isSnakeCase) snakeCase(fieldName) else fieldName, fieldValue, json)
}
}
json.writeEndObject()
}
Correct me if I'm wrong, but it looks like Jackson/Jerkson will not support arbitrarily nested JSON. There's an example on the wiki that uses nesting, but it looks like the target class must have nested classes corresponding to the nested JSON.
Anyway, if you're not using nesting with your case classes then simply declaring a second case class and a couple implicit conversions should work just fine:
case class Example(body: Array[Byte]) {
// Note that you can just inline the body of bodyIsNativeText here
lazy val isNativeText: Boolean = // determine if the body was natively a string or not
}
case class ExampleRaw(isNativeText: Boolean, textEncodedBody: String)
implicit def exampleToExampleRaw(ex: Example) = ExampleRaw(
ex.isNativeText,
if (ex.isNativeText) new String(ex.body, "UTF-8")
else Base64.encode(ex.body)
)
implicit def exampleRawToExample(raw: ExampleRaw) = Example(
if (raw.isNativeText) raw.textEncodedBody.getBytes("UTF-8")
else Base64.decode(textEncodedBody)
)
Now you should be able to do this:
val e: Example = Json.parse[ExampleRaw](
"""{'isNativeText': true, 'textEncodedBody': 'hello'}"""
)
You could leave the original methods and annotations you added to make the JSON generation continue to work with the Example type, or you could just convert it with a cast:
generate(Example(data): ExampleRaw)
Update:
To help catch errors you might want to do something like this too:
case class Example(body: Array[Byte]) {
// Note that you can just inline the body of bodyIsNativeText here
lazy val isNativeText: Boolean = // determine if the body was natively a string or not
lazy val doNotSerialize: String = throw new Exception("Need to convert Example to ExampleRaw before serializing!")
}
That should cause an exception to be thrown if you accidentally pass an instance of Example instead of ExampleRaw to a generate call.
I need to encode and Decode AS3 Objects in a typed manner. http://code.google.com/p/as3corelib/ only supports untyped encoding and decoding.
http://code.google.com/p/ason/ supports some kind of typed objects but is not very robust, e.g. it fails on Date Objects. Any Recommendations ?
To make it clear: It MUST be JSON and it MUST be strong typed and robust.
JSON is built in in AS3. The preferred method to transmit data over the wire is AMF, which does provide you typed objects.
If you have to use JSON, then I guess that you might have to do with some sort of custom protocol to be able encode/decode with types.
You would actually need a reflection utility that read beans in JSON format and then produce your object. It really depends on how deep you want to go.
as3Commons has a reflect package that could help. They also have a JSONTypeProvider, which is not exactly what you need but can put you in the right tract.
You could modify any of the IOC frameworks to produce the context by parsing JSON instead of the regular XML most of them use.
You could modify ASON and add a custom type parser. You would have to send a variable in your JSON object containing the type of the object. And use that in with flash.utils.getDefinitionByName.
Another approach would be to just parse the objects with a regular JSON parser and then if it has a defined type create an instance of that objet, and initialize the properties.
Something like this, to get you started:
var beanInfo:Object = JSON.decode( jsonString );
beanInfo = _parseBean( beanInfo );
private function _parseBean(beanInfo:Object):Object{
if ( beanInfo.hasOwnProperty("_type") ) {
var clazz:Class = getDefinitionByName( beanInfo._type ) as Class;
beanInfo.__clazz = clazz;
var instance:Object = new clazz;
for( var prop:String in beanInfo ) {
if( instance.hasOwnProperty(prop) ) target[prop] = _getPropertyFrom(beanInfo[prop]);
}
}
}
private function _getPropertyFrom(property:String):* {
var xml:XML = describeType( beanInfo.__clazz );
//find the type of the current property.
var type:String = xml...
//if is a simple object then do something like
switch( type ) {
case "number":
return parseFloat(property ) as Number;
break;
case "int":
case "uint":
return parseInt( property );
break;
case "string":
return property as String;
break;
...
default
//As it is it does not suppor complex objects.
//You would use reflection. But then you could save the whole switch...
break;
}
}
Flash has its own serialization system.
var serializer:ByteArray = new ByteArray();
serializer.writeObject(new Sprite());
serializer.position = 0;
var data:String = serializer.readUTFBytes(serializer.bytesAvailable);
trace(data); //Will show you the binary jibberish
You can use registerClassAlias to add support for custom classes.
JSON doens't really define a means to convey type information. It's just strings and ints and arrays and so on. So basically you need some sort of "pickle" for AS3 that's based on JSON. I would suggest you look into Flex/Flash remoting solutions and see how they package objects to be transmitted for RPC; you might be able to modify that solution to use JSON. I'm actually doubtful you'll find a library like this. Does it have to be JSON? I'm pretty sure there are XML based libraries that do this.
JSON is not implemented in the flash virtual machine, and therefore there is no typed object "JSON" as there is "Xml." So basically you can decode JSON just fine, but the type you're going to get is Object. You can them access data using the key in the object as an associative array.
http://blog.alien109.com/2009/02/11/php5-json-as3corelib-a-beautiful-thing/
JSON lib/utils official from adobe:
http://code.google.com/p/as3corelib/source/browse/#svn%2Ftrunk%2Fsrc%2Fcom%2Fadobe%2Fserialization%2Fjson
As good as it gets. :)
There are two operations you need to consider: 1) serializing an object of a particular type into JSON and 2) deserializing a JSON string into an object of a particular type.
The serialization part is easy - just use the built-in JSON.stringify(). Deserializing a JSON string into an object of a particular type in ActionScript is where it gets interesting (and where the answer to your question is). You need to write your own deserialization function for the classe(s) you will need to deserialize. In that function, you need to provide a reviver function to JSON.parse(), which allows you to customize how the JSON gets deserialized.
For example:
public static function deserializeComplexObject(json:String):ComplexObject
{
if (null == json || "null" == json)
{
return null;
}
var complexObject:ComplexObject = new ComplexObject();
var parsedObject:Object = JSON.parse(
json,
function (key:String, value:Object):*
{
switch (key)
{
case “keyForNumber”:
return value;
case “keyForComplexObject2”:
return deserializeComplexObject2(JSON.stringify(value));
case “keyForComplexObject3”:
return deserializeComplexObject3(JSON.stringify(value));
case “keyForString”:
return value;
case “keyForBoolean”:
return value;
default:
return value;
}
}
);
complexObject.keyForNumber = parsedObject.keyForNumber;
complexObject.keyForComplexObject2 = parsedObject.keyForComplexObject2;
// continue setting fields
// …
return complexObject;
}
Each case statement corresponds to a top-level key in the JSON string. You don't actually need separate case statements for every key - you can use the default case to handle all keys that map to values that are one of the simple types (Object, Array, String, Number, Boolean, null) by returning the value as-is.
I have now forked the json part of http://code.google.com/p/as3corelib/ and added typed object support...