ServiceStack.Text CSV serialization of IEnumerable<object> ignores custom serialization functions - csv

Firstly, please forgive any rookie mistakes here - I'm not a regular poster I'm afraid.
Now on to the nitty gritty...
I am trying to use ServiceStack.Text to serialize objects to CSV. If I keep it simple, everything works as expected when serializing objects of a known type.
However I want to serialize many objects and I don't know the type at runtime so I am writing a reusable component where all data is treated as a System.Object. We already do this same routine for Json serialization without problems. But CsvSerializer appears to handle objects differently during serialization.
Sample code
public void TestIEnumerableObjectSerialization()
{
var data = GenerateSampleData();
JsConfig<DateTime>.SerializeFn =
time => new DateTime(time.Ticks, DateTimeKind.Utc).ToString("yyyy-MM-dd HH:mm:ss");
var csv = CsvSerializer.SerializeToCsv(data);
Console.WriteLine(csv);
Assert.Equal("DateTime\r\n"
+ "2017-06-14 00:00:00\r\n"
+ "2017-01-31 01:23:45\r\n",
csv);
}
object[] GenerateSampleData()
{
return new object[] {
new POCO
{
DateTime = new DateTime(2017,6,14)
},
new POCO
{
DateTime = new DateTime(2017,1,31, 01, 23, 45)
}
};
}
public class POCO
{
public DateTime DateTime { get; set; }
}
The result of this code is that the custom serialization function is not invoked, and the DateTime is written out using the standard ToString() method.
The cause?
The CsvWriter.Write method is inspecting the type of the records and if the type is Object it is treated as a Dictionary<string, object> and CsvDictionaryWriter generates the output.
In turn, CsvDictionaryWriter uses the ToCsvField() extension method to write each property a record.
The problem is that ToCsvField() converts the value of each property to a string using ToString() meaning no custom serialization is performed.
JsonSerializer uses TypeSerializer.SerializeToString(text) to serialize the properties of an Object using any configured custom serialization functions; but this doesn't happen with CsvSerializer.
A possible solution?
Without complicating CsvSerializer, the ToCsvField() extension method could be updated to use TypeSerializer to handle the serialization to a string. Here is what I've been testing with so far:
public static object ToCsvField(this object text)
{
var textSerialized = TypeSerializer.SerializeToString(text).StripQuotes();
return textSerialized == null || !CsvWriter.HasAnyEscapeChars(textSerialized)
? textSerialized
: string.Concat
(
CsvConfig.ItemDelimiterString,
textSerialized.Replace(CsvConfig.ItemDelimiterString, CsvConfig.EscapedItemDelimiterString),
CsvConfig.ItemDelimiterString
);
}
So far I haven't come across an issue with this change, although someone may prefer not to allocate a new intermediate variable before the return statement.
Hopefully that is enough information, so on to my questions...
Has anyone else experienced this issue?
Am I doing something wrong and should I be serializing Objects a different way?
If this is a suitable fix/implementation of TypeSerializer, what are the chances of this being addressed in an update to ServiceStack.Text? I would raise an issue on GitHub but the ServiceStack.Text repo doesn't let me raise issues.
Thanks in advance.

Related

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

Spring MVC Test, MockMVC: Conveniently convert objects to/from JSON

I am used to JAX-RS and would like to have similar comfort when sending requests using Spring MVC and working with the responses, i.e. on the client side inside my tests.
On the server (controller) side I'm quite happy with the automatic conversion, i.e. it suffices to just return an object instance and have JSON in the resulting HTTP response sent to the client.
Could you tell me how to work around the manual process of converting objectInstance to jsonString or vice versa in these snippets? If possible, I'd also like to skip configuring the content type manually.
String jsonStringRequest = objectMapper.writeValueAsString(objectInstance);
ResultActions resultActions = mockMvc.perform(post(PATH)
.contentType(MediaType.APPLICATION_JSON)
.content(jsonStringRequest)
)
String jsonStringResponse = resultActions.andReturn().getResponse().getContentAsString();
Some objectInstanceResponse = objectMapper.readValue(jsonStringResponse, Some.class);
For comparison, with JAX-RS client API I can easily send an object using request.post(Entity.entity(objectInstance, MediaType.APPLICATION_JSON_TYPE) and read the response using response.readEntity(Some.class);
if you have lot's of response objects, you could create some generic JsonToObject mapper-factory. It could be then used to detect the object type from a generic response (all response objects inherit from the same generic class) and respond/log properly from a bad mapping attempt.
I do not have a code example at hand, but as a pseudocode:
public abstract GenericResponse {
public String responseClassName = null;
// get/set
}
In the server code, add the name of the actual response object to this class.
The JsonToObject factory
public ConverterFactory<T> {
private T objectType;
public ConverterFactory(T type) {
objectType = type;
}
public T convert(String jsonString) {
// Type check
GenericResponse genResp = mapper.readValue(result.getResponse().getContentAsString(),
GenericResponse.class);
if (objectType.getClass().getSimpleName().equals(genResp.getResponseClassName())) {
// ObjectMapper code
return mapper.readValue(result.getResponse().getContentAsString(),
objectType.class);
} else {
// Error handling
}
}
}
I think this could be extended to be used with annotation to do more automation magic with the response. (start checking with BeanPostProcessor)
#Component
public class AnnotationWorker implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(final Object bean, String name) throws BeansException {
ReflectionUtils.doWithFields(bean.getClass(), field -> {
// make the field accessible if defined private
ReflectionUtils.makeAccessible(field);
if (field.getAnnotation(MyAnnotation.class) != null) {
field.set(bean, log);
}
});
return bean;
}
}
The above code snippet is copied from my current project and it injects to fields, you need to change it so, that it works for methods, eg ... where you may need it.
Having this all implemented may be tricky and can't say it necessarily works even, but it's something to try if you don't mind a bit of educative work.

Jersey/Genson: Unmarschalling single object array

Similar to Jersey: Json array with 1 element is serialized as object BUT on the client side. E.g. I recieve a JSON object where a field is an array regulary, but in case there is only one element, it is a single object.
{"fileInfo":[{"fileName":"weather.arff","id":"10"},"fileName":"supermarket.arff","id":"11"}]}
versus
{"fileInfo":{"fileName":"weather.arff","id":"10"}}
I'm parsing/unmarshalling the JSON using Jersey/Genson. Of course, if the JSON doesnt match the target class I recieve an error (such as expected [ but read '{' )
I've read a lot about this bug and how to avoid when creating JSON objects on the SERVER side, but I found nothing about how to handle this issus when dealing on the CLIENT side.
As always, I prefere the most codeless possibility if there are several solutions...
BTW: Moxy works but it does not marshal native Object-type objects which is another requirement...
Update
Starting with Genson 1.3 release you can achieve it by enabling permissiveParsing:
Genson genson = new GensonBuilder().usePermissiveParsing(true).create();
Answer
Uh, do you know what library produces this on server side? I am curious to see who is responsible for all those badly structured jsons out there...
It is not yet supported in Genson. Originally because IMO people should not produce such dynamic json. Anyway, I opened an issue - this can be easily done, you can expect it to be present in the release coming next week.
Otherwise here is a way to achieve it without breaking the existing mechanisms.
You need to register a Factory that will use Gensons collections factory to create an instance of its standard collection converter. Then you will wrap this converter in another one that will handle the object to array logic. Here is the code (not codeless..., but if you wait a bit you won't have to code :)).
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
class SingleObjectAsCollectionFactory implements Factory<Converter<Collection>> {
// get the default factory
Factory<Converter<Collection<?>>> defaultFactory = CollectionConverterFactory.instance;
#Override
public Converter<Collection> create(Type type, Genson genson) {
// obtain an instance of the correct default converter for this type
final CollectionConverter defaultConverter = (CollectionConverter) defaultFactory.create(type, genson);
// wrap it in your own converter
return new Converter<Collection>() {
#Override
public void serialize(Collection object, ObjectWriter writer, Context ctx) throws Exception {
defaultConverter.serialize(object, writer, ctx);
}
#Override
public Collection deserialize(ObjectReader reader, Context ctx) throws Exception {
if (reader.getValueType() == ValueType.OBJECT) {
Object object = defaultConverter.getElementConverter().deserialize(reader, ctx);
Collection result = defaultConverter.create();
result.add(object);
return result;
} else return defaultConverter.deserialize( reader, ctx );
}
};
}
}
And then register it
Genson genson = new GensonBuilder()
.withConverterFactory(new SingleObjectAsCollectionFactory())
.create();

How to convert an object containing DateTime fields to JSON in Dart?

I try to convert an object to JSON.
var obj = { "dt": new DateTime.now() };
var s = stringify(obj);
The runtime throws an exception: "Calling toJson method on object failed."
That's expected since DateTime class doesn't have toJson method.
But what should I do in this case?
Javascript's JSON.stringify function has an optional argument replacer which allows me to provide my own way of serialization of any object even if the object has no toJson method. Is there any similar facility in Dart or maybe I can extend somehow DateTime class with my own toJson method?
JSON conversion only works with maps, lists, strings, numbers, booleans, or null. So what if your object contains another type like DateTime?
DateTime → JSON
Let's start with the following object:
class Person {
Person(this.name, this.birthdate);
String name;
DateTime birthdate;
}
You can convert it to a map like this:
final person = Person('Bob', DateTime(2020, 2, 25));
Map<String, dynamic> map = {
'name': person.name,
'birthdate': person.birthdate,
};
If you tried to encode this right now with jsonEncode (or json.encode), you would get an error because the DateTime is not directly serializeable. There are two solutions.
Solution 1
You could serialize it yourself first like this:
Map<String, dynamic> map = {
'name': person.name,
'birthdate': person.birthdate.toIso8601String(),
};
final jsonString = json.encode(map);
Note:
Here is the difference between toString and toIso8601String:
2020-02-25 14:44:28.534 // toString()
2020-02-25T14:44:28.534 // toIso8601String()
The toIso8601String doesn't have any spaces so that makes it nicer for conversions and sending over APIs that might not deal with spaces well.
Solution 2
You could use the optional toEncodable function parameter on jsonEncode.
import 'dart:convert';
void main() {
final person = Person('Bob', DateTime(2020, 2, 25));
Map<String, dynamic> map = {
'name': person.name,
'birthdate': person.birthdate,
};
final toJson = json.encode(map, toEncodable: myDateSerializer);
}
dynamic myDateSerializer(dynamic object) {
if (object is DateTime) {
return object.toIso8601String();
}
return object;
}
The toEncodable function just converts the input to a string or something that jsonEncode can covert to a string.
JSON → DateTime
There is nothing special here. You just have to parse the string into the type that you need. In the case of DateTime you can use its parse or tryParse methods.
final myMap= json.decode(jsonString);
final name = myMap['name'];
final birthdateString = myMap['birthdate'];
final birthdate = DateTime.parse(birthdateString);
final decodedPerson = Person(name, birthdate);
Note that parse will throw an exception if the format of the string cannot be parsed into a DateTime object.
As a model class
class Person {
Person(this.name, this.birthdate);
String name;
DateTime birthdate;
Person.fromJson(Map<String, dynamic> json)
: name = json['name'],
birthdate = DateTime.tryParse(json['birthdate']),
Map<String, dynamic> toJson() {
return {
'name': name,
'birthdate': birthdate.toIso8601String(),
};
}
}
This will not throw an exception is the date is malformatted, but birthdate would be null.
Notes
See my fuller answer here.
Thanks to this answer for pointing me in the right direction.
Zdeslav Vojkovic's answer is outdated now.
The JSON.encode() method in dart:convert has an optional toEncodable method that is invoked for objects that are not natively serializable to JSON. It's then up to the user to provide a closure that returns an appropriate serialization of the DateTime.
IMO, it's a flaw in dart:json library that stringify doesn't support additional callback to serialize types lacking the implementation of toJson. Strangely enough, parse function does support reviver argument to customize the deserialization process. Probably the reason was along the lines that user can add toJson for their on types, and core library will take care of 'native' types, but DateTime was omitted (maybe because date serialization in JSON is not really a straightforward story).
Maybe the goal of the team was to use custom types instead of Maps (which is nice), but the drawback here is that if your custom type contains 10 other properties and 1 which is of DateTime type, you need to implement toJson which serializes all of them, even integers and strings. Hopefully, once when Mirror API is ready, there will be libraries that implement serialization 'out-of-band' by reading the reflected info on type, see lower for an example. The promise of Dart is that I would be able to use my types both on server and client, but if I have to manually implement serialization/deserialization for all my models then it is too awkward to be usable.
it also doesn't help that DateTime itself is not very flexible with formatting, as there are no other methods besides toString which returns the format useless for serialization.
So here are some options:
wrap (or derive from) DateTime in your own type which provides toJson
patch json.stringifyJsonValue and submit the change or at least submit an issue :)
use some 3-rd party library like json-object (just an example, it also doesn't support DateTime serialization, AFAIK
I am not really happy with any of them :)
I've added a new package to Dart pub.dev that allows json serialization of objects within a structure. This package Jsonize serialize and deserialize custom classes, and handles DateTime out of the box:
List<dynamic> myList = [1, "Hello!", DateTime.now()];
var jsonRep = Jsonize.toJson(myList);
var myDeserializedList = Jsonize.fromJson(jsonRep);
So will do with your example
var obj = { "dt": new DateTime.now() };
var s = Jsonize.toJson(obj);
var obj2 = Jsonize.fromJson(s);
but can do also this
var obj = DateTime.now();
var s = Jsonize.toJson(obj);
var dt = Jsonize.fromJson(s);

Typed AS3 JSON Encoder and Decoder?

I need to encode and Decode AS3 Objects in a typed manner. http://code.google.com/p/as3corelib/ only supports untyped encoding and decoding.
http://code.google.com/p/ason/ supports some kind of typed objects but is not very robust, e.g. it fails on Date Objects. Any Recommendations ?
To make it clear: It MUST be JSON and it MUST be strong typed and robust.
JSON is built in in AS3. The preferred method to transmit data over the wire is AMF, which does provide you typed objects.
If you have to use JSON, then I guess that you might have to do with some sort of custom protocol to be able encode/decode with types.
You would actually need a reflection utility that read beans in JSON format and then produce your object. It really depends on how deep you want to go.
as3Commons has a reflect package that could help. They also have a JSONTypeProvider, which is not exactly what you need but can put you in the right tract.
You could modify any of the IOC frameworks to produce the context by parsing JSON instead of the regular XML most of them use.
You could modify ASON and add a custom type parser. You would have to send a variable in your JSON object containing the type of the object. And use that in with flash.utils.getDefinitionByName.
Another approach would be to just parse the objects with a regular JSON parser and then if it has a defined type create an instance of that objet, and initialize the properties.
Something like this, to get you started:
var beanInfo:Object = JSON.decode( jsonString );
beanInfo = _parseBean( beanInfo );
private function _parseBean(beanInfo:Object):Object{
if ( beanInfo.hasOwnProperty("_type") ) {
var clazz:Class = getDefinitionByName( beanInfo._type ) as Class;
beanInfo.__clazz = clazz;
var instance:Object = new clazz;
for( var prop:String in beanInfo ) {
if( instance.hasOwnProperty(prop) ) target[prop] = _getPropertyFrom(beanInfo[prop]);
}
}
}
private function _getPropertyFrom(property:String):* {
var xml:XML = describeType( beanInfo.__clazz );
//find the type of the current property.
var type:String = xml...
//if is a simple object then do something like
switch( type ) {
case "number":
return parseFloat(property ) as Number;
break;
case "int":
case "uint":
return parseInt( property );
break;
case "string":
return property as String;
break;
...
default
//As it is it does not suppor complex objects.
//You would use reflection. But then you could save the whole switch...
break;
}
}
Flash has its own serialization system.
var serializer:ByteArray = new ByteArray();
serializer.writeObject(new Sprite());
serializer.position = 0;
var data:String = serializer.readUTFBytes(serializer.bytesAvailable);
trace(data); //Will show you the binary jibberish
You can use registerClassAlias to add support for custom classes.
JSON doens't really define a means to convey type information. It's just strings and ints and arrays and so on. So basically you need some sort of "pickle" for AS3 that's based on JSON. I would suggest you look into Flex/Flash remoting solutions and see how they package objects to be transmitted for RPC; you might be able to modify that solution to use JSON. I'm actually doubtful you'll find a library like this. Does it have to be JSON? I'm pretty sure there are XML based libraries that do this.
JSON is not implemented in the flash virtual machine, and therefore there is no typed object "JSON" as there is "Xml." So basically you can decode JSON just fine, but the type you're going to get is Object. You can them access data using the key in the object as an associative array.
http://blog.alien109.com/2009/02/11/php5-json-as3corelib-a-beautiful-thing/
JSON lib/utils official from adobe:
http://code.google.com/p/as3corelib/source/browse/#svn%2Ftrunk%2Fsrc%2Fcom%2Fadobe%2Fserialization%2Fjson
As good as it gets. :)
There are two operations you need to consider: 1) serializing an object of a particular type into JSON and 2) deserializing a JSON string into an object of a particular type.
The serialization part is easy - just use the built-in JSON.stringify(). Deserializing a JSON string into an object of a particular type in ActionScript is where it gets interesting (and where the answer to your question is). You need to write your own deserialization function for the classe(s) you will need to deserialize. In that function, you need to provide a reviver function to JSON.parse(), which allows you to customize how the JSON gets deserialized.
For example:
public static function deserializeComplexObject(json:String):ComplexObject
{
if (null == json || "null" == json)
{
return null;
}
var complexObject:ComplexObject = new ComplexObject();
var parsedObject:Object = JSON.parse(
json,
function (key:String, value:Object):*
{
switch (key)
{
case “keyForNumber”:
return value;
case “keyForComplexObject2”:
return deserializeComplexObject2(JSON.stringify(value));
case “keyForComplexObject3”:
return deserializeComplexObject3(JSON.stringify(value));
case “keyForString”:
return value;
case “keyForBoolean”:
return value;
default:
return value;
}
}
);
complexObject.keyForNumber = parsedObject.keyForNumber;
complexObject.keyForComplexObject2 = parsedObject.keyForComplexObject2;
// continue setting fields
// …
return complexObject;
}
Each case statement corresponds to a top-level key in the JSON string. You don't actually need separate case statements for every key - you can use the default case to handle all keys that map to values that are one of the simple types (Object, Array, String, Number, Boolean, null) by returning the value as-is.
I have now forked the json part of http://code.google.com/p/as3corelib/ and added typed object support...