Using arbitrary JSON objects in OpenRasta - json

I can't seem to find anything in the OpenRasta docs or tutorials that shows how to use arbitrary JSON objects (i.e. objects not predefined using C# classes) for both receiving from and responding back to the client.
One way to do it would be to use JsonValue and write a custom codec that would just use the (de)serialization features provided by JsonValue. That should be pretty straightforward and less than 50 lines of code, but I wondered if there isn't anything built into OpenRasta?
(One downside of JsonValue is that MS has not yet released it, so you can't yet deploy it to customers (see 1. "Additional Use Rights"). But in cases where that matters, any other Json library, like Json.NET can be used.)

I have written, like most people, a very simple codec that supports dynamics as inputs and outputs to handlers using json.net. You can also register that codec with an anonymous type and it works brilliantly. You end up with this:
public object Post(dynamic myCustomer) {
return new { response = myCustomer.Id };
}

I just implemented a JSON codec using JsonFx. It goes like this:
using System.IO;
using System.Text;
using JsonFx.Json;
namespace Example
{
[global::OpenRasta.Codecs.MediaType("application/json")]
public class JsonFXCodec : global::OpenRasta.Codecs.IMediaTypeWriter, global::OpenRasta.Codecs.IMediaTypeReader
{
public void WriteTo(object entity, global::OpenRasta.Web.IHttpEntity response, string[] codecParameters)
{
JsonWriter json = new JsonWriter();
using (TextWriter w = new StreamWriter(response.Stream, Encoding.UTF8))
{
json.Write(entity, w);
}
}
public object ReadFrom(global::OpenRasta.Web.IHttpEntity request, global::OpenRasta.TypeSystem.IType destinationType, string destinationName)
{
JsonReader json = new JsonReader();
using (TextReader r = new StreamReader(request.Stream, Encoding.UTF8))
{
return json.Read(r, destinationType.StaticType);
}
}
public object Configuration { get; set; }
}
}
If it is registered for "object" then it seems to work for any class:
ResourceSpace.Has.ResourcesOfType<object>()
.WithoutUri
.TranscodedBy<JsonFXCodec>();

Related

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

ServiceStack.Text CSV serialization of IEnumerable<object> ignores custom serialization functions

Firstly, please forgive any rookie mistakes here - I'm not a regular poster I'm afraid.
Now on to the nitty gritty...
I am trying to use ServiceStack.Text to serialize objects to CSV. If I keep it simple, everything works as expected when serializing objects of a known type.
However I want to serialize many objects and I don't know the type at runtime so I am writing a reusable component where all data is treated as a System.Object. We already do this same routine for Json serialization without problems. But CsvSerializer appears to handle objects differently during serialization.
Sample code
public void TestIEnumerableObjectSerialization()
{
var data = GenerateSampleData();
JsConfig<DateTime>.SerializeFn =
time => new DateTime(time.Ticks, DateTimeKind.Utc).ToString("yyyy-MM-dd HH:mm:ss");
var csv = CsvSerializer.SerializeToCsv(data);
Console.WriteLine(csv);
Assert.Equal("DateTime\r\n"
+ "2017-06-14 00:00:00\r\n"
+ "2017-01-31 01:23:45\r\n",
csv);
}
object[] GenerateSampleData()
{
return new object[] {
new POCO
{
DateTime = new DateTime(2017,6,14)
},
new POCO
{
DateTime = new DateTime(2017,1,31, 01, 23, 45)
}
};
}
public class POCO
{
public DateTime DateTime { get; set; }
}
The result of this code is that the custom serialization function is not invoked, and the DateTime is written out using the standard ToString() method.
The cause?
The CsvWriter.Write method is inspecting the type of the records and if the type is Object it is treated as a Dictionary<string, object> and CsvDictionaryWriter generates the output.
In turn, CsvDictionaryWriter uses the ToCsvField() extension method to write each property a record.
The problem is that ToCsvField() converts the value of each property to a string using ToString() meaning no custom serialization is performed.
JsonSerializer uses TypeSerializer.SerializeToString(text) to serialize the properties of an Object using any configured custom serialization functions; but this doesn't happen with CsvSerializer.
A possible solution?
Without complicating CsvSerializer, the ToCsvField() extension method could be updated to use TypeSerializer to handle the serialization to a string. Here is what I've been testing with so far:
public static object ToCsvField(this object text)
{
var textSerialized = TypeSerializer.SerializeToString(text).StripQuotes();
return textSerialized == null || !CsvWriter.HasAnyEscapeChars(textSerialized)
? textSerialized
: string.Concat
(
CsvConfig.ItemDelimiterString,
textSerialized.Replace(CsvConfig.ItemDelimiterString, CsvConfig.EscapedItemDelimiterString),
CsvConfig.ItemDelimiterString
);
}
So far I haven't come across an issue with this change, although someone may prefer not to allocate a new intermediate variable before the return statement.
Hopefully that is enough information, so on to my questions...
Has anyone else experienced this issue?
Am I doing something wrong and should I be serializing Objects a different way?
If this is a suitable fix/implementation of TypeSerializer, what are the chances of this being addressed in an update to ServiceStack.Text? I would raise an issue on GitHub but the ServiceStack.Text repo doesn't let me raise issues.
Thanks in advance.

Don't understand how to use JSON.NET with ASP.NET Core WebAPI

I've completed my first ASP.NET Core Web API and I'd like to try my hand at manually serializing/deserializing JSON via the JSON.NET library. In the JSON.NET documentation they give the following simple manual serialization example:
public static string ToJson(this Person p)
{
StringWriter sw = new StringWriter();
JsonTextWriter writer = new JsonTextWriter(sw);
writer.WriteStartObject();
// "name" : "Jerry"
writer.WritePropertyName("name");
writer.WriteValue(p.Name);
// "likes": ["Comedy", "Superman"]
writer.WritePropertyName("likes");
writer.WriteStartArray();
foreach (string like in p.Likes)
{
writer.WriteValue(like);
}
writer.WriteEndArray();
writer.WriteEndObject();
return sw.ToString();
}
What's lacking for a beginner such as myself is how to use this string. For example, consider the following:
[HttpGet("/api/data")
[Produces("application/json")]
public IActionResult GetData()
{
return Ok(new Byte[SomeBigInt]);
}
In the above code I don't really know where ASP.NET Core serializes the array to JSON...I'm assuming it happens somewhere under the hood. If I were to manually serialize (using the JSON.NET example) some big Byte array, what do I do with the resultant string? Is it just "return Ok(myJsonString);"? Won't the built-in serializer - not knowing that it is already the result of a serialization operation- serialize it again?
Since Asp.Net Core is quite flexible, there are several ways to return JSON. If you want to return Json from a controller one of the most straight forward ways to do it is like this:
[HttpGet("/api/data")]
public JsonResult GetData() {
return Json(new {
fieldOneString = "some value",
fieldTwoInt= 2
});
}
Under the hood the Json() helper method on the Controlleris using JSON.NET to do the JSON serialization and then sending that as the response body.
You could do the same thing like this:
string jsonText = JsonConvert.SerializeObject(new {
fieldOneString = "some value",
fieldTwoInt= 2
});
Response.WriteAsync(jsonText);
Note: to use Response.WriteAsync(jsonText) you need to add using Microsoft.AspNetCore.Http to your file and have a project reference to Microsoft.AspNetCore.Http.Abstractions.

Spring MVC Test, MockMVC: Conveniently convert objects to/from JSON

I am used to JAX-RS and would like to have similar comfort when sending requests using Spring MVC and working with the responses, i.e. on the client side inside my tests.
On the server (controller) side I'm quite happy with the automatic conversion, i.e. it suffices to just return an object instance and have JSON in the resulting HTTP response sent to the client.
Could you tell me how to work around the manual process of converting objectInstance to jsonString or vice versa in these snippets? If possible, I'd also like to skip configuring the content type manually.
String jsonStringRequest = objectMapper.writeValueAsString(objectInstance);
ResultActions resultActions = mockMvc.perform(post(PATH)
.contentType(MediaType.APPLICATION_JSON)
.content(jsonStringRequest)
)
String jsonStringResponse = resultActions.andReturn().getResponse().getContentAsString();
Some objectInstanceResponse = objectMapper.readValue(jsonStringResponse, Some.class);
For comparison, with JAX-RS client API I can easily send an object using request.post(Entity.entity(objectInstance, MediaType.APPLICATION_JSON_TYPE) and read the response using response.readEntity(Some.class);
if you have lot's of response objects, you could create some generic JsonToObject mapper-factory. It could be then used to detect the object type from a generic response (all response objects inherit from the same generic class) and respond/log properly from a bad mapping attempt.
I do not have a code example at hand, but as a pseudocode:
public abstract GenericResponse {
public String responseClassName = null;
// get/set
}
In the server code, add the name of the actual response object to this class.
The JsonToObject factory
public ConverterFactory<T> {
private T objectType;
public ConverterFactory(T type) {
objectType = type;
}
public T convert(String jsonString) {
// Type check
GenericResponse genResp = mapper.readValue(result.getResponse().getContentAsString(),
GenericResponse.class);
if (objectType.getClass().getSimpleName().equals(genResp.getResponseClassName())) {
// ObjectMapper code
return mapper.readValue(result.getResponse().getContentAsString(),
objectType.class);
} else {
// Error handling
}
}
}
I think this could be extended to be used with annotation to do more automation magic with the response. (start checking with BeanPostProcessor)
#Component
public class AnnotationWorker implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(final Object bean, String name) throws BeansException {
ReflectionUtils.doWithFields(bean.getClass(), field -> {
// make the field accessible if defined private
ReflectionUtils.makeAccessible(field);
if (field.getAnnotation(MyAnnotation.class) != null) {
field.set(bean, log);
}
});
return bean;
}
}
The above code snippet is copied from my current project and it injects to fields, you need to change it so, that it works for methods, eg ... where you may need it.
Having this all implemented may be tricky and can't say it necessarily works even, but it's something to try if you don't mind a bit of educative work.

Jersey/Genson: Unmarschalling single object array

Similar to Jersey: Json array with 1 element is serialized as object BUT on the client side. E.g. I recieve a JSON object where a field is an array regulary, but in case there is only one element, it is a single object.
{"fileInfo":[{"fileName":"weather.arff","id":"10"},"fileName":"supermarket.arff","id":"11"}]}
versus
{"fileInfo":{"fileName":"weather.arff","id":"10"}}
I'm parsing/unmarshalling the JSON using Jersey/Genson. Of course, if the JSON doesnt match the target class I recieve an error (such as expected [ but read '{' )
I've read a lot about this bug and how to avoid when creating JSON objects on the SERVER side, but I found nothing about how to handle this issus when dealing on the CLIENT side.
As always, I prefere the most codeless possibility if there are several solutions...
BTW: Moxy works but it does not marshal native Object-type objects which is another requirement...
Update
Starting with Genson 1.3 release you can achieve it by enabling permissiveParsing:
Genson genson = new GensonBuilder().usePermissiveParsing(true).create();
Answer
Uh, do you know what library produces this on server side? I am curious to see who is responsible for all those badly structured jsons out there...
It is not yet supported in Genson. Originally because IMO people should not produce such dynamic json. Anyway, I opened an issue - this can be easily done, you can expect it to be present in the release coming next week.
Otherwise here is a way to achieve it without breaking the existing mechanisms.
You need to register a Factory that will use Gensons collections factory to create an instance of its standard collection converter. Then you will wrap this converter in another one that will handle the object to array logic. Here is the code (not codeless..., but if you wait a bit you won't have to code :)).
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
import com.owlike.genson.convert.DefaultConverters.CollectionConverterFactory;
class SingleObjectAsCollectionFactory implements Factory<Converter<Collection>> {
// get the default factory
Factory<Converter<Collection<?>>> defaultFactory = CollectionConverterFactory.instance;
#Override
public Converter<Collection> create(Type type, Genson genson) {
// obtain an instance of the correct default converter for this type
final CollectionConverter defaultConverter = (CollectionConverter) defaultFactory.create(type, genson);
// wrap it in your own converter
return new Converter<Collection>() {
#Override
public void serialize(Collection object, ObjectWriter writer, Context ctx) throws Exception {
defaultConverter.serialize(object, writer, ctx);
}
#Override
public Collection deserialize(ObjectReader reader, Context ctx) throws Exception {
if (reader.getValueType() == ValueType.OBJECT) {
Object object = defaultConverter.getElementConverter().deserialize(reader, ctx);
Collection result = defaultConverter.create();
result.add(object);
return result;
} else return defaultConverter.deserialize( reader, ctx );
}
};
}
}
And then register it
Genson genson = new GensonBuilder()
.withConverterFactory(new SingleObjectAsCollectionFactory())
.create();