When I run this code in LINQPad using JSON.NET:
var x = JObject.Parse(
#"{
""data"" : [ {
""id"" : ""bbab529ecefe58569c2b301a"",
""name"" : ""Sample Name"",
""group"" : ""8b618be8dc064e653daf62f9"",
""description"" : ""Sample Name"",
""payloadType"" : ""Geolocation"",
""contract"" : ""a9da09a7f4a7e7becf961865"",
""keepAlive"" : 0
} ]
}");
x.Dump();
An AmbiguousMatchException is thrown when trying to dump the parsed JSON to LINQPad's output window. Why? As far as I can tell this is perfectly legitimate JSON. http://jsonlint.com/ says it's valid, too.
This is a problem with how .Dump() is implemented most likely.
If you check the stack trace:
at System.RuntimeType.GetInterface(String fullname, Boolean ignoreCase)
at System.Type.GetInterface(String name)
at UserQuery.Main()
...
We can see that the method throwing the exception is System.RuntimeType.GetInterface.
System.RuntimeType is one of the concrete classes used to represent Type objects when reflection is used at runtime, so let's check Type.GetInterface(String, Boolean) which has this to say:
AmbiguousMatchException
The current Type represents a type that implements the same generic interface with different type arguments.
So it looks like the GetInterface method is called with a type of an interface that is implemented more than once, with different T's or similar.
To provoke the same error, simply replace x.Dump(); with this:
var type = x.GetType().GetInterface("System.Collections.Generic.IEnumerable`1", true);
This will throw the same exception.
Here's a simpler LINQPad example that shows the underlying problem:
void Main()
{
var type = typeof(Problem).GetInterface("System.Collections.Generic.IEnumerable`1", true);
}
public class Problem : IEnumerable<string>, IEnumerable<int>
{
IEnumerator IEnumerable.GetEnumerator() => ((IEnumerable<string>)this).GetEnumerator();
IEnumerator<string> IEnumerable<string>.GetEnumerator() => Enumerable.Empty<string>().GetEnumerator();
IEnumerator<int> IEnumerable<int>.GetEnumerator() => Enumerable.Empty<int>().GetEnumerator();
}
This example will throw the exact same exception.
Conclusion: There is nothing wrong with the Json, nor with Json.Net, this is a problem with how LINQPad tries to figure out the best way to dump the object to the output window.
Related
I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();
I'm using xStream to some JSON. I've used xstream quite extensively over the years. However this issue has me stumped.
I'm getting the following ConversionException...
com.thoughtworks.xstream.converters.ConversionException: For input string: ".232017E.232017E44"
---- Debugging information ----
message : For input string: ".232017E.232017E44"
cause-exception : java.lang.NumberFormatException
cause-message : For input string: ".232017E.232017E44"
class : java.sql.Timestamp
required-type : java.sql.Timestamp
converter-type : com.etepstudios.xstream.XStreamTimestampConverter
line number : -1
class[1] : com.pbp.bookacall.dataobjects.AppleReceipt
converter-type[1] : com.thoughtworks.xstream.converters.reflection.ReflectionConverter
class[2] : com.pbp.bookacall.dataobjects.AppleReceiptCollection
version : 1.4.10
-------------------------------
at com.etepstudios.xstream.XStreamTimestampConverter.unmarshal(XStreamTimestampConverter.java:87)
In my XStreamTimestampConverter class I print out the value that is attempting to be converted.. Which turns out to be the following...
XStreamTimestampConverter value = 2017-08-05 23:44:23.GMT
Here is the unmarshal function in my converter...
public Object unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context)
{
Timestamp theTimestamp;
Date theDate;
String value = reader.getValue ();
try
{
SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.Z");
theDate = formatter.parse(value);
theTimestamp = new Timestamp (theDate.getTime());
}
catch (Exception e)
{
System.out.println ("XStreamTimestampConverter value = " + value);
throw new ConversionException(e.getMessage(), e);
}
return theTimestamp;
}
Any idea where this odd string is coming from? It does not exist anywhere in my JSON. Does xstream have some odd .[num]E.[num]E[num] notation for something? These numbers can change as I run this each time. Also I get an For input string: "" on occasion too. Yet the value is similar to the what is above. It's like it's randomly getting odd values for somewhere.
The data source is from Apple's In-App Purchase /VerifyReceipt web call. The system works just fine some times but then others it does not. It's also important to note that in this very case it parsed 100s of other Date/Timestamp strings using this converter. It just get's confused. Perhaps due to the size of the data?
So I figured out what was going on here. The unmarshal function above is not exactly as I have it in code...
The SimpleDateFormat formatter is actually set in the class rather than in the unmarshal method. Therefore if Xstream holds on to an instance of my converter and the unmarshal is called across multiple threads then it is possible that the formatter can get confused since it is the same object.
That's my only guess at this point as moving the formatter initialization into the method solved the issue. I would say SimpleDateFormatter is not thread safe?
It was the just the sheer about of data and the number of times it was concurrently being called that exposed this issue. Just a tip for anyone else in case this happens to them.
Firstly, please forgive any rookie mistakes here - I'm not a regular poster I'm afraid.
Now on to the nitty gritty...
I am trying to use ServiceStack.Text to serialize objects to CSV. If I keep it simple, everything works as expected when serializing objects of a known type.
However I want to serialize many objects and I don't know the type at runtime so I am writing a reusable component where all data is treated as a System.Object. We already do this same routine for Json serialization without problems. But CsvSerializer appears to handle objects differently during serialization.
Sample code
public void TestIEnumerableObjectSerialization()
{
var data = GenerateSampleData();
JsConfig<DateTime>.SerializeFn =
time => new DateTime(time.Ticks, DateTimeKind.Utc).ToString("yyyy-MM-dd HH:mm:ss");
var csv = CsvSerializer.SerializeToCsv(data);
Console.WriteLine(csv);
Assert.Equal("DateTime\r\n"
+ "2017-06-14 00:00:00\r\n"
+ "2017-01-31 01:23:45\r\n",
csv);
}
object[] GenerateSampleData()
{
return new object[] {
new POCO
{
DateTime = new DateTime(2017,6,14)
},
new POCO
{
DateTime = new DateTime(2017,1,31, 01, 23, 45)
}
};
}
public class POCO
{
public DateTime DateTime { get; set; }
}
The result of this code is that the custom serialization function is not invoked, and the DateTime is written out using the standard ToString() method.
The cause?
The CsvWriter.Write method is inspecting the type of the records and if the type is Object it is treated as a Dictionary<string, object> and CsvDictionaryWriter generates the output.
In turn, CsvDictionaryWriter uses the ToCsvField() extension method to write each property a record.
The problem is that ToCsvField() converts the value of each property to a string using ToString() meaning no custom serialization is performed.
JsonSerializer uses TypeSerializer.SerializeToString(text) to serialize the properties of an Object using any configured custom serialization functions; but this doesn't happen with CsvSerializer.
A possible solution?
Without complicating CsvSerializer, the ToCsvField() extension method could be updated to use TypeSerializer to handle the serialization to a string. Here is what I've been testing with so far:
public static object ToCsvField(this object text)
{
var textSerialized = TypeSerializer.SerializeToString(text).StripQuotes();
return textSerialized == null || !CsvWriter.HasAnyEscapeChars(textSerialized)
? textSerialized
: string.Concat
(
CsvConfig.ItemDelimiterString,
textSerialized.Replace(CsvConfig.ItemDelimiterString, CsvConfig.EscapedItemDelimiterString),
CsvConfig.ItemDelimiterString
);
}
So far I haven't come across an issue with this change, although someone may prefer not to allocate a new intermediate variable before the return statement.
Hopefully that is enough information, so on to my questions...
Has anyone else experienced this issue?
Am I doing something wrong and should I be serializing Objects a different way?
If this is a suitable fix/implementation of TypeSerializer, what are the chances of this being addressed in an update to ServiceStack.Text? I would raise an issue on GitHub but the ServiceStack.Text repo doesn't let me raise issues.
Thanks in advance.
I'm using the Newtonsoft json.NET parser for JSON parsing. In my deserialization, I have the following code so that errors when converting from String to Int will not force me to throw away the entire object:
var param2 = new JsonSerializerSettings
{
Error = delegate(object sender, Newtonsoft.Json.Serialization.ErrorEventArgs args)
{
args.ErrorContext.Handled = true;
}
};
bcontent = JsonConvert.DeserializeObject<BContent>(json, param2);
I do not have control of the input data and the parsing errors are very common so I need to be versatile enough to handle them. Unfortunately, marking all errors as handled causes the deserialization to not terminate when it runs into a different error in a constrained environment.
What I want to do is to mark the errors as handled when they are of a type with a similar message as:
Could not convert string to integer....
But not when they are something different, such as this error which causes the hang:
Unterminated string. Expected delimiter...
What I can do is something like this:
Error = delegate(object sender, Newtonsoft.Json.Serialization.ErrorEventArgs args)
{
if (args.ErrorContext.Error.Message.Contains("convert string to integer"))
args.ErrorContext.Handled = true;
}
But it seems like there's no other way to determine a more specific error than JsonReaderException. Has anyone encountered this issue before and found a better workaround than a String.Contains()?
i wrote a following '.thrift' file
service retriveService {
binary getImageContent(1: string strImgName);
}
I generated 2 clients one in java and other in as3 (using latest thrift-0.9.0.exe) .
my generated 'retriveService.java' file has following method
public ByteBuffer getImageContent(String strImgName) with return type as 'ByteBuffer'
where as my 'retriveService.as' file has following method
function getImageContent(strImgName:String, onError:Function, onSuccess:Function):void;
with return type as 'void'
i am not able to get the file content in as3 implementation since the return type is void.
Any ideas what I'm missing here?
Method result will be received as a parameter of onSuccess function. So, the client code might look like this:
var s : retriveService = ....
s. getImageContent("imageName",
function(e : Error) : void {
//Handle error
},
function(r : ByteArray) : void {
//Handle result
},