Mvc binding issue from json to enum (customexception from int to enum) - json

i have this problem: i use the json to send data to server.
All works fine but the problem is a situation like:
public enum SexType
{
Male : 0,
Female : 1
}
class People{
public SexType Sex {get;set;}
}
That create me the json:
{"Sex" : 0}
When i send back to server, this fill the ModelStateError with this issue:
The parameter conversion from type 'System.Int32' to type 'SexType' failed because no type converter can convert between these types.
But if i wrap the value with ' all work well:
{"Sex" : '0'}
Anyone have the same problem?
Tnx for all!

Yes, I got the same problem. The weird problem is that if you sent back:
{"Sex" : 'Male'}
it would deserialize no problem.
To solve the problem, I implemented a custom model binder for enums, leveraging the example found here (slightly modified as there were some errors):
http://eliasbland.wordpress.com/2009/08/08/enumeration-model-binder-for-asp-net-mvc/
namespace yournamespace
{
/// <summary>
/// Generic Custom Model Binder used to properly interpret int representation of enum types from JSON deserialization, including default values
/// </summary>
/// <typeparam name="T">The enum type to apply this Custom Model Binder to</typeparam>
public class EnumBinder<T> : IModelBinder
{
private T DefaultValue { get; set; }
public EnumBinder(T defaultValue)
{
DefaultValue = defaultValue;
}
#region IModelBinder Members
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
return bindingContext.ValueProvider.GetValue(bindingContext.ModelName) == null ? DefaultValue : GetEnumValue(DefaultValue, bindingContext.ValueProvider.GetValue(bindingContext.ModelName).AttemptedValue);
}
#endregion
public static T GetEnumValue<T>(T defaultValue, string value)
{
T enumType = defaultValue;
if ((!String.IsNullOrEmpty(value)) && (Contains(typeof(T), value)))
enumType = (T)Enum.Parse(typeof(T), value, true);
return enumType;
}
public static bool Contains(Type enumType, string value)
{
return Enum.GetNames(enumType).Contains(value, StringComparer.OrdinalIgnoreCase);
}
}
}
and then registering the model binder in global.asax.cs.
In your case it would be something like:
ModelBinders.Binders.Add(typeof(SexType), new EnumBinder<SexType>(SexType.Male));
I am not sure if there is a quicker way, but this works great.

The Model binding uses the Enum.Parse() method, which is fairly smart about interpreting strings but does NOT explicitly cast or convert other types into strings, even if system-level facilities exist to do so and even if they're the internal storage type used within the Enum.
Is this the right behavior? Arguably so, since if you don't know enough to convert your Enum values to strings you might not be aware that the right-hand side of the Enum values are not necessarily unique within the Enum, either.
As a matter of personal taste (and this is probably also because I do way too much statistical analysis programming) for sex I generally prefer to define it as a clear boolean value, i.e. instead of differentiating between arbitrary values for 'Male' and 'Female' I use a variable called e.g. IsFemale and set it to true or false. This plays more nicely with json, since it relies on primitive types common to both languages, and requires less typing when you want to use it.

Related

EF Core 7 can't deserialize dynamic-members in JSON column

I am trying to map my Name column to a dynamic object. This is how the raw JSON data looks (note that this is SQL-morphed from our old relational data and I am not able to generate or interact with this column via EF Core):
{ "en": "Water", "fa": "آب", "ja": "水", ... }
Just to note, available languages are stored in a separate table and thus are dynamically defined.
Through T-SQL I can perfectly interact with these objects eg
SELECT *
FROM [MyObjects]
WHERE JSON_VALUE(Name, '$.' + #languageCode) = #searchQuery
But it seems EF Core doesn't want to even deserialize these objects as whole, let alone query them.
What I get in a simple GetAll query is an empty Name. Other columns are not affected though.
I have tried so far
Using an empty class with a [JsonExtensionData] dictionary inside
Using a : DynamicObject inheritance and implementing GetDynamicMembers, TryGetMember, TrySetMember, TryCreateInstance
Directly mapping to a string dictionary.
Combining 1 & 2 and adding an indexer operator on top.
All yield the same results: an empty Name.
I have other options like going back to a junction table relational which I have many issues with, hardcoding languages which is not really intuitive and might cause problems in the future, using HasJsonConversion which basically destroys the performance on any search action... so I'm basically stuck here with this.
I think currently it's not fully supported:
You can not use dynamic operations on an expression tree like a Select statement because it needs to be translated.
JsonValue and JsonQuery requires a path to be resolved.
If you specify OwnsOne(entity = >entity.owned, owned => owned.ToJson()) and the Json could not be parsed you will get an error.
I suggest this workaround while the EF team improves the functionality.
Create a static class with static methods to be used as decoys in the expression tree. This will be mapped to the server built-in functions.
public static class DBF
{
public static string JsonValue(this string column, [NotParameterized] string path)
=> throw new NotSupportedException();
public static string JsonQuery(this string column, [NotParameterized] string path) => throw new NotSupportedException();
}
Include the database functions on your OnModelCreating method.
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
modelBuilder.HasDbFunction(
typeof(DBF).GetMethod(nameof(DBF.JsonValue))!
).HasName("JSON_VALUE").IsBuiltIn();
modelBuilder.HasDbFunction(
typeof(DBF).GetMethod(nameof(DBF.JsonQuery))!
).HasName("JSON_QUERY").IsBuiltIn();
/// ...
modelBuilder.Entity(entity => {
//treat entity as text
entity.Property(x => x.Metadata)
.HasColumnType("varchar")
.HasMaxLength(8000);
});
}
Call them dynamically with LINQ.
var a = await _context.FileInformation
.AsNoTracking()
.Where(x => x.Metadata!.JsonValue("$.Property1") == "some value")
.Select(x => x.Metadata!.JsonValue("$.Property2"))
.ToListAsync();
You can add casts or even build anonymous types with this method.
My solution was I added a new class which has KEY and VALUE , which will represent the dictionary i needed :
public class DictionaryObject
{
public string Key { set; get; }
public string Value { set; get; }
}
and instead of having this line in the JSON class :
public Dictionary<string, string> Name { get; set; }
I changed to :
public List<DictionaryObject> Name { get; set; }
Hope it helps.

JsonConverter - WebApi - Case Sensitivity - Polymorphic

I'm using a JsonConverter to deal with a polymorphic collection:
class ItemBatch
{
List<ItemBase> Items { get; set; }
}
// For type discrimination of ItemBase
class ItemTypes
{
public int Value { get; set; }
}
[JsonConverter(typeof(ItemConverter))]
abstract class ItemBase
{
public abstract ItemTypes Type { get; set; }
}
class WideItem : ItemBase
{
public override ItemTypes Type => 1;
public decimal Width { get; set; }
}
class HighItem : ItemBase
{
public override ItemTypes Type => 2;
public decimal Height { get; set; }
}
class ItemConverter : JsonConverter<ItemBase>
{
public override ItemBase? Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
using (JsonDocument jsonDoc = JsonDocument.ParseValue(ref reader));
int type = jsonDoc.RootElement.GetProperty("Type").GetProperty("Value").GetInt32();
// deserialize depending on type.
}
}
Am using Blazor and store the ItemBatch in IndexedDb before retrieving again later and sending to WebAPI.
Serializing to IndexedDb and deserializing from IndexedDb works fine.
But, when I try to send the ItemBatch to a WebAPI, I get the error:
Exception thrown: 'System.Collections.Generic.KeyNotFoundException' in
System.Text.Json.dll An exception of type
'System.Collections.Generic.KeyNotFoundException' occurred in
System.Text.Json.dll but was not handled in user code The given key
was not present in the dictionary.
From peeking at various values, I suspected an issue with case-sensitivity. Indeed, if I change:
int type = jsonDoc.RootElement.GetProperty("Type").GetProperty("Value").GetInt32();
to
int type;
try
{
type = jsonDoc.RootElement.GetProperty("Type").GetProperty("Value").GetInt32();
}
catch (Exception)
{
type = jsonDoc.RootElement.GetProperty("type").GetProperty("value").GetInt32();
}
then I get past this error and my WebAPI gets called.
What am I missing that allows me to serialize / deserialize to / from IndexedDb, but the WebApi Json conversion is having issues with case sensitivity.
Newtonsoft was case insensitive.
With System.Text.Json you have to pull some more levers.
https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json-migrate-from-newtonsoft-how-to?pivots=dotnet-6-0#case-insensitive-deserialization
Case-insensitive deserialization During deserialization,
Newtonsoft.Json does case-insensitive property name matching by
default. The System.Text.Json default is case-sensitive, which gives
better performance since it's doing an exact match. For information
about how to do case-insensitive matching, see Case-insensitive
property matching.
See this below URL as well:
https://makolyte.com/csharp-case-sensitivity-in-json-deserialization/
Here is a possible way to deal with it:
https://learn.microsoft.com/en-us/dotnet/standard/serialization/system-text-json-character-casing
above url is ::: How to enable case-insensitive property name matching with
System.Text.Json
https://learn.microsoft.com/en-us/dotnet/api/system.text.json.jsonelement.getproperty?view=net-6.0#system-text-json-jsonelement-getproperty(system-string)
Remarks
Property name matching is performed as an ordinal, case-sensitive comparison.
I don't think you can overcome that behavior on the "roll your own".
But maybe you can chase this:
https://learn.microsoft.com/en-us/dotnet/api/system.text.json.jsonserializeroptions.propertynamecaseinsensitive?view=net-6.0#system-text-json-jsonserializeroptions-propertynamecaseinsensitive
But that seems to be without "roll your own".
My conclusion is that using default serialization, the JsonSerializerOptions do not set a value "CamelCase" for the PropertyNamingPolicy, but the HttpClient and WebApi Request Pipeline do:
PropertyNamingPolicy = JsonNamingPolicy.CamelCase;
This means that when I'm serializing and deserializing to IndexedDb, the default serialization leaves the Json as Pascal Case.
My custom JsonConverter, which was using the options parameter passed into the Read and Write methods, therefore used Pascal case on the client when working with IndexedDb.
However, when the same JsonConverter is called by the HttpClient and the WebApi request pipeline, the options are set to:
PropertyNamePolicy = JsonNamingPolicy.CamelCase.
and when trying to parse to a JsonDocument, the content is now camel cased and my reading of the JsonDocument using Pascal case assumptions then fell over.
The answer to my problem was to update the write method as follows:
public override void Write(Utf8JsonWriter writer, SaleCommandBase value, JsonSerializerOptions options)
{
JsonSerializerOptions newOptions = new JsonSerializerOptions(options) { PropertyNamingPolicy = null };
JsonSerializer.Serialize(writer, (object)value, newOptions);
}
This forces the serialization to use Pascal case in all situations, whether that be local serialization on the client (when writing to IndexedDb) or whether serializing within the HttpClient when sending to a WebAPI.
Similary, in the read method:
using (JsonDocument jsonDoc = JsonDocument.ParseValue(ref reader))
{
int type = jsonDoc.RootElement.GetProperty("Type").GetProperty("Value").GetInt32();
newOptions = new JsonSerializerOptions(options) { PropertyNamingPolicy = null };
return type switch
{
1 => jsonDoc.RootElement.Deserialize<WideItem>(newOptions),
2 => jsonDoc.RootElement.Deserialize<HighItem>(newOptions),
_ => throw new InvalidOperationException($"Cannot convert type '{type}'."),
};
}
By copying whatever the provided options are, but then overriding the naming policy to use Pascal case (PropertyNamingPolicy = null), then I can be assured that the Json document parsed will always be in Pascal case, regardless of the options provided by the framework.

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

How do I omit the assembly name from the type name while serializing and deserializing in JSON.Net?

We have a single contract assembly which has all our data contracts. We are using JSON.net for serializing our data contracts to json.
JSON.Net adds both the type name and the assembly name in the $type attribute on serialization. Since all our data contracts are in the same assembly which is always loaded in the current app domain, we should be able to omit this.
How can we achieve this?
Thanks
You can use the Binder property in your JsonSerializerSettings.
This blog post (by the library author) describes the steps: http://james.newtonking.com/archive/2011/11/19/json-net-4-0-release-4-bug-fixes.aspx
In short, you create a custom class deriving from SerializationBinder and override two methods:
BindToName(Type serializedType, out string assemblyName, out string typeName)
BindToType(string assemblyName, string typeName)
The logic you place in those methods will give you direct control over how type names are converted to string representation in the $type field, and how types are located at run-time given values from $type.
In your case, wanting to omit the Assembly name, you can probably do:
public override void BindToName(
Type serializedType, out string assemblyName, out string typeName)
{
assemblyName = null;
typeName = serializedType.FullName;
}
public override Type BindToType(string assemblyName, string typeName)
{
return Type.GetType(typeName);
}
I think maybe tag the class with the JsonObjectAttribute
[DataContract]
[JsonObject("")]
public class MyContractClass { ... }
This should override the fact that it is also a DataContract.

Dependency Injection and JavaScriptConverter.Deserialize

My application needs to combine extensive use of dependency injection with the use of JSON as a public API. This apparently leads to the need for a custom JavaScriptConverter.
Right now, my JavaScriptConverter's Deserialize method looks like this:
public override object Deserialize(IDictionary<string, object> dictionary, Type type, JavaScriptSerializer serializer)
{
var result = IocHelper.GetForType(type);
return result;
}
This hands back the appropriate class. Unfortunately, it fails to populate the class members with the applicable values. What I'm missing is a way to tell the Serializer, "Here's the type you asked for. Now fill it in."
The solution I used was to switch from JavaScriptSerializer to Newtonsoft's JSON converter
I was able to get a working round trip by writing a single CustomCreationConverter:
public class JsonDomainConverter : CustomCreationConverter<object>
{
public JsonDomainConverter()
{
}
public override bool CanConvert(Type objectType)
{
return objectType.IsInterface;
}
public override object Create(Type objectType)
{
return IocHelper.GetForType(objectType);
}
}
No doubt this same approach is possible with JavaScriptSerializer, I just couldn't figure out how to make it work. With the Newtonsoft stuff, it took a couple hours at the most, and just a couple lines of code.