How to validate JSON using System.Text.Json before deserialization - json

In .NET Core 5.0, using System.Text.Json.JsonSerializer Deserialize(someJsonFile) i get:
System.Text.Json.JsonException: 'The JSON value could not be converted to System.Guid. Path: $.someGuid | ..
which is expected, since the the someGuid property is of type System.Guid and the value of someGuid in the JSON file/string is:
{
"someGuid": ""
}
which can not be deserialized properly .. (since it's not Guid.Empty)..
To my question.
What's a good and generic implementation to validate the Json before deserializing it (in general)? like TryParse or JsonDocument.Parse? sure, try-catch but that's dirty (imho).
btw: i don't want to use Newtonsoft
thanks for your suggestions (and critics of course).

I created a custom converter using the example is this answer: The JSON value could not be converted to System.Int32
public class StringToGuidConverter : JsonConverter<Guid>
{
public override Guid Read(ref Utf8JsonReader reader, Type type, JsonSerializerOptions options)
{
if (reader.TokenType == JsonTokenType.String)
{
ReadOnlySpan<byte> span = reader.ValueSpan;
if (Utf8Parser.TryParse(span, out Guid guid, out int bytesConsumed) && span.Length == bytesConsumed)
{
return guid;
}
if (Guid.TryParse(reader.GetString(), out guid))
{
return guid;
}
}
return Guid.Empty;
}
public override void Write(Utf8JsonWriter writer, Guid value, JsonSerializerOptions options)
{
writer.WriteStringValue(value.ToString());
}
}
In my case my model to deserialize to can't take Nullable Guid so I return an empty GUID and then validate this in my logic.
Because I'm create a web api using .Net standard, I can't register this in the services at the startup class. But you can register the custom converter using the JsonSerializerOptions property when calling the Deserialize method like this:
var options = new JsonSerializerOptions
{
PropertyNameCaseInsensitive = true,
Converters = { new NERDS.API.Helpers.StringToGuidConverter() }
};
StreamReader reader = new StreamReader(HttpContext.Current.Request.InputStream);
string json = reader.ReadToEnd();
return JsonSerializer.Deserialize<T>(json, options);

Related

JsonConstructorAttribute in System.Text.Json (but not Newtonsoft.Json) results in exception when property and constructor argument types differ

Given a Base64 string, the following sample class will deserialize properly using Newtonsoft.Json, but not with System.Text.Json:
using System;
using System.Text.Json.Serialization;
public class AvatarImage{
public Byte[] Data { get; set; } = null;
public AvatarImage() {
}
[JsonConstructor]
public AvatarImage(String Data) {
//Remove Base64 header info, leaving only the data block and convert it to a Byte array
this.Data = Convert.FromBase64String(Data.Remove(0, Data.IndexOf(',') + 1));
}
}
With System.Text.Json, the following exception is thrown:
must bind to an object property or field on deserialization. Each parameter name must match with a property or field on the object. The match can be case-insensitive.
Apparently System.Text.Json doesn't like the fact the property is a Byte[] but the parameter is a String, which shouldn't really matter because the whole point is that the constructor should be taking care of the assignments.
Is there any way to get this working with System.Text.Json?
In my particular case Base64 images are being sent to a WebAPI controller, but the final object only needs the Byte[]. In Newtonsoft this was a quick and clean solution.
This is apparently a known restriction of System.Text.Json. See the issues:
[JsonSerializer] Relax restrictions on ctor param type to immutable property type matching where reasonable #44428 which is currently labeled with the 6.0.0 Future milestone.
System.Text.Json incorrectly requires construct parameter types to match immutable property types. #47422
JsonConstrutor different behavior between Newtonsoft.Json and System.Text.Json #46480.
Thus (in .Net 5 at least) you will need to refactor your class to avoid the limitation.
One solution would be to add a surrogate Base64 encoded string property:
public class AvatarImage
{
[JsonIgnore]
public Byte[] Data { get; set; } = null;
[JsonInclude]
[JsonPropertyName("Data")]
public string Base64Data
{
private get => Data == null ? null : Convert.ToBase64String(Data);
set
{
var index = value.IndexOf(',');
this.Data = Convert.FromBase64String(index < 0 ? value : value.Remove(0, index + 1));
}
}
}
Note that, ordinarily, JsonSerializer will only serialize public properties. However, if you mark a property with [JsonInclude] then either the setter or the getter -- but not both -- can be nonpublic. (I have no idea why Microsoft doesn't allow both to be private, the data contract serializers certainly support private members marked with [DataMember].) In this case I chose to make the getter private to reduce the chance the surrogate property is serialized by some other serializer or displayed via some property browser.
Demo fiddle #1 here.
Alternatively, you could introduce a custom JsonConverter<T> for AvatarImage
[JsonConverter(typeof(AvatarConverter))]
public class AvatarImage
{
public Byte[] Data { get; set; } = null;
}
class AvatarConverter : JsonConverter<AvatarImage>
{
class AvatarDTO { public string Data { get; set; } }
public override AvatarImage Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
var dto = JsonSerializer.Deserialize<AvatarDTO>(ref reader, options);
var index = dto.Data?.IndexOf(',') ?? -1;
return new AvatarImage { Data = dto.Data == null ? null : Convert.FromBase64String(index < 0 ? dto.Data : dto.Data.Remove(0, index + 1)) };
}
public override void Write(Utf8JsonWriter writer, AvatarImage value, JsonSerializerOptions options) =>
JsonSerializer.Serialize(writer, new { Data = value.Data }, options);
}
This seems to be the easier solution if for simple models, but can become a nuisance for complex models or models to which properties are frequently added.
Demo fiddle #2 here.
Finally, it seems a bit unfortunate that the Data property will have some extra header prepended during deserialization that is not present during serialization. Rather than fixing this during deserialization, consider modifying your architecture to avoid mangling the Data string in the first place.
Implementing the custom converter deserialization using an ExpandoObject can avoid the nested DTO class if desired:
using System.Dynamic;
.
.
.
public override FileEntity Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) {
dynamic obj = JsonSerializer.Deserialize<ExpandoObject>(ref reader, options);
return new FileEntity {
Data = (obj.data == null) ? null : Convert.FromBase64String(obj.data.GetString().Remove(0, obj.data.GetString().IndexOf(',') + 1))
};
}
it makes the custom converter a little more flexible when developing, since the DTO doesn't continually need to grow with the base class being deserialized into. It also makes handling potential nullable properties a bit easier too (over standard JsonElement deserialization, i.e. JsonSerializer.Deserialize< JsonElement >) as such:
public override FileEntity Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options) {
dynamic obj = JsonSerializer.Deserialize<ExpandoObject>(ref reader, options);
return new FileEntity {
SomeNullableInt32Property = obj.id?.GetInt32(),
Data = (obj.data?.GetString() == null) ? null : Convert.FromBase64String(obj.data.GetString().Remove(0, obj.data.GetString().IndexOf(',') + 1))
};
}

Serializing Manatee.Json in .NET Core 3

Background
I want to provide some JsonSchema from my .NET Core 3 application, as well as other objects serialized into JSON. Since Manatee.Json is frequently updated and provides good support for JsonSchema, they are my preferred choice. At the same time, .NET Core 3 provides excellent support for returning objects with magic that transform them into JSON documents.
Example:
public ActionResult<MyFancyClass> MyFancyAction()
{
return new MyFancyClass {
Property1 = "property 1 content",
Property2 = "property 2 content",
};
}
Output:
{
"Property1": "property 1 content",
"Property2": "property 2 content"
}
Problem
.NET Core 3 has internal support for JSON with its System.Text.Json which is used in the previous example. If I try to serialize Manatee.Json.Schema.JsonSchema, its internal structure are serialized, not the json schema itself.
Example:
public ActionResult<MyFancyClass2> MyFancyAction2()
{
return new MyFancyClass2 {
Property1 = "property 1 content",
Property1Schema = new JsonSchema()
.Type(JsonSchemaType.String)
};
}
Output:
{
"Property1": "property 1 content",
"Property1Schema": [{
"name":"type",
"supportedVersions":15,
"validationSequence":1,
"vocabulary": {
"id":"https://json-schema.org/draft/2019-09/vocab/validation",
"metaSchemaId":"https://json-schema.org/draft/2019-09/meta/validation"
}
}]
}
I expect this:
{
"Property1": "property 1 content",
"Property1Schema": {
"type": "string",
}
}
Manatee.Json.JsonValue also have a conflicting inner structure, where System.Text.Json.JsonSerializer fails to access internal get methods correctly and I get for instance this exception message:
Cannot access value of type Object as type Boolean.
Discovery
Manatee.Json.Schema.JsonSchema has a .ToJson() method that can be used to get the correct json schema as a JsonValue, but then I get the problem I just mentioned with serializing Manatee.Json.JsonValue.
Question
Does anyone know a way to enable System.Text.Json to serialize Manatee.Json structures?
Sidemark
Another way forward is to replace System.Text.Json altogether (take a look at this question).
.NET Core 3 json serialization comes with a lot of configuration options. One of them is to add converters that specify how different types should be serialized. One way forward is to create a JsonConverter for JsonSchema and another for JsonValue.
For JsonSchema we can implement a JsonSchemaConverter that when serializing/writing, extracts the json schema as a JsonValue and ask the serializer to serialize that JsonValue instead. Like this:
public class JsonSchemaConverter : JsonConverter<JsonSchema>
{
public JsonSchemaConverter()
{
_manateeSerializer = new ManateeSerializer();
}
private ManateeSerializer _manateeSerializer { get; set; }
public override JsonSchema Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
var jsonText = reader.GetString();
var jsonValue = JsonValue.Parse(jsonText);
return _manateeSerializer.Deserialize<JsonSchema>(jsonValue);
}
public override void Write(Utf8JsonWriter writer, JsonSchema value, JsonSerializerOptions options)
{
var schemaAsJson = value.ToJson(_manateeSerializer);
try
{
System.Text.Json.JsonSerializer.Serialize<JsonValue>(writer, schemaAsJson, options);
}
catch (Exception e)
{
Log.Information($"Failed to serialize JsonSchema ({e.Message});");
writer.WriteNullValue();
}
}
}
For JsonValue we can change it into something System.Text.Json understands, since it is json after all. One unfortunate approach is to serialize the JsonValue to a string, parsing it with for instance JsonDocument.Parse(string) and serialize its RootElement property. It feels so unnecessary to go via JsonDocument, so if anyone finds a better solution that would be great!
A possible implementation can look like this:
public class JsonValueConverter : JsonConverter<JsonValue>
{
public override JsonValue Read(ref Utf8JsonReader reader, Type typeToConvert, JsonSerializerOptions options)
{
var json = reader.GetString();
return JsonValue.Parse(json);
}
public override void Write(Utf8JsonWriter writer, JsonValue value, JsonSerializerOptions options)
{
string content = value.ToString();
try
{
var jsonDocument = JsonDocument.Parse(content);
JsonSerializer.Serialize<JsonElement>(writer, jsonDocument.RootElement, options);
}
catch (Exception e)
{
Log.Warning($"JsonDocument.Parse(JsonValue) failed in JsonValueConverter.Write(,,).\n{e.Message}");
writer.WriteNullValue();
}
}
}
They must be registered at Startup.cs like this:
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers()
.AddJsonOptions(options =>
{
options.JsonSerializerOptions.Converters.Add(new JsonValueConverter());
options.JsonSerializerOptions.Converters.Add(new JsonSchemaConverter());
});
}

Webapi with entity framework - use customer JSON converter to serialized DbGeography not working

I am using webapi with DbGeography spatial data and want to serialize to json.
By default, DbGeography serializes to null. So I implemented my own converter for it.
Here is what I have so far, but it doesn't seem to work.
Basically, with the following code, my DbGeographyConverter.WriteJson method is never under debug and the Location property is serialized as null
Customer converter:
public class DbGeographyConverter : JsonConverter
{
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
DbGeography contextObj = value as DbGeography;
writer.WriteStartObject();
writer.WritePropertyName("Lat");
serializer.Serialize(writer, contextObj.Latitude);
writer.WritePropertyName("Long");
serializer.Serialize(writer, contextObj.Longitude);
writer.WriteEndObject();
}
public override bool CanConvert(Type objectType)
{
if (objectType == typeof(DbGeography))
{
return true;
}
return false;
}
public override bool CanRead
{
get
{
return true;
}
}
public override bool CanWrite
{
get
{
return true;
}
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
throw new NotImplementedException();
}
Add convert in Global.ascx.cs
protected void Application_Start()
{
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.Converters.Add(
new DbGeographyConverter()
}
Finally, apply converter to the data model class property
public DataModelClass1
{
[JsonConverter (typeof(DbGeographyConverter))]
public DbGeography Location { get; set; }
}
First, since you're adding your custom converter to the SerializerSettings.Converters collection, you don't need to decorate your DataModelClass1's Location property with the JsonConverterAttribute- The JsonFormatter will run through the aforementioned collection until it finds the derived JsonConverter you added without the attribute.
Now back to your question- which browser are you testing this in and how? If I were to speculate, I'd say you're using either Chrome or Firefox with GET requests, both of which prioritize application/xml over application/json in the accept header they send to the server. For that reason Web API will see that the browsers prefer XML over JSON and the JsonFormatter will never be touched, let alone your custom JsonConverter.
There are a few workarounds to this. On the browser side the easiest way is to make ajax GET requests with jQuery and specify that you want JSON back. On the server side, you can remove application/xml from the SupportedMediaTypes.
I spent quite awhile on this. You'd only use the write method if you wanted to change the default output format of the Json for DbGeography from this
"geography": {
"coordinateSystemId": 4326,
"wellKnownText": "POINT (77.6599502563474 12.9602302518557)"
}
to something else like "77.22, 12.8" - just a single string.
If you're looking to convert a string like that to DbGeography on reading Json from the request, the code below is what you're after
public class DbGeographyConverter : JsonConverter
{
public override bool CanConvert ( Type objectType )
{
return objectType.IsAssignableFrom( typeof( string ) );
}
public override object ReadJson ( JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer )
{
if ( reader.Value == null ) {
return null;
}
return Parser.ToDbGeography( reader.Value.ToString() );
}
public override void WriteJson ( JsonWriter writer, object value, JsonSerializer serializer )
{
// Base serialization is fine
serializer.Serialize( writer, value );
}
}
This is the code for the Converter if you're passing in a string value - 12,99
Which will be you're lat and lng
This sample application has the answer you're looking for.
https://code.msdn.microsoft.com/windowsazure/HTML-ASPNET-Web-API-Bing-58c97f9f
The Converter code from this can be found here
https://github.com/Azure-Samples/SQLDatabase-Spatial-WebAPI-BingMaps/blob/master/SpatialTypesWithWebAPI/Models/DbGeographyConverter.cs

Emit odata.type field with DataContractJsonSerializer?

Is there a way to make DataContractJsonSerializer emit the "odata.type" field required when posting an OData entity into a collection that supports multiple entity types (hierarchy per table)?
If I construct DataContractJsonSerializer with a settings object with EmitTypeInformation set to Always, it emits a "__type" field in the output, but that's not the field name needed for OData and the format of the value is wrong as well.
Is there any way to hook into the DataContractJsonSerializer pipeline to inject the desired "odata.type" field into the serialization output?
It would be such a hack to have to parse the serialization output in order to inject the field. How does WCF Data Services do it? Not using DataContractJsonSerializer is my guess.
Have you considered using Json.Net? Json.Net is much more extensible and the scenario that you have can be done using a custom resolver. sample code
class Program
{
static void Main(string[] args)
{
Console.WriteLine(
JsonConvert.SerializeObject(new Customer { Name = "Raghu" }, new JsonSerializerSettings
{
ContractResolver = new CustomContractResolver()
}));
}
}
public class CustomContractResolver : DefaultContractResolver
{
protected override JsonObjectContract CreateObjectContract(Type objectType)
{
JsonObjectContract objectContract = base.CreateObjectContract(objectType);
objectContract.Properties.Add(new JsonProperty
{
PropertyName = "odata.type",
PropertyType = typeof(string),
ValueProvider = new StaticValueProvider(objectType.FullName),
Readable = true
});
return objectContract;
}
private class StaticValueProvider : IValueProvider
{
private readonly object _value;
public StaticValueProvider(object value)
{
_value = value;
}
public object GetValue(object target)
{
return _value;
}
public void SetValue(object target, object value)
{
throw new NotSupportedException();
}
}
}
public class Customer
{
public string Name { get; set; }
}
I can't answer your first two questions, but for the third question, I found on the OData Team blog a link to the OData WCF Data Services V4 library open source code. Downloading that code, you will see that they perform all serialization and deserialization manually. They have 68 files in their two Json folders! And looking through the code they have comments such as:
// This is a work around, needTypeOnWire always = true for client side:
// ClientEdmModel's reflection can't know a property is open type even if it is, so here
// make client side always write 'odata.type' for enum.
So that to me kind of implies there is no easy, clean, simple, elegant way to do it.
I tried using a JavaScriptConverter, a dynamic type, and other stuff, but most of them ended up resorting to using Reflection which just made for a much more complicated solution versus just using a string manipulation approach.

ASP.Net MVC3 - why does the default support for JSON model binding fail to decode to enum types?

I am having an issue with ASP.Net MVC3 (RC2). I'm finding that the new JSON model binding functionality, which is implicit in MVC3, does not want to deserialize to a property that has an enum type.
Here's a sample class and enum type:
public enum MyEnum { Nothing = 0, SomeValue = 5 }
public class MyClass
{
public MyEnum Value { get; set; }
public string OtherValue { get; set; }
}
Consider the following code, which successfully passes the unit test:
[TestMethod]
public void Test()
{
var jss = new JavaScriptSerializer();
var obj1 = new MyClass { Value = MyEnum.SomeValue };
var json = jss.Serialize(obj1);
var obj2 = jss.Deserialize<MyClass>(json);
Assert.AreEqual(obj1.Value, obj2.Value);
}
If I serialize obj1 above, but then post that data to an MVC3 controller (example below) with a single parameter of type MyClass, any other properties of the object deserialize properly, but any property that is an enum type deserializes to the default (zero) value.
[HttpPost]
public ActionResult TestAction(MyClass data)
{
return Content(data.Value.ToString()); // displays "Nothing"
}
I've downloaded the MVC source code from codeplex but I'm stumped as to where the actual code performing the deserialization occurs, which means I can't work out what the folks at Microsoft have used to perform the deserialization and thus determine if I'm doing something wrong or if there is a workaround.
Any suggestions would be appreciated.
I've found the answer. I hope this is fixed in MVC3 RTM, but essentially what happens is the object deserializes correctly internally via JsonValueProviderFactory, which uses JavaScriptSerializer to do the work. It uses DeserializeObject() so that it can pass the values back to the default model binder. The problem is that the default model binder won't convert/assign an int value when the property type is an enum.
There is a discussion of this at the ASP.Net forums here:
http://forums.asp.net/p/1622895/4180989.aspx
The solution discussed there is to override the default model binder like so:
public class EnumConverterModelBinder : DefaultModelBinder
{
protected override object GetPropertyValue(ControllerContext controllerContext, ModelBindingContext bindingContext, PropertyDescriptor propertyDescriptor, IModelBinder propertyBinder)
{
var propertyType = propertyDescriptor.PropertyType;
if(propertyType.IsEnum)
{
var providerValue = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
if(null != providerValue)
{
var value = providerValue.RawValue;
if(null != value)
{
var valueType = value.GetType();
if(!valueType.IsEnum)
{
return Enum.ToObject(propertyType, value);
}
}
}
}
return base.GetPropertyValue(controllerContext, bindingContext, propertyDescriptor, propertyBinder);
}
}
Then in Application_Start, add the following line:
ModelBinders.Binders.DefaultBinder = new EnumConverterModelBinder();
How are you calling this action? Have you tried:
$.post(
'/TestAction',
JSON.stringify({ OtherValue : 'foo', Value: 5 }),
function(result) {
alert('ok');
}
);