How can I remove Indentation in Newtonsoft JSON and ASP.NET Core and add it into settings - json

I'm supposed to convert our JSON output into canonical JSON.
My 2 questions are:
How do I remove all indentation and whitelines e.g. ?
How do I add those settings to startup.cs ?
My colleague wrote the methods to create the JSON files with the JsonWriter and JsonReader methods from Newtonsoft.
I already overwrote the DefaultContractResolver in a new class to sort the keys alphabetically, but failed to find a proper point in the startup to add those settings. Also I'm missing the option to remove all indentation, new lines etc.
Here is my CanonicalContractResolver:
public class CanonicalContractResolver : DefaultContractResolver
{
public override JsonContract ResolveContract(Type type)
{
var contract = base.CreateContract(type);
// remove Intendation here
return contract;
}
protected override IList<JsonProperty> CreateProperties(Type type, MemberSerialization memberSerialization)
{
return base.CreateProperties(type, memberSerialization).OrderBy(p => p.PropertyName).ToList();
}
}
The afore mentioned JsonReader and JsonWriter classes (that need the canonical JSON output) are linked like this in the Configure method in startup.cs - and I don't really understand where I should add those changes I made in my CanonicalContractResolver class.
services.AddControllers()
.AddNewtonsoftJson(options => {
options.SerializerSettings.Converters.Add(new SignaturesConverter());
options.SerializerSettings.Converters.Add(new PolicyConverter());
});
I'm a beginner in software engineer and this is my first post on Stackoverflow. I already researched around 6-7 hours in this topic, but the Newtonsoft documentation is very sparse and hasn't helped me a lot.
Thank you all in advance for helping!

You can set the Format property of NewtonsoftJson.
If you set to Indented:
services.AddControllers()
.AddNewtonsoftJson(options => {
options.SerializerSettings.Formatting = Formatting.Indented;
});
Then the output looks like this:
If you set to None:
services.AddControllers()
.AddNewtonsoftJson(options => {
options.SerializerSettings.Formatting = Formatting.None;
});
Then the output looks like this:

Related

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

TypeScript: how to JSON stringify a class definition?

Say we have:
class MyClass {
myProperty: string
}
Is there any built in function or easy way to get JSON like this?:
{
"myProperty": "string"
}
EDIT: My end goal is I want to dynamically print typed class definitions to a web view, in some kind of structured object syntax like JSON. I'm trying to make a server API that will return the schema for various custom classes - for example http://myserver.com/MyClass should return MyClass's properties and their types as a JSON string or other structured representation.
Evert is correct, however a workaround can look like this
class MyClass {
myProperty: string = 'string'
}
JSON.stringify(new MyClass) // shows what you want
In other words, setting a default property value lets TS compile properties to JS
If the above solution is not acceptable, then I would suggest you parsing TS files with your classes with https://dsherret.github.io/ts-simple-ast/.
Typescript class properties exist at build-time only. They are removed from your source after compiling to .js. As such, there is no run-time way to get to the class properties.
Your code snippet compiles to:
var MyClass = /** #class */ (function () {
function MyClass() {
}
return MyClass;
}());
As you can see, the property disappeared.
Based on your update, I had this exact problem. This is how I solved it.
My JSON-based API uses json-schema across the board for type validation, and also exposes these schemas for clients to re-use.
I used an npm package to automatically convert json-schema to Typescript.
This works brilliantly.

Typescript class with default values, how to parse JSON to this

I have a class of type A.
This class has several properties, let's call them prop1, prop2 and prop3.
When I'm calling an API, that returns a JSON string representing the object, some properties might be omitted if they are null. Further down the road, however, this object is used to construct a form dynamically (using Formik, but that's unrelated).
This framework expects all properties to be there, and some will be visible dynamically depending on other properties.
So my question, how can I parse a JSON response to my custom class, keeping default values in case properties are omitted in the API response?
What I've tried was:
static getCustomer(id) {
return fetch(process.env.MD_API_URL + 'customers/' + id, { mode: 'cors' })
.then(response => {
let cust = new Customer();
return response.json().then(x => cust = JSON.parse(x));
}).catch(error => {
return error;
});
}
But this returns undefined. Must be doing something wrong...
since typescript is not actually compiled but translated into javascript so all the javascript rules apply.
Therefore deserializing json wont actually create a new instance of the class in question but gives you an object you can "call" Customer during design time.
you could however create an object and then assign the json values like this:
export class Customer {
public id: number;
public name: string;
// your stuff here
public myDefaultProp: string = "default value";
public constructor(init?: Partial<Customer>) {
Object.assign(this, init);
}
}
your return then would look like this:
return response.json().then(x => new Customer(JSON.parse(x)));
added an example https://stackblitz.com/edit/typescript-16wlmg
This essentially just a matter of determining what to do in order to create an instance of a class, and map the properties of a JSON response towards your custom class, and there could be many different ways to solve this,
But I think (Factory function) is appropriate approach for this kind of task.

How to export data from LinqPAD as JSON?

I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx

Fluent NHibernate DuplicateMappingException with AutoMapping

Summary:
I want to save two classes of the same name and different namespaces with the Fluent NHibernate Automapper
Context
I'm writing having to import a lot of different objects to database for testing. I'll eventually write mappers to a proper model.
I've been using code gen and Fluent NHibernate to take these DTOs and dump them straight to db.
the exception does say to (try using auto-import="false")
Code
public class ClassConvention : IClassConvention
{
public void Apply(IClassInstance instance)
{
instance.Table(instance.EntityType.Namespace.Replace(".", "_"));
}
}
namespace Sample.Models.Test1
{
public class Test
{
public virtual int Id { get; set; }
public virtual string Something { get; set; }
}
}
namespace Sample.Models.Test2
{
public class Test
{
public virtual int Id { get; set; }
public virtual string SomethingElse { get; set; }
}
}
And here's the actual app code
var model = AutoMap.AssemblyOf<Service1>()
.Where(t => t.Namespace.StartsWith("Sample.Models"))
.Conventions.AddFromAssemblyOf<Service1>();
var cfg = Fluently.Configure()
.Database(
MySQLConfiguration.Standard.ConnectionString(
c => c.Is("database=test;server=localhost;user id=root;Password=;")))
.Mappings(m => m.AutoMappings.Add(model))
.BuildConfiguration();
new SchemaExport(cfg).Execute(false, true, false);
Thanks I really appreciate any help
Update using Fluent Nhibernate RC1
solution from fluent-nhibernate forums by James Gregory
Got around to having a proper look at
this tonight. Basically, it is down to
the AutoImport stuff the exception
mentioned; when NHibernate is given
the first mapping it sees that the
entity is named with the full assembly
qualified name and creates an import
for the short name (being helpful!),
and then when you add the second one
it then complains that this import is
now going to conflict. So the solution
is to turn off the auto importing;
unfortunately, we don't have a way to
do that in the RC... I've just
commited a fix that adds in the
ability to change this in a
convention. So if you get the latest
binaries or source, you should be able
to change your Conventions line in
your attached project to do this:
.Conventions.Setup(x => {
x.AddFromAssemblyOf<Program>();
x.Add(AutoImport.Never()); });
Which adds all the conventions you've
defined in your assembly, then uses
one of the helper conventions to turn
off auto importing.
I was not able to get this to work using Conventions for FluentMappings (in contrast to AutoMappings). However, the following works for me, though it must be added to each ClassMap where needed.
public class AMap : ClassMap<A>
{
public AMap()
{
HibernateMapping.Not.AutoImport();
Map(x => x.Item, "item");
...
}
}
I am having real problem with this, and the example above or any of its variants do not help.
var cfg = new NotifyFluentNhibernateConfiguration();
return Fluently.Configure()
.Database(
FluentNHibernate.Cfg.Db.MsSqlConfiguration.MsSql2005
.ConnectionString("Server=10.2.65.227\\SOSDBSERVER;Database=NotifyTest;User ID=NHibernateTester;Password=test;Trusted_Connection=False;")
)
.Mappings(m => {
m.AutoMappings
.Add(AutoMap.AssemblyOf<SubscriptionManagerRP>(cfg));
m.FluentMappings.Conventions.Setup(x =>
{
x.AddFromAssemblyOf<Program>();
x.Add(AutoImport.Never());
});
} )
.BuildSessionFactory();
I can't find Program's reference..
I've also tried to put down a seperate xml file to in desperation config fluent nhibernate's mapping to auto-import = false with no success.
Can I please have some more extensive example on how to do this?
Edit, I got the latest trunk just weeks ago.
Edit, Solved this by removing all duplicates.
I have had the same problem. I solved it like this:
Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2008
.ConnectionString(...)
.AdoNetBatchSize(500))
.Mappings(m => m.FluentMappings
.Conventions.Setup(x => x.Add(AutoImport.Never()))
.AddFromAssembly(...)
.AddFromAssembly(...)
.AddFromAssembly(...)
.AddFromAssembly(...))
;
The imported part is: .Conventions.Setup(x => x.Add(AutoImport.Never())). Everything seems to be working fine with this configuration.
Use the BeforeBindMapping event to gain access to the object representation of the .HBM XML files.
This event allows you to modify any properties at runtime before the NHibernate Session Factory is created. This also makes the FluentNHibernate-equivalent convention unnecessary. Unfortunately there is currently no official documentation around this really great feature.
Here's a global solution to duplicate mapping problems ( Just remember that all HQL queries will now need to use Fully Qualified Type names instead of just the class names ).
var configuration = new NHibernate.Cfg.Configuration();
configuration.BeforeBindMapping += (sender, args) => args.Mapping.autoimport = false;
I had to play around with where to add the convention AutoImport.Never() to. I have my persistence mapping separated into different projects - models for each application can also be found in different projects. Using it with Fluent NHibernate and auto mapping.
There are occasions when domains, well mappings really have to be combined. This would be when I need access to all domains. POCO classes used will sometimes have the same name and different namespaces, just as examples above.
Here is how my combine all mapping looks like:
internal static class NHIbernateUtility
{
public static ISessionFactory CreateSessionFactory(string connectionString)
{
return Fluently.Configure()
.Database(
MsSqlConfiguration
.MsSql2008
.ConnectionString(connectionString))
.Mappings(m => m.AutoMappings
.Add(ProjectA.NHibernate.PersistenceMapper.CreatePersistenceModel()))
.Mappings(m => m.AutoMappings
.Add(ProjectB.NHibernate.PersistenceMapper.CreatePersistenceModel()))
.Mappings(m => m.AutoMappings
.Add(ProjectC.NHibernate.PersistenceMapper.CreatePersistenceModel())).BuildSessionFactory();
}
}
And one of the persistence mappers:
public static class PersistenceMapper
{
public static AutoPersistenceModel CreatePersistenceModel()
{
return
AutoMap.AssemblyOf<Credential>(new AutoMapConfiguration())
.IgnoreBase<BaseEntity>()
.Conventions.Add(AutoImport.Never())
.Conventions.Add<TableNameConvention>()
.Conventions.Add<StandardForeignKeyConvention>()
.Conventions.Add<CascadeAllConvention>()
.Conventions.Add<StandardManyToManyTableNameConvention>()
.Conventions.Add<PropertyConvention>();
}
}
Persistence mappers are very similar for each POCO namespace - some have overrides. I had to add .Conventions.Add(AutoImport.Never()) to each persistence mapper and it works like a charm.
Just wanted to share this if anyone else is doing it this way.