System.Text.Json problem with large list serialization - json

I wanted to remove Newtonsoft from my project and start using the default System.Text.Json. Now I have a very large list of objects which I want to serialize, but this doesn't work in System.Text.Json, but it does work in Newtonsoft.
The problem is that the JSON is 'cut off' after +- 58000 characters. So the JSON is not valid anymore.
This works:
return Ok(JsonConvert.SerializeObject(result, new JsonSerializerSettings
{
ContractResolver = new CamelCasePropertyNamesContractResolver()
}));
I tried to change the max settings in my web.config, but this didn't fix it. Are there any other settings which I could implement?
Thanks in advance!
Evert

I didn't reproduce the issue, I make up a model which had a string variable, and the string has a 9999999 length, then it can be converted successfully.
Then I scan JsonSerializerSettings definition and found method below which will throw ArgumentException which seems to relate to your issue. So i'm afraid you can try the set MaxDepth.
/// <summary>
/// Gets or sets the maximum depth allowed when reading JSON. Reading past this depth will throw a <see cref="JsonReaderException"/>.
/// A null value means there is no maximum.
/// The default value is <c>64</c>.
/// </summary>
public int? MaxDepth
{
get => _maxDepthSet ? _maxDepth : DefaultMaxDepth;
set
{
if (value <= 0)
{
throw new ArgumentException("Value must be positive.", nameof(value));
}
_maxDepth = value;
_maxDepthSet = true;
}
}

Related

How to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON?

I've got following setup: C#, ServiceStack, MariaDB, POCOs with objects and structs, JSON.
The main question is: how to use ServiceStack to store POCOs to MariaDB having complex types (objects and structs) blobbed as JSON and still have working de/serialization of the same POCOs? All of these single tasks are supported, but I had problems when all put together mainly because of structs.
... finally during writing this I found some solution and it may look like I answered my own question, but I still would like to know the answer from more skilled people, because the solution I found is a little bit complicated, I think. Details and two subquestions arise later in the context.
Sorry for the length and for possible misinformation caused by my limited knowledge.
Simple example. This is the final working one I ended with. At the beginning there were no SomeStruct.ToString()/Parse() methods and no JsConfig settings.
using Newtonsoft.Json;
using ServiceStack;
using ServiceStack.DataAnnotations;
using ServiceStack.OrmLite;
using ServiceStack.Text;
using System.Diagnostics;
namespace Test
{
public class MainObject
{
public int Id { get; set; }
public string StringProp { get; set; }
public SomeObject ObjectProp { get; set; }
public SomeStruct StructProp { get; set; }
}
public class SomeObject
{
public string StringProp { get; set; }
}
public struct SomeStruct
{
public string StringProp { get; set; }
public override string ToString()
{
// Unable to use .ToJson() here (ServiceStack does not serialize structs).
// Unable to use ServiceStack's JSON.stringify here because it just takes ToString() => stack overflow.
// => Therefore Newtonsoft.Json used.
var serializedStruct = JsonConvert.SerializeObject(this);
return serializedStruct;
}
public static SomeStruct Parse(string json)
{
// This method behaves differently for just deserialization or when part of Save().
// Details in the text.
// After playing with different options of altering the json input I ended with just taking what comes.
// After all it is not necessary, but maybe useful in other situations.
var structItem = JsonConvert.DeserializeObject<SomeStruct>(json);
return structItem;
}
}
internal class ServiceStackMariaDbStructTest
{
private readonly MainObject _mainObject = new MainObject
{
ObjectProp = new SomeObject { StringProp = "SomeObject's String" },
StringProp = "MainObject's String",
StructProp = new SomeStruct { StringProp = "SomeStruct's String" }
};
public ServiceStackMariaDbStructTest()
{
// This one line is needed to store complex types as blobbed JSON in MariaDB.
MySqlDialect.Provider.StringSerializer = new JsonStringSerializer();
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
}
public void Test_Serialization()
{
try
{
var json = _mainObject.ToJson();
if (!string.IsNullOrEmpty(json))
{
var objBack = json.FromJson<MainObject>();
}
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
public void Test_Save()
{
var cs = "ConnectionStringToMariaDB";
var dbf = new OrmLiteConnectionFactory(cs, MySqlDialect.Provider);
using var db = dbf.OpenDbConnection();
db.DropAndCreateTable<MainObject>();
try
{
db.Save(_mainObject);
var dbObject = db.SingleById<MainObject>(_mainObject.Id);
}
catch (System.Exception ex)
{
Debug.WriteLine(ex.Message);
}
}
}
}
What (I think) I know / have tried but at first didn't help to solve it myself:
ServiceStack stores complex types in DB as blobbed JSV by default (last paragraph of first section: https://github.com/ServiceStack/ServiceStack.OrmLite), so it is necessary to set it the way it is proposed: MySqlDialect.Provider.StringSerializer = new JsonStringSerializer(); (https://github.com/ServiceStack/ServiceStack.OrmLite#pluggable-complex-type-serializers)=> default JSV changed to JSON.
the ServiceStack's serialization does not work with structs, it is necessary to treat them special way:
a) according to https://github.com/ServiceStack/ServiceStack.Text#c-structs-and-value-types and example https://github.com/ServiceStack/ServiceStack.Text/#using-structs-to-customize-json it is necessary to implement TStruct.ToString() and static TStruct.ParseJson()/ParseJsv() methods.
b) according to https://github.com/ServiceStack/ServiceStack.Text/#typeserializer-details-jsv-format and unit tests https://github.com/ServiceStack/ServiceStack.Text/blob/master/tests/ServiceStack.Text.Tests/CustomStructTests.cs it shall be TStruct.ToString() (the same as in a) and static TStruct.Parse().
Subquestion #1: which one is the right one? For me, ParseJson() was never called, Parse() was. Documentation issue or is it used in other situation?
I implemented option b). Results:
IDbConnection.Save(_mainObject) saved the item to MariaDB. Success.
Through the saving process ToString() and Parse() were called. In Parse, incoming JSON looked this way:
"{\"StringProp\":\"SomeStruct's String\"}". Fine.
Serialization worked. Success.
Deserialization failed. I don't know the reason, but JSON incoming to Parse() was "double-escaped":
"{\\\"StringProp\\\":\\\"SomeStruct's String\\\"}"
Subquestion #2: Why the "double-escaping" in Parse on deserialization?
I tried to solve structs with JsConfig (and Newtonsoft.Json to get proper JSON):
JsConfig<SomeStruct>.SerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.DeSerializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results:
Save failed: the json input in JsonConvert.DeserializeObject(json) that is used during Save was just type name "WinAmbPrototype.SomeStruct".
De/serialization worked.
b) then I implemented ToString() also using Newtonsoft.Json. During Save ToString() was used instead of JsConfig.SerializeFn even the JsConfig.SerializeFn was still set (maybe by design, I do not judge). Results:
Save failed: but the json input of DeserializeFn called during Save changed, now it was JSV-like "{StringProp:SomeStruct's String}", but still not deserializable as JSON.
De/serialization worked.
Then (during writing this I was still without any solution) I found JsConfig.Raw* "overrides" and tried them:
JsConfig<SomeStruct>.RawSerializeFn = someStruct => JsonConvert.SerializeObject(someStruct);
JsConfig<SomeStruct>.RawDeserializeFn = json => JsonConvert.DeserializeObject<SomeStruct>(json);
a) at first without ToString() and Parse() defined in the TStruct. Results are the same as in 2a.
b) then I implemented ToString(). Results:
BOTH WORKED. No Parse() method needed for this task.
But it is very fragile setup:
if I removed ToString(), it failed (now I understand why, default ToString produced JSON with just type name in 2a, 3a).
if I removed RawSerializeFn setting, it failed in RawDeserializeFn ("double-escaped" JSON).
Is there some simpler solution? I would be very glad if someone points me to better direction.
Acceptable would be maybe two (both of them accessible because of different circumstances):
if I am the TStruct owner: with just pure TStruct.ToString() and static TStruct.Parse() to support out of the box de/serialization and DB by ServiceStack (without different input in Parse()).
if I am a consumer of TStruct with no JSON support implemented and I am without access to its code: until now I did not find the way, if the ToString is not implemented: Save to DB did not work. Maybe would be fine to ensure JsConfig serialize functions are enough for both de/serialization and when used during saving to DB.
And the best one would be without employing other dependency (e.g. Newtonsoft.Json) to serialize structs. Maybe some JsConfig.ShallProcessStructs = true; (WARNING: just a tip, not working as of 2021-04-02) would be fine for such situations.
ServiceStack treats structs like a single scalar value type, just like most of the core BCL Value Types (e.g. TimeSpan, DateTime, etc). Overloading the Parse() and ToString() methods and Struct's Constructor let you control the serialization/deserialization of custom structs.
Docs have been corrected. Structs use Parse whilst classes use ParseJson/ParseJsv
If you want to serialize a models properties I'd suggest you use a class instead as the behavior you're looking for is that of a POCO DTO.
If you want to have structs serailized as DTOs in your RDBMS an alternative you can try is to just use JSON.NET for the complex type serialization, e.g:
public class JsonNetStringSerializer : IStringSerializer
{
public To DeserializeFromString<To>(string serializedText) =>
JsonConvert.DeserializeObject<To>(serializedText);
public object DeserializeFromString(string serializedText, Type type) =>
JsonConvert.DeserializeObject(serializedText, type);
public string SerializeToString<TFrom>(TFrom from) =>
JsonConvert.SerializeObject(from);
}
MySqlDialect.Provider.StringSerializer = new JsonNetStringSerializer();

Xstream ConversionException for Wrong Input String

I'm using xStream to some JSON. I've used xstream quite extensively over the years. However this issue has me stumped.
I'm getting the following ConversionException...
com.thoughtworks.xstream.converters.ConversionException: For input string: ".232017E.232017E44"
---- Debugging information ----
message : For input string: ".232017E.232017E44"
cause-exception : java.lang.NumberFormatException
cause-message : For input string: ".232017E.232017E44"
class : java.sql.Timestamp
required-type : java.sql.Timestamp
converter-type : com.etepstudios.xstream.XStreamTimestampConverter
line number : -1
class[1] : com.pbp.bookacall.dataobjects.AppleReceipt
converter-type[1] : com.thoughtworks.xstream.converters.reflection.ReflectionConverter
class[2] : com.pbp.bookacall.dataobjects.AppleReceiptCollection
version : 1.4.10
-------------------------------
at com.etepstudios.xstream.XStreamTimestampConverter.unmarshal(XStreamTimestampConverter.java:87)
In my XStreamTimestampConverter class I print out the value that is attempting to be converted.. Which turns out to be the following...
XStreamTimestampConverter value = 2017-08-05 23:44:23.GMT
Here is the unmarshal function in my converter...
public Object unmarshal(HierarchicalStreamReader reader, UnmarshallingContext context)
{
Timestamp theTimestamp;
Date theDate;
String value = reader.getValue ();
try
{
SimpleDateFormat formatter = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss.Z");
theDate = formatter.parse(value);
theTimestamp = new Timestamp (theDate.getTime());
}
catch (Exception e)
{
System.out.println ("XStreamTimestampConverter value = " + value);
throw new ConversionException(e.getMessage(), e);
}
return theTimestamp;
}
Any idea where this odd string is coming from? It does not exist anywhere in my JSON. Does xstream have some odd .[num]E.[num]E[num] notation for something? These numbers can change as I run this each time. Also I get an For input string: "" on occasion too. Yet the value is similar to the what is above. It's like it's randomly getting odd values for somewhere.
The data source is from Apple's In-App Purchase /VerifyReceipt web call. The system works just fine some times but then others it does not. It's also important to note that in this very case it parsed 100s of other Date/Timestamp strings using this converter. It just get's confused. Perhaps due to the size of the data?
So I figured out what was going on here. The unmarshal function above is not exactly as I have it in code...
The SimpleDateFormat formatter is actually set in the class rather than in the unmarshal method. Therefore if Xstream holds on to an instance of my converter and the unmarshal is called across multiple threads then it is possible that the formatter can get confused since it is the same object.
That's my only guess at this point as moving the formatter initialization into the method solved the issue. I would say SimpleDateFormatter is not thread safe?
It was the just the sheer about of data and the number of times it was concurrently being called that exposed this issue. Just a tip for anyone else in case this happens to them.

ServiceStack.Text CSV serialization of IEnumerable<object> ignores custom serialization functions

Firstly, please forgive any rookie mistakes here - I'm not a regular poster I'm afraid.
Now on to the nitty gritty...
I am trying to use ServiceStack.Text to serialize objects to CSV. If I keep it simple, everything works as expected when serializing objects of a known type.
However I want to serialize many objects and I don't know the type at runtime so I am writing a reusable component where all data is treated as a System.Object. We already do this same routine for Json serialization without problems. But CsvSerializer appears to handle objects differently during serialization.
Sample code
public void TestIEnumerableObjectSerialization()
{
var data = GenerateSampleData();
JsConfig<DateTime>.SerializeFn =
time => new DateTime(time.Ticks, DateTimeKind.Utc).ToString("yyyy-MM-dd HH:mm:ss");
var csv = CsvSerializer.SerializeToCsv(data);
Console.WriteLine(csv);
Assert.Equal("DateTime\r\n"
+ "2017-06-14 00:00:00\r\n"
+ "2017-01-31 01:23:45\r\n",
csv);
}
object[] GenerateSampleData()
{
return new object[] {
new POCO
{
DateTime = new DateTime(2017,6,14)
},
new POCO
{
DateTime = new DateTime(2017,1,31, 01, 23, 45)
}
};
}
public class POCO
{
public DateTime DateTime { get; set; }
}
The result of this code is that the custom serialization function is not invoked, and the DateTime is written out using the standard ToString() method.
The cause?
The CsvWriter.Write method is inspecting the type of the records and if the type is Object it is treated as a Dictionary<string, object> and CsvDictionaryWriter generates the output.
In turn, CsvDictionaryWriter uses the ToCsvField() extension method to write each property a record.
The problem is that ToCsvField() converts the value of each property to a string using ToString() meaning no custom serialization is performed.
JsonSerializer uses TypeSerializer.SerializeToString(text) to serialize the properties of an Object using any configured custom serialization functions; but this doesn't happen with CsvSerializer.
A possible solution?
Without complicating CsvSerializer, the ToCsvField() extension method could be updated to use TypeSerializer to handle the serialization to a string. Here is what I've been testing with so far:
public static object ToCsvField(this object text)
{
var textSerialized = TypeSerializer.SerializeToString(text).StripQuotes();
return textSerialized == null || !CsvWriter.HasAnyEscapeChars(textSerialized)
? textSerialized
: string.Concat
(
CsvConfig.ItemDelimiterString,
textSerialized.Replace(CsvConfig.ItemDelimiterString, CsvConfig.EscapedItemDelimiterString),
CsvConfig.ItemDelimiterString
);
}
So far I haven't come across an issue with this change, although someone may prefer not to allocate a new intermediate variable before the return statement.
Hopefully that is enough information, so on to my questions...
Has anyone else experienced this issue?
Am I doing something wrong and should I be serializing Objects a different way?
If this is a suitable fix/implementation of TypeSerializer, what are the chances of this being addressed in an update to ServiceStack.Text? I would raise an issue on GitHub but the ServiceStack.Text repo doesn't let me raise issues.
Thanks in advance.

<f:selectItems> returns a validation error [duplicate]

I have a problem with a p:selectOneMenu, no matter what I do I cannot get JSF to call the setter on the JPA entity. JSF validation fails with this message:
form:location: Validation Error: Value is not valid
I have this working on several other class of the same type (ie, join table classes) but cannot for the life of me get this one working.
If anyone can throw some troubleshooting/debugging tips for this sort of problem it would be greatly appreciated.
Using log statements I have verified the following:
The Conveter is returning correct, non null values.
I have no Bean Validation in my JPA entities.
The setter setLocation(Location location) is never called.
This is the simplest example I can do and it simply will not work:
<h:body>
<h:form id="form">
<p:messages id="messages" autoUpdate="true" />
<p:selectOneMenu id="location" value="#{locationStockList.selected.location}" converter="locationConverter">
<p:ajax event="change" update=":form:lblLocation"/>
<f:selectItems value="#{locationStockList.locationSelection}"/>
</p:selectOneMenu>
</h:form>
</h:body>
Converter:
#FacesConverter(forClass=Location.class, value="locationConverter")
public class LocationConverter implements Converter, Serializable {
private static final Logger logger = Logger.getLogger(LocationConverter.class.getName());
#Override
public Object getAsObject(FacesContext context, UIComponent component, String value) {
if (value.isEmpty())
return null;
try {
Long id = Long.parseLong(value);
Location location = ((LocationManagedBean) context.getApplication().getELResolver().getValue(context.getELContext(), null, "location")).find(id);
logger.log(Level.SEVERE, "Converted {0} to {1}" , new Object[] {value, location});
return location;
} catch (NumberFormatException e) {
return new Location();
}
}
#Override
public String getAsString(FacesContext context, UIComponent component, Object value) {
if (value == null || value.toString().isEmpty() || !(value instanceof Location))
return "";
return String.valueOf(((Location) value).getId());
}
}
Console output:
// Getter method
INFO: Current value=ejb.locations.Location[id=null, name=null, latitude=0.0, longitude=0.0]
// Session Bean
INFO: Finding ejb.locations.Location with id=3
// Session Bean
INFO: ### Returning : ejb.locations.Location[id=3, name=mdmd, latitude=4.5, longitude=2.3]
// Converter
SEVERE: Converted 3 to ejb.locations.Location[id=3, name=mdmd, latitude=4.5, longitude=2.3]
// Getter method -> Where did my selected Location go ??
INFO: Current value=ejb.locations.Location[id=null, name=null, latitude=0.0, longitude=0.0]
Validation fails with the message "form:location: Validation Error: Value is not valid"
This error boils down to that the selected item does not match any of the available select item values specified by any nested <f:selectItem(s)> tag during processing of the form submit request.
As part of safeguard against tampered/hacked requests, JSF will reiterate over all available select item values and test if selectedItem.equals(availableItem) returns true for at least one available item value. If no one item value matches, then you'll get exactly this validation error.
This process is under the covers basically as below, whereby bean.getAvailableItems() fictionally represents the entire list of available select items as defined by <f:selectItem(s)>:
String submittedValue = request.getParameter(component.getClientId());
Converter converter = component.getConverter();
Object selectedItem = (converter != null) ? converter.getAsObject(context, component, submittedValue) : submittedValue;
boolean valid = false;
for (Object availableItem : bean.getAvailableItems()) {
if (selectedItem.equals(availableItem)) {
valid = true;
break;
}
}
if (!valid) {
throw new ValidatorException("Validation Error: Value is not valid");
}
So, based on the above logic, this problem can logically have at least the following causes:
The selected item is missing in the list of available items.
The equals() method of the class representing the selected item is missing or broken.
If a custom Converter is involved, then it has returned the wrong object in getAsObject(). Perhaps it's even null.
To solve it:
Ensure that exactly the same list is been preserved during the subsequent request, particularly in case of multiple cascading menus. Making the bean #ViewScoped instead of #RequestScoped should fix it in most cases. Also make sure that you don't perform the business logic in the getter method of <f:selectItem(s)>, but instead in #PostConstruct or an action event (listener) method. If you're relying on specific request parameters, then you'd need to explicitly store them in the #ViewScoped bean, or to re-pass them on subsequent requests by e.g. <f:param>. See also How to choose the right bean scope?
Ensure that the equals() method is implemented right. This is already done right on standard Java types such as java.lang.String, java.lang.Number, etc, but not necessarily on custom objects/beans/entites. See also Right way to implement equals contract. In case you're already using String, make sure that the request character encoding is configured right. If it contains special characters and JSF is configured to render the output as UTF-8 but interpret the input as e.g. ISO-8859-1, then it will fail. See also a.o. Unicode input retrieved via PrimeFaces input components become corrupted.
Debug/log the actions of your custom Converter and fix it accordingly. For guidelines, see also Conversion Error setting value for 'null Converter' In case you're using java.util.Date as available items with <f:convertDateTime>, make sure that you don't forget the full time part in the pattern. See also "Validation Error: Value is not valid" error from f:datetimeConverter.
See also:
Our selectOneMenu wiki page
How to populate options of h:selectOneMenu from database?
Make multiple dependent / cascading selectOneMenu dropdown lists in JSF
If anyone can throw some troubleshooting/debugging tips for this sort of problem it would be greatly appreciated.
Just ask a clear and concrete question here. Do not ask too broad questions ;)
In my case I forgot to implement a correct get/set methods. It happened because I have changed a lot of attributes along the development.
Without a proper get method, JSF canĀ“t recover your selected item, and happens what BalusC said at item 1 of his answer:
1 . The selected item is missing in the list of available items. This can happen if the list of available items is served by a request scoped bean which is not properly reinitialized on subsequent request, or is incorrectly doing the business job inside a getter method which causes it to return a different list in some way.
This can be a Converter Issue or else DTO issue.
Try to solve this, by adding hashCode() and equals() methods in your object DTO; In the above scenario you can generate these methods within the Location object class which indicate as the 'DTO' here.
Example:
#Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (int) (id ^ (id >>> 32));
return result;
}
#Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (getClass() != obj.getClass())
return false;
Location other = (Location) obj;
if (id != other.id)
return false;
return true;
}
Please note that the above example is for an 'id' of type 'long'.

Mvc binding issue from json to enum (customexception from int to enum)

i have this problem: i use the json to send data to server.
All works fine but the problem is a situation like:
public enum SexType
{
Male : 0,
Female : 1
}
class People{
public SexType Sex {get;set;}
}
That create me the json:
{"Sex" : 0}
When i send back to server, this fill the ModelStateError with this issue:
The parameter conversion from type 'System.Int32' to type 'SexType' failed because no type converter can convert between these types.
But if i wrap the value with ' all work well:
{"Sex" : '0'}
Anyone have the same problem?
Tnx for all!
Yes, I got the same problem. The weird problem is that if you sent back:
{"Sex" : 'Male'}
it would deserialize no problem.
To solve the problem, I implemented a custom model binder for enums, leveraging the example found here (slightly modified as there were some errors):
http://eliasbland.wordpress.com/2009/08/08/enumeration-model-binder-for-asp-net-mvc/
namespace yournamespace
{
/// <summary>
/// Generic Custom Model Binder used to properly interpret int representation of enum types from JSON deserialization, including default values
/// </summary>
/// <typeparam name="T">The enum type to apply this Custom Model Binder to</typeparam>
public class EnumBinder<T> : IModelBinder
{
private T DefaultValue { get; set; }
public EnumBinder(T defaultValue)
{
DefaultValue = defaultValue;
}
#region IModelBinder Members
public object BindModel(ControllerContext controllerContext, ModelBindingContext bindingContext)
{
return bindingContext.ValueProvider.GetValue(bindingContext.ModelName) == null ? DefaultValue : GetEnumValue(DefaultValue, bindingContext.ValueProvider.GetValue(bindingContext.ModelName).AttemptedValue);
}
#endregion
public static T GetEnumValue<T>(T defaultValue, string value)
{
T enumType = defaultValue;
if ((!String.IsNullOrEmpty(value)) && (Contains(typeof(T), value)))
enumType = (T)Enum.Parse(typeof(T), value, true);
return enumType;
}
public static bool Contains(Type enumType, string value)
{
return Enum.GetNames(enumType).Contains(value, StringComparer.OrdinalIgnoreCase);
}
}
}
and then registering the model binder in global.asax.cs.
In your case it would be something like:
ModelBinders.Binders.Add(typeof(SexType), new EnumBinder<SexType>(SexType.Male));
I am not sure if there is a quicker way, but this works great.
The Model binding uses the Enum.Parse() method, which is fairly smart about interpreting strings but does NOT explicitly cast or convert other types into strings, even if system-level facilities exist to do so and even if they're the internal storage type used within the Enum.
Is this the right behavior? Arguably so, since if you don't know enough to convert your Enum values to strings you might not be aware that the right-hand side of the Enum values are not necessarily unique within the Enum, either.
As a matter of personal taste (and this is probably also because I do way too much statistical analysis programming) for sex I generally prefer to define it as a clear boolean value, i.e. instead of differentiating between arbitrary values for 'Male' and 'Female' I use a variable called e.g. IsFemale and set it to true or false. This plays more nicely with json, since it relies on primitive types common to both languages, and requires less typing when you want to use it.