I am using the autocomplete feature of jQuery. When I try to retrieve the list of more then 17000 records (each won't have more than 10 char length), it's exceeding the length and throws the error:
Exception information:
Exception type: InvalidOperationException
Exception message: Error during serialization or deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property.
Can I set an unlimited length for maxJsonLength in web.config? If not, what is the maximum length I can set?
NOTE: this answer applies only to Web services, if you are returning JSON from a Controller method, make sure you read this SO answer below as well: https://stackoverflow.com/a/7207539/1246870
The MaxJsonLength property cannot be unlimited, is an integer property that defaults to 102400 (100k).
You can set the MaxJsonLength property on your web.config:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="50000000"/>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
If you are using MVC 4, be sure to check out this answer as well.
If you are still receiving the error:
after setting the maxJsonLength property to its maximum value in web.config
and you know that your data's length is less than this value
and you are not utilizing a web service method for the JavaScript serialization
your problem is is likely that:
The value of the MaxJsonLength property applies only to the internal JavaScriptSerializer instance that is used by the asynchronous communication layer to invoke Web services methods. (MSDN: ScriptingJsonSerializationSection.MaxJsonLength Property)
Basically, the "internal" JavaScriptSerializer respects the value of maxJsonLength when called from a web method; direct use of a JavaScriptSerializer (or use via an MVC action-method/Controller) does not respect the maxJsonLength property, at least not from the systemWebExtensions.scripting.webServices.jsonSerialization section of web.config. In particular, the Controller.Json() method does not respect the configuration setting!
As a workaround, you can do the following within your Controller (or anywhere really):
var serializer = new JavaScriptSerializer();
// For simplicity just use Int32's max value.
// You could always read the value from the config section mentioned above.
serializer.MaxJsonLength = Int32.MaxValue;
var resultData = new { Value = "foo", Text = "var" };
var result = new ContentResult{
Content = serializer.Serialize(resultData),
ContentType = "application/json"
};
return result;
This answer is my interpretation of this asp.net forum answer.
In MVC 4 you can do:
protected override JsonResult Json(object data, string contentType, System.Text.Encoding contentEncoding, JsonRequestBehavior behavior)
{
return new JsonResult()
{
Data = data,
ContentType = contentType,
ContentEncoding = contentEncoding,
JsonRequestBehavior = behavior,
MaxJsonLength = Int32.MaxValue
};
}
in your controller.
Addition:
For anyone puzzled by the parameters you need to specify, a call could look like this:
Json(
new {
field1 = true,
field2 = "value"
},
"application/json",
Encoding.UTF8,
JsonRequestBehavior.AllowGet
);
You can configure the max length for json requests in your web.config file:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="....">
</jsonSerialization>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
The default value for maxJsonLength is 102400. For more details, see this MSDN page: http://msdn.microsoft.com/en-us/library/bb763183.aspx
if you are still getting error after web.config setting like following:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="50000000"/>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
I solved it by following:
public ActionResult/JsonResult getData()
{
var jsonResult = Json(superlargedata, JsonRequestBehavior.AllowGet);
jsonResult.MaxJsonLength = int.MaxValue;
return jsonResult;
}
I hope this should help.
I was having this problem in ASP.NET Web Forms. It was completely ignoring the web.config file settings so I did this:
JavaScriptSerializer serializer = new JavaScriptSerializer();
serializer.MaxJsonLength = Int32.MaxValue;
return serializer.Serialize(response);
Of course overall this is terrible practice. If you are sending this much data in a web service call you should look at a different approach.
I followed vestigal's answer and got to this solution:
When I needed to post a large json to an action in a controller, I would get the famous "Error during deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property.\r\nParameter name: input value provider".
What I did is create a new ValueProviderFactory, LargeJsonValueProviderFactory, and set the MaxJsonLength = Int32.MaxValue in the GetDeserializedObject method
public sealed class LargeJsonValueProviderFactory : ValueProviderFactory
{
private static void AddToBackingStore(LargeJsonValueProviderFactory.EntryLimitedDictionary backingStore, string prefix, object value)
{
IDictionary<string, object> dictionary = value as IDictionary<string, object>;
if (dictionary != null)
{
foreach (KeyValuePair<string, object> keyValuePair in (IEnumerable<KeyValuePair<string, object>>) dictionary)
LargeJsonValueProviderFactory.AddToBackingStore(backingStore, LargeJsonValueProviderFactory.MakePropertyKey(prefix, keyValuePair.Key), keyValuePair.Value);
}
else
{
IList list = value as IList;
if (list != null)
{
for (int index = 0; index < list.Count; ++index)
LargeJsonValueProviderFactory.AddToBackingStore(backingStore, LargeJsonValueProviderFactory.MakeArrayKey(prefix, index), list[index]);
}
else
backingStore.Add(prefix, value);
}
}
private static object GetDeserializedObject(ControllerContext controllerContext)
{
if (!controllerContext.HttpContext.Request.ContentType.StartsWith("application/json", StringComparison.OrdinalIgnoreCase))
return (object) null;
string end = new StreamReader(controllerContext.HttpContext.Request.InputStream).ReadToEnd();
if (string.IsNullOrEmpty(end))
return (object) null;
var serializer = new JavaScriptSerializer {MaxJsonLength = Int32.MaxValue};
return serializer.DeserializeObject(end);
}
/// <summary>Returns a JSON value-provider object for the specified controller context.</summary>
/// <returns>A JSON value-provider object for the specified controller context.</returns>
/// <param name="controllerContext">The controller context.</param>
public override IValueProvider GetValueProvider(ControllerContext controllerContext)
{
if (controllerContext == null)
throw new ArgumentNullException("controllerContext");
object deserializedObject = LargeJsonValueProviderFactory.GetDeserializedObject(controllerContext);
if (deserializedObject == null)
return (IValueProvider) null;
Dictionary<string, object> dictionary = new Dictionary<string, object>((IEqualityComparer<string>) StringComparer.OrdinalIgnoreCase);
LargeJsonValueProviderFactory.AddToBackingStore(new LargeJsonValueProviderFactory.EntryLimitedDictionary((IDictionary<string, object>) dictionary), string.Empty, deserializedObject);
return (IValueProvider) new DictionaryValueProvider<object>((IDictionary<string, object>) dictionary, CultureInfo.CurrentCulture);
}
private static string MakeArrayKey(string prefix, int index)
{
return prefix + "[" + index.ToString((IFormatProvider) CultureInfo.InvariantCulture) + "]";
}
private static string MakePropertyKey(string prefix, string propertyName)
{
if (!string.IsNullOrEmpty(prefix))
return prefix + "." + propertyName;
return propertyName;
}
private class EntryLimitedDictionary
{
private static int _maximumDepth = LargeJsonValueProviderFactory.EntryLimitedDictionary.GetMaximumDepth();
private readonly IDictionary<string, object> _innerDictionary;
private int _itemCount;
public EntryLimitedDictionary(IDictionary<string, object> innerDictionary)
{
this._innerDictionary = innerDictionary;
}
public void Add(string key, object value)
{
if (++this._itemCount > LargeJsonValueProviderFactory.EntryLimitedDictionary._maximumDepth)
throw new InvalidOperationException("JsonValueProviderFactory_RequestTooLarge");
this._innerDictionary.Add(key, value);
}
private static int GetMaximumDepth()
{
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonDeserializerMembers");
int result;
if (values != null && values.Length > 0 && int.TryParse(values[0], out result))
return result;
}
return 1000;
}
}
}
Then, in the Application_Start method from Global.asax.cs, replace the ValueProviderFactory with the new one:
protected void Application_Start()
{
...
//Add LargeJsonValueProviderFactory
ValueProviderFactory jsonFactory = null;
foreach (var factory in ValueProviderFactories.Factories)
{
if (factory.GetType().FullName == "System.Web.Mvc.JsonValueProviderFactory")
{
jsonFactory = factory;
break;
}
}
if (jsonFactory != null)
{
ValueProviderFactories.Factories.Remove(jsonFactory);
}
var largeJsonValueProviderFactory = new LargeJsonValueProviderFactory();
ValueProviderFactories.Factories.Add(largeJsonValueProviderFactory);
}
I fixed it.
//your Json data here
string json_object="........";
JavaScriptSerializer jsJson = new JavaScriptSerializer();
jsJson.MaxJsonLength = 2147483644;
MyClass obj = jsJson.Deserialize<MyClass>(json_object);
It works very well.
if, after implementing the above addition into your web.config, you get an “Unrecognized configuration section system.web.extensions.” error then try adding this to your web.config in the <ConfigSections> section:
<sectionGroup name="system.web.extensions" type="System.Web.Extensions">
<sectionGroup name="scripting" type="System.Web.Extensions">
<sectionGroup name="webServices" type="System.Web.Extensions">
<section name="jsonSerialization" type="System.Web.Extensions"/>
</sectionGroup>
</sectionGroup>
</sectionGroup>
Simply set MaxJsonLength proprty in MVC's Action method
JsonResult json= Json(classObject, JsonRequestBehavior.AllowGet);
json.MaxJsonLength = int.MaxValue;
return json;
you can write this line into Controller
json.MaxJsonLength = 2147483644;
you can also write this line into web.config
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647">
</jsonSerialization>
</webServices>
</scripting>
</system.web.extensions>
`
To be on the safe side, use both.
Fix for ASP.NET MVC if you want to fix it only for particular action that is causing the problem then change this code:
public JsonResult GetBigJson()
{
var someBigObject = GetBigObject();
return Json(someBigObject);
}
to this:
public JsonResult GetBigJson()
{
var someBigObject = GetBigObject();
return new JsonResult()
{
Data = someBigObject,
JsonRequestBehavior = JsonRequestBehavior.DenyGet,
MaxJsonLength = int.MaxValue
};
}
And the functionality should be same, you can just return bigger JSON as response.
Explanation based on ASP.NET MVC source code: you can check what Controller.Json method does in ASP.NET MVC source code
protected internal JsonResult Json(object data)
{
return Json(data, null /* contentType */, null /* contentEncoding */, JsonRequestBehavior.DenyGet);
}
It is calling other Controller.Json method:
protected internal virtual JsonResult Json(object data, string contentType, Encoding contentEncoding, JsonRequestBehavior behavior)
{
return new JsonResult
{
Data = data,
ContentType = contentType,
ContentEncoding = contentEncoding,
JsonRequestBehavior = behavior
};
}
where passed contentType and contentEncoding object are null. So basically calling return Json(object) in controller is equivalent to calling return new JsonResult { Data = object, JsonRequestBehavior = sonRequestBehavior.DenyGet }. You can use second form and parameterize JsonResult.
So what happens when you set MaxJsonLength property (by default it's null)?
It's passed down to JavaScriptSerializer.MaxJsonLength property and then JavaScriptSerializer.Serialize method is called :
JavaScriptSerializer serializer = new JavaScriptSerializer();
if (MaxJsonLength.HasValue)
{
serializer.MaxJsonLength = MaxJsonLength.Value;
}
if (RecursionLimit.HasValue)
{
serializer.RecursionLimit = RecursionLimit.Value;
}
response.Write(serializer.Serialize(Data));
And when you don't set MaxJsonLenght property of serializer then it takes default value which is just 2MB.
If you are getting this error from the MiniProfiler in MVC then you can increase the value by setting the property MiniProfiler.Settings.MaxJsonResponseSize to the desired value. By default, this tool seems to ignore the value set in config.
MiniProfiler.Settings.MaxJsonResponseSize = 104857600;
Courtesy mvc-mini-profiler.
I suggest setting it to Int32.MaxValue.
JavaScriptSerializer serializer = new JavaScriptSerializer();
serializer.MaxJsonLength = Int32.MaxValue;
How about some attribute magic?
[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method, Inherited = true, AllowMultiple = false)]
public class MaxJsonSizeAttribute : ActionFilterAttribute
{
// Default: 10 MB worth of one byte chars
private int maxLength = 10 * 1024 * 1024;
public int MaxLength
{
set
{
if (value < 0) throw new ArgumentOutOfRangeException("value", "Value must be at least 0.");
maxLength = value;
}
get { return maxLength; }
}
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
JsonResult json = filterContext.Result as JsonResult;
if (json != null)
{
if (maxLength == 0)
{
json.MaxJsonLength = int.MaxValue;
}
else
{
json.MaxJsonLength = maxLength;
}
}
}
}
Then you could either apply it globally using the global filter configuration or controller/action-wise.
If you are encountering this sort of issue in View, you can use below method to resolve that. Here Iused Newtonsoft package .
#using Newtonsoft.Json
<script type="text/javascript">
var partData = #Html.Raw(JsonConvert.SerializeObject(ViewBag.Part));
</script>
Alternative ASP.NET MVC 5 Fix:
(Mine is similar to MFCs answer above with a few small changes)
I wasn't ready to change to Json.NET just yet and in my case the error was occurring during the request. Best approach in my scenario was modifying the actual JsonValueProviderFactory which applies the fix to the global project and can be done by editing the global.cs file as such.
JsonValueProviderConfig.Config(ValueProviderFactories.Factories);
add a web.config entry:
<add key="aspnet:MaxJsonLength" value="20971520" />
and then create the two following classes
public class JsonValueProviderConfig
{
public static void Config(ValueProviderFactoryCollection factories)
{
var jsonProviderFactory = factories.OfType<JsonValueProviderFactory>().Single();
factories.Remove(jsonProviderFactory);
factories.Add(new CustomJsonValueProviderFactory());
}
}
This is basically an exact copy of the default implementation found in System.Web.Mvc but with the addition of a configurable web.config appsetting value aspnet:MaxJsonLength.
public class CustomJsonValueProviderFactory : ValueProviderFactory
{
/// <summary>Returns a JSON value-provider object for the specified controller context.</summary>
/// <returns>A JSON value-provider object for the specified controller context.</returns>
/// <param name="controllerContext">The controller context.</param>
public override IValueProvider GetValueProvider(ControllerContext controllerContext)
{
if (controllerContext == null)
throw new ArgumentNullException("controllerContext");
object deserializedObject = CustomJsonValueProviderFactory.GetDeserializedObject(controllerContext);
if (deserializedObject == null)
return null;
Dictionary<string, object> strs = new Dictionary<string, object>(StringComparer.OrdinalIgnoreCase);
CustomJsonValueProviderFactory.AddToBackingStore(new CustomJsonValueProviderFactory.EntryLimitedDictionary(strs), string.Empty, deserializedObject);
return new DictionaryValueProvider<object>(strs, CultureInfo.CurrentCulture);
}
private static object GetDeserializedObject(ControllerContext controllerContext)
{
if (!controllerContext.HttpContext.Request.ContentType.StartsWith("application/json", StringComparison.OrdinalIgnoreCase))
return null;
string fullStreamString = (new StreamReader(controllerContext.HttpContext.Request.InputStream)).ReadToEnd();
if (string.IsNullOrEmpty(fullStreamString))
return null;
var serializer = new JavaScriptSerializer()
{
MaxJsonLength = CustomJsonValueProviderFactory.GetMaxJsonLength()
};
return serializer.DeserializeObject(fullStreamString);
}
private static void AddToBackingStore(EntryLimitedDictionary backingStore, string prefix, object value)
{
IDictionary<string, object> strs = value as IDictionary<string, object>;
if (strs != null)
{
foreach (KeyValuePair<string, object> keyValuePair in strs)
CustomJsonValueProviderFactory.AddToBackingStore(backingStore, CustomJsonValueProviderFactory.MakePropertyKey(prefix, keyValuePair.Key), keyValuePair.Value);
return;
}
IList lists = value as IList;
if (lists == null)
{
backingStore.Add(prefix, value);
return;
}
for (int i = 0; i < lists.Count; i++)
{
CustomJsonValueProviderFactory.AddToBackingStore(backingStore, CustomJsonValueProviderFactory.MakeArrayKey(prefix, i), lists[i]);
}
}
private class EntryLimitedDictionary
{
private static int _maximumDepth;
private readonly IDictionary<string, object> _innerDictionary;
private int _itemCount;
static EntryLimitedDictionary()
{
_maximumDepth = CustomJsonValueProviderFactory.GetMaximumDepth();
}
public EntryLimitedDictionary(IDictionary<string, object> innerDictionary)
{
this._innerDictionary = innerDictionary;
}
public void Add(string key, object value)
{
int num = this._itemCount + 1;
this._itemCount = num;
if (num > _maximumDepth)
{
throw new InvalidOperationException("The length of the string exceeds the value set on the maxJsonLength property.");
}
this._innerDictionary.Add(key, value);
}
}
private static string MakeArrayKey(string prefix, int index)
{
return string.Concat(prefix, "[", index.ToString(CultureInfo.InvariantCulture), "]");
}
private static string MakePropertyKey(string prefix, string propertyName)
{
if (string.IsNullOrEmpty(prefix))
{
return propertyName;
}
return string.Concat(prefix, ".", propertyName);
}
private static int GetMaximumDepth()
{
int num;
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonDeserializerMembers");
if (values != null && values.Length != 0 && int.TryParse(values[0], out num))
{
return num;
}
}
return 1000;
}
private static int GetMaxJsonLength()
{
int num;
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonLength");
if (values != null && values.Length != 0 && int.TryParse(values[0], out num))
{
return num;
}
}
return 1000;
}
}
For those who are having issues with in MVC3 with JSON that's automatically being deserialized for a model binder and is too large, here is a solution.
Copy the code for the JsonValueProviderFactory class from the MVC3 source code into a new class.
Add a line to change the maximum JSON length before the object is deserialized.
Replace the JsonValueProviderFactory class with your new, modified class.
Thanks to http://blog.naver.com/techshare/100145191355 and https://gist.github.com/DalSoft/1588818 for pointing me in the right direction for how to do this. The last link on the first site contains full source code for the solution.
The question really is whether you really need to return 17k records? How are you planning to handle all the data in the browser? The users are not going to scroll through 17000 rows anyway.
A better approach is to retrieve only a "top few" records and load more as required.
You can set it in the config as others have said, or you can set in on an individual instance of the serializer like:
var js = new JavaScriptSerializer() { MaxJsonLength = int.MaxValue };
JsonResult result = Json(r);
result.MaxJsonLength = Int32.MaxValue;
result.JsonRequestBehavior = JsonRequestBehavior.AllowGet;
return result;
It appears that there is no "unlimited" value. The default is 2097152 characters, which is equivalent to 4 MB of Unicode string data.
As as already been observed, 17,000 records are hard to use well in the browser. If you are presenting an aggregate view it may be much more efficient to do the aggregation on the server and transfer only a summary in the browser. For example, consider a file system brower, we only see the top of the tree, then emit further requestes as we drill down. The number of records returned in each request is comparatively small. A tree view presentation can work well for large result sets.
Just ran into this. I'm getting over 6,000 records. Just decided I'd just do some paging. As in, I accept a page number in my MVC JsonResult endpoint, which is defaulted to 0 so it's not necessary, like so:
public JsonResult MyObjects(int pageNumber = 0)
Then instead of saying:
return Json(_repository.MyObjects.ToList(), JsonRequestBehavior.AllowGet);
I say:
return Json(_repository.MyObjects.OrderBy(obj => obj.ID).Skip(1000 * pageNumber).Take(1000).ToList(), JsonRequestBehavior.AllowGet);
It's very simple. Then, in JavaScript, instead of this:
function myAJAXCallback(items) {
// Do stuff here
}
I instead say:
var pageNumber = 0;
function myAJAXCallback(items) {
if(items.length == 1000)
// Call same endpoint but add this to the end: '?pageNumber=' + ++pageNumber
}
// Do stuff here
}
And append your records to whatever you were doing with them in the first place. Or just wait until all the calls finish and cobble the results together.
I solved the problem adding this code:
String confString = HttpContext.Current.Request.ApplicationPath.ToString();
Configuration conf = WebConfigurationManager.OpenWebConfiguration(confString);
ScriptingJsonSerializationSection section = (ScriptingJsonSerializationSection)conf.GetSection("system.web.extensions/scripting/webServices/jsonSerialization");
section.MaxJsonLength = 6553600;
conf.Save();
Solution for WebForms UpdatePanel:
Add a setting to Web.config:
<configuration>
<appSettings>
<add key="aspnet:UpdatePanelMaxScriptLength" value="2147483647" />
</appSettings>
</configuration>
https://support.microsoft.com/en-us/kb/981884
ScriptRegistrationManager class contains following code:
// Serialize the attributes to JSON and write them out
JavaScriptSerializer serializer = new JavaScriptSerializer();
// Dev10# 877767 - Allow configurable UpdatePanel script block length
// The default is JavaScriptSerializer.DefaultMaxJsonLength
if (AppSettings.UpdatePanelMaxScriptLength > 0) {
serializer.MaxJsonLength = AppSettings.UpdatePanelMaxScriptLength;
}
string attrText = serializer.Serialize(attrs);
We don't need any server side changes. you can fix this only modify by web.config file
This helped for me. try this out
<appSettings>
<add key="aspnet:MaxJsonDeserializerMembers" value="2147483647" />
<add key="aspnet:UpdatePanelMaxScriptLength" value="2147483647" />
</appSettings>
and
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647"/>
</webServices>
</scripting>
i use this and it worked for Kendo grid read request.
{
//something
var result = XResult.ToList().ToDataSourceResult(request);
var rs = Json(result, JsonRequestBehavior.AllowGet);
rs.MaxJsonLength = int.MaxValue;
return rs;
}
use lib\Newtonsoft.Json.dll
public string serializeObj(dynamic json) {
return JsonConvert.SerializeObject(json);
}
if this maxJsonLength value is a int then how big is its int 32bit/64bit/16bit.... i just want to be sure whats the maximum value i can set as my maxJsonLength
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647">
</jsonSerialization>
</webServices>
</scripting>
You do not need to do with web.config
You can use short property during catch value of the passing list
For example
declare a model like
public class BookModel
{
public decimal id { get; set; } // 1
public string BN { get; set; } // 2 Book Name
public string BC { get; set; } // 3 Bar Code Number
public string BE { get; set; } // 4 Edition Name
public string BAL { get; set; } // 5 Academic Level
public string BCAT { get; set; } // 6 Category
}
here i use short proporties like
BC =barcode
BE=book edition and so on
I want to put a MySQL result set into a JsonArray using Gsons library. How can I best achieve this. I've read this:
resultset to json using gson
But it uses for some reason, the simple-Json library in addition. i dont want that if possible. Is there any way to achieve this easily with the gson library?
Thank you very much!
PlayerList.java:
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package de.freakyonline.ucone;
import de.freakyonline.ucone.Player;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.net.Socket;
import java.util.ArrayList;
import java.util.Iterator;
import javafx.application.Platform;
import javafx.collections.FXCollections;
import javafx.collections.ObservableList;
import javafx.scene.control.TextArea;
/**
*
* #author uwe
*/
public class PlayerList {
ObservableList<Player> playerList;
ObjectInputStream in;
ObjectOutputStream out;
Socket sock;
private Object obj = null;
private Object obj2 = null;
TextArea consoleOneTextArea;
public PlayerList(ObjectInputStream in, ObjectOutputStream out, Socket sock, TextArea consoleOneTextArea) {
this.in = in;
this.out = out;
this.sock = sock;
this.consoleOneTextArea = consoleOneTextArea;
getPlayersFromServer();
}
private void getPlayersFromServer() {
/* try {
out.writeObject("getplayers");
obj=in.readObject();
if(obj == null) {
System.out.println("ERROR! void getPlayersFromServer in PlayerList.java");
Platform.exit();
}
String command = obj.toString().toLowerCase();
String currentFromServer;
if(command.equalsIgnoreCase("getplayers")) {
while((obj2=in.readObject()) != null) {
currentFromServer = obj2.toString().toLowerCase();
for(String cell : currentFromServer.split(" ")) {
System.out.println(cell.toString());
}
if (currentFromServer.equalsIgnoreCase("done")) {
consoleOneTextArea.appendText("This is finished. Have fun!\n");
break;
}
consoleOneTextArea.appendText(currentFromServer + "\n");
}
} { System.out.println("ERROR"); }
} catch (Exception ex) { ex.printStackTrace(); }
*/
this.playerList = FXCollections.observableArrayList(
new Player("freakyy85","Owner","1810",31,"m", "missing..."),
new Player("Ky3ak","Owner","1920",34,"m", "missing...")
);
}
}
(ive commented out some parts, as they are not relevant anymore)
Player.java:
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package de.freakyonline.ucone;
import com.google.gson.stream.JsonReader;
import java.io.InputStreamReader;
import java.io.ObjectInputStream;
import java.io.ObjectOutputStream;
import java.net.Socket;
import javafx.scene.control.TextArea;
/**
*
* #author uwe
*/
public class Remote implements Runnable {
private Object obj = null;
private Object obj2 = null;
private ObjectInputStream in;
private ObjectOutputStream out;
private Socket sock;
private TextArea consoleOneTextArea;
public Remote (ObjectInputStream in, ObjectOutputStream out, Socket sock) {
this.in = in;
this.out = out;
this.sock = sock;
}
public ObjectInputStream getIn() {
return in;
}
public ObjectOutputStream getOut() {
return out;
}
public Socket getSock() {
return sock;
}
public void setConsoleOneTextArea(TextArea consoleOneTextArea) {
this.consoleOneTextArea = consoleOneTextArea;
}
public void run() {
try {
while((obj=in.readObject()) != null && sock.isConnected()) {
String command = obj.toString().toLowerCase();
String currentFromServer;
switch(command) {
case "getplayers":
/* while((obj2=in.readObject()) != null) {
currentFromServer = obj2.toString().toLowerCase();
if (currentFromServer.equalsIgnoreCase("done")) {
consoleOneTextArea.appendText("This is finished. Have fun!\n");
break;
}
consoleOneTextArea.appendText(currentFromServer + "\n");
*/ }
JsonReader jsonReader = new JsonReader(new InputStreamReader(in, "UTF-8"));
jsonReader.close();
break;
}
} catch (Exception ex) { ex.printStackTrace(); }
}
}
Is there any way to achieve this easily with the gson library?
Not really. Gson and JDBC are too/two unrelated things so you have to implement a custom remapping function to "decode" JDBC result set rows/fields and "encode" them back to JSON array/object respectively. Accumulating a JsonArray instance may be expensive from the memory consumption point of view, or even crash the application with OutOfMemoryError for huge result sets. Nonetheless they are good if the result sets are known to be small or LIMITed.
Accumulating JsonArray
static JsonArray resultSetToJsonArray(final ResultSet resultSet)
throws SQLException {
final ResultSetMetaData metaData = resultSet.getMetaData();
// JsonArray is a Gson built-in class to hold JSON arrays
final JsonArray jsonArray = new JsonArray();
while ( resultSet.next() ) {
jsonArray.add(resultSetRowToJsonObject(resultSet, metaData));
}
return jsonArray;
}
private static JsonElement resultSetRowToJsonObject(final ResultSet resultSet, final ResultSetMetaData metaData)
throws SQLException {
final int columnCount = metaData.getColumnCount();
// Every result set row is a JsonObject equivalent
final JsonObject jsonObject = new JsonObject();
// JDBC uses 1-based loops
for ( int i = 1; i <= columnCount; i++ ) {
jsonObject.add(metaData.getColumnName(i), fieldToJsonElement(resultSet, metaData, i));
}
return jsonObject;
}
private static JsonElement fieldToJsonElement(final ResultSet resultSet, final ResultSetMetaData metaData, final int column)
throws SQLException {
final int columnType = metaData.getColumnType(column);
final Optional<JsonElement> jsonElement;
// Process each SQL type mapping a value to a JSON tree equivalent
switch ( columnType ) {
case Types.BIT:
case Types.TINYINT:
case Types.SMALLINT:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
case Types.INTEGER:
// resultSet.getInt() returns 0 in case of null, so it must be extracted with getObject and cast, then converted to a JsonPrimitive
jsonElement = Optional.ofNullable((Integer) resultSet.getObject(column)).map(JsonPrimitive::new);
break;
case Types.BIGINT:
case Types.FLOAT:
case Types.REAL:
case Types.DOUBLE:
case Types.NUMERIC:
case Types.DECIMAL:
case Types.CHAR:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
case Types.VARCHAR:
jsonElement = Optional.ofNullable(resultSet.getString(column)).map(JsonPrimitive::new);
break;
case Types.LONGVARCHAR:
case Types.DATE:
case Types.TIME:
case Types.TIMESTAMP:
case Types.BINARY:
case Types.VARBINARY:
case Types.LONGVARBINARY:
case Types.NULL:
case Types.OTHER:
case Types.JAVA_OBJECT:
case Types.DISTINCT:
case Types.STRUCT:
case Types.ARRAY:
case Types.BLOB:
case Types.CLOB:
case Types.REF:
case Types.DATALINK:
case Types.BOOLEAN:
case Types.ROWID:
case Types.NCHAR:
case Types.NVARCHAR:
case Types.LONGNVARCHAR:
case Types.NCLOB:
case Types.SQLXML:
case Types.REF_CURSOR:
case Types.TIME_WITH_TIMEZONE:
case Types.TIMESTAMP_WITH_TIMEZONE:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
default:
throw new UnsupportedOperationException("Unknown type: " + columnType);
}
// If the optional value is missing, assume it's a null
return jsonElement.orElse(JsonNull.INSTANCE);
}
final JsonArray jsonArray = resultSetToJsonArray(resultSet);
System.out.println(jsonArray);
Don't forget to close the resultSet, of course.
JSON streaming
If the JsonArray is supposed to be written elsewhere, JsonWriter can be a better solution being able to process huge result sets reading row by row and writing JSON element by JSON element.
#SuppressWarnings("resource")
static void resultSetToJsonArrayStream(final ResultSet resultSet, final JsonWriter jsonWriter)
throws SQLException, IOException {
// Write the [ token
jsonWriter.beginArray();
final ResultSetMetaData metaData = resultSet.getMetaData();
while ( resultSet.next() ) {
// Write row by row
writeRow(resultSet, jsonWriter, metaData);
}
// Finish the array with the ] token
jsonWriter.endArray();
}
#SuppressWarnings("resource")
private static void writeRow(final ResultSet resultSet, final JsonWriter jsonWriter, final ResultSetMetaData metaData)
throws SQLException, IOException {
final int columnCount = metaData.getColumnCount();
// Similarly to the outer array: the { token starts a new object representing a row
jsonWriter.beginObject();
for ( int i = 1; i <= columnCount; i++ ) {
// Write the column name and try to resolve a JSON literal to be written
jsonWriter.name(metaData.getColumnName(i));
writeField(resultSet, jsonWriter, metaData, i);
}
// Terminate the object with }
jsonWriter.endObject();
}
#SuppressWarnings("resource")
private static void writeField(final ResultSet resultSet, final JsonWriter jsonWriter, final ResultSetMetaData metaData, final int column)
throws SQLException, IOException {
final int columnType = metaData.getColumnType(column);
switch ( columnType ) {
case Types.BIT:
case Types.TINYINT:
case Types.SMALLINT:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
case Types.INTEGER:
jsonWriter.value((Integer) resultSet.getObject(column));
break;
case Types.BIGINT:
case Types.FLOAT:
case Types.REAL:
case Types.DOUBLE:
case Types.NUMERIC:
case Types.DECIMAL:
case Types.CHAR:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
case Types.VARCHAR:
jsonWriter.value((String) resultSet.getObject(column));
break;
case Types.LONGVARCHAR:
case Types.DATE:
case Types.TIME:
case Types.TIMESTAMP:
case Types.BINARY:
case Types.VARBINARY:
case Types.LONGVARBINARY:
case Types.NULL:
case Types.OTHER:
case Types.JAVA_OBJECT:
case Types.DISTINCT:
case Types.STRUCT:
case Types.ARRAY:
case Types.BLOB:
case Types.CLOB:
case Types.REF:
case Types.DATALINK:
case Types.BOOLEAN:
case Types.ROWID:
case Types.NCHAR:
case Types.NVARCHAR:
case Types.LONGNVARCHAR:
case Types.NCLOB:
case Types.SQLXML:
case Types.REF_CURSOR:
case Types.TIME_WITH_TIMEZONE:
case Types.TIMESTAMP_WITH_TIMEZONE:
throw new UnsupportedOperationException("TODO: " + JDBCType.valueOf(columnType));
default:
throw new UnsupportedOperationException("Unknown type: " + columnType);
}
}
Example of writing to System.out, but, of course, it can be written anywhere just supplying an appropriate OutputStream instance:
final JsonWriter jsonWriter = new JsonWriter(new OutputStreamWriter(System.out))
resultSetToJsonArrayStream(resultSet, jsonWriter);
Similarly to ResultSet, JsonWriter must be closed as well.
I've written the above code for SQLite, but it should work for MySQL too. For example, the test database created and populated with the following SQL statements:
CREATE TABLE `table` (i NUMBER NOT NULL, s TEXT NOT NULL);
INSERT INTO `table` (i, s) VALUES (1, 'foo');
INSERT INTO `table` (i, s) VALUES (2, 'bar');
INSERT INTO `table` (i, s) VALUES (3, 'baz');
will result in
[{"i":1,"s":"foo"},{"i":2,"s":"bar"},{"i":3,"s":"baz"}]
for both object model and streaming approaches.
How should one deal with Gsonand required versus optional fields?
Since all fields are optional, I can't really fail my network request based on if the response json contains some key, Gsonwill simply parse it to null.
Method I am using gson.fromJson(json, mClassOfT);
For example if I have following json:
{"user_id":128591, "user_name":"TestUser"}
And my class:
public class User {
#SerializedName("user_id")
private String mId;
#SerializedName("user_name")
private String mName;
public String getId() {
return mId;
}
public void setId(String id) {
mId = id;
}
public String getName() {
return mName;
}
public void setName(String name) {
mName = name;
}
}
Is the any option to get Gson to fail if json would not contain user_id or user_name key?
There can be many cases where you might need at least some values to be parsed and other one could be optional?
Is there any pattern or library to be used to handle this case globally?
Thanks.
As you note, Gson has no facility to define a "required field" and you'll just get null in your deserialized object if something is missing in the JSON.
Here's a re-usable deserializer and annotation that will do this. The limitation is that if the POJO required a custom deserializer as-is, you'd have to go a little further and either pass in a Gson object in the constructor to deserialize to object itself or move the annotation checking out into a separate method and use it in your deserializer. You could also improve on the exception handling by creating your own exception and pass it to the JsonParseException so it can be detected via getCause() in the caller.
That all said, in the vast majority of cases, this will work:
public class App
{
public static void main(String[] args)
{
Gson gson =
new GsonBuilder()
.registerTypeAdapter(TestAnnotationBean.class, new AnnotatedDeserializer<TestAnnotationBean>())
.create();
String json = "{\"foo\":\"This is foo\",\"bar\":\"this is bar\"}";
TestAnnotationBean tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
json = "{\"foo\":\"This is foo\"}";
tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
json = "{\"bar\":\"This is bar\"}";
tab = gson.fromJson(json, TestAnnotationBean.class);
System.out.println(tab.foo);
System.out.println(tab.bar);
}
}
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
#interface JsonRequired
{
}
class TestAnnotationBean
{
#JsonRequired public String foo;
public String bar;
}
class AnnotatedDeserializer<T> implements JsonDeserializer<T>
{
public T deserialize(JsonElement je, Type type, JsonDeserializationContext jdc) throws JsonParseException
{
T pojo = new Gson().fromJson(je, type);
Field[] fields = pojo.getClass().getDeclaredFields();
for (Field f : fields)
{
if (f.getAnnotation(JsonRequired.class) != null)
{
try
{
f.setAccessible(true);
if (f.get(pojo) == null)
{
throw new JsonParseException("Missing field in JSON: " + f.getName());
}
}
catch (IllegalArgumentException ex)
{
Logger.getLogger(AnnotatedDeserializer.class.getName()).log(Level.SEVERE, null, ex);
}
catch (IllegalAccessException ex)
{
Logger.getLogger(AnnotatedDeserializer.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
return pojo;
}
}
Output:
This is foo
this is bar
This is foo
null
Exception in thread "main" com.google.gson.JsonParseException: Missing field in JSON: foo
Answer of Brian Roach is very good, but sometimes it's also necessary to handle:
properties of model's super class
properties inside of arrays
For these purposes the following class can be used:
/**
* Adds the feature to use required fields in models.
*
* #param <T> Model to parse to.
*/
public class JsonDeserializerWithOptions<T> implements JsonDeserializer<T> {
/**
* To mark required fields of the model:
* json parsing will be failed if these fields won't be provided.
* */
#Retention(RetentionPolicy.RUNTIME) // to make reading of this field possible at the runtime
#Target(ElementType.FIELD) // to make annotation accessible through reflection
public #interface FieldRequired {}
/**
* Called when the model is being parsed.
*
* #param je Source json string.
* #param type Object's model.
* #param jdc Unused in this case.
*
* #return Parsed object.
*
* #throws JsonParseException When parsing is impossible.
* */
#Override
public T deserialize(JsonElement je, Type type, JsonDeserializationContext jdc)
throws JsonParseException {
// Parsing object as usual.
T pojo = new Gson().fromJson(je, type);
// Getting all fields of the class and checking if all required ones were provided.
checkRequiredFields(pojo.getClass().getDeclaredFields(), pojo);
// Checking if all required fields of parent classes were provided.
checkSuperClasses(pojo);
// All checks are ok.
return pojo;
}
/**
* Checks whether all required fields were provided in the class.
*
* #param fields Fields to be checked.
* #param pojo Instance to check fields in.
*
* #throws JsonParseException When some required field was not met.
* */
private void checkRequiredFields(#NonNull Field[] fields, #NonNull Object pojo)
throws JsonParseException {
// Checking nested list items too.
if (pojo instanceof List) {
final List pojoList = (List) pojo;
for (final Object pojoListPojo : pojoList) {
checkRequiredFields(pojoListPojo.getClass().getDeclaredFields(), pojoListPojo);
checkSuperClasses(pojoListPojo);
}
}
for (Field f : fields) {
// If some field has required annotation.
if (f.getAnnotation(FieldRequired.class) != null) {
try {
// Trying to read this field's value and check that it truly has value.
f.setAccessible(true);
Object fieldObject = f.get(pojo);
if (fieldObject == null) {
// Required value is null - throwing error.
throw new JsonParseException(String.format("%1$s -> %2$s",
pojo.getClass().getSimpleName(),
f.getName()));
} else {
checkRequiredFields(fieldObject.getClass().getDeclaredFields(), fieldObject);
checkSuperClasses(fieldObject);
}
}
// Exceptions while reflection.
catch (IllegalArgumentException | IllegalAccessException e) {
throw new JsonParseException(e);
}
}
}
}
/**
* Checks whether all super classes have all required fields.
*
* #param pojo Object to check required fields in its superclasses.
*
* #throws JsonParseException When some required field was not met.
* */
private void checkSuperClasses(#NonNull Object pojo) throws JsonParseException {
Class<?> superclass = pojo.getClass();
while ((superclass = superclass.getSuperclass()) != null) {
checkRequiredFields(superclass.getDeclaredFields(), pojo);
}
}
}
First of all the interface (annotation) to mark required fields with is described, we'll see an example of its usage later:
/**
* To mark required fields of the model:
* json parsing will be failed if these fields won't be provided.
* */
#Retention(RetentionPolicy.RUNTIME) // to make reading of this field possible at the runtime
#Target(ElementType.FIELD) // to make annotation accessible throw the reflection
public #interface FieldRequired {}
Then deserialize method is implemented. It parses json strings as usual: missing properties in result pojo will have null values:
T pojo = new Gson().fromJson(je, type);
Then the recursive check of all fields of the parsed pojo is being launched:
checkRequiredFields(pojo.getClass().getDeclaredFields(), pojo);
Then we also check all fields of pojo's super classes:
checkSuperClasses(pojo);
It's required when some SimpleModel extends its SimpleParentModel and we want to make sure that all properties of SimpleModel marked as required are provided as SimpleParentModel's ones.
Let's take a look on checkRequiredFields method. First of all it checks if some property is instance of List (json array) - in this case all objects of the list should also be checked to make sure that they have all required fields provided too:
if (pojo instanceof List) {
final List pojoList = (List) pojo;
for (final Object pojoListPojo : pojoList) {
checkRequiredFields(pojoListPojo.getClass().getDeclaredFields(), pojoListPojo);
checkSuperClasses(pojoListPojo);
}
}
Then we are iterating through all fields of pojo, checking if all fields with FieldRequired annotation are provided (what means these fields are not null). If we have encountered some null property which is required - an exception will be fired. Otherwise another recursive step of the validation will be launched for current field, and properties of parent classes of the field will be checked too:
for (Field f : fields) {
// If some field has required annotation.
if (f.getAnnotation(FieldRequired.class) != null) {
try {
// Trying to read this field's value and check that it truly has value.
f.setAccessible(true);
Object fieldObject = f.get(pojo);
if (fieldObject == null) {
// Required value is null - throwing error.
throw new JsonParseException(String.format("%1$s -> %2$s",
pojo.getClass().getSimpleName(),
f.getName()));
} else {
checkRequiredFields(fieldObject.getClass().getDeclaredFields(), fieldObject);
checkSuperClasses(fieldObject);
}
}
// Exceptions while reflection.
catch (IllegalArgumentException | IllegalAccessException e) {
throw new JsonParseException(e);
}
}
}
And the last method should be reviewed is checkSuperClasses: it just runs the similar required fields validation checking properties of pojo's super classes:
Class<?> superclass = pojo.getClass();
while ((superclass = superclass.getSuperclass()) != null) {
checkRequiredFields(superclass.getDeclaredFields(), pojo);
}
And finally lets review some example of this JsonDeserializerWithOptions's usage. Assume we have the following models:
private class SimpleModel extends SimpleParentModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
#JsonDeserializerWithOptions.FieldRequired NestedModel nested;
#JsonDeserializerWithOptions.FieldRequired ArrayList<ListModel> list;
}
private class SimpleParentModel {
#JsonDeserializerWithOptions.FieldRequired Integer rev;
}
private class NestedModel extends NestedParentModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
}
private class NestedParentModel {
#JsonDeserializerWithOptions.FieldRequired Integer rev;
}
private class ListModel {
#JsonDeserializerWithOptions.FieldRequired Long id;
}
We can be sure that SimpleModel will be parsed correctly without exceptions in this way:
final Gson gson = new GsonBuilder()
.registerTypeAdapter(SimpleModel.class, new JsonDeserializerWithOptions<SimpleModel>())
.create();
gson.fromJson("{\"list\":[ { \"id\":1 } ], \"id\":1, \"rev\":22, \"nested\": { \"id\":2, \"rev\":2 }}", SimpleModel.class);
Of course, provided solution can be improved and accept more features: for example - validations for nested objects which are not marked with FieldRequired annotation. Currently it's out of answer's scope, but can be added later.
(Inspired by Brian Roache's answer.)
It seems that Brian's answer doesn't work for primitives because the values can be initialized as something other than null (e.g. 0).
Moreover, it seems like the deserializer would have to be registered for every type. A more scalable solution uses TypeAdapterFactory (as below).
In certain circumstances, it is safer to whitelist exceptions from required fields (i.e. as JsonOptional fields) rather than annotating all fields as required.
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface JsonOptional {
}
Though this approach can easily be adapted for required fields instead.
import com.google.gson.Gson;
import com.google.gson.JsonElement;
import com.google.gson.JsonParseException;
import com.google.gson.TypeAdapter;
import com.google.gson.TypeAdapterFactory;
import com.google.gson.internal.Streams;
import com.google.gson.reflect.TypeToken;
import com.google.gson.stream.JsonReader;
import com.google.gson.stream.JsonWriter;
import java.io.IOException;
import java.lang.reflect.Field;
import java.util.ArrayList;
import java.util.Set;
import java.util.stream.Collectors;
import java.util.stream.Stream;
public class AnnotatedTypeAdapterFactory implements TypeAdapterFactory {
#Override
public <T> TypeAdapter<T> create(Gson gson, TypeToken<T> typeToken) {
Class<? super T> rawType = typeToken.getRawType();
Set<Field> requiredFields = Stream.of(rawType.getDeclaredFields())
.filter(f -> f.getAnnotation(JsonOptional.class) == null)
.collect(Collectors.toSet());
if (requiredFields.isEmpty()) {
return null;
}
final TypeAdapter<T> baseAdapter = (TypeAdapter<T>) gson.getAdapter(rawType);
return new TypeAdapter<T>() {
#Override
public void write(JsonWriter jsonWriter, T o) throws IOException {
baseAdapter.write(jsonWriter, o);
}
#Override
public T read(JsonReader in) throws IOException {
JsonElement jsonElement = Streams.parse(in);
if (jsonElement.isJsonObject()) {
ArrayList<String> missingFields = new ArrayList<>();
for (Field field : requiredFields) {
if (!jsonElement.getAsJsonObject().has(field.getName())) {
missingFields.add(field.getName());
}
}
if (!missingFields.isEmpty()) {
throw new JsonParseException(
String.format("Missing required fields %s for %s",
missingFields, rawType.getName()));
}
}
TypeAdapter<T> delegate = gson.getDelegateAdapter(AnnotatedTypeAdapterFactory.this, typeToken);
return delegate.fromJsonTree(jsonElement);
}
};
}
}
This is my simple solution that creates a generic solution with minimum coding.
Create #Optional annotation
Mark First Optional. Rest are assumed optional. Earlier are assumed required.
Create a generic 'loader' method that checks that source Json object has a value. The loop stops once an #Optional field is encountered.
I am using subclassing so the grunt work is done in the superclass.
Here is the superclass code.
import com.google.gson.Gson;
import java.lang.reflect.Field;
import java.lang.annotation.Annotation;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
...
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.FIELD)
public #interface Optional {
public boolean enabled() default true;
}
and the grunt work method
#SuppressWarnings ("unchecked")
public <T> T payload(JsonObject oJR,Class<T> T) throws Exception {
StringBuilder oSB = new StringBuilder();
String sSep = "";
Object o = gson.fromJson(oJR,T);
// Ensure all fields are populated until we reach #Optional
Field[] oFlds = T.getDeclaredFields();
for(Field oFld:oFlds) {
Annotation oAnno = oFld.getAnnotation(Optional.class);
if (oAnno != null) break;
if (!oJR.has(oFld.getName())) {
oSB.append(sSep+oFld.getName());
sSep = ",";
}
}
if (oSB.length() > 0) throw CVT.e("Required fields "+oSB+" mising");
return (T)o;
}
and an example of usage
public static class Payload {
String sUserType ;
String sUserID ;
String sSecpw ;
#Optional
String sUserDev ;
String sUserMark ;
}
and the populating code
Payload oPL = payload(oJR,Payload.class);
In this case sUserDev and sUserMark are optional and the rest required. The solution relies on the fact that the class stores the Field definitions in the declared order.
I searched a lot and found no good answer. The solution I chose is as follows:
Every field that I need to set from JSON is an object, i.e. boxed Integer, Boolean, etc. Then, using reflection, I can check that the field is not null:
public class CJSONSerializable {
public void checkDeserialization() throws IllegalAccessException, JsonParseException {
for (Field f : getClass().getDeclaredFields()) {
if (f.get(this) == null) {
throw new JsonParseException("Field " + f.getName() + " was not initialized.");
}
}
}
}
From this class, I can derive my JSON object:
public class CJSONResp extends CJSONSerializable {
#SerializedName("Status")
public String status;
#SerializedName("Content-Type")
public String contentType;
}
and then after parsing with GSON, I can call checkDeserialization and it will report me if some of the fields is null.