How to insert uploaded image into varbinary(max) database column - sql-server-2008

I want process form submit and save jpg image into varbinary sql column.
I have code but it does not work properly, it is saving only empty bytes like 0x00...0000. So no errors are raised and database row is inserted successfully, but varbinary column seems to me corrupted.
The code is following
Models
public class FrontendModel
{
public HttpPostedFileBase Photo1 { get; set; }
}
public class SubmitModel
{
public byte[] ImageData { get; set; }
public decimal ImageSizeB { get; set; }
public SubmitModel
(
HttpPostedFileBase Photo
)
{
this.ImageData = new byte[Photo.ContentLength];
Photo.InputStream.Read(ImageData, 0, Convert.ToInt32(Photo.ContentLength));
this.ImageSizeB = Photo.ContentLength;
}
Controller
[HttpPost]
[ValidateAntiForgeryToken]
public ActionResult Index(FrontendModel m)
{
using (var db = new ABC.Models.ABCDBContext())
{
using (var scope = new TransactionScope())
{
if (m.Photo1 != null && m.Photo1.ContentLength > 0)
db.InsertSubmit(new SubmitModel(m.Photo1));
scope.Complete();
}
}
return View(new FrontendModel());
}
DB Insert Function
public void InsertSubmit(SubmitModel m)
{
Database.ExecuteSqlCommand(
"spInsertSubmit #p1",
new SqlParameter("p1", m.ImageData),
);
}
SQL DB Procedure
CREATE PROCEDURE [dbo].[spInsertSubmit]
#ImageData VARBINARY(max)
AS
INSERT INTO dbo.Images (Image)
VALUES (#ImageData)
what am I doing wrong ? Thank you
PS:
I also tried something like this but it behave the same
using (var binaryReader = new BinaryReader(Photo.InputStream))
{
this.ImageData = binaryReader.ReadBytes(Photo.ContentLength);
}
then I tried
using (Stream inputStream = Photo.InputStream)
{
MemoryStream memoryStream = inputStream as MemoryStream;
if (memoryStream == null)
{
memoryStream = new MemoryStream();
inputStream.CopyTo(memoryStream);
}
ImageData = memoryStream.ToArray();
}
but error shows in calling DB function with error message, Parameter is not valid
i have the same problem as is mentioned here
File uploading and saving to database incorrectly
I found that when i assign Input stream to memory stream, the memory stream is empty ?!

Your procedure specifies that the parameter is called #ImageData but your code
Database.ExecuteSqlCommand(
"spInsertSubmit #p1",
new SqlParameter("p1", m.ImageData),
);
seems to be passing a parameter called #p1
Edit
Also I think that you have to specify the type explicitly when working with a byte array larger than 8k. See this link: Inserting a byte array larger than 8k bytes
Database.ExecuteSqlCommand(
"spInsertSubmit #p1",
new SqlParameter("#p1", SqlDbType.VarBinary) { Value = m.ImageData },
);

Ok i found a solution. This code works!
this.ImageData = new byte[streamLength];
Photo.InputStream.Position = 0;
Photo.InputStream.Read(this.ImageData, 0, this.ImageData.Length);
The line added is Photo.InputStream.Position = 0;

Related

How to send very long json to Asp.NET MVC [duplicate]

I am using the autocomplete feature of jQuery. When I try to retrieve the list of more then 17000 records (each won't have more than 10 char length), it's exceeding the length and throws the error:
Exception information:
Exception type: InvalidOperationException
Exception message: Error during serialization or deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property.
Can I set an unlimited length for maxJsonLength in web.config? If not, what is the maximum length I can set?
NOTE: this answer applies only to Web services, if you are returning JSON from a Controller method, make sure you read this SO answer below as well: https://stackoverflow.com/a/7207539/1246870
The MaxJsonLength property cannot be unlimited, is an integer property that defaults to 102400 (100k).
You can set the MaxJsonLength property on your web.config:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="50000000"/>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
If you are using MVC 4, be sure to check out this answer as well.
If you are still receiving the error:
after setting the maxJsonLength property to its maximum value in web.config
and you know that your data's length is less than this value
and you are not utilizing a web service method for the JavaScript serialization
your problem is is likely that:
The value of the MaxJsonLength property applies only to the internal JavaScriptSerializer instance that is used by the asynchronous communication layer to invoke Web services methods. (MSDN: ScriptingJsonSerializationSection.MaxJsonLength Property)
Basically, the "internal" JavaScriptSerializer respects the value of maxJsonLength when called from a web method; direct use of a JavaScriptSerializer (or use via an MVC action-method/Controller) does not respect the maxJsonLength property, at least not from the systemWebExtensions.scripting.webServices.jsonSerialization section of web.config. In particular, the Controller.Json() method does not respect the configuration setting!
As a workaround, you can do the following within your Controller (or anywhere really):
var serializer = new JavaScriptSerializer();
// For simplicity just use Int32's max value.
// You could always read the value from the config section mentioned above.
serializer.MaxJsonLength = Int32.MaxValue;
var resultData = new { Value = "foo", Text = "var" };
var result = new ContentResult{
Content = serializer.Serialize(resultData),
ContentType = "application/json"
};
return result;
This answer is my interpretation of this asp.net forum answer.
In MVC 4 you can do:
protected override JsonResult Json(object data, string contentType, System.Text.Encoding contentEncoding, JsonRequestBehavior behavior)
{
return new JsonResult()
{
Data = data,
ContentType = contentType,
ContentEncoding = contentEncoding,
JsonRequestBehavior = behavior,
MaxJsonLength = Int32.MaxValue
};
}
in your controller.
Addition:
For anyone puzzled by the parameters you need to specify, a call could look like this:
Json(
new {
field1 = true,
field2 = "value"
},
"application/json",
Encoding.UTF8,
JsonRequestBehavior.AllowGet
);
You can configure the max length for json requests in your web.config file:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="....">
</jsonSerialization>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
The default value for maxJsonLength is 102400. For more details, see this MSDN page: http://msdn.microsoft.com/en-us/library/bb763183.aspx
if you are still getting error after web.config setting like following:
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="50000000"/>
</webServices>
</scripting>
</system.web.extensions>
</configuration>
I solved it by following:
public ActionResult/JsonResult getData()
{
var jsonResult = Json(superlargedata, JsonRequestBehavior.AllowGet);
jsonResult.MaxJsonLength = int.MaxValue;
return jsonResult;
}
I hope this should help.
I was having this problem in ASP.NET Web Forms. It was completely ignoring the web.config file settings so I did this:
JavaScriptSerializer serializer = new JavaScriptSerializer();
serializer.MaxJsonLength = Int32.MaxValue;
return serializer.Serialize(response);
Of course overall this is terrible practice. If you are sending this much data in a web service call you should look at a different approach.
I followed vestigal's answer and got to this solution:
When I needed to post a large json to an action in a controller, I would get the famous "Error during deserialization using the JSON JavaScriptSerializer. The length of the string exceeds the value set on the maxJsonLength property.\r\nParameter name: input value provider".
What I did is create a new ValueProviderFactory, LargeJsonValueProviderFactory, and set the MaxJsonLength = Int32.MaxValue in the GetDeserializedObject method
public sealed class LargeJsonValueProviderFactory : ValueProviderFactory
{
private static void AddToBackingStore(LargeJsonValueProviderFactory.EntryLimitedDictionary backingStore, string prefix, object value)
{
IDictionary<string, object> dictionary = value as IDictionary<string, object>;
if (dictionary != null)
{
foreach (KeyValuePair<string, object> keyValuePair in (IEnumerable<KeyValuePair<string, object>>) dictionary)
LargeJsonValueProviderFactory.AddToBackingStore(backingStore, LargeJsonValueProviderFactory.MakePropertyKey(prefix, keyValuePair.Key), keyValuePair.Value);
}
else
{
IList list = value as IList;
if (list != null)
{
for (int index = 0; index < list.Count; ++index)
LargeJsonValueProviderFactory.AddToBackingStore(backingStore, LargeJsonValueProviderFactory.MakeArrayKey(prefix, index), list[index]);
}
else
backingStore.Add(prefix, value);
}
}
private static object GetDeserializedObject(ControllerContext controllerContext)
{
if (!controllerContext.HttpContext.Request.ContentType.StartsWith("application/json", StringComparison.OrdinalIgnoreCase))
return (object) null;
string end = new StreamReader(controllerContext.HttpContext.Request.InputStream).ReadToEnd();
if (string.IsNullOrEmpty(end))
return (object) null;
var serializer = new JavaScriptSerializer {MaxJsonLength = Int32.MaxValue};
return serializer.DeserializeObject(end);
}
/// <summary>Returns a JSON value-provider object for the specified controller context.</summary>
/// <returns>A JSON value-provider object for the specified controller context.</returns>
/// <param name="controllerContext">The controller context.</param>
public override IValueProvider GetValueProvider(ControllerContext controllerContext)
{
if (controllerContext == null)
throw new ArgumentNullException("controllerContext");
object deserializedObject = LargeJsonValueProviderFactory.GetDeserializedObject(controllerContext);
if (deserializedObject == null)
return (IValueProvider) null;
Dictionary<string, object> dictionary = new Dictionary<string, object>((IEqualityComparer<string>) StringComparer.OrdinalIgnoreCase);
LargeJsonValueProviderFactory.AddToBackingStore(new LargeJsonValueProviderFactory.EntryLimitedDictionary((IDictionary<string, object>) dictionary), string.Empty, deserializedObject);
return (IValueProvider) new DictionaryValueProvider<object>((IDictionary<string, object>) dictionary, CultureInfo.CurrentCulture);
}
private static string MakeArrayKey(string prefix, int index)
{
return prefix + "[" + index.ToString((IFormatProvider) CultureInfo.InvariantCulture) + "]";
}
private static string MakePropertyKey(string prefix, string propertyName)
{
if (!string.IsNullOrEmpty(prefix))
return prefix + "." + propertyName;
return propertyName;
}
private class EntryLimitedDictionary
{
private static int _maximumDepth = LargeJsonValueProviderFactory.EntryLimitedDictionary.GetMaximumDepth();
private readonly IDictionary<string, object> _innerDictionary;
private int _itemCount;
public EntryLimitedDictionary(IDictionary<string, object> innerDictionary)
{
this._innerDictionary = innerDictionary;
}
public void Add(string key, object value)
{
if (++this._itemCount > LargeJsonValueProviderFactory.EntryLimitedDictionary._maximumDepth)
throw new InvalidOperationException("JsonValueProviderFactory_RequestTooLarge");
this._innerDictionary.Add(key, value);
}
private static int GetMaximumDepth()
{
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonDeserializerMembers");
int result;
if (values != null && values.Length > 0 && int.TryParse(values[0], out result))
return result;
}
return 1000;
}
}
}
Then, in the Application_Start method from Global.asax.cs, replace the ValueProviderFactory with the new one:
protected void Application_Start()
{
...
//Add LargeJsonValueProviderFactory
ValueProviderFactory jsonFactory = null;
foreach (var factory in ValueProviderFactories.Factories)
{
if (factory.GetType().FullName == "System.Web.Mvc.JsonValueProviderFactory")
{
jsonFactory = factory;
break;
}
}
if (jsonFactory != null)
{
ValueProviderFactories.Factories.Remove(jsonFactory);
}
var largeJsonValueProviderFactory = new LargeJsonValueProviderFactory();
ValueProviderFactories.Factories.Add(largeJsonValueProviderFactory);
}
I fixed it.
//your Json data here
string json_object="........";
JavaScriptSerializer jsJson = new JavaScriptSerializer();
jsJson.MaxJsonLength = 2147483644;
MyClass obj = jsJson.Deserialize<MyClass>(json_object);
It works very well.
if, after implementing the above addition into your web.config, you get an “Unrecognized configuration section system.web.extensions.” error then try adding this to your web.config in the <ConfigSections> section:
<sectionGroup name="system.web.extensions" type="System.Web.Extensions">
<sectionGroup name="scripting" type="System.Web.Extensions">
<sectionGroup name="webServices" type="System.Web.Extensions">
<section name="jsonSerialization" type="System.Web.Extensions"/>
</sectionGroup>
</sectionGroup>
</sectionGroup>
Simply set MaxJsonLength proprty in MVC's Action method
JsonResult json= Json(classObject, JsonRequestBehavior.AllowGet);
json.MaxJsonLength = int.MaxValue;
return json;
you can write this line into Controller
json.MaxJsonLength = 2147483644;
you can also write this line into web.config
<configuration>
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647">
</jsonSerialization>
</webServices>
</scripting>
</system.web.extensions>
`
To be on the safe side, use both.
Fix for ASP.NET MVC if you want to fix it only for particular action that is causing the problem then change this code:
public JsonResult GetBigJson()
{
var someBigObject = GetBigObject();
return Json(someBigObject);
}
to this:
public JsonResult GetBigJson()
{
var someBigObject = GetBigObject();
return new JsonResult()
{
Data = someBigObject,
JsonRequestBehavior = JsonRequestBehavior.DenyGet,
MaxJsonLength = int.MaxValue
};
}
And the functionality should be same, you can just return bigger JSON as response.
Explanation based on ASP.NET MVC source code: you can check what Controller.Json method does in ASP.NET MVC source code
protected internal JsonResult Json(object data)
{
return Json(data, null /* contentType */, null /* contentEncoding */, JsonRequestBehavior.DenyGet);
}
It is calling other Controller.Json method:
protected internal virtual JsonResult Json(object data, string contentType, Encoding contentEncoding, JsonRequestBehavior behavior)
{
return new JsonResult
{
Data = data,
ContentType = contentType,
ContentEncoding = contentEncoding,
JsonRequestBehavior = behavior
};
}
where passed contentType and contentEncoding object are null. So basically calling return Json(object) in controller is equivalent to calling return new JsonResult { Data = object, JsonRequestBehavior = sonRequestBehavior.DenyGet }. You can use second form and parameterize JsonResult.
So what happens when you set MaxJsonLength property (by default it's null)?
It's passed down to JavaScriptSerializer.MaxJsonLength property and then JavaScriptSerializer.Serialize method is called :
JavaScriptSerializer serializer = new JavaScriptSerializer();
if (MaxJsonLength.HasValue)
{
serializer.MaxJsonLength = MaxJsonLength.Value;
}
if (RecursionLimit.HasValue)
{
serializer.RecursionLimit = RecursionLimit.Value;
}
response.Write(serializer.Serialize(Data));
And when you don't set MaxJsonLenght property of serializer then it takes default value which is just 2MB.
If you are getting this error from the MiniProfiler in MVC then you can increase the value by setting the property MiniProfiler.Settings.MaxJsonResponseSize to the desired value. By default, this tool seems to ignore the value set in config.
MiniProfiler.Settings.MaxJsonResponseSize = 104857600;
Courtesy mvc-mini-profiler.
I suggest setting it to Int32.MaxValue.
JavaScriptSerializer serializer = new JavaScriptSerializer();
serializer.MaxJsonLength = Int32.MaxValue;
How about some attribute magic?
[AttributeUsage(AttributeTargets.Class | AttributeTargets.Method, Inherited = true, AllowMultiple = false)]
public class MaxJsonSizeAttribute : ActionFilterAttribute
{
// Default: 10 MB worth of one byte chars
private int maxLength = 10 * 1024 * 1024;
public int MaxLength
{
set
{
if (value < 0) throw new ArgumentOutOfRangeException("value", "Value must be at least 0.");
maxLength = value;
}
get { return maxLength; }
}
public override void OnActionExecuted(ActionExecutedContext filterContext)
{
JsonResult json = filterContext.Result as JsonResult;
if (json != null)
{
if (maxLength == 0)
{
json.MaxJsonLength = int.MaxValue;
}
else
{
json.MaxJsonLength = maxLength;
}
}
}
}
Then you could either apply it globally using the global filter configuration or controller/action-wise.
If you are encountering this sort of issue in View, you can use below method to resolve that. Here Iused Newtonsoft package .
#using Newtonsoft.Json
<script type="text/javascript">
var partData = #Html.Raw(JsonConvert.SerializeObject(ViewBag.Part));
</script>
Alternative ASP.NET MVC 5 Fix:
(Mine is similar to MFCs answer above with a few small changes)
I wasn't ready to change to Json.NET just yet and in my case the error was occurring during the request. Best approach in my scenario was modifying the actual JsonValueProviderFactory which applies the fix to the global project and can be done by editing the global.cs file as such.
JsonValueProviderConfig.Config(ValueProviderFactories.Factories);
add a web.config entry:
<add key="aspnet:MaxJsonLength" value="20971520" />
and then create the two following classes
public class JsonValueProviderConfig
{
public static void Config(ValueProviderFactoryCollection factories)
{
var jsonProviderFactory = factories.OfType<JsonValueProviderFactory>().Single();
factories.Remove(jsonProviderFactory);
factories.Add(new CustomJsonValueProviderFactory());
}
}
This is basically an exact copy of the default implementation found in System.Web.Mvc but with the addition of a configurable web.config appsetting value aspnet:MaxJsonLength.
public class CustomJsonValueProviderFactory : ValueProviderFactory
{
/// <summary>Returns a JSON value-provider object for the specified controller context.</summary>
/// <returns>A JSON value-provider object for the specified controller context.</returns>
/// <param name="controllerContext">The controller context.</param>
public override IValueProvider GetValueProvider(ControllerContext controllerContext)
{
if (controllerContext == null)
throw new ArgumentNullException("controllerContext");
object deserializedObject = CustomJsonValueProviderFactory.GetDeserializedObject(controllerContext);
if (deserializedObject == null)
return null;
Dictionary<string, object> strs = new Dictionary<string, object>(StringComparer.OrdinalIgnoreCase);
CustomJsonValueProviderFactory.AddToBackingStore(new CustomJsonValueProviderFactory.EntryLimitedDictionary(strs), string.Empty, deserializedObject);
return new DictionaryValueProvider<object>(strs, CultureInfo.CurrentCulture);
}
private static object GetDeserializedObject(ControllerContext controllerContext)
{
if (!controllerContext.HttpContext.Request.ContentType.StartsWith("application/json", StringComparison.OrdinalIgnoreCase))
return null;
string fullStreamString = (new StreamReader(controllerContext.HttpContext.Request.InputStream)).ReadToEnd();
if (string.IsNullOrEmpty(fullStreamString))
return null;
var serializer = new JavaScriptSerializer()
{
MaxJsonLength = CustomJsonValueProviderFactory.GetMaxJsonLength()
};
return serializer.DeserializeObject(fullStreamString);
}
private static void AddToBackingStore(EntryLimitedDictionary backingStore, string prefix, object value)
{
IDictionary<string, object> strs = value as IDictionary<string, object>;
if (strs != null)
{
foreach (KeyValuePair<string, object> keyValuePair in strs)
CustomJsonValueProviderFactory.AddToBackingStore(backingStore, CustomJsonValueProviderFactory.MakePropertyKey(prefix, keyValuePair.Key), keyValuePair.Value);
return;
}
IList lists = value as IList;
if (lists == null)
{
backingStore.Add(prefix, value);
return;
}
for (int i = 0; i < lists.Count; i++)
{
CustomJsonValueProviderFactory.AddToBackingStore(backingStore, CustomJsonValueProviderFactory.MakeArrayKey(prefix, i), lists[i]);
}
}
private class EntryLimitedDictionary
{
private static int _maximumDepth;
private readonly IDictionary<string, object> _innerDictionary;
private int _itemCount;
static EntryLimitedDictionary()
{
_maximumDepth = CustomJsonValueProviderFactory.GetMaximumDepth();
}
public EntryLimitedDictionary(IDictionary<string, object> innerDictionary)
{
this._innerDictionary = innerDictionary;
}
public void Add(string key, object value)
{
int num = this._itemCount + 1;
this._itemCount = num;
if (num > _maximumDepth)
{
throw new InvalidOperationException("The length of the string exceeds the value set on the maxJsonLength property.");
}
this._innerDictionary.Add(key, value);
}
}
private static string MakeArrayKey(string prefix, int index)
{
return string.Concat(prefix, "[", index.ToString(CultureInfo.InvariantCulture), "]");
}
private static string MakePropertyKey(string prefix, string propertyName)
{
if (string.IsNullOrEmpty(prefix))
{
return propertyName;
}
return string.Concat(prefix, ".", propertyName);
}
private static int GetMaximumDepth()
{
int num;
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonDeserializerMembers");
if (values != null && values.Length != 0 && int.TryParse(values[0], out num))
{
return num;
}
}
return 1000;
}
private static int GetMaxJsonLength()
{
int num;
NameValueCollection appSettings = ConfigurationManager.AppSettings;
if (appSettings != null)
{
string[] values = appSettings.GetValues("aspnet:MaxJsonLength");
if (values != null && values.Length != 0 && int.TryParse(values[0], out num))
{
return num;
}
}
return 1000;
}
}
For those who are having issues with in MVC3 with JSON that's automatically being deserialized for a model binder and is too large, here is a solution.
Copy the code for the JsonValueProviderFactory class from the MVC3 source code into a new class.
Add a line to change the maximum JSON length before the object is deserialized.
Replace the JsonValueProviderFactory class with your new, modified class.
Thanks to http://blog.naver.com/techshare/100145191355 and https://gist.github.com/DalSoft/1588818 for pointing me in the right direction for how to do this. The last link on the first site contains full source code for the solution.
The question really is whether you really need to return 17k records? How are you planning to handle all the data in the browser? The users are not going to scroll through 17000 rows anyway.
A better approach is to retrieve only a "top few" records and load more as required.
You can set it in the config as others have said, or you can set in on an individual instance of the serializer like:
var js = new JavaScriptSerializer() { MaxJsonLength = int.MaxValue };
JsonResult result = Json(r);
result.MaxJsonLength = Int32.MaxValue;
result.JsonRequestBehavior = JsonRequestBehavior.AllowGet;
return result;
It appears that there is no "unlimited" value. The default is 2097152 characters, which is equivalent to 4 MB of Unicode string data.
As as already been observed, 17,000 records are hard to use well in the browser. If you are presenting an aggregate view it may be much more efficient to do the aggregation on the server and transfer only a summary in the browser. For example, consider a file system brower, we only see the top of the tree, then emit further requestes as we drill down. The number of records returned in each request is comparatively small. A tree view presentation can work well for large result sets.
Just ran into this. I'm getting over 6,000 records. Just decided I'd just do some paging. As in, I accept a page number in my MVC JsonResult endpoint, which is defaulted to 0 so it's not necessary, like so:
public JsonResult MyObjects(int pageNumber = 0)
Then instead of saying:
return Json(_repository.MyObjects.ToList(), JsonRequestBehavior.AllowGet);
I say:
return Json(_repository.MyObjects.OrderBy(obj => obj.ID).Skip(1000 * pageNumber).Take(1000).ToList(), JsonRequestBehavior.AllowGet);
It's very simple. Then, in JavaScript, instead of this:
function myAJAXCallback(items) {
// Do stuff here
}
I instead say:
var pageNumber = 0;
function myAJAXCallback(items) {
if(items.length == 1000)
// Call same endpoint but add this to the end: '?pageNumber=' + ++pageNumber
}
// Do stuff here
}
And append your records to whatever you were doing with them in the first place. Or just wait until all the calls finish and cobble the results together.
I solved the problem adding this code:
String confString = HttpContext.Current.Request.ApplicationPath.ToString();
Configuration conf = WebConfigurationManager.OpenWebConfiguration(confString);
ScriptingJsonSerializationSection section = (ScriptingJsonSerializationSection)conf.GetSection("system.web.extensions/scripting/webServices/jsonSerialization");
section.MaxJsonLength = 6553600;
conf.Save();
Solution for WebForms UpdatePanel:
Add a setting to Web.config:
<configuration>
<appSettings>
<add key="aspnet:UpdatePanelMaxScriptLength" value="2147483647" />
</appSettings>
</configuration>
https://support.microsoft.com/en-us/kb/981884
ScriptRegistrationManager class contains following code:
// Serialize the attributes to JSON and write them out
JavaScriptSerializer serializer = new JavaScriptSerializer();
// Dev10# 877767 - Allow configurable UpdatePanel script block length
// The default is JavaScriptSerializer.DefaultMaxJsonLength
if (AppSettings.UpdatePanelMaxScriptLength > 0) {
serializer.MaxJsonLength = AppSettings.UpdatePanelMaxScriptLength;
}
string attrText = serializer.Serialize(attrs);
We don't need any server side changes. you can fix this only modify by web.config file
This helped for me. try this out
<appSettings>
<add key="aspnet:MaxJsonDeserializerMembers" value="2147483647" />
<add key="aspnet:UpdatePanelMaxScriptLength" value="2147483647" />
</appSettings>
and
<system.web.extensions>
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647"/>
</webServices>
</scripting>
i use this and it worked for Kendo grid read request.
{
//something
var result = XResult.ToList().ToDataSourceResult(request);
var rs = Json(result, JsonRequestBehavior.AllowGet);
rs.MaxJsonLength = int.MaxValue;
return rs;
}
use lib\Newtonsoft.Json.dll
public string serializeObj(dynamic json) {
return JsonConvert.SerializeObject(json);
}
if this maxJsonLength value is a int then how big is its int 32bit/64bit/16bit.... i just want to be sure whats the maximum value i can set as my maxJsonLength
<scripting>
<webServices>
<jsonSerialization maxJsonLength="2147483647">
</jsonSerialization>
</webServices>
</scripting>
You do not need to do with web.config
You can use short property during catch value of the passing list
For example
declare a model like
public class BookModel
{
public decimal id { get; set; } // 1
public string BN { get; set; } // 2 Book Name
public string BC { get; set; } // 3 Bar Code Number
public string BE { get; set; } // 4 Edition Name
public string BAL { get; set; } // 5 Academic Level
public string BCAT { get; set; } // 6 Category
}
here i use short proporties like
BC =barcode
BE=book edition and so on

newtonsoft. json serializeObject string contains fraction value doesn't seem to escape properly

Having an issue where my object is being serialized using Newton where one of the properties is a string that contains values that are fractions such as 1/2", 1/4", etc...
After serializing I'm passing the variable to a SQL Server Stored Procedure that is using OPENJSON.
The double quote in the fraction value doesn't seem to be escaping properly as it fails as invalid format for JSON. When debugging I see below which would appear the " in the fraction value is not being escaped properly.
A little new to some of the serialization so could use a little help.
string strJson = JsonConvert.SerializeObject(myobject);
DECLARE #json nvarchar(max) = '{"Number":64260,"Notes":"1/2\\" testing"}';
SELECT *
FROM OPENJSON (#json, '$')
WITH(
[Number] int '$.Number'
,[Notes] nvarchar(max) '$.Notes'
) AS myDat
Msg 13609, Level 16, State 4, Line 2 JSON text is not properly formatted. Unexpected character 't' is found at position 32.
If I remove the second "\" from the fraction value it works fine.
public partial class TblEcr
{
public int Number { get; set; }
public string Notes { get; set; }
}
public JsonResult OnPostUpdate([DataSourceRequest] DataSourceRequest request, TblEcr ecr)
{
_context.TblEcr.Where(x => x.Number == ecr.Number).Select(x => ecr);
try
{
if (ModelState.IsValid)
{
string ecrJson = JsonConvert.SerializeObject(ecr);
var param = new SqlParameter[] {
new SqlParameter() {
ParameterName = "#json",
SqlDbType = System.Data.SqlDbType.VarChar,
Size = 8000,
Direction = System.Data.ParameterDirection.Input,
Value = ecrJson
},
new SqlParameter() {
ParameterName = "#Status",
SqlDbType = System.Data.SqlDbType.Bit,
Direction = System.Data.ParameterDirection.Output
//,Value = 10
},
new SqlParameter() {
ParameterName = "#ErrorDetails",
SqlDbType = System.Data.SqlDbType.VarChar,
Size =8000,
Direction = System.Data.ParameterDirection.Output,
}};
int affectedRows = _context.Database.ExecuteSqlCommand("dbo.usp_UpdateECR #json, #Status, #ErrorDetails out", param);
}
}
catch (Exception ex)
{
return new JsonResult(ex.Message);
}
return new JsonResult(new[] { ecr }.ToDataSourceResult(request, ModelState));
}
Doesn't repro for me.
using Microsoft.Data.SqlClient;
using Newtonsoft.Json;
using System;
namespace ConsoleApp8
{
class Program
{
public partial class TblEcr
{
public int Number { get; set; }
public string Notes { get; set; }
}
static void Main(string[] args)
{
var ecr = new TblEcr() { Number = 1, Notes = #"1/2"" testing" };
string ecrJson = JsonConvert.SerializeObject(ecr);
Console.WriteLine(ecrJson);
}
}
}
outputs
{"Number":1,"Notes":"1/2\" testing"}
Thank you for looking at it.
After putting more thought in to it, I simplified a record I was attempting to update and found my issue. It is the stored procedure in SQL Server.
There was a string manipulation in the procedure that was causing the issue. Basically a simple replace for another purpose.
Set #json = REPLACE(#json,'\"','"'); -- BOOM, this caused it.

Is there in JSON function which dumps JSON array of dictionary into tab separated text files

I have an JSON array as defined below:-
[
{"Name":"Ayush","Age":24,"Job":"Developer"},
{"Name":"Monika","Age":23,"Job":"Developer"},
{"Name":"Chinmay","Age":23,"Job":"Developer"}
]
I want to dump this into text file in following format:-
Name Age Job
Ayush 24 Developer
Monika 23 Developer
Chinmay 23 Developer
Is there any C# function to accomplish the above? If not, how can i achieve it with minimum memory consumption?
Thanks in advance
There is no such built-in function. You may achieve this by reading JTokens from input stream using JsonTextReader and writing their values into another stream. Stream input and output ensures minimal memory footprint.
using (var inputStream = File.OpenRead("input.json"))
using (var streamReader = new StreamReader(inputStream))
using (var jsonTextReader = new JsonTextReader(streamReader))
using (var outputStream = File.OpenWrite("output.csv"))
using (var streamWriter = new StreamWriter(outputStream))
{
var firstItem = true;
while (jsonTextReader.Read())
{
if (jsonTextReader.TokenType == JsonToken.StartObject)
{
var jObject = JObject.ReadFrom(jsonTextReader);
if (firstItem)
{
streamWriter.WriteLine(string.Join("\t",
jObject.Children().Select(c => (c as JProperty).Name)));
firstItem = false;
}
streamWriter.WriteLine(string.Join("\t",
jObject.Values().Select(t => t.ToString())));
}
}
}
Demo: https://dotnetfiddle.net/2fCRa6. (I used MemoryStream and Console instead of input and output file streams in this demo since .NET Fiddle does not allow file IO, but the idea is the same.)
You can create a class with Name, Age and Job as properties.
public class Info{
public string Name { get; set; }
public int Age { get; set; }
public string Job { get; set; }
}
Then in another function use we can use System.Web.Script.Serialization class(to use this class make sure you have referenced System.Web.Extensions in project references). Once done we can use JavaScriptSerializer class and get list of objects from the json data. Then we can iterate over each item and add it two our file with a tab as a delimeter.
public static void WriteDetailsInFile(string jsonData)
{
var list = new JavaScriptSerializer().Deserialize<List<Info>>(jsonData);
using (var streamWriter = File.AppendText("D:MyFile.txt"))
{
streamWriter.WriteLine("Name\tAge\tJob");
foreach (var item in list)
{
streamWriter.WriteLine(item.Name + "\t" + item.Age + "\t" + item.Job);
}
}
}
//driver
public static void Main()
{
string data = #"[
{ ""Name"":""Ayush"",""Age"":24,""Job"":""Developer""},
{ ""Name"":""Monika"",""Age"":23,""Job"":""Developer""},
{ ""Name"":""Chinmay"",""Age"":23,""Job"":""Developer""}
]";
WriteDetailsInFile(data);
}

C# - Emgu Cv - Face Recognition- Loading training sets of Faces saved to Access database as a binary in to EigenObjectRecognizer for Face recognition

I was having a hard time loading training set from Ms Access database in to the main form that does the Face Recognition. I saved the training sets with their names and ID in to the database as a binary data with an OLE Object format.The method i used to change, save and read the data from the database and in to the training sets is
private static byte[] ConvertImageToBytes(Image InputImage)
{
using (Bitmap BmpImage = new Bitmap(InputImage))
{
using (MemoryStream MyStream = new MemoryStream())
{
BmpImage.Save(MyStream, System.Drawing.Imaging.ImageFormat.Jpeg);
byte[] ImageAsBytes = MyStream.ToArray();
return ImageAsBytes;
}
}
}
The method that i use to store the converted byte data to the database is the following:
private void StoreData(byte[] ImageAsBytes,String NameStudent,String IDStudent)
{
if (DBConnection.State.Equals(ConnectionState.Closed))
DBConnection.Open();
try
{
//MessageBox.Show("Saving image at index : " + rowPosition);
using (OleDbCommand insert = new OleDbCommand(String.Format("Insert INTO
TrainingSet(rowPosition,StudentName,StudentID,StudentFace) values ('
{0}','{1}','{2}',#StudentFace)", rowPosition, NameStudent, IDStudent),
DBConnection))
{
OleDbParameter imageParameter = insert.Parameters.AddWithValue(#"StudentFace",
SqlDbType.Binary);
imageParameter.Value = ImageAsBytes;
imageParameter.Size = ImageAsBytes.Length;
int rowsAffected = insert.ExecuteNonQuery();
MessageBox.Show(String.Format("Data stored successfully in {0}
Row",rowsAffected));
}
rowPosition++;
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
MessageBox.Show(ex.Message);
}
finally
{
RefreshDBConnection();
}
}
The method that i use to Read this binary data is as follows:
private Image ReadImageFromDB()
{
Image FetchedImg;
if (rowNumber >= 0)
{
byte[] FetchedImgBytes = (byte[])LocalDataTable.Rows[rowNumber]["StudentFace"];
MemoryStream stream = new MemoryStream(FetchedImgBytes);
FetchedImg = Image.FromStream(stream);
return FetchedImg;
}
else
{
MessageBox.Show("There are no images in the database yet.Please reconnect
or add some pictures.");
return null;
}
}
I have successfully saved the training sets/images as a binary data in to the database.The problem is when i load these training sets for Recognition.
// Declaring the variables=====trainingImages is where the training sets are
// loaded from the database NameLabels and IDLabels are text in the database
// and where name and Id of subject
//is saved.
List<Image<Gray,byte>> trainingImages = new List<Image<Gray,byte>>();
List<string> NameLables= new List<string>();
List<string> IDLables = new List<string>();
int ContTrain, NumNameLabels,NumIDLabels, t;
//The training sets from the database are loaded in to the facerecognizer code as
// follows
public FaceRecognizer()
{
InitializeComponent();
try
{
//Load previous trained and labels for each image from the database Here
RefreshDBConnection();
String[] NameLabels = (String[])LocalDataTable.Rows[rowNumber]["StudentName"];
NumNameLabels = Convert.ToInt16(NameLabels[0]);
String[] IDLabels = (String[])LocalDataTable.Rows[rowNumber]["StudentID"];
NumIDLabels = Convert.ToInt16(IDLabels[0]);
if (NumNameLabels == NumIDLabels)
{
ContTrain = NumNameLabels;
string LoadFaces;
// Converting the master image to a bitmap
Image imageFromDB;
Bitmap imageChangedToBitmap;
// Normalizing it to grayscale
Image<Gray, Byte> normalizedMasterImage;
for (int tf = 1; tf < NumNameLabels + 1; tf++)
{
imageFromDB = ReadImageFromDB();
//image loaded from the database is converted in to Bitmap and then
//convert the bitmap image in to Image<Gray,byte> for input to
//EigenObjectRecognizer(,,,)
imageChangedToBitmap = new Bitmap(imageFromDB);
normalizedMasterImage = new Image<Gray, Byte>(imageChangedToBitmap);
LoadFaces = String.Format("face{0}.bmp", tf);
trainingImages.Add(normalizedMasterImage);
//trainingImages.Add(new Image<Gray, byte>());
NameLables.Add(NameLabels[tf]);
IDLables.Add(IDLabels[tf]);
rowNumber = rowNumber + 1;
}
}
else
MessageBox.Show("There's a conflict between Name labels and id labels");
}
catch (Exception e)
{
MessageBox.Show("Nothing in the database, please add at least a
face.Train the database","Triained faces load",MessageBoxButtons.OK,
MessageBoxIcon.Exclamation);
}
}
I am only getting the message in the catch when the the form loads even if there are faces saved in the database. I have used EigenObjectRecognizer and i will post the code if necessary.
at the part of loading face, you did not save by face1, face2, face3 etc. So you can not load using;
LoadFaces = String.Format("face{0}.bmp", tf);

Google end point returns JSON for long data type in quotes

I am using Google cloud end point for my rest service. I am consuming this data in a GWT web client using RestyGWT.
I noticed that cloud end point is automatically enclosing a long datatype in double quotes which is causing an exception in RestyGWT when I try to convert JSON to POJO.
Here is my sample code.
#Api(name = "test")
public class EndpointAPI {
#ApiMethod(httpMethod = HttpMethod.GET, path = "test")
public Container test() {
Container container = new Container();
container.testLong = (long)3234345;
container.testDate = new Date();
container.testString = "sathya";
container.testDouble = 123.98;
container.testInt = 123;
return container;
}
public class Container {
public long testLong;
public Date testDate;
public String testString;
public double testDouble;
public int testInt;
}
}
This is what is returned as JSON by cloud end point. You can see that testLong is serialized as "3234345" rather than 3234345.
I have the following questions.
(1) How can I remove double quotes in long values ?
(2) How can I change the string format to "yyyy-MMM-dd hh:mm:ss" ?
Regards,
Sathya
What version of restyGWT are you using ? Did you try 1.4 snapshot ?
I think this is the code (1.4) responsible for parsing a long in restygwt, it might help you :
public static final AbstractJsonEncoderDecoder<Long> LONG = new AbstractJsonEncoderDecoder<Long>() {
public Long decode(JSONValue value) throws DecodingException {
if (value == null || value.isNull() != null) {
return null;
}
return (long) toDouble(value);
}
public JSONValue encode(Long value) throws EncodingException {
return (value == null) ? getNullType() : new JSONNumber(value);
}
};
static public double toDouble(JSONValue value) {
JSONNumber number = value.isNumber();
if (number == null) {
JSONString val = value.isString();
if (val != null){
try {
return Double.parseDouble(val.stringValue());
}
catch(NumberFormatException e){
// just through exception below
}
}
throw new DecodingException("Expected a json number, but was given: " + value);
}
return number.doubleValue();
}