Select JSON sub-node - json

I am querying the Wikipedia API and am getting JSON back that looks like this:
https://en.wikipedia.org/w/api.php?action=query&prop=pageimages&titles=cessna%20172&pithumbsize=500&format=json
{"batchcomplete":"","query":{"normalized":[{"from":"cessna 172","to":"Cessna 172"}],"pages":{"173462":{"pageid":173462,"ns":0,"title":"Cessna 172","thumbnail":{"source":"https://upload.wikimedia.org/wikipedia/commons/thumb/a/ae/Cessna_172S_Skyhawk_SP%2C_Private_JP6817606.jpg/500px-Cessna_172S_Skyhawk_SP%2C_Private_JP6817606.jpg","width":500,"height":333},"pageimage":"Cessna_172S_Skyhawk_SP,_Private_JP6817606.jpg"}}}}
Using .Net Core 2.2, what is the proper way to get the image thumbnail out of this (the source property in this case)?

Parsing JSON is not a built in feature in .Net core 2.2 so you will want to add the Newtonsoft.Json package to the project with dotnet add package Newtonsoft.Json --version 12.0.3.
From there include Newtonsoft.Json by adding using Newtonsoft.Json.Linq; to the top of the file. and using System.Net; to use WebClient.
From there the code retrieves the string from the url. JObject.Parse parses the string as a JObject. We can get the property you want by chaining indexers: ["query"]["pages"]["173462"]["thumbnail"]["source"].
Full source:
using System;
using System.Net;
using Newtonsoft.Json.Linq;
class Program
{
static void Main(string[] args)
{
const string url = "https://en.wikipedia.org/w/api.php?action=query&prop=pageimages&titles=cessna%20172&pithumbsize=500&format=json";
using (WebClient client = new WebClient())
{
string rawString = client.DownloadString(url);
var jsonResult = JObject.Parse(rawString);
string thumbnail = jsonResult["query"]["pages"]["173462"]["thumbnail"]["source"];
Console.WriteLine(thumbnail);
}
}
}

Ideally, you will have to define a class and de-serialised the json. Example :
Batch batch = JsonConvert.DeserializeObject<Account>(json);
More details here.
However, at times, just to get one/two values, it might be overhead to use an entire class structure. In this case, a quick way might be to parse the json dynamically. Example which is taken from here:
public void JValueParsingTest()
{
var jsonString = #"{""Name"":""Rick"",""Company"":""West Wind"",
""Entered"":""2012-03-16T00:03:33.245-10:00""}";
dynamic json = JValue.Parse(jsonString);
// values require casting
string name = json.Name;
string company = json.Company;
DateTime entered = json.Entered;
Assert.AreEqual(name, "Rick");
Assert.AreEqual(company, "West Wind");
}

Related

JsonDocument incomplete parsing with larger payloads

So basically, I have a HttpClient that attempts to obtain any form of JSON data from an endpoint. I previously utilized Newtonsoft.Json to achieve this easily but after migrating all of the functions to STJ, I started to notice improper parsing.
Platforms tested: macOS & Linux (Google Kubernetes Engine)
Framework: .NET Core 3.1 LTS
The code screenshots below show an API that returns a JSON Array. I simply stream it, load it into a JsonDocument, and then attempt to peek into it. Nothing comes out as expected. Code below is provided along with the step debug var results.
using System;
using System.ComponentModel;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Net.Http.Headers;
using System.Text;
using System.Text.Json;
using System.Threading.Tasks;
using System.Web;
using System.Xml;
namespace HttpCallDemo
{
class Program
{
static async Task Main(string[] args)
{
using (var httpClient = new HttpClient())
{
// FLUSH
httpClient.DefaultRequestHeaders.Clear();
httpClient.MaxResponseContentBufferSize = 4096;
string body = string.Empty, customMediaType = string.Empty; // For POST/PUT
// Setup the url
var uri = new UriBuilder("https://api-pub.bitfinex.com/v2/tickers?symbols=ALL");
uri.Port = -1;
// Pull in the payload
var requestPayload = new HttpRequestMessage(HttpMethod.Get, uri.ToString());
HttpResponseMessage responsePayload;
responsePayload = await httpClient.SendAsync(requestPayload,
HttpCompletionOption.ResponseHeadersRead);
var byteArr = await responsePayload.Content.ReadAsByteArrayAsync();
if (byteArr.LongCount() > 4194304) // 4MB
return; // Too big.
// Pull the content
var contentFromBytes = Encoding.Default.GetString(byteArr);
JsonDocument payload;
switch (responsePayload.StatusCode)
{
case HttpStatusCode.OK:
// Return the payload distinctively
payload = JsonDocument.Parse(contentFromBytes);
#if DEBUG
var testJsonRes = Encoding.UTF8.GetString(
Utf8Json.JsonSerializer.Serialize(payload.RootElement));
// var testRawRes = contentStream.read
var testJsonResEl = payload.RootElement.GetRawText();
#endif
break;
default:
throw new InvalidDataException("Invalid HTTP response.");
}
}
}
}
}
Simply execute the above Minimal code, notice that the payload is different from its original after parsing? I'm sure there's something wrong with the options for STJ. Seems like we have to optimise or explicitly define its limits to allow it to process that JSON payload.
Diving deeper into the debug content made things even weirder. When the HttpClient obtains the payload, reads it to a string, it gives me the entire JSON string as is. However, once we attempt to parse it into a JsonDocument and the further invoking RootElement.Clone(), we'll end up with a JsonElement with much lesser data and while carrying an invalid JSON struct (Below).
ValueKind = Array : "[["tBTCUSD",11418,70.31212518,11419,161.93475693,258.02141213,0.0231,11418,2980.0289306,11438,11003],["tLTCUSD",58.919,2236.00823543,58.95,2884.6718013699997,1.258,0.0218,58.998,63147.48344762,59.261,56.334],["tLTCBTC",0.0051609,962.80334198,0.005166,1170.07399991,-0.000012,-0.0023,0.0051609,4178.13148459,0.0051852,0.0051],["tETHUSD",396.54,336.52151165,396.55,384.37623341,8.26964946,0.0213,396.50930256,69499.5382821,397.77,380.5],["tETHBTC",0.034731,166.67781664000003,0.034751,356.03450125999996,-0.000054,-0.0016,0.034747,5855.04978836,0.035109,0.0343],["tETCBTC",0.00063087,15536.813429530002,0.00063197,16238.600279749999,-0.00000838,-0.0131,0.00063085,73137.62192801,0.00064135,0.00062819],["tETCUSD",7.2059,9527.40221867,7.2176,8805.54677899,0.0517,0.0072,7.2203,49618.78868196,7.2263,7],["tRRTUSD",0.057476,33577.52064154,0.058614,20946.501210000002,0.023114,0.6511,0.058614,210741.23592011,0.06443,0.0355],["tZECUSD",88.131,821.28048322,88.332,880.37484662,5.925,0.0
And of course, attempting to read its contents would result in:
System.InvalidOperationException: Operation is not valid due to the current state of the object.
at System.Text.Json.JsonElement.get_Item(Int32 index)
at Nozomi.Preprocessing.Abstracts.BaseProcessingService`1.ProcessIdentifier(JsonElement jsonDoc, String identifier) in /Users/nicholaschen/Projects/nozomi/Nozomi.Infra.Preprocessing/Abstracts/BaseProcessingService.cs:line 255
Here's proof that there is a proper 38KBs worth of data coming in from the endpoint.
UPDATE
Further testing with this
if (payload.RootElement.ValueKind.Equals(JsonValueKind.Array))
{
string testJsonArr;
testJsonArr = Encoding.UTF8.GetString(
Utf8Json.JsonSerializer.Serialize(
payload.RootElement.EnumerateArray()));
}
show that a larger array of arrays (exceeding 9 elements each with 11 elements) would result in an incomplete JSON struct, causing the issue i'm facing.
For those who are working with JsonDocument and JsonElement, take note that the step debug variables are not accurate. It is not advisable to inspect the variables during runtime as they do not display themselves entirely.
#dbc has proven that re-serializing the deserialized data will produce the complete dataset. I strongly suggest you wrap the serializers for debugging in a DEBUG preprocessor to make sure these redundant lines don't end up being executed out of development.
To interact with these entities, ensure you .clone() whenever you can to prevent disposals and ensure that you're accessing the RootElement and then subsequently traversing into it before viewing its value in step debug mode because large values will not be displayed.

Don't understand how to use JSON.NET with ASP.NET Core WebAPI

I've completed my first ASP.NET Core Web API and I'd like to try my hand at manually serializing/deserializing JSON via the JSON.NET library. In the JSON.NET documentation they give the following simple manual serialization example:
public static string ToJson(this Person p)
{
StringWriter sw = new StringWriter();
JsonTextWriter writer = new JsonTextWriter(sw);
writer.WriteStartObject();
// "name" : "Jerry"
writer.WritePropertyName("name");
writer.WriteValue(p.Name);
// "likes": ["Comedy", "Superman"]
writer.WritePropertyName("likes");
writer.WriteStartArray();
foreach (string like in p.Likes)
{
writer.WriteValue(like);
}
writer.WriteEndArray();
writer.WriteEndObject();
return sw.ToString();
}
What's lacking for a beginner such as myself is how to use this string. For example, consider the following:
[HttpGet("/api/data")
[Produces("application/json")]
public IActionResult GetData()
{
return Ok(new Byte[SomeBigInt]);
}
In the above code I don't really know where ASP.NET Core serializes the array to JSON...I'm assuming it happens somewhere under the hood. If I were to manually serialize (using the JSON.NET example) some big Byte array, what do I do with the resultant string? Is it just "return Ok(myJsonString);"? Won't the built-in serializer - not knowing that it is already the result of a serialization operation- serialize it again?
Since Asp.Net Core is quite flexible, there are several ways to return JSON. If you want to return Json from a controller one of the most straight forward ways to do it is like this:
[HttpGet("/api/data")]
public JsonResult GetData() {
return Json(new {
fieldOneString = "some value",
fieldTwoInt= 2
});
}
Under the hood the Json() helper method on the Controlleris using JSON.NET to do the JSON serialization and then sending that as the response body.
You could do the same thing like this:
string jsonText = JsonConvert.SerializeObject(new {
fieldOneString = "some value",
fieldTwoInt= 2
});
Response.WriteAsync(jsonText);
Note: to use Response.WriteAsync(jsonText) you need to add using Microsoft.AspNetCore.Http to your file and have a project reference to Microsoft.AspNetCore.Http.Abstractions.

WinRT XmlAnyElement and Serialization

We have a Windows Store application that communicates with our server using XML for requests / responses and are serialized with the XmlSerializer. The issue we are encountering is that one of our types can contain arbitrary XML as one of its properties. In non WinRT applications, the usage would have been.
public sealed ItemExtension {
[XmlAttribute("source")]
public string Source {get;set;}
[XmlAnyElement]
public XmlElement[] Data {get;set;}
}
This would allow us to have XML in our database like
<extension source="foo"><randomXml><data/></randomXml></extension>
In WinRT, XmlElement is not included, System.Xml.XmlElement does not exist and the Windows.Data.Xml.Dom.XmlElement is not compatible. Documentation mentions XElement, but XElement is not a supported WinRT type so the WinRT project won't compile if I try using it.
Is this a bug with Windows Store applications or is there a sufficient work around?
Thanks.
So far I've only found a hack to get this working. If we use
[XmlAnyElement]
public object Data {get;set;}
This will properly will properly deserialize existing data. In the debugger when inspecting it, it is of type System.Xml.XmlElement, which isn't exposed in WinRT. So there's no way to directly set this. Since we figured out that XmlSerializer can instantiate and access System.Xml.XmlElement, we use it to handle setting it by taking in an object/xml snippet, wrapping it in container xml for a wrapper type that contains [XmlAnyElement] and calling Deserialize on it to have the XmlSerializer instantiate an XmlElement, which can then be set on the target object you wish to serialize.
For getting data, since trying to read this property throws an exception in the UI layer, and trying to access InnerXml/OuterXml throw an exception as well, we are left with using the XmlSerializer to Serialize the XmlElement back into a string, and then can use that however you want.
public sealed class XmlAnyElementContainer
{
[XmlAnyElement]
public object Data { get; set; }
}
public void SetData(object extensionObject)
{
var objectSerializer = new XmlSerializer(extensionObject.GetType());
var settings = new XmlWriterSettings()
{
Indent = false,
OmitXmlDeclaration = true
};
var sb = new StringBuilder();
using (var xmlWriter = XmlWriter.Create(sb, settings))
{
objectSerializer.Serialize(xmlWriter, extensionObject);
}
string objectXml = sb.ToString();
string newXml = "<XmlAnyElementContainer>" + objectXml + "</XmlAnyElementContainer>";
var xmlAnySerializer = new XmlSerializer(typeof(XmlAnyElementContainer));
using (var sr = new StringReader(newXml))
{
[TargetPropertyToSerialize] = (xmlAnySerializer.Deserialize(sr) as XmlAnyElementContainer).Data;
}
}

How to export data from LinqPAD as JSON?

I want to create a JSON file for use as part of a simple web prototyping exercise. LinqPAD is perfect for accessing the data from my DB in just the shape I need, however I cannot get it out as JSON very easily.
I don't really care what the schema is, because I can adapt my JavaScript to work with whatever is returned.
Is this possible?
A more fluent solution is to add the following methods to the "My Extensions" File in Linqpad:
public static String DumpJson<T>(this T obj)
{
return
obj
.ToJson()
.Dump();
}
public static String ToJson<T>(this T obj)
{
return
new System.Web.Script.Serialization.JavaScriptSerializer()
.Serialize(obj);
}
Then you can use them like this in any query you like:
Enumerable.Range(1, 10)
.Select(i =>
new
{
Index = i,
IndexTimesTen = i * 10,
})
.DumpJson();
I added "ToJson" separately so it can be used in with "Expessions".
This is not directly supported, and I have opened a feature request here. Vote for it if you would also find this useful.
A workaround for now is to do the following:
Set the language to C# Statement(s)
Add an assembly reference (press F4) to System.Web.Extensions.dll
In the same dialog, add a namespace import to System.Web.Script.Serialization
Use code like the following to dump out your query as JSON
new JavaScriptSerializer().Serialize(query).Dump();
There's a solution with Json.NET since it does indented formatting, and renders Json dates properly. Add Json.NET from NuGet, and refer to Newtonsoft.Json.dll to your “My Extensions” query and as well the following code :
public static object DumpJson(this object value, string description = null)
{
return GetJson(value).Dump(description);
}
private static object GetJson(object value)
{
object dump = value;
var strValue = value as string;
if (strValue != null)
{
var obj = JsonConvert.DeserializeObject(strValue);
dump = JsonConvert.SerializeObject(obj, Newtonsoft.Json.Formatting.Indented);
}
else
{
dump = JsonConvert.SerializeObject(value, Newtonsoft.Json.Formatting.Indented);
}
return dump;
}
Use .DumpJson() as .Dump() to render the result. It's possible to override more .DumpJson() with different signatures if necessary.
As of version 4.47, LINQPad has the ability to export JSON built in. Combined with the new lprun.exe utility, it can also satisfy your needs.
http://www.linqpad.net/lprun.aspx

Using arbitrary JSON objects in OpenRasta

I can't seem to find anything in the OpenRasta docs or tutorials that shows how to use arbitrary JSON objects (i.e. objects not predefined using C# classes) for both receiving from and responding back to the client.
One way to do it would be to use JsonValue and write a custom codec that would just use the (de)serialization features provided by JsonValue. That should be pretty straightforward and less than 50 lines of code, but I wondered if there isn't anything built into OpenRasta?
(One downside of JsonValue is that MS has not yet released it, so you can't yet deploy it to customers (see 1. "Additional Use Rights"). But in cases where that matters, any other Json library, like Json.NET can be used.)
I have written, like most people, a very simple codec that supports dynamics as inputs and outputs to handlers using json.net. You can also register that codec with an anonymous type and it works brilliantly. You end up with this:
public object Post(dynamic myCustomer) {
return new { response = myCustomer.Id };
}
I just implemented a JSON codec using JsonFx. It goes like this:
using System.IO;
using System.Text;
using JsonFx.Json;
namespace Example
{
[global::OpenRasta.Codecs.MediaType("application/json")]
public class JsonFXCodec : global::OpenRasta.Codecs.IMediaTypeWriter, global::OpenRasta.Codecs.IMediaTypeReader
{
public void WriteTo(object entity, global::OpenRasta.Web.IHttpEntity response, string[] codecParameters)
{
JsonWriter json = new JsonWriter();
using (TextWriter w = new StreamWriter(response.Stream, Encoding.UTF8))
{
json.Write(entity, w);
}
}
public object ReadFrom(global::OpenRasta.Web.IHttpEntity request, global::OpenRasta.TypeSystem.IType destinationType, string destinationName)
{
JsonReader json = new JsonReader();
using (TextReader r = new StreamReader(request.Stream, Encoding.UTF8))
{
return json.Read(r, destinationType.StaticType);
}
}
public object Configuration { get; set; }
}
}
If it is registered for "object" then it seems to work for any class:
ResourceSpace.Has.ResourcesOfType<object>()
.WithoutUri
.TranscodedBy<JsonFXCodec>();