Intercept JSON prior to RestKit - json

The company I'm working at is considering using RestKit. However, the JSON that our server returns is surrounded characters for security reasons. It's a pain. In another iPhone app, one that does not use RestKit and uses JSON only very little, I parse the string returned from the server, removing the characters preceding and trailing the JSON string. Once the the string is parsed, I call JSONValue on the string (we're using SBJSON) and get an NSDictionary.
I've heard that RestKit features a pluggable architecture. If that's the case is there somewhere I can intercept the strings coming back from the server prior to the point where RestKit does its parsing?

I wanted to find a fix that did not require me to change the RestKit codebase in any way and I found it. The answer was to create and register my own parser.
Parsers need to conform to the RKParser protocol. Basically what I needed to do was trim the server response and not parse the response into objects - there was already a parser that did that: RKJSONParserJSONKit. So I subclassed this class and registered my parser at start up:
[[RKParserRegistry sharedRegistry] setParserClass:[MyJSONParser class]
forMIMEType:#"application/json"];

Just wanted to note that nowadays you can implement your own retrieve/map operation by subclassing the
RKHTTPRequestOperation (doc) — for retrieving file from server
RKObjectRequestOperation (doc) — for mapping
RKManagedObjectRequestOperation (doc) — for mapping to core data objects
and registering them with [RKObjectManager registerRequestOperationClass:] (doc) method.

Related

How to handle JSON body using Tornado w/ python3

Tornado 4.5.2 using Python3 represents the request body as a byte object instead of a native dictionary. This presents a problem for methods like RequestHandler.get_body_argument() which will not access the field correctly.
My question is how to correctly have tornado parse these bodies into more useable dictionaries so the standard library will work. I've looked throughout tornado's documentation and there's next to nothing on even the existence of this problem.
Am I missing something here or will I need to re-implement those methods myself?
Tornado never automatically parses JSON; it only automatically parses HTML-standard form encoding (the data models of form encoding and JSON are different, so it wouldn't make sense to use the same family of get_argument/get_arguments methods in the less-ambiguous JSON format). If you want to handle JSON requests, it's one line to parse it yourself:
args = tornado.escape.json_decode(self.request.body)

Is it necessary to pass request data as JSON string to server

When making an AJAX request to a server (may be Java, PHP, etc), is it necessary to pass data as JSON string ?
Can we not pass the object directly ? Are there issues de-serializing OR can that be handled at the backend ? Any examples of handling JS object (if it is possible to send an obj directly) at the backend would be great ?
Object literal makes sense only in the JavaScript runtime environment. Since AJAX body is simply a string, you can pass {a:3} to a server. But what should server side do with it? It ertainly can store it in a database and return to you back when requested. But what if it wanted to extract some data from it? You' have to have JS runtime and evaluate the object using eval. Which would be awkward, but possible. However, not all servers have JS runtime environment. Whereas there are libraries for many languages that support parsing JSON into the representation specific to the language on the server.
An AJAX request passes data to the server in the same way any other HTTP request does. Most commonly, AJAX requests use POST and pass data to the server as POST data, but query strings are often used, and there are other ways to pass data to a server using HTTP and AJAX.
In essence all HTTP data is octets (bytes), and HTTP has no special support for serialization of JavaScript objects, so you or the libraries and/or frameworks you use must handle the serialization.

JSON parsing without using Java objects

I want to parse JSON data from a RESTful service.
Unlike a SOAP-based service, where a service consumer can create stubs and skeleton from WSDL, in the case of the RESTful service, the service consumer gets a raw JSON string.
Since the service consumer does not have a Java object matching the JSON structure, we are not able to use the JSON to Java Mappers like GSON, Jackson etc.
One another way is to use parsers like JsonPath, minimal-json, etc which help traversing the JSON structure and read the data.
Is there any better way of reading JSON data?
The official docs for Jackson mention 3 different ways to parse a JSON doc from Java. The first 2 do not require "Java object matching the JSON structure". In Summary :
Streaming API (aka "Incremental parsing/generation") reads and writes JSON content as discrete events.
Tree Model provides a mutable in-memory tree representation of a JSON document. ObjectMapper can build trees that consist of JsonNode nodes.
Data Binding converts JSON to and from POJOs based either on property accessor conventions or annotations.
With simple data binding you convert to and from Java Maps, Lists, Strings, Numbers, Booleans and nulls
With full data binding you convert to and from any Java bean type (as well as "simple" types mentioned above)
Another option is to generate Java Beans from JSON documents. You mileage may vary and you may/probably will have to modify the generated files. There are at least 5 online tools for that purpose that you can try:
http://www.jsonschema2pojo.org/
http://pojo.sodhanalibrary.com/
https://timboudreau.com/blog/json/read
http://jsongen.byingtondesign.com/
http://json2java.azurewebsites.net/
There are also IDE plugins that you can use. For instance this one for Intellij https://plugins.jetbrains.com/idea/plugin/7678-jackson-generator-plugin
The GSON supports work without objects, too. Something as this:
JsonObject propertiesWrapper = new JsonParser().parse(responseContent).getAsJsonObject();
assertNotNull(propertiesWrapper);
propertiesWrapper = propertiesWrapper.getAsJsonObject("properties");
assertNotNull(propertiesWrapper);
JsonArray propertiesArray = propertiesWrapper.getAsJsonArray("property");
assertNotNull(propertiesArray);
assertTrue(propertiesArray.size()>0, "The list of properties should not be empty. ");
The problem is that the work this way is so inconvenient that it is really better to create objects instead.
Jackson has absolutely the same problems, and to greater extent - extremal inconvenient for direct json reading/creation. All its tutorials advice to use POJOs instead, too.
The only really convenient way is use Groovy. Groovy works as an envelope on Java, you can simply write Java code and use Groovy operators at need. And in JSON or XML reading and creation Groovy is incomparably more powerful that Java with all its libraries multiplied on each other! It is even much more convenient than already prepared by somebody else tree structure of ready POJOs.

Debugging json4s read deserialization errors

I am attempting to consume an API that I do not have control over which is somewhat poorly documentented and somewhat inconsistent. This means that sometimes, the API returns a different type than what is documented or what you would normally see. For this example, we'll look at a case when an array was returned in a place where I would normally see a string. That makes a crappy API, but my real problem is: How can I more easily track those things down? Right now, the errors look something like this:
No usable value for identifier
Do not know how to convert JArray(List(JString(3c8723eceb1a), JString(cba8849e7a2f))) into class java.lang.String
After deciphering the problem (why JValue::toString doesn't emit a JSON string is utterly perplexing to me), I can figure out the API returned an array when I made my case class only able to deal with Strings. Great. My issue is that finding this discrepancy between my object model and the contents of the JSON seems significantly more difficult than it should be.
Currently, this is my workflow for hunting down decoding errors:
Hope bad data has some sort of identifying marker. If this is not true, then it is way more guesswork and you will have to repeat the following steps for each entry that looks like the bad bits.
Go through the troubles of converting the JArray(List(JString(...), ...)) from the error message into valid JSON, hoping that I encode JSON the same way at the API endpoint I got the data from does. If this is not true, then I use a JSON formatter (jq) to format all data consistently.
Locate the place in the source data where the decoding error originates from.
Backtrack through arrays and objects to discover how I need to change my object model to more accurately represent what data is coming back to me from the API.
Some background: I'm coming from C++, where I rolled my own JSON deserialization framework for this purpose. The equivalent error when using the library I built is:
Error decoding value at result.taskInstances[914].subtasks[5].identifier: expected std::string but found array value (["3c8723eceb1a","cba8849e7a2f"]) at 1:4084564
This is my process when using my hand-rolled library:
Look at the expected type (std::string) compared with the data that was actually found (["3c8723eceb1a","cba8849e7a2f"]) and alter my data model for the path for the data in the source (result.taskInstances[914].subtasks[5].identifier)
As you can see, I get to jump immediately to the problem that I actually have.
My question is: Is there a way to more quickly debug inconsistencies between my data model and the results I'm getting back from the API?
I'm using json4s-native_2.10 version 3.2.8.
A simplified example:
{ "property": ["3c8723eceb1a", "cba8849e7a2f"] }
Does not mesh with Scala class:
case class Thing(property: String)
The best solution would be to use Try http://www.scala-lang.org/api/current/#scala.util.Try in Scala, but unfortunately json4s API cannot.
So, I think you should use Scala Option type http://www.scala-lang.org/api/current/#scala.Option .
In Scala, and more generally in functional languages, Options are used to represent an object that can be there or not (like à nil value).
For handle parsing failures, you can use parse(str).toOption, which is a function that return an Option[JValue], and you can doing a pattern matching on the resulting value.
For handling extraction of data extraction into case classes, you can use extractOpt function, to do pattern matching on the value.
You can read this answer : https://stackoverflow.com/a/15944506/2330361

How can I handle JSON strings in UniData?

I am receiving JSON strings from a web service. Is there any JSON parsers for UniData?
I am on version 7.2.
If you upgrade to UniData 7.3.x, you will indeed get access to a JSON (and more) parser.
check out the manuals for a new feature called 'U2 Dynamic Objects' - UDO for short.
This will allow you to serialize, access, modified, etc JSON strings.