How to deserialize JSON to object in Swift WITHOUT mapping - json

Is there a way to deserialize a JSON response to a custom object in Swift WITHOUT having to individually map each element.
Currently I am doing it manually with SwiftyJSON but it still requires me to map every field redundantly:
var myproperty1 = json["myproperty1"].stringValue
However coming from a C# background it is as simple as one line of code:
JsonConvert.DeserializeObject<CustomObject>(jsonString); //No mapping needed.
Source - http://www.newtonsoft.com/json/help/html/DeserializeObject.htm
Since I am writing many API endpoints I would like to avoid all the boilerplate mapping code that can be prone to errors. Keep in mind that my JSON responses will be multiple levels deep with arrays of arrays. So any function would need to be recursive.
Similiar question : Automatic JSON serialization and deserialization of objects in Swift

You could use EVReflection for that. You can use code like:
var user:User = User(json:jsonString)
or
var jsonString:String = user.toJsonString()
See the GitHub page for more detailed sample code

You should be able to use key value coding for this. You'd have to make your object a subclass of NSObject for KVC to work, and then you could loop through the elements of the JSON data and use setValue:forKey to assign a value for each key in the dictionary.
Note that this would be dangerous: If the target object did not contain a value for a specific key then your code would crash, so your program would crash on JSON data that contained invalid keys. Not a good thing.

Related

How does IPCRenderer.send() serialize data to JSON?

I am trying to send information about an error event using ipcRenderer.send("error", errorObject) but my Error object gets serialized to '{}' in the listener. Now, I know that ipcRenderer serializes objects to JSON internally (More information here: https://electronjs.org/docs/api/ipc-renderer) so I want to find out what method is called for serialization to JSON internally so that I can try to override it in my code. Can anyone help?
I guess it's using JSON.stringify() but it's probably serialized for security reason so maybe it's better to not override it. BTW I don't think override JSON.stringify() is a good practice in any way. I didn't notice ipcRenderer.send serialized data, I pass plain JavaScript Object as data and don't parse it on ipcMain side.

Efficient way of parsing Jax-ws Restful Response

I need to parse the jax-ws rest response and I tried the following two ways of parsing the response.Both works good.But I am in need to know the best efficient way of implementation.Please provide me your view.
First Approach:
Use getEntity Object and get the response as Input Stream.
Using Jackson ObjectMapper readValue() -covert the inputstream to java
object.
Using getters and setters of nested java class get the response objects member values.
Second Approach:
Use getEntity Object and get the response as Input Stream and and
convert the Input Stream to String.
Using Google Json API,convert the string to json object.
Using Json parser and get the nested objects member values.
I would say the first approach is better for two reasons:
You don't go through the intermediate process of reading the response payload into String
The setter methods that are called during Jackson deserialization may perform validation on input and throw appropriate exceptions, so you do validation during deserialization.
Maybe not a general answer to this question but another variant of what you're describing under "First approach". I would start with a generic data structure and would only introduce an extra bean if necessary. I wouldn't use String to pass structured data around.
Use jackson to convert the JSON response to a
Map<String,Object> or JsonNode.
Advantage:
You don't need to implement a specialized bean class. Even a very simple bean can become unhandy over time (if format changes or new nested structures are added to the json response, etc.). It also introduces some kind of metaphor to your code which sometimes helps but also can be misleading.
Map<String,Object> is in the JDK and offers a good interface to access data. You don't have to change any interfaces even if the JSON format changes.
You can always pass your data in form of a Map<String,Object>
Disadvantage
Data Encapsulation. The map is a very close representation of the input data and therefore offers not same level of abstraction like a bean.

How to get JSON output from Cloudant CouchDB?

I'm new to cloudant and still figuring out various APIs that it provides for access and retrieval.
So far, I've used findbyIndex using selector string which returns JSON as de-serialised object of the POJO supplied.
For a certain use case which I'm working- I need to get JSON only and not the de-seralised object. Is there any way or API from which I can get the actual cloudant doc as JSON only ?
If you know the ID of the document you want to read only the raw JSON, you can use the find method to get the raw inputstream from the request. Currently using the findByIndex method you can only get a deserialised java object back, so you could always use a Map object for your POJO. Although that being said it might be possible to pass the type String for the class to deserialise, however I have not tested this.

Debugging json4s read deserialization errors

I am attempting to consume an API that I do not have control over which is somewhat poorly documentented and somewhat inconsistent. This means that sometimes, the API returns a different type than what is documented or what you would normally see. For this example, we'll look at a case when an array was returned in a place where I would normally see a string. That makes a crappy API, but my real problem is: How can I more easily track those things down? Right now, the errors look something like this:
No usable value for identifier
Do not know how to convert JArray(List(JString(3c8723eceb1a), JString(cba8849e7a2f))) into class java.lang.String
After deciphering the problem (why JValue::toString doesn't emit a JSON string is utterly perplexing to me), I can figure out the API returned an array when I made my case class only able to deal with Strings. Great. My issue is that finding this discrepancy between my object model and the contents of the JSON seems significantly more difficult than it should be.
Currently, this is my workflow for hunting down decoding errors:
Hope bad data has some sort of identifying marker. If this is not true, then it is way more guesswork and you will have to repeat the following steps for each entry that looks like the bad bits.
Go through the troubles of converting the JArray(List(JString(...), ...)) from the error message into valid JSON, hoping that I encode JSON the same way at the API endpoint I got the data from does. If this is not true, then I use a JSON formatter (jq) to format all data consistently.
Locate the place in the source data where the decoding error originates from.
Backtrack through arrays and objects to discover how I need to change my object model to more accurately represent what data is coming back to me from the API.
Some background: I'm coming from C++, where I rolled my own JSON deserialization framework for this purpose. The equivalent error when using the library I built is:
Error decoding value at result.taskInstances[914].subtasks[5].identifier: expected std::string but found array value (["3c8723eceb1a","cba8849e7a2f"]) at 1:4084564
This is my process when using my hand-rolled library:
Look at the expected type (std::string) compared with the data that was actually found (["3c8723eceb1a","cba8849e7a2f"]) and alter my data model for the path for the data in the source (result.taskInstances[914].subtasks[5].identifier)
As you can see, I get to jump immediately to the problem that I actually have.
My question is: Is there a way to more quickly debug inconsistencies between my data model and the results I'm getting back from the API?
I'm using json4s-native_2.10 version 3.2.8.
A simplified example:
{ "property": ["3c8723eceb1a", "cba8849e7a2f"] }
Does not mesh with Scala class:
case class Thing(property: String)
The best solution would be to use Try http://www.scala-lang.org/api/current/#scala.util.Try in Scala, but unfortunately json4s API cannot.
So, I think you should use Scala Option type http://www.scala-lang.org/api/current/#scala.Option .
In Scala, and more generally in functional languages, Options are used to represent an object that can be there or not (like à nil value).
For handle parsing failures, you can use parse(str).toOption, which is a function that return an Option[JValue], and you can doing a pattern matching on the resulting value.
For handling extraction of data extraction into case classes, you can use extractOpt function, to do pattern matching on the value.
You can read this answer : https://stackoverflow.com/a/15944506/2330361

Stream objects from MongoDB cursor into nodejs HTTP response

NOTE: I don't believe this question is a duplicate of this similar question because it is more specific.
I'm attempting to retrieve multiple objects from Mongo with the nodejs-mongodb-driver and write the objects to an HTTP response as JSON. The objects should be in the form of an array, but i don't want to call toArray() on the cursor because of the memory overhead and I try to avoid large JSON.stringify calls whenever possible.
var response = ... // an http response
collection.find().stream(JSON.stringify).pipe(response); // causes a malformed JSON string
The object in the browser appears as follow.
{"obj", "obj"}{"obj", "obj"} // clearly malformed
Is there an efficient way to do this?
I will explain the code you wrote so that you understand why it returns malformed JSON and why you probably need toArray() or the JSONStream libary from the answer you posted.
First collection.find() returns a Cursor object. At that point no data was read. Then, the .stream(JSON.stringify) call returns a readable Stream with the transformation function JSON.stringify. Still no data read.
The .pipe(response) call then reads the entire Stream to the end and for every object it calls the JSON.stringify function. Note that it does really call it for every single object seperately and therefore does not create an array. Instead you get your malformed JSON, object after object.
Now the answer in the question you posted as possible duplicate (Stream from a mongodb cursor to Express response in node.js) would work for you, but it requires an additional libary with a JSONStream. The JSONStream properly handles the CursorStream for JSON output. I don't know if that really reduces the overhead though, but you could try that.
Without an addition libary you will have to use toArray().