NOTE: I don't believe this question is a duplicate of this similar question because it is more specific.
I'm attempting to retrieve multiple objects from Mongo with the nodejs-mongodb-driver and write the objects to an HTTP response as JSON. The objects should be in the form of an array, but i don't want to call toArray() on the cursor because of the memory overhead and I try to avoid large JSON.stringify calls whenever possible.
var response = ... // an http response
collection.find().stream(JSON.stringify).pipe(response); // causes a malformed JSON string
The object in the browser appears as follow.
{"obj", "obj"}{"obj", "obj"} // clearly malformed
Is there an efficient way to do this?
I will explain the code you wrote so that you understand why it returns malformed JSON and why you probably need toArray() or the JSONStream libary from the answer you posted.
First collection.find() returns a Cursor object. At that point no data was read. Then, the .stream(JSON.stringify) call returns a readable Stream with the transformation function JSON.stringify. Still no data read.
The .pipe(response) call then reads the entire Stream to the end and for every object it calls the JSON.stringify function. Note that it does really call it for every single object seperately and therefore does not create an array. Instead you get your malformed JSON, object after object.
Now the answer in the question you posted as possible duplicate (Stream from a mongodb cursor to Express response in node.js) would work for you, but it requires an additional libary with a JSONStream. The JSONStream properly handles the CursorStream for JSON output. I don't know if that really reduces the overhead though, but you could try that.
Without an addition libary you will have to use toArray().
Related
I'm trying to read a sequence of JSON objects from a network stream. This involves finding complete JSON objects and returning them one by one. As soon as a complete JSON object was received, I need to have it. Anything else that follows that JSON object is for the next object and must only be used when the next complete object was received.
I would have thought that the Utf8JsonReader class could do that but apparently it cannot accept a Stream of any kind. It even seems to be unwanted to have that possibility.
Now I'm wondering if it's possible at all to use .NET's shiny new JSON parser to read from a stream when I don't know when data arrives and how much of it. Do I need to split the JSON object messages manually or can the already existing reader just stop when it has something and continue when the next thing is available? I mean, if it can do that on a predefined array of bytes, it could surely also do it with some waiting in between until more data is available. That just doesn't seem to be exposed in the public API. Or did I miss something?
The JsonDocument.ParseAsync(Stream) method cannot be used because it would read the stream to the end. That doesn't make sense for a network stream that stays open for a long time and just reads some data from time to time.
New to Dart. It seems that .transform(JsonDecoder()) will hang until the stream is closed or throw an error if it starts to see a new Json object. I could cache the entire strings and parse them that way, but I would like to take advantage of the stream an not store more than is needed in memory.
Is there a way to get the JsonDecoder to push an object to the sink as soon as it gets a complete valid Json Object? I've tried extending some of the internal classes, but only got a private library error.
https://github.com/dart-lang/sdk/blob/1278bd5adb6a857580f137e47bc521976222f7b9/sdk/lib/_internal/vm/lib/convert_patch.dart#L1500 . This seems to be the relevant code and it's really a pain in my butt. Would I need to create a dummy stream or something?
If the input is newline separated, you can do:
Stream jsonObjects = inputStream
.transform(utf8.decoder) // if incoming is bytes.
.transform(const LineSplitter())
.map(jsonDecode);
The JsonDecoder converter only works on a single JSON value, because the JSON grammar doesn't allow more than one value in a JSON source text.
The LineSplitter will buffer until it has an entire line, then emit one line at a time, so if each JSON message is on a line by itself, that makes each event from the line-splitted stream a complete JSON value.
Is there a way to deserialize a JSON response to a custom object in Swift WITHOUT having to individually map each element.
Currently I am doing it manually with SwiftyJSON but it still requires me to map every field redundantly:
var myproperty1 = json["myproperty1"].stringValue
However coming from a C# background it is as simple as one line of code:
JsonConvert.DeserializeObject<CustomObject>(jsonString); //No mapping needed.
Source - http://www.newtonsoft.com/json/help/html/DeserializeObject.htm
Since I am writing many API endpoints I would like to avoid all the boilerplate mapping code that can be prone to errors. Keep in mind that my JSON responses will be multiple levels deep with arrays of arrays. So any function would need to be recursive.
Similiar question : Automatic JSON serialization and deserialization of objects in Swift
You could use EVReflection for that. You can use code like:
var user:User = User(json:jsonString)
or
var jsonString:String = user.toJsonString()
See the GitHub page for more detailed sample code
You should be able to use key value coding for this. You'd have to make your object a subclass of NSObject for KVC to work, and then you could loop through the elements of the JSON data and use setValue:forKey to assign a value for each key in the dictionary.
Note that this would be dangerous: If the target object did not contain a value for a specific key then your code would crash, so your program would crash on JSON data that contained invalid keys. Not a good thing.
I am attempting to consume an API that I do not have control over which is somewhat poorly documentented and somewhat inconsistent. This means that sometimes, the API returns a different type than what is documented or what you would normally see. For this example, we'll look at a case when an array was returned in a place where I would normally see a string. That makes a crappy API, but my real problem is: How can I more easily track those things down? Right now, the errors look something like this:
No usable value for identifier
Do not know how to convert JArray(List(JString(3c8723eceb1a), JString(cba8849e7a2f))) into class java.lang.String
After deciphering the problem (why JValue::toString doesn't emit a JSON string is utterly perplexing to me), I can figure out the API returned an array when I made my case class only able to deal with Strings. Great. My issue is that finding this discrepancy between my object model and the contents of the JSON seems significantly more difficult than it should be.
Currently, this is my workflow for hunting down decoding errors:
Hope bad data has some sort of identifying marker. If this is not true, then it is way more guesswork and you will have to repeat the following steps for each entry that looks like the bad bits.
Go through the troubles of converting the JArray(List(JString(...), ...)) from the error message into valid JSON, hoping that I encode JSON the same way at the API endpoint I got the data from does. If this is not true, then I use a JSON formatter (jq) to format all data consistently.
Locate the place in the source data where the decoding error originates from.
Backtrack through arrays and objects to discover how I need to change my object model to more accurately represent what data is coming back to me from the API.
Some background: I'm coming from C++, where I rolled my own JSON deserialization framework for this purpose. The equivalent error when using the library I built is:
Error decoding value at result.taskInstances[914].subtasks[5].identifier: expected std::string but found array value (["3c8723eceb1a","cba8849e7a2f"]) at 1:4084564
This is my process when using my hand-rolled library:
Look at the expected type (std::string) compared with the data that was actually found (["3c8723eceb1a","cba8849e7a2f"]) and alter my data model for the path for the data in the source (result.taskInstances[914].subtasks[5].identifier)
As you can see, I get to jump immediately to the problem that I actually have.
My question is: Is there a way to more quickly debug inconsistencies between my data model and the results I'm getting back from the API?
I'm using json4s-native_2.10 version 3.2.8.
A simplified example:
{ "property": ["3c8723eceb1a", "cba8849e7a2f"] }
Does not mesh with Scala class:
case class Thing(property: String)
The best solution would be to use Try http://www.scala-lang.org/api/current/#scala.util.Try in Scala, but unfortunately json4s API cannot.
So, I think you should use Scala Option type http://www.scala-lang.org/api/current/#scala.Option .
In Scala, and more generally in functional languages, Options are used to represent an object that can be there or not (like à nil value).
For handle parsing failures, you can use parse(str).toOption, which is a function that return an Option[JValue], and you can doing a pattern matching on the resulting value.
For handling extraction of data extraction into case classes, you can use extractOpt function, to do pattern matching on the value.
You can read this answer : https://stackoverflow.com/a/15944506/2330361
It might be a frequent question but i cannot figure out how to prevent errors in my parsing when the script can't find a property...
in XML was easy because even the empty properties were like <location/>
but now if location is not available JSON paser cant find it and it results in errors...OR
it may happen the json has different property or a children lost its father..... so for instance if you need to extract the LocalityName is no more under SubAdministrativeArea but under AddressLine...
any of you have any experience about? what the best way to solve it and to parse it correctly?
While answering one of your other question I had written the following javascript code to obtain the lat and lng from the JSON returned by the maps api without any validations for zero results.
$.getJSON("getjson.php?address="+address,
function(data){
lat=data.results[0].geometry.location.lat;
lng=data.results[0].geometry.location.lng;
//.....map initialization code
}
);
Now if I were to validate for zero results, I'd modify the code in the following way
$.getJSON("getjson.php?address="+address,
function(data){
if (data.result.length>0) {
for (count=0;count<data.result.length;count++){
lat=data.results[count].geometry.location.lat;
lng=data.results[count].geometry.location.lng;
//.....map initialization code
}
}
}
);
As you can see parsing JSON comes naturally to javascript and to many other languages for it resolves down to arrays/lists and objects/dictionary/hash.
If I get this right and you are using a library to convert to json like gson try to construct some kind of object array like arrayList in java and then convert to json so every object you retrieve later in javascript is distinct and thus clear during debugging.Also if you don't use firebug give it try as it shows json data clearly.cheers