I'm currently working on a project that has front-end components (Jira) written in groovy, and backend processes written in powershell. We're using json to pass information back and forth. One of the biggest problems we've encountered is coming up with a standardized "template" for the json that is being used on both ends. What we have works, but it is a frankenstein mess.
We are using json libraries for both json and powershell -- the json that is being constructed on either end is legit json. We are also encoding it to base64 to get around interpolation issues we've run into.
My main question is this: what is the best practice for passing data between different tools in json? I'm relatively new to it. Is there some sort of standard template we should be adhering to? I develop the groovy side, my friend the powershell side -- I was hoping to come up with something that would minimize problems if someone messed up how the json was constructed on either side. Something to check against. Something akin to an xsd.
Was curious if people have dealt with this type of thing, and what the best approach was. As I mentioned before, we have something that works now -- with error handling and whatnot, but it was very organic... and not standardized at all. I saw mention of jasonp, jsend, etc., but having some difficulty groking the options.
Tips/guidance appreciated.
My goal is to come up with a set of automated tests that will consume some services of an API and check if the compatibility is not broken (by mistake or on purpose).
My initial idea is to get the JSON results and deserialize against the DTOs used in the serialization. The reason is to avoid the manual generation of a schema that can get really big and difficult to maintain. The problem is that libraries like GSON are very robust and tend not to thrown exceptions when some problem happens in the deserialization (unless we write a custom deserializer, that again will take time and maintainance efforts).
Am I going in the right direction here? or is there another way to ensure API compatibility?
(Ideally, I would like to test not only JSON, but also XML responses from the same API.)
My application's vaadin version is upgraded from 7.3.6 to 7.4.6 and my Json code wherever I was using the GSON is starting to break.
I did googling for the same I found the similar issue mentioned in this stackOverflowPost
So should I conclude that elemental.json is not compatible with GSON and org.json was compatible with GSON.
GSON is not GWT compatible, which makes sense since it requires reflection to do its work. Likewise, org.json is meant to be used on a normal JVM.
On the other hand, GWT knows it is running in the browser, so doesn't need to implement its own JSON parser, since the browser already has one. There are several ways to use JSON in GWT, and GSON, org.json, and the built-in JSON parser of the browser all speak the same JSON.
All are compatible with each other, though you can't simply use elemental on the server or GSON on the client, or reuse the same server types on the client.
What specifically is 'starting to break'? What errors do you get, and what does the data that you are trying to send look like?
(Also worth noting, some of the 'json' in the linked post uses ' quotes for properties and strings, which is not legal JSON and should not work in the first place with any proper JSON parser.)
There are many possibilities to parse a JSON in context of a Windows Store App.
Regardless in which language (C#, JavaScript or C++).
For example: .NET 4.5 JsonObject and DataContractJsonSerializer, JavaScript Json parser or an extern one like Json.NET.
Does anybody know something about that?
I only read good things about Json.NET's performance.
But are they true and play that a role for JSON's which include datasets of 100k JSON objects? Or the user won't notice a difference?
I only have experience using Json.NET ... works just fast and great! I also used the library in enterprise-projects - i never got disappointed!
If it helps, and FWIW, I've been collecting recently some new JSON parsing / deserialization performance data that can be observed over various JSON payload "shapes" (and sizes), when using four JSON librairies, there:
https://github.com/ysharplanguage/FastJsonParser#Performances
(.NET's out-of-the-box JavaScriptSerializer vs. JSON.NET vs. ServiceStack vs. JsonParser)
Please note:
these figures are for the full .NET only (i.e., the desktop / server tier; not mobile devices)
I was interested in getting new benchmark figures about parsing / deserialization performances only (i.e., not serialization)
finally, I was also especially interested (although not exclusively) in figures re: strongly typed deserialization use cases (i.e., deserializing into POCOs)
'Hope this helps,
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
Is DOM the only way to parse JSON?
Some JSON parsers do offer incremental ("streaming") parser; for Java, at least following parsers from json.org page offer such an interface:
Jackson (pull interface)
Json-simple (SAX-style push interface)
(in addition to Software Monkey's parser referred to by another answer)
Actually, it is kind of odd that so many JSON parsers do NOT offer this simple low-level interface -- after all, they already need to implement low-level parsing, so why not expose it.
EDIT (June 2011): Gson too has its own streaming API (with gson 1.6)
By DOM, I assume you mean that the parser reads an entire document at once before you can work with it. Note that saying DOM tends to imply XML, these days, but IMO that is not really an accurate inference.
So, in answer to your questions - "Yes", there are streaming API's and "No", DOM is not the only way. That said, processing a JSON document as a stream is often problematic in that many objects are not simple field/value pairs, but contain other objects as values, which you need to parse to process, and this tends to end up a recursive thing. But for simple messages you can do useful things with a stream/event based parser.
I have written a pull-event parser for JSON (it was one class, about 700 lines). But most of the others I have seen are document oriented. One of the layers I have built on top of my parser is a document reader, which took about 30 LOC. I have only ever used my parser in practice as a document loader (for the above reason).
I am sure if you search the net you will find pull and push based parsers for JSON.
EDIT: I have posted the parser to my site for download. A working compilable class and a complete example are included.
EDIT2: You'll also want to look at the JSON website.
As stefanB mentioned, http://lloyd.github.com/yajl/ is a C library for stream parsing JSON. There are also many wrappers mentioned on that page for other languages:
yajl-ruby - ruby bindings for YAJL
yajl-objc - Objective-C bindings for YAJL
YAJL IO bindings (for the IO language)
Python bindings come in two flavors, py-yajl OR yajl-py
yajl-js - node.js bindings (mirrored to github).
lua-yajl - lua bindings
ooc-yajl - ooc bindings
yajl-tcl - tcl bindings
some of them may not allow streaming, but many of them certainly do.
If you want to use pure javascript and a library that runs both in node.js and in the browser you can try clarinet:
https://github.com/dscape/clarinet
The parser is event-based, and since it’s streaming it makes dealing with huge files possible. The API is very close to sax and the code is forked from sax-js.
Disclaimer: I'm suggesting my own project.
I maintain a streaming JSON parser in Javascript which combines some of the features of SAX and DOM:
Oboe.js website
The idea is to allow streaming parsing, but not require the programmer to listen to lots of different events like with raw SAX. I like SAX but it tends to be quite low level for what most people need. You can listen for any interesting node from the JSON stream by registering JSONPath patterns.
The code is on Github here:
Oboe.js Github page
LitJSON supports a streaming-style API. Quoting from the manual:
"An alternative interface to handling JSON data that might be familiar to some developers is through classes that make it possible to read and write data in a stream-like fashion. These classes are JsonReader and JsonWriter.
"These two types are in fact the foundation of this library, and the JsonMapper type is built on top of them, so in a way, the developer can think of the reader and writer classes as the low-level programming interface for LitJSON."
If you are looking specifically for Python, then ijson claims to support it. However, it is only a parser, so I didn't come across anything for Python that can generate json as a stream.
For C++ there is rapidjson that claims to support both parsing and generation in a streaming manner.
Here's a NodeJS NPM library for parsing and handling streams of JSON:
https://npmjs.org/package/JSONStream
For Python, an alternative (apparently lighter and more efficient) to ijson is jsaone (see that link for rough benchmarks, showing that jsaone is approximately 3x faster).
DISCLAIMER: I'm the author of jsaone, and the tests I made are very basic... I'll be happy to be proven wrong!
Answering the question title: YAJL a JSON parser library in C:
YAJL remembers all state required to
support restarting parsing. This
allows parsing to occur incrementally
as data is read off a disk or network.
So I guess using yajl to parse JSON can be considered as processing stream of data.
In reply to your 2nd question, no, many languages have JSON parsers. PHP, Java, C, Ruby and many others. Just Google for the language of your choice plus "JSON parser".