Can/should I use YAML as payload in RESTful webservice? - json

As the header says.
In general I like YAML more than JSON these days. I implemented a RESTful WS PoC back in the day using JSON. I was wondering if I can instead use YAML or not.
E.g. are there enough tools/libraries/support for doing that? Or would I end up doing quite a bit of mundane/tedious coding which I would've avoided if I were using JSON instead?
Also as I understood from WWW: REST doesn't restrict one from using YAML as the payload, is that correct?
Thanks!

Yes, if it's a goal that the data be especially readable by humans. REST itself isn't focused on protocols/formats so much as patterns.
There's not a lot to gain here for webservices however, which typically represent app to app communication. Computers don't care, and JSON can be pretty-printed to improve legibility somewhat.
YAML is well supported by mainstream languages, though not always included in standard libraries as JSON typically is. So you'll probably be looking at an additional library dependency.
Also, if the client is a browser, parsing will be slower, as you'll have to use a non-native external lib such as described here using: JavaScript YAML Parser . Make sure it gets compressed in transit or the extra indentation spaces will expand the size of the data.
Also, YAML has a lot of esoteric and downright potentially dangerous features. Whenever I'm using it I use the "safe" parser, and deactivate many if not most of its features besides data structures.
I could imagine some utility as a debug parameter however, perhaps url.yaml or …?fmt=yaml to assist during development. But, otherwise not much gain for all the trouble.

Related

Why Dart uses YAML as package manager?

I'm asking about the reason for using YAML for package managing pubspec.yaml in Dart, why did they choose YAML, not JSON? what is the unique thing in YAML that makes it a favourite for this purpose instead of another?
In my opinion:
1. Readability
YAML is much better... Like Python
2. Commentary
In JSON you can't give comments
For flutter/Dart application maintenance, of course requires comments since the PUBSPEC file was created.
3. Speed
Indeed JSON files are smaller and faster, but for cross-platform developers, more emphasis on ease of reading and speed of production. Moreover, the development of mobile hardware is now very good.
4. Complexity
JSON structure is simpler, so it does not support complex configurations.
But, YAML... be aware that "white space" (tabs v spaces) matters.
What is the difference between YAML and JSON?
Most importantly support for comments & Better readability
The design goal of JSON is to be as simple as possible and be universally usable. This has reduced the readability of the data, to some extent. In contrast, the design goal of YAML is to provide a good human-readable format and provide support for serializing arbitrary native data structures.
Source: JSON vs. YAML: A Dive Into 2 Popular Data Serialization Languages

JSON and HTML trying to understand

According to a post on Stackflow.com called “what’s is JSOn and why would I use it? “web services used XML as their primary data format for transmitting back data, but since JSON appeared, it is preferred method.” Why do must web services use JSON over XML, is because it’s a better method for interchanging?
XML was designed primarily for document formats, e.g. papers in scientific journals. It contains many features that aren't needed for simple data interchange, and these features can get in the way when you are processing XML because they can't be easily represented in Javascript. So the code for processing the XML ends up a lot more complicated than it could be. By contrasts, JSON has an exact match to the data structures Javascript can handle natively. Of course, that problem could in principle be solved by using a language with better XML support than JavaScript - XSLT, for example - but unfortunately XSLT in the browser has never had the same level of investment put into it.
Additionally, for reasons I have never understood, the browser security folks decided that reading JSON from alien web sites (i.e. from a different domain from your HTML page) is safe, but reading XML from alien sites isn't. So if you switch from XML to JSON, you get rid of a lot of cross-site-scripting hassle.
JSON is less verbose and it is sufficient for simple data transmission, i.e. if you do not need any transformations (XSLT).

IDL for JSON REST/RPC interface

We are designing a fairly complex REST API, in which most of the I/O are JSON encoded objects with a specific structure. One challenge we have found is to document the API in such a way that makes it easier for clients to post correct input and process output. Because the data of both the input and output requires fairly complex JSON objects, client developers often introduce bugs related to the structure of the I/O objects.
With all of the JSON web API's these days, I would have hoped for a general solution, but I am having a hard time finding one. I looked into json-schema which is a json-validation schema but both the IETF draft and implementations seem to be fairly immature (even though they have been around for a while, which is not a good sign).
A slightly different approach is offered by Protocol Buffers and Apache Avro, where the schema is not used for validation, but actually required for the encoding/decoding of the message. Of these 2, Avro seems to have rather limited documentation and implementations. ProtoBuf seems better, but I am not sure if this is really suitable to use in the browser to call a JSON api?
Now I am starting to doubt if I am looking at this from the right angle. Are there other methods available to make my API a bit more strong-typed'ish? Or is a formal description of a JSON REST/RPC API something that defeats the purpose of using JSON?
Edit: 6 months after this topic we found mongoose, which is very close to what we were lookin for.
Below a reply I received by email from Douglas Crockford.
I am not a believer in schemas as an alternative to input validation.
There are properties that cannot be verified from the syntax. I think
that was one of the ways that XML went wrong.
If your formats are too complex, then I would look at simplifying
them.
Such systems exist and I'm the author of one of them. It is called Piqi-RPC and it does IDL-based validation of the input and output parameters for RPC-style APIs over HTTP.
It supports JSON, XML and Google Protocol Buffers as data representation formats for input and output of HTTP POST requests. Clients can choose to use any of the three formats and specify their choice using the standard Accept and Content-Type HTTP headers.
So, yes, in theory, you are looking in the right direction. However, at the moment, Piqi-RPC supports writing servers only in Erlang and it wouldn't be very useful for you if you use a different stack. I heard that Apache Thrift also supports JSON over HTTP transport, but I haven't checked. Another kind of similar system I know of (also for Erlang) is called UBF. I have heard of libraries for Java that can parse and validate JSON based on Protocol Buffers specification (e.g. http://code.google.com/p/protostuff/).
The idea itself is far from being new, but there aren't many systems that approach it in practice. It is a challenging problem.
Historically, IDLs were used for interface definition and binary data serialization and not so much for validating dynamic data interchange formats (e.g. XML and JSON) which emerged later. Sun-RPC IDL and CORBA IDL fall in the first category. WSDL would be one of few examples covering both areas, but it is a terrible piece of technology and it would be a bad choice for most modern systems. In addition, there are many schema languages (also known as DDLs -- data definition languages), most of which are highly specialized and work with only one representation format, e.g. XML or JSON schemas. Few of those have stable implementations.
The Piqi project and Piqi-RPC, which is based on it, are build around several fairly simple realizations:
DLL doesn't have to be explicitly tied to any particular data representation format or built around it. Instead, such language can be fairly universal and cover wide range of practical use-cases (e.g. cross-language data serialization and data validation) and data formats (e.g. JSON, XML, Protocol Buffers).
IDL for RPC-style communication can be implemented as a thin, mostly syntactic layer on top of the universal DDL.
Such IDL and interface specifications can be transport agnostic.
Speaking of REST-style APIs over HTTP compared to RPC-style APIs over HTTP.
With RPC-style APIs, service developer or an automated system have to validate three things: function name (according to some service naming scheme), input and, if you choose so, output.
In case of REST-style APIs, people get themselves in trouble for no good reason. Now, they have a lot more stuff to validate: arbitrarily complex URL syntax, including dynamic parameters encoded in URL segments (for all HTTP methods) and URL query string (only for HTTP GET method), HTTP method correspondence (whether it should be GET, POST, PUT, DELETE, etc.), HTTP body when some parameters go there (sometimes they do it manually twice for parameters represented in JSON and XML), custom HTTP headers, and separately -- service documentation. Imagine an IDL supporting all that!
XML is better for RESTful services in many ways. It has native linking (<link href=, for all those HATEOAS fans), native language support (lang="en") and a great ecosystem.
It is also better for future proofing and future API refactorings. Converting this:
<profile>
<username>alganet</username>
</profile>
To support more usernames:
<profile>
<username>alganet</username>
<username>alexandre</username>
</profile>
Is much more simpler to do without breaking existing clients using XML. JSON is hard on that.
If you really need JSON, JSON-Schema is the way to go. It's immature, but I don't know anything better on that case. Maybe your consumers could choose between XML and JSON, so they can choose between a small payload (JSON) or RESTful candies (XML) using Content Negotiation.
I'd say the answer to your last question is yes. If you need a way to constrain and document the JSON "schema", why didn't you go with XML in the first place? It is not that much harder to parse, and being able to enforce a schema for it is a great advantage.

Is there anything wrong with YAML format to be joined to the web standards

Well, I think YAML is really fantastic...
It's beautiful, easy to read, clever syntax...compared to any other data serialization format.
As a superset of JSON we could say it's more elaborated, hence its language evolution.
But I see some different opinions out there, such:
YAML is dead,
don't use yaml and so on...
I simply can't understand on what this is based because it seems so nice :)
If we take few well succeeded examples over the web such as Ruby on Rails, we know they use yaml for simple configuration, but one thing that gets me curious is why yaml is not being part of most used formats over web like XML and JSON.
If you take twitter for example...why not offer the data in YAML format from the API as well?
Is there something wrong by doing it?
We can see the evolution on no-sql databases like couchdb, mongo, all json based, even one great project called jsondb which looks very lightweight and it definitely can do the job.
But when writing data structures in json I really can't understand why YAML is not being used instead.
So one of my concerns would be if is there something wrong with YAML?
People can say it's complex, but well, if you pretend to use the same features you would get in json it's definitely not. You will get a more beautiful file for sure tho and with no hassle. It would be indeed more complex if you decide to use more features, but that's how things are, at least you have the possibility to use it if you want to.
The possibility to choose if you want or not to use double-quotes for string is fantastic makes everything cleaner and easier to read....well you see what's my point :)
So my question would be, why YAML is not vastly used in place of JSON?
Why it doesn't seem that it will be used for data structure transfers within the online community?
All I can see is people using it for simple configuration files and nothing else...
Please bear with me since I might be completely wrong and very big projects might be happening and my ignorance on the subject didn't allow me to be a part of it :)
If is there any big project based on yaml out there I would be very happy to know about it
Thanks in advance
It's not that there's something wrong with YAML — it's just that it doesn't offer any compelling benefits in many cases. YAML is basically a superset of JSON. For most purposes, JSON is quite sufficient — people wouldn't be using advanced YAML features even if they had a full YAML parser — and its close ties to JavaScript make it fit in well with the technologies that Web developers are using anyway.
TLDR: People are already using as much YAML as they need. In most cases, that's JSON.
YAML uses more data than non-prettified JSON. It's great for files that humans might want to edit themselves but when all you're doing is passing data around, you're wasting bandwidth if you're using YAML.
If you need an explanation: each space in UTF-16 is two bytes. YAML uses spaces for indentation, and newline characters for nesting.
Take this example:
foo:
bar:
- foo
- bar
This requires 44 characters (including newline characters). The equivalent JSON would be only 29 characters:
{"foo":{"bar":["foo","bar"]}}
Then just imagine what happens if you URL-encode the YAML. It becomes 95 characters:
foo%3A%0A%20%20%20%20bar%3A%0A%20%20%20%20%20%20%20%20-%20foo%0A%20%20%20%20%20%20%20%20-%20bar
Meanwhile the JSON just becomes 64 characters:
%7B%22foo%22%3A%7B%22bar%22%3A%5B%22foo%22%2C%22bar%22%5D%7D%7D
The size increase to YAML from JSON is more than double when it's URL-encoded, in the above example. And I'm sure you can just imagine that the longer your YAML file, the more and more this difference will increase.
Oh, and one other reason not to use YAML: stackoverflow.com does not support YAML syntax highlighting... ! (Of course, I would argue that YAML is so beautiful that it doesn't need syntax highlighting. That's kind of the point of YAML, I think.)
In Ruby many people argue that configuration should be Ruby, rather than YAML. This saves the parsing stage, means you don't have to learn the new syntax, and don't end up with ERB tags everywhere when you are dynamically generating YAML content (Rails fixtures).
Personally I have to agree, and can't see what YAML would offer to network transfers that would make it a worthwhile consideration over JSON.
YAML has an amount of problems, there is a good article
YAML: probably not so great after all on that.
Short summary (in addition to problems already listed in other answers):
Unreadable except for simple and short things
Insecure by default
Has portability issues
Very complex, with amount of surprising behaviors
I considered using YAML few times and never did. The reason always had to do with white spaces for indentation. While I personally love this, even to me it sounded like asking for trouble, because
For sure someone will make a mistake, not expecting that changing white spaces will break the file. Sometimes someone who has no idea about the language / format has to go to the file to change one number or string.
You can't guarantee that everybody everywhere will have it's comparison / merging / SC software configured properly to catch white space or empty lines differences.

How do you share configuration information or business rules between languages

I'm looking for best practices for using the same data in different places without repeating yourself - this could include configuration or business rules.
Example 1. Data validation rules where you want to validate on the client using javascript, but you want to make sure by validating on the server.
Example 2. Database access where your web server and your cronjobs use the same password, username.
Ease of processing and a human-readable solution would be a plus.
Encode your data in JSON. There's a JSON library for pretty much any language you'd care to think of, or if not, it's pretty easy to code one up. If JSON is not enough, perhaps look at YAML.
XML is pretty globally used. Easy to read, easy to write, and human readable. If you're concerned about the space overhead (which you actually aren't if you want human readable) then just compress it before you send it out, XML compresses quite well.
See answers to this question. I think they are applicable here, especially the one with a DSL.
As much hate as they get, for sharing data validation rules, I'm going to have to say Regular Expressions.
I know, I know, everyone hates them, but they are (generally) language-agnostic.
Use O/S Environment Variables (envvars) to store application configuration info (such as db passwords)
Validation rules often require logic. You could write your rules in JavaScript, and then run them in the browser, server (using Nashorn), and database (PLV8 with Postgres).