Increase security by creating un-eval-uatable ("unparsable cruft") JSON? - json

we are looking at using the unparseable curft approach to our json as an extra level of security.
In looking at the approaches, I've come across google's while(1); and facebook's for(;;); and then another mention of {}&&
I've seen comments surrounding the while(1); that say the 1 being numeric can get clobbered, so my approach was going to be the for(;;);.
Then I came across the {}&&, which renders the json as invalid yet it can still be parsed/eval'ed. See this article for reference: http://www.sitepen.com/blog/2008/09/25/security-in-ajax/
What are your approaches? and what do your functions look like for making the ajax call with the unparseable curft?

I just always use a root object. As noted:
It is only possible to hijack JSON data with a root that is an array.
When the root is a primitive, primitive values do not trigger a
constructor. When the root is an object, it is not valid JavaScript
syntax, and therefore can’t be parsed.
Note that having a root primitive (e.g. your response is just 5) is not valid JSON. Section 2 of the RFC says:
A JSON text is a serialized object or array.
JSON-text = object / array
This isn't much of a burden, as I (and many sites) typically use an envelope format. E.g.:
{
"header": {...},
"data": {...}
}
or:
{
"status": {...},
"data": {...}
}
etc.
In that case, any array would just be the value of data, so you can serve syntactically valid JSON without any hijacking risk.

Related

Is it valid for JSON data structure to vary between a list and a boolean

The json data structure for jstree is define in https://github.com/vakata/jstree, here is an example
[ { "text" : "Root node", "children" : [ "Child node 1", "Child node 2" ] } ]
Notably it says
The children key can be used to add children to the branch, it should
be an array
However later on in section Populating the tree using AJAX and lazy loading nodes it shows to use set children to false to indicate when a child has not be processed
[{
"id":1,"text":"Root node","children":[
{"id":2,"text":"Child node 1","children":true},
{"id":3,"text":"Child node 2"}
]
}]
So here we see children used as both as an array and as a boolean
I am using jstree as an example because this is where I encountered the issue, but my question is really a general json question. My question is this, is it valid JSON for the same element in json to be two different types (an array and a boolean)
Structure wise, both are valid JSON packets. This is okay, as JSON is somewhat less stricter than XML(with a XSD or a DTD). As per: https://www.w3schools.com/js/js_json_objects.asp,
JSON objects are surrounded by curly braces {}.
JSON objects are written in key/value pairs.
Keys must be strings, and values must be a valid JSON data type (string, number, object, array, boolean or null).
Keys and values are separated by a colon.
Each key/value pair is separated by a comma.
Having said that, if the sender is allowed to send such JSONs, only caveat is that server side will have to handle this discrepancy upon receiving such different packets. This is a bad-looking-contract, and hence server might need to do extra work to manage it. Server side handling of such incoming JSON packets can become tricky.
See: How do I create JSON data structure when element can be different types in for use by
You could validate whether a JSON is okay or not at https://jsonlint.com/
See more about JSON in this answer: https://stackoverflow.com/a/4862511/945214
It is valid Json. JSON RFC 8259 defines a general syntax but it contains nothing that would allow a tool to identify that two equally named entries are meant to describe the same conceptual thing.
The need to have a criteria to check two JSON structures for instance equality has been one motivation to create something like Json Schema.
I also think it is not too unusual for javascript to provide this kind of mixed data. Sometimes it might help to explicitly convert the javascript object to JSON. Like in JSON.stringify(testObject)
A thing for json validation
https://www.npmjs.com/package/json-validation
https://davidwalsh.name/json-validation.

How does a client know the datatype of JSON RestResponse

While developing a client application using one of our existing REST services, I have the choice for using JSON or XML responses. The XML responses are described by XSD files with schema information.
With these XML Schemas I can determine what datatype a certain result must be, and the client can use that information when presenting the data to the user, or when the client asks the user to change a property. (How is quit another question btw as I cannot find any multiplatform Delphi implementation of XML that supports XSD schemas... but like i said: that's another question).
The alternative is to use a JSON response type, but then the client cannot determine the specific datatype of a property because everything is send as a string.
How would a client know that one of those properties is a index from an enumerated type, or a integer number, or an amount or a reference to another object by its ID maybe? (These are just examples)
I would think that the client should not contain "hardcoded" info on the structure of the response, or am I wrong in assuming that?
JSON doesn't have a rich type system like XML does, and JSON doesn't have a schema system for describing things like enumerations and references like XML does. But JSON has only a few data types, and the general formatting of the JSON is self-describing in terms of what data type any given value is using (see the official JSON spec for more details):
a string is always wrapped in quotation marks:
"fieldname": "fieldvalue"
a numeric value is digit characters without quotations:
"fieldname": 12345
an object is always wrapped in curly braces:
"fieldname": { ... object data ... }
an array is always wrapped in square braces:
"fieldname": [ ... array data ... ]
a boolean is always a fixed true or false without quotations:
"name": true
"name": false
a null is always a fixed null without quotations:
"name": null
Anything beyond that will require the client to have external knowledge of the data that is being sent (like a schema in XML, since XML itself does not describe data types at all).

Does JSON syntax allow duplicate keys in an object?

Is this valid json?
{
"a" : "x",
"a" : "y"
}
http://jsonlint.com/ says yes.
http://www.json.org/ doesn't say anything about it being forbidden.
But obviously it doesn't make much sense, does it?
Most implementations probably use a hashtable so it is being overriden anyways.
The short answer: Yes but is not recommended.
The long answer: It depends on what you call valid...
ECMA-404 "The JSON Data Interchange Syntax" doesn't say anything about duplicated names (keys).
However, RFC 8259 "The JavaScript Object Notation (JSON) Data Interchange Format" says:
The names within an object SHOULD be unique.
In this context SHOULD must be understood as specified in BCP 14:
SHOULD This word, or the adjective "RECOMMENDED", mean that there
may exist valid reasons in particular circumstances to ignore a
particular item, but the full implications must be understood and
carefully weighed before choosing a different course.
RFC 8259 explains why unique names (keys) are good:
An object whose names are all unique is interoperable in the sense
that all software implementations receiving that object will agree on
the name-value mappings. When the names within an object are not
unique, the behavior of software that receives such an object is
unpredictable. Many implementations report the last name/value pair
only. Other implementations report an error or fail to parse the
object, and some implementations report all of the name/value pairs,
including duplicates.
Also, as Serguei pointed out in the comments: ECMA-262 "ECMAScript® Language Specification", reads:
In the case where there are duplicate name Strings within an object, lexically preceding values for the same key shall be overwritten.
In other words, last-value-wins.
Trying to parse a string with duplicated names with the Java implementation by Douglas Crockford (the creator of JSON) results in an exception:
org.json.JSONException: Duplicate key "status" at
org.json.JSONObject.putOnce(JSONObject.java:1076)
From the standard (p. ii):
It is expected that other standards will refer to this one, strictly adhering to the JSON text format, while
imposing restrictions on various encoding details. Such standards may require specific behaviours. JSON
itself specifies no behaviour.
Further down in the standard (p. 2), the specification for a JSON object:
An object structure is represented as a pair of curly bracket tokens surrounding zero or more name/value pairs.
A name is a string. A single colon token follows each name, separating the name from the value. A single
comma token separates a value from a following name.
It does not make any mention of duplicate keys being invalid or valid, so according to the specification I would safely assume that means they are allowed.
That most implementations of JSON libraries do not accept duplicate keys does not conflict with the standard, because of the first quote.
Here are two examples related to the C++ standard library. When deserializing some JSON object into a std::map it would make sense to refuse duplicate keys. But when deserializing some JSON object into a std::multimap it would make sense to accept duplicate keys as normal.
There are 2 documents specifying the JSON format:
http://json.org/
https://www.rfc-editor.org/rfc/rfc7159
The accepted answer quotes from the 1st document. I think the 1st document is more clear, but the 2nd contains more detail.
The 2nd document says:
Objects
An object structure is represented as a pair of curly brackets
surrounding zero or more name/value pairs (or members). A name is a
string. A single colon comes after each name, separating the name
from the value. A single comma separates a value from a following
name. The names within an object SHOULD be unique.
So it is not forbidden to have a duplicate name, but it is discouraged.
I came across a similar question when dealing with an API that accepts both XML and JSON, but doesn't document how it would handle what you'd expect to be duplicate keys in the JSON accepted.
The following is a valid XML representation of your sample JSON:
<object>
<a>x</a>
<a>y</a>
</object>
When this is converted into JSON, you get the following:
{
"object": {
"a": [
"x",
"y"
]
}
}
A natural mapping from a language that handles what you might call duplicate keys to another, can serve as a potential best practice reference here.
Hope that helps someone!
The JSON spec says this:
An object is an unordered set of name/value pairs.
The important part here is "unordered": it implies uniqueness of keys, because the only thing you can use to refer to a specific pair is its key.
In addition, most JSON libs will deserialize JSON objects to hash maps/dictionaries, where keys are guaranteed unique. What happens when you deserialize a JSON object with duplicate keys depends on the library: in most cases, you'll either get an error, or only the last value for each duplicate key will be taken into account.
For example, in Python, json.loads('{"a": 1, "a": 2}') returns {"a": 2}.
Posting and answer because there is a lot of outdated ideas and confusion about the standards. As of December 2017, there are two competing standards:
RFC 8259 - https://www.rfc-editor.org/rfc/rfc8259
ECMA-404 - http://www.ecma-international.org/publications/files/ECMA-ST/ECMA-404.pdf
json.org suggests ECMA-404 is the standard, but this site does not appear to be an authority. While I think it's fair to consider ECMA the authority, what's important here is, the only difference between the standards (regarding unique keys) is that RFC 8259 says the keys should be unique, and the ECMA-404 says they are not required to be unique.
RFC-8259:
"The names within an object SHOULD be unique."
The word "should" in all caps like that, has a meaning within the RFC world, that is specifically defined in another standard (BCP 14, RFC 2119 - https://www.rfc-editor.org/rfc/rfc2119) as,
SHOULD This word, or the adjective "RECOMMENDED", mean that
there may exist valid reasons in particular circumstances to ignore
a particular item, but the full implications must be understood and
carefully weighed before choosing a different course.
ECMA-404:
"The JSON syntax does not impose any restrictions on the strings used
as names, does not require that name strings be unique, and does not
assign any significance to the ordering of name/value pairs."
So, no matter how you slice it, it's syntactically valid JSON.
The reason given for the unique key recommendation in RFC 8259 is,
An object whose names are all unique is interoperable in the sense
that all software implementations receiving that object will agree on
the name-value mappings. When the names within an object are not
unique, the behavior of software that receives such an object is
unpredictable. Many implementations report the last name/value pair
only. Other implementations report an error or fail to parse the
object, and some implementations report all of the name/value pairs,
including duplicates.
In other words, from the RFC 8259 viewpoint, it's valid but your parser may barf and there's no promise as to which, if any, value will be paired with that key. From the ECMA-404 viewpoint (which I'd personally take as the authority), it's valid, period. To me this means that any parser that refuses to parse it is broken. It should at least parse according to both of these standards. But how it gets turned into your native object of choice is, in any case, unique keys or not, completely dependent on the environment and the situation, and none of that is in the standard to begin with.
SHOULD be unique does not mean MUST be unique. However, as stated, some parsers would fail and others would just use the last value parsed. However, if the spec was cleaned up a little to allow for duplicates then I could see a use where you may have an event handler which is transforming the JSON to HTML or some other format... In such cases it would be perfectly valid to parse the JSON and create another document format...
[
"div":
{
"p": "hello",
"p": "universe"
},
"div":
{
"h1": "Heading 1",
"p": "another paragraph"
}
]
could then easily parse to html for example:
<body>
<div>
<p>hello</p>
<p>universe</p>
</div>
<div>
<h1>Heading 1</h1>
<p>another paragraph</p>
</div>
</body>
I can see the reasoning behind the question but as it stands... I wouldn't trust it.
It's not defined in the ECMA JSON standard. And generally speaking, a lack of definition in a standard means, "Don't count on this working the same way everywhere."
If you're a gambler, "many" JSON engines will allow duplication and simply use the last-specified value. This:
var o = {"a": 1, "b": 2, "a": 3}
Becomes this:
Object {a: 3, b: 2}
But if you're not a gambler, don't count on it!
Asking for purpose, there are different answers:
Using JSON to serialize objects (JavaScriptObjectNotation), each dictionary element maps to an indivual object property, so different entries defining a value for the same property has no meaning.
However, I came over the same question from a very specific use case:
Writing JSON samples for API testing, I was wondering how to add comments into our JSON file without breaking the usability. The JSON spec does not know comments, so I came up with a very simple approach:
To use duplicate keys to comment our JSON samples.
Example:
{
"property1" : "value1", "REMARK" : "... prop1 controls ...",
"property2" : "value2", "REMARK" : "... value2 raises an exception ...",
}
The JSON serializers which we are using have no problems with these "REMARK" duplicates and our application code simply ignores this little overhead.
So, even though there is no meaning on the application layer, these duplicates for us provide a valuable workaround to add comments to our testing samples without breaking the usability of the JSON.
The standard does say this:
Programming languages vary widely on whether they support objects, and
if so, what characteristics and constraints the objects offer. The
models of object systems can be wildly divergent and are continuing to
evolve. JSON instead provides a simple notation for expressing
collections of name/value pairs. Most programming languages will have
some feature for representing such collections, which can go by names
like record, struct, dict, map, hash, or object.
The bug is in node.js at least. This code succeeds in node.js.
try {
var json = {"name":"n","name":"v"};
console.log(json); // outputs { name: 'v' }
} catch (e) {
console.log(e);
}
According to RFC-7159, the current standard for JSON published by the Internet Engineering Task Force (IETF), states "The names within an object SHOULD be unique". However, according to RFC-2119 which defines the terminology used in IETF documents, the word "should" in fact means "... there may exist valid reasons in particular circumstances to ignore a particular item, but the full implications must be understood and carefully weighed before choosing a different course." What this essentially means is that while having unique keys is recommended, it is not a must. We can have duplicate keys in a JSON object, and it would still be valid.
From practical application, I have seen the value from the last key is considered when duplicate keys are found in a JSON.
In C# if you deserialise to a Dictionary<string, string> it takes the last key value pair:
string json = #"{""a"": ""x"", ""a"": ""y""}";
var d = JsonConvert.DeserializeObject<Dictionary<string, string>>(json);
// { "a" : "y" }
if you try to deserialise to
class Foo
{
[JsonProperty("a")]
public string Bar { get; set; }
[JsonProperty("a")]
public string Baz { get; set; }
}
var f = JsonConvert.DeserializeObject<Foo>(json);
you get a Newtonsoft.Json.JsonSerializationException exception.

is "HelloWorld" a valid json response

This could be the most basic json question ever. I'm creating a WCF REST service and have a HelloWorld test function that just returns a string. I'm testing the service in fiddler and the reponse body I'm getting back is:
"HelloWorld"
I also created a function that would just return a number (double) and the response body is:
1.0
Are those valid json responses? Are simple return types just returned in plain text like that (no markup with the brackets)?
Valid JSON responses start with either a { for an object or [ for a list of objects.
Native types are not valid JSON if not encapsulated. Try JSONlint to check validity.
RFC 4672, says no. Which doesn't mean it can't work, but it isn't strictly standards compliant. (Of course, neither are all the JSON readers...)
To quote from Section 2, "JSON Grammar":
A JSON text is a sequence of tokens. The set of tokens includes six
structural characters, strings, numbers, and three literal names.
A JSON text is a serialized object or array.
JSON-text = object / array
Only objects / maps, and arrays, at the top level.
According to the official website, you need to use a syntax like this:
You need to declare what you want beetween {}, like this:
{
"test": "HelloWorld"
}
No. The following is, for example:
{
"Foo": "HelloWorld"
}
You can try JSONLint to see what validates and what does not.

Is it valid to define functions in JSON results?

Part of a website's JSON response had this (... added for context):
{..., now:function(){return(new Date).getTime()}, ...}
Is adding anonymous functions to JSON valid? I would expect each time you access 'time' to return a different value.
No.
JSON is purely meant to be a data description language. As noted on http://www.json.org, it is a "lightweight data-interchange format." - not a programming language.
Per http://en.wikipedia.org/wiki/JSON, the "basic types" supported are:
Number (integer, real, or floating
point)
String (double-quoted Unicode
with backslash escaping)
Boolean
(true and false)
Array (an ordered
sequence of values, comma-separated
and enclosed in square brackets)
Object (collection of key:value
pairs, comma-separated and enclosed
in curly braces)
null
The problem is that JSON as a data definition language evolved out of JSON as a JavaScript Object Notation. Since Javascript supports eval on JSON, it is legitimate to put JSON code inside JSON (in that use-case). If you're using JSON to pass data remotely, then I would say it is bad practice to put methods in the JSON because you may not have modeled your client-server interaction well. And, further, when wishing to use JSON as a data description language I would say you could get yourself into trouble by embedding methods because some JSON parsers were written with only data description in mind and may not support method definitions in the structure.
Wikipedia JSON entry makes a good case for not including methods in JSON, citing security concerns:
Unless you absolutely trust the source of the text, and you have a need to parse and accept text that is not strictly JSON compliant, you should avoid eval() and use JSON.parse() or another JSON specific parser instead. A JSON parser will recognize only JSON text and will reject other text, which could contain malevolent JavaScript. In browsers that provide native JSON support, JSON parsers are also much faster than eval. It is expected that native JSON support will be included in the next ECMAScript standard.
Let's quote one of the spec's - https://www.rfc-editor.org/rfc/rfc7159#section-12
The The JavaScript Object Notation (JSON) Data Interchange Format Specification states:
JSON is a subset of JavaScript but excludes assignment and invocation.
Since JSON's syntax is borrowed from JavaScript, it is possible to
use that language's "eval()" function to parse JSON texts. This
generally constitutes an unacceptable security risk, since the text
could contain executable code along with data declarations. The same
consideration applies to the use of eval()-like functions in any
other programming language in which JSON texts conform to that
language's syntax.
So all answers which state, that functions are not part of the JSON standard are correct.
The official answer is: No, it is not valid to define functions in JSON results!
The answer could be yes, because "code is data" and "data is code".
Even if JSON is used as a language independent data serialization format, a tunneling of "code" through other types will work.
A JSON string might be used to pass a JS function to the client-side browser for execution.
[{"data":[["1","2"],["3","4"]],"aFunction":"function(){return \"foo bar\";}"}]
This leads to question's like: How to "https://stackoverflow.com/questions/939326/execute-javascript-code-stored-as-a-string".
Be prepared, to raise your "eval() is evil" flag and stick your "do not tunnel functions through JSON" flag next to it.
It is not standard as far as I know. A quick look at http://json.org/ confirms this.
Nope, definitely not.
If you use a decent JSON serializer, it won't let you serialize a function like that. It's a valid OBJECT, but not valid JSON. Whatever that website's intent, it's not sending valid JSON.
JSON explicitly excludes functions because it isn't meant to be a JavaScript-only data
structure (despite the JS in the name).
A short answer is NO...
JSON is a text format that is completely language independent but uses
conventions that are familiar to programmers of the C-family of
languages, including C, C++, C#, Java, JavaScript, Perl, Python, and
many others. These properties make JSON an ideal data-interchange
language.
Look at the reason why:
When exchanging data between a browser and a server, the data can only
be text.
JSON is text, and we can convert any JavaScript object into JSON, and
send JSON to the server.
We can also convert any JSON received from the server into JavaScript
objects.
This way we can work with the data as JavaScript objects, with no
complicated parsing and translations.
But wait...
There is still ways to store your function, it's widely not recommended to that, but still possible:
We said, you can save a string... how about converting your function to a string then?
const data = {func: '()=>"a FUNC"'};
Then you can stringify data using JSON.stringify(data) and then using JSON.parse to parse it (if this step needed)...
And eval to execute a string function (before doing that, just let you know using eval widely not recommended):
eval(data.func)(); //return "a FUNC"
Via using NodeJS (commonJS syntax) I was able to get this type of functionality working, I originally had just a JSON structure inside some external JS file, but I wanted that structure to be more of a Class, with methods that could be decided at run time.
The declaration of 'Executor' in myJSON is not required.
var myJSON = {
"Hello": "World",
"Executor": ""
}
module.exports = {
init: () => { return { ...myJSON, "Executor": (first, last) => { return first + last } } }
}
Function expressions in the JSON are completely possible, just do not forget to wrap it in double quotes. Here is an example taken from noSQL database design:
{
"_id": "_design/testdb",
"views": {
"byName": {
"map": "function(doc){if(doc.name){emit(doc.name,doc.code)}}"
}
}
}
although eval is not recommended, this works:
<!DOCTYPE html>
<html>
<body>
<h2>Convert a string written in JSON format, into a JavaScript function.</h2>
<p id="demo"></p>
<script>
function test(val){return val + " it's OK;}
var someVar = "yup";
var myObj = { "func": "test(someVar);" };
document.getElementById("demo").innerHTML = eval(myObj.func);
</script>
</body>
</html>
Leave the quotes off...
var a = {"b":function(){alert('hello world');} };
a.b();