Part of a website's JSON response had this (... added for context):
{..., now:function(){return(new Date).getTime()}, ...}
Is adding anonymous functions to JSON valid? I would expect each time you access 'time' to return a different value.
No.
JSON is purely meant to be a data description language. As noted on http://www.json.org, it is a "lightweight data-interchange format." - not a programming language.
Per http://en.wikipedia.org/wiki/JSON, the "basic types" supported are:
Number (integer, real, or floating
point)
String (double-quoted Unicode
with backslash escaping)
Boolean
(true and false)
Array (an ordered
sequence of values, comma-separated
and enclosed in square brackets)
Object (collection of key:value
pairs, comma-separated and enclosed
in curly braces)
null
The problem is that JSON as a data definition language evolved out of JSON as a JavaScript Object Notation. Since Javascript supports eval on JSON, it is legitimate to put JSON code inside JSON (in that use-case). If you're using JSON to pass data remotely, then I would say it is bad practice to put methods in the JSON because you may not have modeled your client-server interaction well. And, further, when wishing to use JSON as a data description language I would say you could get yourself into trouble by embedding methods because some JSON parsers were written with only data description in mind and may not support method definitions in the structure.
Wikipedia JSON entry makes a good case for not including methods in JSON, citing security concerns:
Unless you absolutely trust the source of the text, and you have a need to parse and accept text that is not strictly JSON compliant, you should avoid eval() and use JSON.parse() or another JSON specific parser instead. A JSON parser will recognize only JSON text and will reject other text, which could contain malevolent JavaScript. In browsers that provide native JSON support, JSON parsers are also much faster than eval. It is expected that native JSON support will be included in the next ECMAScript standard.
Let's quote one of the spec's - https://www.rfc-editor.org/rfc/rfc7159#section-12
The The JavaScript Object Notation (JSON) Data Interchange Format Specification states:
JSON is a subset of JavaScript but excludes assignment and invocation.
Since JSON's syntax is borrowed from JavaScript, it is possible to
use that language's "eval()" function to parse JSON texts. This
generally constitutes an unacceptable security risk, since the text
could contain executable code along with data declarations. The same
consideration applies to the use of eval()-like functions in any
other programming language in which JSON texts conform to that
language's syntax.
So all answers which state, that functions are not part of the JSON standard are correct.
The official answer is: No, it is not valid to define functions in JSON results!
The answer could be yes, because "code is data" and "data is code".
Even if JSON is used as a language independent data serialization format, a tunneling of "code" through other types will work.
A JSON string might be used to pass a JS function to the client-side browser for execution.
[{"data":[["1","2"],["3","4"]],"aFunction":"function(){return \"foo bar\";}"}]
This leads to question's like: How to "https://stackoverflow.com/questions/939326/execute-javascript-code-stored-as-a-string".
Be prepared, to raise your "eval() is evil" flag and stick your "do not tunnel functions through JSON" flag next to it.
It is not standard as far as I know. A quick look at http://json.org/ confirms this.
Nope, definitely not.
If you use a decent JSON serializer, it won't let you serialize a function like that. It's a valid OBJECT, but not valid JSON. Whatever that website's intent, it's not sending valid JSON.
JSON explicitly excludes functions because it isn't meant to be a JavaScript-only data
structure (despite the JS in the name).
A short answer is NO...
JSON is a text format that is completely language independent but uses
conventions that are familiar to programmers of the C-family of
languages, including C, C++, C#, Java, JavaScript, Perl, Python, and
many others. These properties make JSON an ideal data-interchange
language.
Look at the reason why:
When exchanging data between a browser and a server, the data can only
be text.
JSON is text, and we can convert any JavaScript object into JSON, and
send JSON to the server.
We can also convert any JSON received from the server into JavaScript
objects.
This way we can work with the data as JavaScript objects, with no
complicated parsing and translations.
But wait...
There is still ways to store your function, it's widely not recommended to that, but still possible:
We said, you can save a string... how about converting your function to a string then?
const data = {func: '()=>"a FUNC"'};
Then you can stringify data using JSON.stringify(data) and then using JSON.parse to parse it (if this step needed)...
And eval to execute a string function (before doing that, just let you know using eval widely not recommended):
eval(data.func)(); //return "a FUNC"
Via using NodeJS (commonJS syntax) I was able to get this type of functionality working, I originally had just a JSON structure inside some external JS file, but I wanted that structure to be more of a Class, with methods that could be decided at run time.
The declaration of 'Executor' in myJSON is not required.
var myJSON = {
"Hello": "World",
"Executor": ""
}
module.exports = {
init: () => { return { ...myJSON, "Executor": (first, last) => { return first + last } } }
}
Function expressions in the JSON are completely possible, just do not forget to wrap it in double quotes. Here is an example taken from noSQL database design:
{
"_id": "_design/testdb",
"views": {
"byName": {
"map": "function(doc){if(doc.name){emit(doc.name,doc.code)}}"
}
}
}
although eval is not recommended, this works:
<!DOCTYPE html>
<html>
<body>
<h2>Convert a string written in JSON format, into a JavaScript function.</h2>
<p id="demo"></p>
<script>
function test(val){return val + " it's OK;}
var someVar = "yup";
var myObj = { "func": "test(someVar);" };
document.getElementById("demo").innerHTML = eval(myObj.func);
</script>
</body>
</html>
Leave the quotes off...
var a = {"b":function(){alert('hello world');} };
a.b();
Related
I'm trying to devise a service to convert a generic JSON data representation into XML data representation.
The first idea that came to my mind (and that I found on the internet) takes advantage of the Go templating utilities.
If I have a JSON data representation like the following:
{
"user": {
"name": "Luca",
"surname": "Rossi"
}
}
I could devise a template like the following:
<xml>
<user name="{{.user.name}}" surname="{{.user.surname}}" />
</xml>
to generate:
<xml>
<user name="Luca" surname="Rossi" />
</xml>
The problem is: Go requires the definition of a struct which declares how to marshal and unmarshal a JSON data representation; at the same time, however, I'd like to provide the template to generate the XML as a service configuration available at runtime.
The question is: "is it possible"?
I know (thanks tothis question) that I can take do something like this:
var anyJson map[string]interface{}
json.Unmarshal(bytes, &anyJson)
The problem comes when I have to access values: I'd be asked to do a type assertion, like
anyJson["id"].(string)
Now, I might be able to know the type of anyJson["id"] by means of JSON schema, for example, but I don't know if I can do a parametric type assertion, something like
anyJson["id"].(typeForIDFromJSONSchema)
When you unmarshal into map[string]interface{}, every nested JSON object will also be map[string]interface{}. So type assertions of the contained elements to string may typically work, but not to any struct type - the unmarshaller would always be unaware of your structs.
So the two options I suggest are
to do it 'the hard way' using type switches and type assertions - this is workable and fast, but not all that nice; or
to use a different tool such as jsoniter or gjson - these might be a little slower but they do allow you to traverse arbitrary JSON graphs
I have used both GJson and Jsoniter. GJson works by reading byte by byte through the input, using buffering to keep its speed up, providing an API that allows inspection of the current element and assertions to convert the values.
Jsoniter looks to me like a cleaner implementation along the lines of successful parsers in Java, but I haven't used it for parsing in this manner yet. (It can also be used simply as a fast drop-in replacement for the standard Go encoding/json.) I suggest you focus on using its Iterator and its WhatIsNext method.
I am using the Yason library in common-lisp, I want to parse a json string but would like the parser to keep one a its node unparsed.
Typically with an example like that:
{
"metadata1" : "mydata1",
"metadata2" : "mydata2",
"payload" : {...my long payload object},
"otherNodesToParse" : {...}
}
How can I set the yason parser to parse my json but skip the payload node and keep it as a string in the json format.
Use: let's say I just want the envelope data (everything that's not the payload), and to forward the payload as-is (as json string) to another system.
If I parse the whole json (so including payload) and then re-encode the payload to json, it is inefficient. The payload size could also be pretty big.
How do you know where the end of the payload object is in the stream? You do so by parsing the stream: if you don't parse the stream you simply can't know where the end of the object is: that's the nature of JSON's syntax (as it is the nature of CL's default syntax). For instance the only way you can know the difference between where to continue after
{x:1}
and after
{x:1.2}
is by parsing the two things.
So you must necessarily parse the whole thing.
So the answer to your question is: you can't do this.
You could (but not, I think, with YASON) decide that you did not want to build an object as a result of the parse. And perhaps, if the stream you are parsing corresponds to something with random access like a string or a file, you could note the start and end positions in the stream to later extract a string from it corresponding to the unparsed data (or you could perhaps build it up as you go).
It looks as if some or all of this might be possible with CL-JSON, but you'd have to work at it.
Unless the objects you are reading are vast the benefit of this seems questionable-to-none. If you really do want to do something like this efficiently you need a serialisation scheme which tells you how long things are.
I've carefully read the JSON description http://json.org/ but I'm not sure I know the answer to the simple question. What strings are the minimum possible valid JSON?
"string" is the string valid JSON?
42 is the simple number valid JSON?
true is the boolean value a valid JSON?
{} is the empty object a valid JSON?
[] is the empty array a valid JSON?
At the time of writing, JSON was solely described in RFC4627. It describes (at the start of "2") a JSON text as being a serialized object or array.
This means that only {} and [] are valid, complete JSON strings in parsers and stringifiers which adhere to that standard.
However, the introduction of ECMA-404 changes that, and the updated advice can be read here. I've also written a blog post on the issue.
To confuse the matter further however, the JSON object (e.g. JSON.parse() and JSON.stringify()) available in web browsers is standardised in ES5, and that clearly defines the acceptable JSON texts like so:
The JSON interchange format used in this specification is exactly that described by RFC 4627 with two exceptions:
The top level JSONText production of the ECMAScript JSON grammar may consist of any JSONValue rather than being restricted to being a JSONObject or a JSONArray as specified by RFC 4627.
snipped
This would mean that all JSON values (including strings, nulls and numbers) are accepted by the JSON object, even though the JSON object technically adheres to RFC 4627.
Note that you could therefore stringify a number in a conformant browser via JSON.stringify(5), which would be rejected by another parser that adheres to RFC4627, but which doesn't have the specific exception listed above. Ruby, for example, would seem to be one such example which only accepts objects and arrays as the root. PHP, on the other hand, specifically adds the exception that "it will also encode and decode scalar types and NULL".
There are at least four documents which can be considered JSON standards on the Internet. The RFCs referenced all describe the mime type application/json. Here is what each has to say about the top-level values, and whether anything other than an object or array is allowed at the top:
RFC-4627: No.
A JSON text is a sequence of tokens. The set of tokens includes six
structural characters, strings, numbers, and three literal names.
A JSON text is a serialized object or array.
JSON-text = object / array
Note that RFC-4627 was marked "informational" as opposed to "proposed standard", and that it is obsoleted by RFC-7159, which in turn is obsoleted by RFC-8259.
RFC-8259: Yes.
A JSON text is a sequence of tokens. The set of tokens includes six
structural characters, strings, numbers, and three literal names.
A JSON text is a serialized value. Note that certain previous
specifications of JSON constrained a JSON text to be an object or an
array. Implementations that generate only objects or arrays where a
JSON text is called for will be interoperable in the sense that all
implementations will accept these as conforming JSON texts.
JSON-text = ws value ws
RFC-8259 is dated December 2017 and is marked "INTERNET STANDARD".
ECMA-262: Yes.
The JSON Syntactic Grammar defines a valid JSON text in terms of tokens defined by the JSON lexical
grammar. The goal symbol of the grammar is JSONText.
Syntax
JSONText :
JSONValue
JSONValue :
JSONNullLiteral
JSONBooleanLiteral
JSONObject
JSONArray
JSONString
JSONNumber
ECMA-404: Yes.
A JSON text is a sequence of tokens formed from Unicode code points that conforms to the JSON value
grammar. The set of tokens includes six structural tokens, strings, numbers, and three literal name tokens.
According to the old definition in RFC 4627 (which was obsoleted in March 2014 by RFC 7159), those were all valid "JSON values", but only the last two would constitute a complete "JSON text":
A JSON text is a serialized object or array.
Depending on the parser used, the lone "JSON values" might be accepted anyway. For example (sticking to the "JSON value" vs "JSON text" terminology):
the JSON.parse() function now standardised in modern browsers accepts any "JSON value"
the PHP function json_decode was introduced in version 5.2.0 only accepting a whole "JSON text", but was amended to accept any "JSON value" in version 5.2.1
Python's json.loads accepts any "JSON value" according to examples on this manual page
the validator at http://jsonlint.com expects a full "JSON text"
the Ruby JSON module will only accept a full "JSON text" (at least according to the comments on this manual page)
The distinction is a bit like the distinction between an "XML document" and an "XML fragment", although technically <foo /> is a well-formed XML document (it would be better written as <?xml version="1.0" ?><foo />, but as pointed out in comments, the <?xml declaration is technically optional).
JSON stands for JavaScript Object Notation. Only {} and [] define a Javascript object. The other examples are value literals. There are object types in Javascript for working with those values, but the expression "string" is a source code representation of a literal value and not an object.
Keep in mind that JSON is not Javascript. It is a notation that represents data. It has a very simple and limited structure. JSON data is structured using {},:[] characters. You can only use literal values inside that structure.
It is perfectly valid for a server to respond with either an object description or a literal value. All JSON parsers should be handle to handle just a literal value, but only one value. JSON can only represent a single object at a time. So for a server to return more than one value it would have to structure it as an object or an array.
The ecma specification might be useful for reference:
http://www.ecma-international.org/ecma-262/5.1/
The parse function parses a JSON text (a JSON-formatted String) and produces an ECMAScript value. The
JSON format is a restricted form of ECMAScript literal. JSON objects are realized as ECMAScript objects.
JSON arrays are realized as ECMAScript arrays. JSON strings, numbers, booleans, and null are realized as
ECMAScript Strings, Numbers, Booleans, and null. JSON uses a more limited set of white space characters
than WhiteSpace and allows Unicode code points U+2028 and U+2029 to directly appear in JSONString literals
without using an escape sequence. The process of parsing is similar to 11.1.4 and 11.1.5 as constrained by
the JSON grammar.
JSON.parse("string"); // SyntaxError: Unexpected token s
JSON.parse(43); // 43
JSON.parse("43"); // 43
JSON.parse(true); // true
JSON.parse("true"); // true
JSON.parse(false);
JSON.parse("false");
JSON.parse("trueee"); // SyntaxError: Unexpected token e
JSON.parse("{}"); // {}
JSON.parse("[]"); // []
Yes, yes, yes, yes, and yes. All of them are valid JSON value literals.
However, the official RFC 4627 states:
A JSON text is a serialized object or array.
So a whole "file" should consist of an object or array as the outermost structure, which of course can be empty. Yet, many JSON parsers accept primitive values as well for input.
Just follow the railroad diagrams given on the json.org page. [] and {} are the minimum possible valid JSON objects. So the answer is [] and {}.
var x;
console.log(JSON.stringify(x)); // will output "{}"
So your answer is "{}" which denotes an empty object.
JSON could mean JSON type or json string.
It starts confuse me when different library use json in two different meanings.
I wonder how other people name those variables differently.
For all practical purposes, "JSON" has exactly one meaning, which is a string representing a JavaScript object following certain specific syntax.
JSON is parsed into a JavaScript object using JSON.parse, and an JavaScript object is converted into a JSON string using JSON.stringify.
The problem is that all too many people have gotten into the bad habit of referring to plain old JavaScript objects as JSON. That is either confused or sloppy or both. {a: 1} is a JS object. '{"a": 1}' is a JSON string.
In the same vein, many people use variable names like json to refer to JavaScript objects derived from JSON. For example:
$.getJSON('foo.php') . then(function(json) { ...
In the above case, the variable name json is ill-advised. The actual payload returned from the server is a JSON string, but internally $.getJSON has already transformed that into a plain old JavaScript object, which is what is being passed to the then handler. Therefore, it would be preferable to use the variable name data, for example.
If a library uses the term "json" to refer to things which are not JSON, but actually are JavaScript objects, it is a mark of poor design, and I'd suggest looking around for a different library.
This could be the most basic json question ever. I'm creating a WCF REST service and have a HelloWorld test function that just returns a string. I'm testing the service in fiddler and the reponse body I'm getting back is:
"HelloWorld"
I also created a function that would just return a number (double) and the response body is:
1.0
Are those valid json responses? Are simple return types just returned in plain text like that (no markup with the brackets)?
Valid JSON responses start with either a { for an object or [ for a list of objects.
Native types are not valid JSON if not encapsulated. Try JSONlint to check validity.
RFC 4672, says no. Which doesn't mean it can't work, but it isn't strictly standards compliant. (Of course, neither are all the JSON readers...)
To quote from Section 2, "JSON Grammar":
A JSON text is a sequence of tokens. The set of tokens includes six
structural characters, strings, numbers, and three literal names.
A JSON text is a serialized object or array.
JSON-text = object / array
Only objects / maps, and arrays, at the top level.
According to the official website, you need to use a syntax like this:
You need to declare what you want beetween {}, like this:
{
"test": "HelloWorld"
}
No. The following is, for example:
{
"Foo": "HelloWorld"
}
You can try JSONLint to see what validates and what does not.