Is it possible to reconstruct a JSON object in XQuery? Using XML, it's possible to use computed constructors to rebuild an element:
element { node-name($some-element) } {
(: Do stuff with $some-element/(#*|node()) :)
}
But using JSON objects, it seems that it's not possible to reconstruct properties. I would like to do something like this, but this throws a syntax error:
object-node {
for $p in $some-json-object/*
return node-name($p) : $p
}
It looks like it's possible to workaround that by mutating the JSON object:
let $obj := json:object(document{xdmp:from-json($json)}/*)
let $_put := map:put($o, 'prop-name', $prop-val)
return xdmp:to-json($o)/node()
But this has some obvious limitations.
I'm afraid using json:object really is the way to use here. Could be worse though, you only need a few lines to copy all json properties. You also don't need that document{} constructor, nor the extra type cast to json:object. xdmp:from-json already returns a json:object:
let $org := xdmp:from-json($json)
let $new := json:object()
let $_ :=
for $key in map:keys($org)
return map:put($new, $key, map:get($org, $key))
return xdmp:to-json($new)/node()
HTH!
This may be helpful for you: http://docs.marklogic.com/guide/app-dev/json
However, I often take a different approach in xQuery (being comfortable with XML). This may get some push-back from people here, but it is my approach:
Construct what you like in XML and then transform it. If you make your XML in the http: //marklogic.com/xdmp/json/basic namespace, then you can just transform it to whatever complex JSON you desire using json:transform-to-json - since all of the hints to datatypes are in the attributes of the XML. The nice thing about this approach is that it is a nice middle format. I can transform to JSON - or I can apply an XSLT transformation and get other XML if I desire.
It should be noted that json:transform-to-json has other modes of operation and can get datatype hints from your own schema as well. But I perfer the built-in schema.
I stumbled across this blog post by #paxstonhare that uses a non-functional approach, rebuilding new JSON objects during the tree walk by mutating them using map:put():
http://developer.marklogic.com/blog/walking-among-the-json-trees
Related
Using Python I can do following:
r = requests.get(url_base + url)
jsonObj = json.loads(r.content.decode('raw_unicode_escape'))
print(jsonObj["PartDetails"]["ManufacturerPartNumber"]
Is there any way to perform same thing using Golang?
Currently I need following:
json.Unmarshal(body, &part_number_json)
fmt.Println("\r\nPartDetails: ", part_number_json.(map[string]interface{})["PartDetails"].(map[string]interface{})["ManufacturerPartNumber"])
That is to say I need to use casting for each field of JSON what tires and makes the code unreadable.
I tried this using reflection but it is not comphortable too.
EDIT:
currently use following function:
func jso(json interface{}, fields ...string) interface{} {
res := json
for _, v := range fields {
res = res.(map[string]interface{})[v]
}
return res
and call it like that:
fmt.Println("PartDetails: ", jso( part_number_json, "PartDetails", "ManufacturerPartNumber") )
There are third-party packages like gjson that can help you do that.
That said, note that Go is Go, and Python is Python. Go is statically typed, for better and worse. It takes more code to write simple JSON manipulation, but that code should be easier to maintain later since it's more strictly typed and the compiler helps you check against error. Types also serve as documentation - simply nesting dicts and arrays is completely arbitrary.
I have found the following resource very helpful in creating a struct from json. Unmarshaling should only match the fields you have defined in the struct, so take what you need, and leave the rest if you like.
https://mholt.github.io/json-to-go/
This may be an odd question as it's specific to the JSON strings themselves, not the objects they represent. Given a 'pretty printed' JSON string (representing any JSON-encodable model), how would one reformat it to the 'compact' format?
My first thought was to not consider it JSON, but rather just a string, then use RegEx to remove duplicate spaces, remove newlines, etc., but that's not context aware so it risks affecting the keys and values portions of the JSON if you don't properly test that you're inside quotes.
My next thought was to try and construct an object from the JSON, but without a type to convert to, I'm not sure how to do that without manually parsing the values as 'ANY', then testing if they're an array, and recurse into it if they are, repeating the process. Then once I have the final object, serialize the result in compact form. However, that seems like a lot of overkill.
Is there an easier way to accomplish this? We're using Swift 4 if it helps.
UPDATE:
as pointed by #Mark A. Donohoe, this removes ALL whitespaces. so even though it looks coooooool, it's a dumb answer. don't fall for it.
i needed the same thing and i ended up creating a String extension:
extension String {
func toCompactJSON() -> String {
self.filter { !$0.isWhitespace && !$0.isNewline }
}
}
in my case though it was for testing purposes and it turned out to be useless as the order in which the Javascript object/arrays appear is not the same as while generated through the JSONEncoder.
I have in a text database field a json encoded chart configuration in the form of:
{"Name":[[1,1],[1,2],[2,1]],"Name2":[[3,2]]}
The first number of these IDs is a primary key of another table. I'd like to remove those entries with a trigger when the row is deleted, a plperl function would be good except it does not preserve the order of the hash and the order is important in this project. What can I do (without changing the format of the json encoded config)? Note: the chart name can contain any characters so it's hard to do it with regex.
You need to use a streaming JSON decoder, such as JSON::Streaming::Reader. You could then store your JSON as an array of key/value pairs, instead of a hash.
The actual implementation of how you might use do this is highly dependent on the structure of your data, but given the simple example provided... here's a simple implementation.
use strict;
use warnings;
use JSON::Streaming::Reader;
use JSON 'to_json';
my $s = '{"Name":[[1,1],[1,2],[2,1]],"Name2":[[3,2]]}';
my $jsonr = JSON::Streaming::Reader->for_string($s);
my #data;
while (my $token = $jsonr->get_token) {
my ($key, $value) = #$token;
if ($key eq 'start_property') {
push #data, { $value => $jsonr->slurp };
}
}
print to_json(\#data);
The output for this script is always: -
[{"Name":[[1,1],[1,2],[2,1]]},{"Name2":[[3,2]]}]
Well, I managed to solve my problem, but it's not a general solution so it will probably not help the casual reader. Anyway I got the order of keys using the help of the database, I called my function like this:
SELECT remove_from_chart(
chart_config,
array(select * from json_object_keys(chart_config::json)),
id);
then I walked through the keys in the order of the second parameter and put the results in a new tied (IxHash) hash and json encoded it.
It's pretty sad that there is no perl json decoder that could preserve the key order when everything else I work with, at least on this project, does it (php, postgres, firefox, chrome).
JSON objects are unordered. You will have to encode the desired order into your data somehow
{"Name":[[1,1],[1,2],[2,1]],"Name2":[[3,2]], "__order__":["Name","Name2"]}
[{"Name":[[1,1],[1,2],[2,1]]},{"Name2":[[3,2]]}]
May be you want streaming decoder of JSON data like SAX parser. If so then see JSON::Streaming::Reader, or JSON::SL.
I have a pretty complex JSON object that contains, among other things, some JSON arrays that I need to update, removing and adding elements.
To do that I'm trying to use a JsPath that point directly to the object inside the array that I need to remove, something like:
/priceLists(1)/sections(0)/items(0)
to remove the element I tried to use json.prune and it doesn't work, I get this error: error.expected.jsobject
Would would be the best way to do that?
Your question is lacking a precise context (i.e., structure of your json data), but let's do with what we have.
The error message you get is clear, you can only call prune on a json object, to prune one of its values. You can't use it to prune an element of a json array.
I can only advise you to use json.update, stating that like prune, update only works on json objects. In the body of the update, work on your arrays as you usually do with scala/java data types.
__.json.update(__.reads[JsArray].map { jsArray =>
val removedElement = JsArray(jsArray.value.filter(_ == ???))
val addedElement = removedElement :+ JsBoolean(true)
addedElement
})
Part of a website's JSON response had this (... added for context):
{..., now:function(){return(new Date).getTime()}, ...}
Is adding anonymous functions to JSON valid? I would expect each time you access 'time' to return a different value.
No.
JSON is purely meant to be a data description language. As noted on http://www.json.org, it is a "lightweight data-interchange format." - not a programming language.
Per http://en.wikipedia.org/wiki/JSON, the "basic types" supported are:
Number (integer, real, or floating
point)
String (double-quoted Unicode
with backslash escaping)
Boolean
(true and false)
Array (an ordered
sequence of values, comma-separated
and enclosed in square brackets)
Object (collection of key:value
pairs, comma-separated and enclosed
in curly braces)
null
The problem is that JSON as a data definition language evolved out of JSON as a JavaScript Object Notation. Since Javascript supports eval on JSON, it is legitimate to put JSON code inside JSON (in that use-case). If you're using JSON to pass data remotely, then I would say it is bad practice to put methods in the JSON because you may not have modeled your client-server interaction well. And, further, when wishing to use JSON as a data description language I would say you could get yourself into trouble by embedding methods because some JSON parsers were written with only data description in mind and may not support method definitions in the structure.
Wikipedia JSON entry makes a good case for not including methods in JSON, citing security concerns:
Unless you absolutely trust the source of the text, and you have a need to parse and accept text that is not strictly JSON compliant, you should avoid eval() and use JSON.parse() or another JSON specific parser instead. A JSON parser will recognize only JSON text and will reject other text, which could contain malevolent JavaScript. In browsers that provide native JSON support, JSON parsers are also much faster than eval. It is expected that native JSON support will be included in the next ECMAScript standard.
Let's quote one of the spec's - https://www.rfc-editor.org/rfc/rfc7159#section-12
The The JavaScript Object Notation (JSON) Data Interchange Format Specification states:
JSON is a subset of JavaScript but excludes assignment and invocation.
Since JSON's syntax is borrowed from JavaScript, it is possible to
use that language's "eval()" function to parse JSON texts. This
generally constitutes an unacceptable security risk, since the text
could contain executable code along with data declarations. The same
consideration applies to the use of eval()-like functions in any
other programming language in which JSON texts conform to that
language's syntax.
So all answers which state, that functions are not part of the JSON standard are correct.
The official answer is: No, it is not valid to define functions in JSON results!
The answer could be yes, because "code is data" and "data is code".
Even if JSON is used as a language independent data serialization format, a tunneling of "code" through other types will work.
A JSON string might be used to pass a JS function to the client-side browser for execution.
[{"data":[["1","2"],["3","4"]],"aFunction":"function(){return \"foo bar\";}"}]
This leads to question's like: How to "https://stackoverflow.com/questions/939326/execute-javascript-code-stored-as-a-string".
Be prepared, to raise your "eval() is evil" flag and stick your "do not tunnel functions through JSON" flag next to it.
It is not standard as far as I know. A quick look at http://json.org/ confirms this.
Nope, definitely not.
If you use a decent JSON serializer, it won't let you serialize a function like that. It's a valid OBJECT, but not valid JSON. Whatever that website's intent, it's not sending valid JSON.
JSON explicitly excludes functions because it isn't meant to be a JavaScript-only data
structure (despite the JS in the name).
A short answer is NO...
JSON is a text format that is completely language independent but uses
conventions that are familiar to programmers of the C-family of
languages, including C, C++, C#, Java, JavaScript, Perl, Python, and
many others. These properties make JSON an ideal data-interchange
language.
Look at the reason why:
When exchanging data between a browser and a server, the data can only
be text.
JSON is text, and we can convert any JavaScript object into JSON, and
send JSON to the server.
We can also convert any JSON received from the server into JavaScript
objects.
This way we can work with the data as JavaScript objects, with no
complicated parsing and translations.
But wait...
There is still ways to store your function, it's widely not recommended to that, but still possible:
We said, you can save a string... how about converting your function to a string then?
const data = {func: '()=>"a FUNC"'};
Then you can stringify data using JSON.stringify(data) and then using JSON.parse to parse it (if this step needed)...
And eval to execute a string function (before doing that, just let you know using eval widely not recommended):
eval(data.func)(); //return "a FUNC"
Via using NodeJS (commonJS syntax) I was able to get this type of functionality working, I originally had just a JSON structure inside some external JS file, but I wanted that structure to be more of a Class, with methods that could be decided at run time.
The declaration of 'Executor' in myJSON is not required.
var myJSON = {
"Hello": "World",
"Executor": ""
}
module.exports = {
init: () => { return { ...myJSON, "Executor": (first, last) => { return first + last } } }
}
Function expressions in the JSON are completely possible, just do not forget to wrap it in double quotes. Here is an example taken from noSQL database design:
{
"_id": "_design/testdb",
"views": {
"byName": {
"map": "function(doc){if(doc.name){emit(doc.name,doc.code)}}"
}
}
}
although eval is not recommended, this works:
<!DOCTYPE html>
<html>
<body>
<h2>Convert a string written in JSON format, into a JavaScript function.</h2>
<p id="demo"></p>
<script>
function test(val){return val + " it's OK;}
var someVar = "yup";
var myObj = { "func": "test(someVar);" };
document.getElementById("demo").innerHTML = eval(myObj.func);
</script>
</body>
</html>
Leave the quotes off...
var a = {"b":function(){alert('hello world');} };
a.b();