DataPower: Map Json response to xml using XSLTs - json

My SOR application is returning result set, I'm performing below binary-encode and decode to retrieve json
<xsl:variable name="response_json">
<xsl:copy-of select="dp:decode(dp:binary-encode($resp/result/binary/node()), 'base-64')"/>
</xsl:varibale>
response_json - {"code":"00", "description":"Success"}.
Now how do we parse through above response_json and assign "code"/"description" values to xsl:variables and context variables using XSLTs

You have two options, IMO one good and one bad...
Let's start with the bad one which you are already on as you are using XSLT for JSON data.
Even worse option is of course to use the JSON as text only and check for the value of code and description using XSLT String functions, otherwise you need to transform the JSON data into JSONx that can be handled in XSLT by using a convert of Query Params to XML.
To do this you need to pass your JSON to a new Action (Convert QP) and then into a new Transform.
You can then use DP's JSONx to handle any "JSON" data operations you'd like as XML in the XSLT stylesheet and when done call store://jsonx2json.xsl in a new Transform to transform it into "real" JSON.
I assume you are using a dp:url-open() as you get your response in a variable but if your service has Request/Resoonse as JSON the context VAR __JSONASJSONX will automatically hold a JSONx object for you that you can use as the INPUT to any XSLT Transform action.
That being said, let's move over to the good solution; use GWS!
GWS is added to DataPower to handle JSON (and anything other than XML) and you can call a GWS from your XSLT using dp:gatewayscript() as well. See this sample: https://github.com/ibm-datapower/datapower-samples/blob/master/gatewayscript/example-ext-func-dp-gwscript.xsl
If your Transform is not doing anything XML specific I'd rewrite it into GWS!

Related

Converting json object with variable attributes into xml array

The Api response is {"Content":{"634331":["Product could not be found"],"634332":["Product could not be found"],"etc…
and I am having trouble to catch the values after content into xml :
<Content>
<__634331>Product could not be found</__634331>
<__634332>Product could not be found</__634332>
<__123104398>Product could not be found</__123104398>
The values are being interpreted as field names.
Is there a way to convert json object into xml array looking like:
<Content>
<res>
<key>634331</key>
<value>Product could not be found</value>
</res>
<res>
<key>634332</key>
<value>Product could not be found</value>
</res>
It's quite common in JSON for keys in a map (or "object") to represent data values rather than property names. Unfortunately this doesn't map at all well to XML, where the equivalent would usually be a structure like
<data key="__634331" value="Product could not be found"/>
No automatic converter is going to be able to recognise that this kind of conversion is appropriate.
My recommendation would be to do a custom conversion using XSLT 3.0 template rules. I would need to see more detail of your JSON and required XML to advise in more detail.

Does JSON to XML lose me anything?

We have a program that accepts as data XML, JSON, SQL, OData, etc. For the XML we use Saxon and its XPath support and that works fantastic.
For JSON we use the jsonPath library which is not as powerful as XPath 3.1. And jsonPath is a little squirrelly in some corner cases.
So... what if we convert the JSON we get to XML and then use Saxon? Are there limitations to that approach? Are there JSON constructs that won't convert to XML, like anonymous arrays?
The headline question: The json-to-xml() function in XPath 3.1 is lossless, except that by default, characters that are invalid in XML (such as NUL, or unpaired surrogates) are replaced by a SUB character -- you can change this behaviour with the option escape=true.
The losslessness has been achieved at some cost in convenience. For example, JSON property names are not translated to XML element or attribute names, but rather to values of the key attribute.
Lots of different people have come up with lots of different conversions of JSON to XML. As already pointed out, the XPath 3.1 and the XSLT 3.0 spec have a loss-less, round-tripping conversion with json-to-xml and xml-to-json that can handle any JSON.
There are simpler conversions that handle limited sets of JSON, the main problem is how to represent property names of JSON that don't map to XML names e.g. { "prop 1" : "value" } is represented by json-to-xml as <string key="prop 1">value</string> while conversions trying to map the property name to an element or attribute name either fail to create well-formed XML (e.g. <prop 1>value</prop 1>) or have to escape the space in the element name (e.g. <prop_1>value</prop_1> or some hex representation of the Unicode of the space inserted).
In the end I guess you want to select the property foo in { "foo" : "value" } as foo which the simple conversion would give you; in XPath 3.1 you would need ?foo for the XDM map or fn:string[#key = 'foo'] for the json-to-xml result format.
With { "prop 1" : "value" } the latter kind of remains as fn:string[#key = 'prop 1'], the ? approach needs to be changed to ?('prop 1') or .('prop 1'). Any conversion that has escaped the space in an element name requires you to change the path to e.g. prop_1.
There is no ideal way for all kind of JSON I think, in the end it depends on the JSON formats you expect and the willingness or time of users to learn a new selection/querying approach.
Of course you can use other JSON to XML conversions than the json-to-xml and then use XPath 3.1 on any XML format; I think that is what the oXygen guys opted for, they had some JSON to XML conversion before XPath 3.1 provided one and are mainly sticking with it, so in oXygen you can write "path" expressions against JSON as under the hood the path is evaluated against an XML conversion of the JSON. I am not sure which effort it takes to indicate which JSON values in the original JSON have been selected by XPath path expressions in the XML format, that is probably not that easy and straightforward.

Transformation from XML to JSON

Is there a standard way to transform an input XML document with structure (scheme) of my choice to an output JSON object with structure (scheme) of my choice?
If it were transformation from input XML to output XML, I would use XSLT.
I can image the following three approaches:
Direct transformation from XML to JSON, i.e. a way to describe transformation XML -> JSON just like XSLT describes transformation XML -> XML.
I am aware of JSONML. It is a lossless JSON representation of arbitrary XML document. However, the resulting JSON object does not have the structure of my choice. If there were some standard way to describe transformation JSON -> JSON, I would chain XML -> JSONML and JSONML -> JSON.
If there were the opposite to JSONML (let's call it "XMLSON", i.e. a lossless XML notation of arbitrary JSON object), I would chain XML -> XMLSON (via XSLT) and XMLSON -> JSON.
All the three options have some "if there were". I wonder if there really is some technology to achieve the goal.
Thanks.
XSLT 3 has support to transform any XML to an XML representation of JSON defined in https://www.w3.org/TR/xslt-30/#schema-for-json and then allows you to use https://www.w3.org/TR/xslt-30/#func-xml-to-json to convert that particular XML to JSON.
The output of XSLT does not need to be XML, so if you are comfortable using that, you can go ahead and use it to output JSON.
A quick search showed up this, which might be a good example for you to start from: https://github.com/bramstein/xsltjson
It defines an XSLT function which takes an XML tree as input, and generates a string as output. Looking into the source, the basic approach is to generate an XML tree with nodes for each JSON object, array, and value, and then apply templates to that which output the JSON syntax itself.
For instance, to output a JSON array, it first generates an XML node of <json:array>...</json:array>, and then applies this template:
<xsl:template match="json:array" mode="json">
<xsl:variable name="values">
<xsl:apply-templates mode="json"/>
</xsl:variable>
<xsl:text/>[<xsl:text/>
<xsl:value-of select="string-join($values/value,',')"/>
<xsl:text/>]<xsl:text/>
</xsl:template>

Specifiy type when converting from XML to JSON in MarkLogic

Using MarkLogic 8, I'm using a custom XML to JSON conversion for json:transform-to-json, and I've got it working just about right except the conversion is outputting numbers as strings.
Is there a way to specify that the value of a particular element should be a number value, not a string?
I don't see anything in the doc for json:config, but just in case there's something I've missed, or if you have a neat post-processing trick, I'd love to hear about how to solve this problem.
You can do that by defining an XML Schema for the non-string type elements. Just make sure it is available in the context (by loading it into xdmp:schemas-database()), and that it is recognized (your XML needs to have a namespace that matches the XML Schema, and you might wanna use import schema)..
HTH!

name json variable and jsonString variable convention?

JSON could mean JSON type or json string.
It starts confuse me when different library use json in two different meanings.
I wonder how other people name those variables differently.
For all practical purposes, "JSON" has exactly one meaning, which is a string representing a JavaScript object following certain specific syntax.
JSON is parsed into a JavaScript object using JSON.parse, and an JavaScript object is converted into a JSON string using JSON.stringify.
The problem is that all too many people have gotten into the bad habit of referring to plain old JavaScript objects as JSON. That is either confused or sloppy or both. {a: 1} is a JS object. '{"a": 1}' is a JSON string.
In the same vein, many people use variable names like json to refer to JavaScript objects derived from JSON. For example:
$.getJSON('foo.php') . then(function(json) { ...
In the above case, the variable name json is ill-advised. The actual payload returned from the server is a JSON string, but internally $.getJSON has already transformed that into a plain old JavaScript object, which is what is being passed to the then handler. Therefore, it would be preferable to use the variable name data, for example.
If a library uses the term "json" to refer to things which are not JSON, but actually are JavaScript objects, it is a mark of poor design, and I'd suggest looking around for a different library.