We are writing a JSONiq query to insert new properties into a JSON and return the updated JSON from the query.
Query:
jsoniq version "1.0";
let $users := {
"name" : "Deadbeat Jim",
"address" : "1 E 161st St, Bronx, NY 10451",
"risk tolerance" : "high"
}
insert json {"status" : "credit card declined"} into $users
return $users
users holds the input json, we are trying to add one more property using JSONiq insert command, as mentioned in JSONiq documentation here
We are getting below exception:
java.lang.RuntimeException: (no URI):13,1: static error [err:XPST0003]: invalid expression: syntax error, unexpected expression (missing comma "," between expressions?)
Questions :
Is the query correct ? if not, How to make it correct syntactically/logically ?
Are there any good resources available online for JSONiq with examples ?
Here are some more explanations:
The way JSONiq updates work is identical to the way XQuery updates work. JSONiq updates are declarative: a JSONiq update program returns, in addition to an empty sequence in the data model, what is called a pending update list (PUL), which is a list of updates (deletions, replacements, renamings, insertions, etc) to be applied to some documents.
JSONiq update has snapshot semantics, meaning that no side effects occur during the evaluation of the main expression. Instead, after the PUL has been computed, the engine may propagate the changes specified by the PUL to underlying storage (such as a file on disk, or a document store).
A syntactically correct version of the question's example would be:
jsoniq version "1.0";
let $users := {
"name" : "Deadbeat Jim",
"address" : "1 E 161st St, Bronx, NY 10451",
"risk tolerance" : "high"
}
return insert json {"status" : "credit card declined"} into $users
However, in this case, the PUL returned contains changes against a JSON object created on the fly, in memory. The lifetime of this object is only that of the evaluation of the query, so that this program simply has no visible effect.
If the collection function is in some way mapped to a database in a document store like Couchbase or MongoDB (that is, if the engine is documented and configured to do this), the following query will semantically apply an update to this document store.
jsoniq version "1.0";
let $users := collection("users")[$$.name eq "Jim"]
return insert json {"status" : "credit card declined"} into $users
A copy-modify-return expression (also called transform expression like in XQuery, see other answer on this page) provides a way to apply changes in memory without losing them, and without any persistent storage. It:
creates a JSON object (as a copy of another), or an XML node, etc
modifies that object by applying the PUL obtained from the modify expression (important: this has no visible side effects, as only a copy is being modified)
returns the modified copy.
For advanced users: in this case, the copy clause contains a constructor that builds a fresh object, so the optimizer can actually skip the copying.
This is the way to make it work json updates using JSONiq. We need to use copy-modify-return clauses:
jsoniq version "1.0";
copy $users := {
"name" : "Deadbeat Jim",
"address" : "1 E 161st St, Bronx, NY 10451",
"risk tolerance" : "high"
}
modify insert json {"status" : "credit card declined"} into $users
return $users
Hope this might helpful to someone
Related
I'm using SugarCRM rest API, and according to the documentation, to get a set of records, I have to use /<module> GET endpoint and pass JSON in the body to filter the query.
First, is it even possible to have a body in a GET request ?
and how can I build this kind of request then ?
I'm using postman and tried to pass parameters as query strings but it's not possible though.
As far as I know you have to put everything in the query string, which might look different to what you'd expect.
Example for a request to /Users:
{
max_num: 100,
fields: ["first_name", "last_name"],
filter: [
{"user_name":"admin"}
{"status":"Active"}
]
}
Written as query string this request will look like this:
/rest/v10/Users?max_num=100&fields=first_name,last_name&filter[0][user_name]=admin&filter[1][status]=Active
Observations regarding the query string format:
There is no { or }, the values of the request object are placed directly in the query string
Key-Value pairs are assigned with =, and separated by & (instead of : and ,)
There are no " or ' quotes at all, strings are written without those
An array of values (here: fields) is just one assignment with all values separated by ,
An array of objects (here: filter) has one Key-Value pair per bottom value and uses [ and ] to indicate the "path" to each value. Using 0-based numerical indices for arrays
Notes
Keep in mind there are length limits to URL incl. query string. E.g. 4096 bytes/chars for Apache 2, if I remember correctly. If you have to send very elaborate requests, you might want to use POST /rest/v10/<module>/filter instead.
URL-escaped (usually not necessary) the example filter would look like this:
/rest/v10/Users?max_num%3D100%26fields%3Dfirst_name%2Clast_name%26filter%5B0%5D%5Buser_name%5D%3Dadmin%26filter%5B1%5D%5Bstatus%5D%3DActive
I created a post method to receive the geolocation data of customers:
Post method
When I call the post method with the JSON:
{"customer": 1, "latitude":-21.13179, "longitude":-47.736782 }
my PL/SQL Script works.
Now I'd like to send a group of records but I don't know how to do it.
I created a PUT method to receive a collections of geolocations and I constructed a script just to parse the parameter:
Put method
When I call the put method with the JSON:
{
"items":[
{
"customer":1,
"latitude":-21.13179,
"longitude":-47.736782
},
{
"customer":1,
"latitude":-21.13179,
"longitude":-47.736782
}
]
}
PL/SQL code:
declare
l_values apex_json.t_values;
begin
apex_json.parse (
p_values => l_values,
p_source => :items );
end;
I received the message:
400 - Bad Request - Expected a value but got: START_ARRAY.
What I'm doing of wrong?
I want to create a post/put method to receive a collection.
Thanks for your help.
There is an example in OracleBase that shows a way to use 'JSON_Table' in 12c and 'JSON_Obect_t' pl/sql in 12Cr2. The JSON data is passed as a blob to the stored proc which then parses and updates/whatever. I have not tested it yet but it looks like a good approach to deal with collections which apparently cannot be handled by ORDS "out of the box". I had experimented with using the bulkload approach to load a temp table but it was for csv only and a bit tedious. Here's Jeff Smiths blog post on that
I have not tested this yet, I rebuilt my approach to send each entry individually but eventually I'll need to use this. I'll update this answer when I do with examples.
I am facing the same issue and the reason would be what is posted in the below URL.
https://community.oracle.com/thread/2182167?start=0&tstart=0
"In APEX Listener 1.1 the PL/SQL Hander will automatically convert JSON properties to implicit parameters. Note this will only work for simple JSON objects, arrays or nested object are not supported."
Basically - one can't pass collections/arrays. I'm not sure if this has changed now or if there are any plans to change this in the roadmap.
I am trying to serialize some Clojure data structure into a persistent database, and I currently use Chesire for that purpose.
Let's say I have a map that contains namespaced keywords like the following :
{:cemerick.friend/identity {:current friend, :authentications {friend {:identity friend, :roles #{:clojure-cms.handler/user}}}}}
It gets serialized into JSON, like that :
{"cemerick.friend/identity":{"current":"friend","authentications":{"friend":{"identity":"friend","roles":["clojure-cms.handler/user"]}}}}
When reading it back and serializing (with keywordization (parse-string data true)), I get back the following :
{:cemerick.friend/identity {:current friend, :authentications {:friend {:identity friend, :roles [clojure-cms.handler/user]}}}}
How can I parse this JSON with and get the same data as the original ?
Note : this question gives some context to what I am trying to achieve.
Looking at the tests in Chesire, it's quite obvious that the optional keyword parameter to parse-string will affect all name attributes in a JSON object, value attributes like the namespaced keyword in your example are not affected. Actually, your problem is two-fold: the original set is also not converted back correctly.
For the set problem, what you could do is to write a custom decoder as described in the Chesire documentation.
For the original problem, there is probably no direct way other than to post-process the returned map, find the value to :roles and turn the value into a keyword, like so (untested):
(defn postprocess-json [authmap]
(update-in authmap [:authentications :friend :roles] keyword))
I am working on a (.NET) REST API which is returning some JSON data. The consumer of the API is an embedded client. We have been trying to establish the structure of the JSON we will be working with. The format the embedded client wants to use is something I have not seen before in working with JSON. I suggested that it is no "typical" JSON. I was met with the question "Where is 'typical' JSON format documented"?
As an example of JSON I "typically" see:
{
"item" : {
"users": [ ... list of user objects ... ],
"times": [ ... list of time objects ...],
}
}
An example of the non-typical JSON:
{
"item" : [
{
"users": [ ... list of user objects ... ]
},
{
"times": [ ... list of time objects ...]
},
]
}
In the second example, item contains an array of objects, which each contain a property whose value is an array of entities. This is valid JSON. However, I have not encountered another instance of JSON that is structured this way when it is not an arbitrary array of objects but is in fact a set list of properties on the "item" object.
In searching json.org, stackoverflow.com and other places on the interwebs I have not found any guidelines on why the structure of JSON follows the "typical" example above rather than the second example.
Can you provide links to documentation that would provide recommendations for one format or the other above?
Not a link, but just straightforward answer: Items are either indexed (0, 1, 2, ...) or keyed (users, times). No matter what software you use, you can get at indexed or keyed data equally easily and quickly. But not with what you call "non-typical" JSON: To get at the users, I have to iterate through the array and find one dictionary that has a key "users". But there might be two or more dictionaries with that key. So what am I supposed to do then? If you use JSON schema, the "non-typical" JSON is impossible to check. In iOS, in the typical case I write
NSArray* users = itemDict [#"users"];
For the non-typical JSON I have to write
NSArray* users = nil;
for (NSDictionary* dict in itemArray)
if (dict [#"users"] != nil)
users = dict [#"users"];
but that still has no error checking for multiple dicts with the key "users". Which is an error that in the first case isn't even possible. So just tell them what the are asking for is rubbish and creates nothing but unnecessary work. For other software, you probably have the same problems.
In JMeter, I need to extract some fields (City, Classification, and Chain) from a JSON response:
{
"StoreCode": "111243",
"StoreName": "Spencer - Sec 14 Gurgaon",
"Address1": "Gurgaon-Sector-14",
"Address2": "NCR",
"Pin": "110000",
"City": "NCR",
"Classification": "Vol 4",
"Chain": "Spencers",
"Version": "20281",
"VisitType": "Weekly"
}
Can it be done using the regular expression extractor? Is there another option?
If this piece of JSON is the all the response - it makes sense to use Regular Expression Extractor.
If you receive larger or more complicated structure - it's better to use special JSON Path Extractor available through plugin. JSON Path expressions for your JSON response would be something like $.City, $.Chain, etc.
See "Parsing JSON" chapter of Using the XPath Extractor in JMeter guide for more details on JSON Path language and on how to install the plugin.
Very easy with the plugin mentioned. See this for example. Here is link to plugin.
My biggest thing to understand was the flow. In your jmeter test you need to have an httprequest that returns data (in this case json data). So running your test you'd see json in the Response Data tab if you have a View Results Tree listener going. If you have this right click on the HttpRequest you want data from. ADD => Post Processors => jp#gc - JSON Path Extractor. Inside that extractor, you can name it anything you want.
The variable name should be one you already have defined in a User Defined Variables config element. The JSON Path would start with a dollar sign, then a dot, then the name of the key you want the value for from your json. So for me: $.logId the value from ... "logId": 4, ... in my json. It will store the number 4 in my userdefined variable. The default value can be set to something you'd never see like -1 or null or false etc...
BTW you reference your variable in your tests with ${variablename}. If putting into json and its a string do "${variablename}". Hope it helps.
Lots of the way to find out with the help of regular expression. Sharing one of them.
"City": "(.*)",
"Classification": "(.*)",
"Chain": "(.*)",