I'm trying to write a python class that takes a properly formatted Elasticsearch JSON query as a parameter. So far so good.. however, as part of that class I'd like to also take a "to" and "from" parameter to limit the date range that the query runs over.
Is there a way to combine the JSON for the DSL query along with URI parameters to pass in the date and time constraint?
I know I can limit the time using a range parameter like this:
GET /my-awesome-index*/_search
{
"query":
{
"bool":
{
"must": [{"match_all": {}}],
"filter":
[
{"range": {"date_time": {"gte": "now-24h","lte": "now"}}},
{"match_phrase": {"super_cool_field": "foo"}},
{
"bool":
{
"should":
[
{"match_phrase": {"somewhat_cool_field_1": "bar"}},
{"match_phrase": {"somewhat_cool_field_2": "boo-ta"}}
],
"minimum_should_match": 1
}
}
]
}
}
}
and that's all well and good.. but, I want to craft my class to make the timeframe a variable. I also know I can limit the timeframe by submitting the URL like this..
GET /my-awesome-index*/_search?q=date_time:[1611732033412+TO+1611777796000]
{
"query":
{
"bool":
{
"must": [{"match_all": {}}],
"filter":
[
{"match_phrase": {"super_cool_field": "foo"}},
{
"bool":
{
"should":
[
{"match_phrase": {"somewhat_cool_field_1": "bar"}},
{"match_phrase": {"somewhat_cool_field_2": "boo-ta"}}
],
"minimum_should_match": 1
}
}
]
}
}
}
However, when I submit that Elasticsearch only seems to consider the timeframe from the URI and ignores the DSL JSON entirely.
Is there a way to get Elastic to consider / concatenate the two queries into one?
I'm considering programmatically making the range query something like this
range_part = '{{"range":{{"{}":{{"gte":"{}","lte":"{}"}}}}}}'.format(field,start,end)
And then dynamically inserting into any JSON the class takes.. but that seems cumbersome since there are so many formats available for the query and finding where to put the string etc..
Your suspicion regarding q is right. Quoting from the docs:
The q parameter overrides the query parameter in the request body. If both parameters are specified, documents matching the query request body parameter are not returned.
So q and query cannot be combined.
As to "programatically making the range query" -- since you don't know in what format you'll receive the queries, the standard approach would be to traverse the query json, find/create the correct bool-must/bool-filter and set the range query there.
But let's take a step back -- maybe your class shouldn't be expecting a pre-baked JSON query in the first place. Why not pass a query config array like
[
{
"type": "match_phrase",
"field": "super_cool_field",
"value": "foo"
},
...
]
and build the query yourself? That way you have full control over what gets passed downstream to ES. Plus, adding a date range query would be piece of cake...
Related
I am syncing a data set with about 300k rows to Phonograph2 and need to make those records available via REST (End Point: /phonograph2/api/search/tables).
My requests looks as following (retrieving records after a certain timestamp):
{
"tableRids": [
"ri.phonograph2.main.table.xxxxxxxxxxxxxxx"
],
"filter": {
"type" : "range",
"range": {
"field": "reco_timestamp",
"gte": "1634408219000"
}
}
}
}
The response ends with:
"nextPageToken": "xxxxxxx"
This leads me to the following questions:
How do I use the "nextPageToken" to retrieve the next set of results?
Can the consumer get a list/array of pages to consume?
Can the number of hits which are displayed until the nextPageToken is written be configured?
We discussed this with our Palantir project support and will use Objects Gateway - as suggested. Thanks for pointing us into this direction.
I have some documents in my index and I wanna get notified if in any document a specific field is set to false.
Can I do this whitin Kibana alerts, or am I just better off running an exists/boolean query to check for what I need that runs everyday?
I have this query:
GET my-index/_count
{
"query": {
"bool": {
"must": [
{
"term": {
"product.is_visible": {
"value": "false"
}
}
}
]
}
}
}
I want to get notified if this query ever returns more than 0 rows. What would be a good solution?
Using Kibana is a good solution, especially since you can use the Elasticsearch query rule.
Then to receive notifications, you can add an action (e.g. send to email) to run when a rule condition is met (e.g. query hits > 0 (threshold value)).
I am trying to build a query filter as an array.
So, To make a GET call with some filter in the postman, I'd built an query like:
"query": [
{
"key": "type",
"value": 3
},
{
"key": "type",
"value": 4
},
{
"key": "type",
"value": 5
}]
It made the URLs with filter, like
/api/3/vehicles/?type=3&type=4&type=5
But these filters should be getting from previous API call.
So, I'd built some script that builds the query like above and save it in the environment variable.
query = []
for (i = 0; i < data.length; i++){
query.push({'key': 'type', 'value': data[i].id})
}
postman.setEnvironmentVariable("query", query);
And, in the JSON file, I used it like:
"query" : {{query}}
But it seems postman can't recognize it as an environment variable.
I can't even import JSON file to the postman. I am getting a format error.
Is this something you faced before? How I can solve this problem?
So when you check the environment variables the "query" variable is not there, right?
Also I am not sure about formatting. For declaring environment variable I use: pm.environment.set("query", query);
You can also add console.log(query) after your for loop, open your Postman console(Ctrl+Alt+C) and verify what query looks like. Maybe it will give you a hint.
I am getting JSON returned in this format:
{
"status": "success",
"data": {
"debtor": {
"debtor_id": 1301,
"key": value,
"key": value,
"key": value
}
}
}
Somehow, my RESTAdapter needs to provide my debtor model properties from "debtor" section of the JSON.
Currently, I am getting a successful call back from the server, but a console error saying that Ember cannot find a model for "status". I can't find in the Ember Model Guide how to deal with JSON that is nested like this?
So far, I have been able to do a few simple things like extending the RESTSerializer to accept "debtor_id" as the primaryKey, and also remove the pluralization of the GET URL request... but I can't find any clear guide to reach a deeply nested JSON property.
Extending the problem detail for clarity:
I need to somehow alter the default behavior of the Adapter/Serializer, because this JSON convention is being used for many purposes other than my Ember app.
My solution thus far:
With a friend we were able to dissect the "extract API" (thanks #lame_coder for pointing me to it)
we came up with a way to extend the serializer on a case-by-case basis, but not sure if it really an "Ember Approved" solution...
// app/serializers/debtor.js
export default DS.RESTSerializer.extend({
primaryKey: "debtor_id",
extract: function(store, type, payload, id, requestType) {
payload.data.debtor.id = payload.data.debtor.debtor_id;
return payload.data.debtor;
}
});
It seems that even though I was able to change my primaryKey for requesting data, Ember was still trying to use a hard coded ID to identify the correct record (rather than the debtor_id that I had set). So we just overwrote the extract method to force Ember to look for the correct primary key that I wanted.
Again, this works for me currently, but I have yet to see if this change will cause any problems moving forward....
I would still be looking for a different solution that might be more stable/reusable/future-proof/etc, if anyone has any insights?
From description of the problem it looks like that your model definition and JSON structure is not matching. You need to make it exactly same in order to get it mapped correctly by Serializer.
If you decide to change your REST API return statement would be something like, (I am using mock data)
//your Get method on service
public object Get()
{
return new {debtor= new { debtor_id=1301,key1=value1,key2=value2}};
}
The json that ember is expecting needs to look like this:
"debtor": {
"id": 1301,
"key": value,
"key": value,
"key": value
}
It sees the status as a model that it needs to load data for. The next problem is it needs to have "id" in there and not "debtor_id".
If you need to return several objects you would do this:
"debtors": [{
"id": 1301,
"key": value,
"key": value,
"key": value
},{
"id": 1302,
"key": value,
"key": value,
"key": value
}]
Make sense?
I'm having trouble properly formatting one particular soap parameter using the node-soap module for node.js as a client, to a 3rd-party SOAP service.
The client.describe() for this method says this particular input should be in the shape of:
params: { 'param[]': {} }
I have tried a bunch of different JSON notations to try to fit my data to that shape.
Examples of formats that do NOT work:
"params": { "param": [ {"myParameterName": "myParameterValue"} ] }
"params": [ "param": { "name": "myParameterName", "_": "myParameterValue"} ]
"params": { "param" : [ {"name": "myParameterName", "_": "myParameterValue"} ] }
"params": { "param[]": {"myParameterName": "myParameterValue" } }
"params": { "param[myParameterName]": {"_": "myParameterValue" } }
I must be overlooking something, and I suspect I'm going to feel like Captain Obvious when some nice person points out what I'm doing wrong.
Here is what DOES work, using other soap clients, and how they handle the "named parameter with a value"
soapUI for this method successfully accepts this particular input via XML in the shape of:
<ns:params>
<ns:param name="myParameterName">myParameterValue</ns:param>
</ns:params>
Also, using PHP, I can successfully make the call by creating a stdClass of arrays like so:
$parms = new stdClass;
$parms->param = array(
array(
"name"=>"myParameterName","_"=>"myParameterValue"
)
);
and then eventually passing
'params' => $parms
to the PHP soap client
Many thanks!
To get a better look at what XML was being generated by node-soap, I added a console.log(message) statement to the node_modules/soap/lib/client.js after the object-to-XML encoding. I then began experimenting with various JSON structures to figure out empirically how they were mapping to XML structures.
I found a JSON structure for node-soap to generate the XML in my 3rd-party's required named-parameter-with-value format. I was completely unaware of the "$value" special keyword. Looks like this may have been added in the 0.4.6 release from mid-June 2014. See the change history
"params": [
{
"param": {
"attributes": {
"name": "myParameterName"
},
$value: "myParameterValue"
}
}
]
(note the outer array, which gives me the luxury of specifying multiple "param" entries, which is sometimes needed by this particular 3rd-party API)
generates this XML:
<tns:params>
<tns:param name="myParameterName">myParameterValue</tns:param>
</tns:params>
which perfectly matches the structure in soapUI (which I already knew worked) of:
<ns:params>
<ns:param name="myParameterName">myParameterValue</ns:param>
</ns:params>