I'm trying perform an elasticsearch query as a POST request in order pull data from the index which I created. The data which is in the index is, a table from MySQL DB, configured though logstash.
Here is my request and the JSON body:
http://localhost:9200/response_summary/_search
Body:
{
"query": {
"query_string": {
"query": "transactionoperationstatus:\"charged\" AND api:\"payment\" AND operatorid:\"XL\" AND userid:*test AND time:\"2015-05-27*\" AND responsecode:(200+201)"
}
},
"aggs": {
"total": {
"terms": {
"field": "userid"
},
"aggs": {
"total": {
"sum": {
"script": "Double.parseDouble(doc['chargeamount'].value)"
}
}
}
}
}
}
In the above JSON body, I'm in need to append the timestamp into the query_string in order get the data from the index within a date range. I tried adding at the end of the query as:
AND timestamp:[2015-05-27T00:00:00.128Z+TO+2015-05-27T23:59:59.128Z]"
Where am I going wrong? Any help would be appreciated.
You just need to remove the +as they are only necessary when sending a query via the URL query string (i.e. to URL-encode the spaces), but if you use the query_string query, you don't need to do that
AND timestamp:[2015-05-27T00:00:00.128Z TO 2015-05-27T23:59:59.128Z]"
^ ^
| |
remove these
Related
I have a json message like below. I am using dbt and with Big query plug in. I need to create table dynamically in Big query
{
"data": {
"schema":"dev",
"payload": {
"lastmodifieddate": "2022-11-122 00:01:28",
"changeeventheader": {
"changetype": "UPDATE",
"changefields": [
"lastmodifieddate",
"product_value"
],
"committimestamp": 18478596845860,
"recordIds":[
"568069"
]
},
"product_value" : 20000
}
}
}
I need to create table dynamically with recordIds and changed fields. This field list changes dynamically whenever source sends update..
Expected output:
recordIds | product_value | lastmodifieddate |changetype
568069 | 20000 | 2022-11-122 00:01:28 |UPDATE
Thanks for your suggestions and help!.
JSON objects can be saved in a BigQuery table. There is no need to use dbt here.
with tbl as (select 5 row, JSON '''{
"data": {
"schema":"dev",
"payload": {
"lastmodifieddate": "2022-11-122 00:01:28",
"changeeventheader": {
"changetype": "UPDATE",
"changefields": [
"lastmodifieddate",
"product_value"
],
"committimestamp": 18478596845860,
"recordIds":[
"568069"
]
},
"product_value" : 20000
}
}
}''' as JS)
select *,
JSON_EXTRACT_STRING_ARRAY(JS.data.payload.changeeventheader.recordIds) as recordIds,
JSON_EXTRACT_SCALAR(JS.data.payload.product_value) as product_value,
Json_value(JS.data.payload.lastmodifieddate) as lastmodifieddate,
Json_value(JS.data.payload.changeeventheader.changetype) as changetype
from tbl
If the JSON is saved as string in a BigQuery table, please use PARSE_JSON(column_name) to convert the string to JSON first.
I get log files from my firewall which i want to filter for several strings.
However the string contains always some other information. So i want to filter the whole string for some specific words which are always in the string: "User" "authentication" "failed.
I tried this but i do not get any data from it:
"query": {
"bool": {
"must": [
{
"range": {
"#timestamp": {
"gt": "now-15m"
}
}
},
{
"query_string": {
"query": "User AND authentication AND failed"
}
}
]
}
}
}
However i cannot find the syntax for specific filtering words in strings. Hopefully some of you can help me.
This is the message log ( i want to filter "event.original"): Screenshot
I'm trying perform an elasticsearch query as a GET request in order pull data from the index which I created. The data which is in the index is, a table from MySQL DB, configured though logstash.
Here is my request without the IN clause:
http://localhost:9200/response_summary/_search?q=api:"location"+AND+transactionoperationstatus:"charged"+AND+operatorid='DIALOG'+AND+userid:test+AND+time:"2015-05-27"
In the above, I should be able to append sum(chargeAmount+0) & group by . I tried giving it a search on the web, but couldn't find any solutions.
Any help could be appreaciated.
Whatever you put after the q=... in your query uses the same syntax as a query_string query, so you can rewrite your query to leverage query_string and use aggregations to compute the desired sum:
curl -XPOST http://localhost:9200/response_summary/_search -d '{
"query": {
"query_string": {
"query": "api:\"location\" AND transactionoperationstatus:\"charged\" AND operatorid:\"DIALOG\" AND userid:test AND time:\"2015-05-27\" AND responseCode:(401 403)"
}
},
"aggs": {
"total": {
"terms": {
"field": "chargeAmount"
},
"aggs":{
"total": {
"sum": {
"field": "chargeAmount"
}
}
}
}
}
}'
In Postman, it would look like this:
I'm new to Elasticsearch querying, so I'm a little lost on how to convert this SQL query to an Elasticsearch query:
SELECT time_interval, type, sum(count)
FROM test
WHERE (&start_date <= t_date <= &end_date)
GROUP BY time_interval, type
I know I can use the "range" query to set parameters for gte and lte, but if there's a clearer way to do this, that would be even better. Thanks in advance!
Edit:
My elasticsearch is setup to have an index: "test" with type: "summary" and contains JSON documents that have a few fields:
t_datetime
t_date
count
type
*t_id**
The IDs for these JSON documents are the t_date concatenated with the t_id values
Assuming, t_datetime is the same as time_interval, you can use the query below:
POST trans/summary/_search?search_type=count
{
"aggs": {
"filtered_results": {
"filter": {
"range": {
"t_date": {
"gte": "2015-05-01",
"lte": "2015-05-30"
}
}
},
"aggs": {
"time_interval_type_groups": {
"terms": {
"script": "doc['t_datetime'].value + '_' + doc['type'].value",
"size": 0
},
"aggs": {
"sum_of_count": {
"sum": {
"field": "count"
}
}
}
}
}
}
}
}
This query is making use of scripts. On newer versions of Elasticsearch, dynamic scripting is disabled by default. To enable dynamic scripting, follow this.
How can i convert the following sql query into elastic search query?
SELECT sum(`price_per_unit`*`quantity`) as orders
FROM `order_demormalize`
WHERE date(`order_date`)='2014-04-15'
You need to use scripts to compute the product of values. For newer versions of Elasticsearch, enable dynamic scripting by adding the line script.disable_dynamic: false in elasticsearch.yml file. Note that this may leave a security hole in your Elasticsearch cluster. So enable scripting judiciously. Try the query below:
POST <indexname>/<typename>/_search?search_type=count
{
"query": {
"filtered": {
"filter": {
"term": {
"order_date": "2014-04-15"
}
}
}
},
"aggs": {
"orders": {
"sum": {
"script": "doc['price_per_unit'].value * doc['quantity'].value"
}
}
}
}