Querying complete JSON object using elasticsearch - json

I want to query for the complete Json.
I have Json objects saved in in different documents.
_id: 1,
my_json: {
"field1": "1",
"field2": "2",
"field3": "3",
"field4": "4"
}
_id: 2,
my_json: {
"field1": "11",
"field2": "22",
"field3": "33",
"field4": "44"
}
In my query I want to match for the complete combination not the single field. So I tried the query below but it is returning empty response.
{
"query": {
"bool": {
"must": [
{
"terms": {
"my_json": [
{
"field1": "11",
"field2": "22",
"field3": "33",
"field4": "44"
},
{
"field1": "1",
"field2": "2",
"field3": "3",
"field4": "4"
}
]
}
}
]
}
}
}
I need to work with more and more Json objects like above. So I cannot use the regular query.Hence I tried to match the complte the JSON object.

If you want to match all properties of a JSON object, you can do so but using one term query per field in your JSON, i.e. each field of your JSON must match
{
"query": {
"bool": {
"must": [
{
"term": {
"my_json.field1": "11"
}
},
{
"term": {
"my_json.field2": "22"
}
},
{
"term": {
"my_json.field3": "33"
}
},
{
"term": {
"my_json.field4": "44"
}
}
]
}
}
}

Related

How to JSONify a subdocument in a MongoDB aggregation pipeline

I have a collection containing documents with JSON subdocuments
{
"field1": {"some_keys_a":42},
"field2": {"some_keys_b":42},
"field3": {"some_keys_c":42},
"fieldN": "..."
}
In a aggregation pipeline I need to serialize some of the fields in a string; at first I believed that there ought to be a $json operator I could call like:
{
"$set": {
"serialied_subdocument": {
"$toJson": {
"field1": "$field1",
"field2": "$field2",
"field3": "$field3",
}
}
}
}
My problem is that it does not look like such an operator exists. Is there another way to convert a subdocument or an array to a JSON string inside a pipeline?
I am using MongoDB 4.4 on Atlas.
For context I was previously using a pipeline with $group with does not require this stringification:
[
// Omitted $match and other stuff
{
"$project": {
"field1": true,
"field2": true,
"field3": true
}
},
{
"$group": {
"_id": {
"field1": "$field1",
"field2": "$field2",
"field3": "$field3"
}
}
},
{
"$replaceRoot": {
"newRoot": "$_id"
}
}
]
but lately the collection has significantly grown in size so I would like to use $merge
[
// Omitted $match and other stuff
{
"$project": {
"field1": true,
"field2": true,
"field3": true
}
},
{
"$set": {
"_id": {
"field1": "$field1",
"field2": "$field2",
"field3": "$field3"
},
"lookup_keys_for_later":"random_string_created_by_the_app_to_fetch_the_results"
}
},
{
"$merge": {
"into": "unique_field1_field2_field3",
"on": "_id",
"whenMatched": "keepExisting",
"whenNotMatched": "insert"
}
}
]
after which I would simply fetch the result by the "lookup_keys_for_later" key. The issue is that for $merge the _id needs to be a primitive.

Jsonassert ignore key

There are 2 Json strings to compare.
1
"values": {
"-1487778947": {
"field1": "xxx",
"field2": "yyy",
"field3": {
"zzz": {
"field4": 21,
"field5": 28
}
}
},
"-1820451085": {
"field1": "fgf",
"field2": "dfd",
"field3": {
"zzz": {
"field4": 56,
"field5": 78
}
}
},
}
2
"values": {
"343434-35454-232467498": { // ignore this value
"field1": "xxx", // compare these fields
"field2": "yyy", // compare these fields
"field3": { // compare these fields
"zzz": { // compare these fields
"field4": 21, // compare these fields
"field5": 28 // compare these fields
}
}
},
"486787-4546-787344353": { // ignore this value
"field1": "fgf", // compare these fields
"field2": "dfd", // compare these fields
"field3": { // compare these fields
"zzz": { // compare these fields
"field4": 56, // compare these fields
"field5": 78 // compare these fields
}
}
},
}
I want to ignore the key of these objects and just match the inside fields. Is this possible with JsonAssert or any other library?
We can ignore the fields using customization. But not found a way to ignore only the object key and validate the child values.
With Jayway-JSONPath you can get only the child nodes from both JSON, thus ignoring the key.
INPUT JSON
{
"values": {
"-1487778947": {
"field1": "xxx",
"field2": "yyy",
"field3": {
"zzz": {
"field4": 21,
"field5": 28
}
}
},
"-1820451085": {
"field1": "fgf",
"field2": "dfd",
"field3": {
"zzz": {
"field4": 56,
"field5": 78
}
}
}
}
}
JSONPath
$.values.[*]
Output
[
{
"field1" : "xxx",
"field2" : "yyy",
"field3" : {
"zzz" : {
"field4" : 21,
"field5" : 28
}
}
},
{
"field1" : "fgf",
"field2" : "dfd",
"field3" : {
"zzz" : {
"field4" : 56,
"field5" : 78
}
}
}
]
Online Test Tool :
JSONLint - The JSON Validator
Jayway JsonPath Evaluator

Update Json field via Nifi

I have a json like this coming from an input port:
{
"url": "blablabla",
"keyword": "foo"
}
then I have to generate a new json to pass to a post call. The new json is something like this:
{
"requests": [
{
"source": "blablabla",
"params": {
"keywords": [
"something"
],
"sub-field1": true
}
}
],
"field1": "1",
"field2": "2",
"field3": false
}
where the array keywords should be replaced with a new array with the value of the previous Json ("foo"). The resulting is:
{
"requests": [
{
"source": "blablabla",
"params": {
"keywords": [
"foo"
],
"sub-field1": true
}
}
],
"field1": "1",
"field2": "2",
"field3": false
}
then i have to call a function via REST API Post.
The problem is that I don't know how to perform the replace of the field value
Try JoltTransformJSON with the following spec:
[
{
"operation": "shift",
"spec": {
"url": "requests[0].source",
"keyword": "requests[0].params.keywords[]",
"#1": "requests[0].params.sub-field1"
}
},
{
"operation": "default",
"spec": {
"field1": "1",
"field2": "2",
"field3": false
}
}
]

jq: mapping json objects to another format schema

I need to transform an array of this kind of elements:
[
{
"Field1": "value1",
"Field2": "value2"
},
{
"Field1": "value3",
"Field2": "value4"
},
...
]
To:
[
"PutRequest": {
"Item": {
"Field1": {
"S": "value1"
},
"Field2": {
"S": "value2"
}
}
},
"PutRequest": {
"Item": {
"Field1": {
"S": "value3"
},
"Field2": {
"S": "value4"
}
}
},
...
]
I was thinking about using jq, but I don't quite figure out how to get it.
EDIT
Up to now, I've been able to get that:
[.[] | {"Field1": {"S": .Field1}, "Field2": {"S": .Field2}}]
Is there any what to say: for each field add an like key: {"S": .value}?
EDIT 2
Using map({PutRequest: {Item: map_values({S: .})}}) approach, it's generating me:
{
"S": {
"Field1": "value1",
"Field2": "value2",
}
}
I need:
"Item": {
"Field1": {
"S": "value3"
},
"Field2": {
"S": "value4"
}
}
Any ideas?
Does not exactly match your expected output but you're probably looking for something like this:
map({PutRequest: {Item: map_values({S: .})}})
Demo

Elastic search how to index a nested list

How to create index for nested data structure with list? There will be list of otherUserID's and I don't know how to index them with elasticsearch 6.5.
UserID -> OtherUserID-> name:"text" , count : "long"
You can uses nested data type for creating such a field and indexing list of object.
Refer to the example below and modify it to your needs:
Mapping:
PUT testindex
{
"mappings": {
"_doc": {
"properties": {
"nestedField": {
"type": "nested",
"properties": {
"field1": {
"type": "text",
"fields": {
"keywords": {
"type": "keyword"
}
}
},
"field2": {
"type": "integer"
}
}
}
}
}
}
}
Adding documents:
For single item in list:
PUT testindex/_doc/1
{
"nestedField": [
{
"field1": "Some text",
"field2": 10
}
]
}
For multiple items in list:
PUT testindex/_doc/2
{
"nestedField": [
{
"field1": "Some other text",
"field2": 11
},
{
"field1": "random value",
"field2": 15
}
]
}