jq conditional delete from array - json

I have this json i got from aws, this is just a test i created not my actual rules
[
{
"Name": "Fortinet-all_rules",
"Priority": 0,
"Statement": {
"ManagedRuleGroupStatement": {
"VendorName": "Fortinet",
"Name": "all_rules",
"ExcludedRules": [
{
"Name": "Database-Vulnerability-Exploit-01"
},
{
"Name": "Database-Vulnerability-Exploit-02"
},
{
"Name": "Database-Vulnerability-Exploit-03"
},
{
"Name": "Malicious-Robot"
},
{
"Name": "OS-Command-Injection-01"
},
{
"Name": "OS-Command-Injection-02"
},
{
"Name": "SQL-Injection-01"
},
{
"Name": "SQL-Injection-02"
},
{
"Name": "SQL-Injection-03"
},
{
"Name": "Source-Code-Disclosure"
},
{
"Name": "Web-Application-Injection-01"
},
{
"Name": "Web-Application-Injection-02"
},
{
"Name": "Web-Application-Vulnerability-Exploit-01"
},
{
"Name": "Web-Application-Vulnerability-Exploit-02"
},
{
"Name": "Web-Application-Vulnerability-Exploit-03"
},
{
"Name": "Web-Application-Vulnerability-Exploit-04"
},
{
"Name": "Web-Application-Vulnerability-Exploit-05"
},
{
"Name": "Web-Application-Vulnerability-Exploit-06"
},
{
"Name": "Web-Application-Vulnerability-Exploit-07"
},
{
"Name": "Web-Scanner-01"
},
{
"Name": "Web-Scanner-02"
},
{
"Name": "Web-Scanner-03"
},
{
"Name": "Web-Server-Vulnerability-Exploit-01"
},
{
"Name": "Web-Server-Vulnerability-Exploit-02"
},
{
"Name": "Web-Server-Vulnerability-Exploit-03"
},
{
"Name": "Web-Server-Vulnerability-Exploit-04"
}
],
"ScopeDownStatement": {
"RegexPatternSetReferenceStatement": {
"ARN": "",
"FieldToMatch": {
"UriPath": {}
},
"TextTransformations": [
{
"Priority": 0,
"Type": "NONE"
}
]
}
}
}
},
"OverrideAction": {
"None": {}
},
"VisibilityConfig": {
"SampledRequestsEnabled": true,
"CloudWatchMetricsEnabled": true,
"MetricName": "Fortinet-all_rules"
}
},
{
"Name": "DDOS_rate_rule",
"Priority": 1,
"Statement": {
"RateBasedStatement": {
"Limit": 350,
"AggregateKeyType": "FORWARDED_IP",
"ScopeDownStatement": {
"NotStatement": {
"Statement": {
"IPSetReferenceStatement": {
"ARN": "",
"IPSetForwardedIPConfig": {
"HeaderName": "X-Forwarded-For",
"FallbackBehavior": "MATCH",
"Position": "FIRST"
}
}
}
}
},
"ForwardedIPConfig": {
"HeaderName": "X-Forwarded-For",
"FallbackBehavior": "MATCH"
}
}
},
"Action": {
"Block": {}
},
"VisibilityConfig": {
"SampledRequestsEnabled": true,
"CloudWatchMetricsEnabled": true,
"MetricName": "DDOS_rate_rule"
}
}
]
So what i want for example is to delete the element { "Name": "OS-Command-Injection-01" }
I need to do it conditionally
So i tried using select jq '. | select([].Statement.ManagedRuleGroupStatement.ExcludedRules[].Name == "Malicious-Robot")'
problem is it errors jq: error (at :150): Cannot iterate over null (null)
also if i try to chain selects it doesn't work
I will also need to delete several object at once, but if i can delete one i can run the query several times so that's not an issue

|= is useful for modifying elements of a data structure.
The left-hand side should return the things to modify. (Use parens if it contains |.)
The right-hand side is evaluated as if | was used instead of |=.
The right-hand side should return the new value. (Use parens if it contains |.)
The whole returns . with the modifications made.
jq '
( .[].Statement.ManagedRuleGroupStatement.ExcludedRules | arrays ) |=
map(select(.Name != "OS-Command-Injection-01"))
'
jqplay

To delete objects from arrays, you could use the template:
walk(if type == "arrayā€¯
then map(select(
( type=="object" and
(.Name|IN( ... ) ) ) | not ))
else . end)

You can try this :
jq 'walk(if type=="object" and
(.Name|IN("OS-Command-Injection-01","SQL-Injection-03"))
then empty
else . end)' input-file

Related

find and replace in json file using jq filter

I've below json file env.json and I want to search for "[\"res\",\"q3\"]" and replace it with a variable var1 value "[\"res\"]"
{
"idsb": "marqd",
"data": {
"name": "bcon-dv-alert"
},
"ingress": {
"args": {
"params": [
{
"name": "spt",
"value": "cld"
},
{
"name": "scv",
"value": "sdv"
},
{
"name": "scr",
"value": "ord"
}
{
"name": "srm",
"value": "[\"res\",\"q3\"]"
},
{
"name": "tgo",
"value": "pbc"
}
]
},
"wfr": {
"name": "t-r-e"
},
"snm": "as-r"
}
}
I tried the below way but it's not working
var1="[\"res\"]"
jq '.ingress.args.params[] | select(.name=="srm").value |= ["'${var1}'"]' env.json
where am making mistake? what's the right way to do it?
The final result will be
{
"idsb": "marqd",
"data": {
"name": "bcon-dv-alert"
},
"ingress": {
"args": {
"params": [
{
"name": "spt",
"value": "cld"
},
{
"name": "scv",
"value": "sdv"
},
{
"name": "scr",
"value": "ord"
}
{
"name": "srm",
"value": "[\"res\"]"
},
{
"name": "tgo",
"value": "pbc"
}
]
},
"wfr": {
"name": "t-r-e"
},
"snm": "as-r"
}
}
Since you want to update ingress, and not return only the result of the loops, use:
.ingress.args.params |= map(select(.name=="srm").value |= "new-value")
Try it online

Generate multiple JQ output documents from a single input, modifying each result

I want to do 2 operations in my JSON file . I tried to do it with JQ and SHELL .
First one : I want to tranform the parents elements to an pure text value
Second one : I want to remove one specific level in the JSON tree
Input :
{
"template_first": {
"order": 0,
"index_patterns": [
"first"
],
"settings": {
"index": {
"codec": "best_compression",
"refresh_interval": "30s",
"analysis": {
"normalizer": {
"norm_case_insensitive": {
"filter": "lowercase",
"type": "custom"
}
}
},
"number_of_shards": "1",
"number_of_replicas": "1"
}
},
"mappings": {
"_doc": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
}
},
"template_second": {
"order": 0,
"index_patterns": [
"second"
],
"settings": {
"index": {
"codec": "best_compression",
"refresh_interval": "30s",
"analysis": {
"normalizer": {
"norm_case_insensitive": {
"filter": "lowercase",
"type": "custom"
}
}
},
"number_of_shards": "1",
"number_of_replicas": "1"
}
},
"mappings": {
"_doc": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
}
}
}
You see there two JSON object in the file
{
"template_first" : { ...},
"template_second" : { ... }
}
The first modification comes from the appearance of this command
PUT _template/template_number
instead of the key of the first JSON object.
So the expected result
PUT _template/template_first
{...}
PUT _template/template_second
{...}
The second change comes with the removal of _doc level
Before :
"mappings": {
"_doc": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
}
Expected result
"mappings": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
So the actual result look like this
PUT _template/template_first
{
"order": 0,
"index_patterns": [
"first"
],
"settings": {
"index": {
"codec": "best_compression",
"refresh_interval": "30s",
"analysis": {
"normalizer": {
"norm_case_insensitive": {
"filter": "lowercase",
"type": "custom"
}
}
},
"number_of_shards": "1",
"number_of_replicas": "1"
}
},
"mappings": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
}
PUT _template/template_second
{
"order": 0,
"index_patterns": [
"second"
],
"settings": {
"index": {
"codec": "best_compression",
"refresh_interval": "30s",
"analysis": {
"normalizer": {
"norm_case_insensitive": {
"filter": "lowercase",
"type": "custom"
}
}
},
"number_of_shards": "1",
"number_of_replicas": "1"
}
},
"mappings": {
"dynamic": true,
"dynamic_templates": [
{
"strings": {
"mapping": {
"type": "keyword"
},
"match_mapping_type": "string"
}
}
],
"properties": {
"log.id": {
"type": "keyword"
},
"host.indexer.hostname": {
"type": "keyword"
},
"ts_indexer": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
}
}
}
}
I achieved to do the second change : delete one level of the JSON array by using the command
jq 'keys[] as $k | map( .mappings =.mappings._doc )' template.json
But i don't know how to do the first change and the 2nd change in the same time .
I tried to loop into the array like this , without success
for row in $(jq 'keys[] as $k | "\($k)"' template.json); do
_jq() {
echo ${row}
}
echo $(_jq '.name')
done
Calling jq just once, and having it write a NUL-delimited list of template-name / modified-template-content pairs (which a bash while read loop can then iterate over):
while IFS= read -r -d '' template_name && IFS= read -r -d '' template_content; do
echo "We want to do PUT the following to _template/$template_name"
printf '%s\n' "$template_content"
done < <(
jq -j '
to_entries[] |
.key as $template_name |
.value as $template_content |
($template_name, "\u0000",
($template_content | (.mappings = .mappings._doc) | tojson), "\u0000")
' <infile.json
)
I had some trouble with the done < <( that caused syntax error in my shell (don't know why ) .
So I modified your script like this :
jq -j 'to_entries[] | .key as $template_name | .value as $template_content | ($template_name, "\u0000", ($template_content | (.mappings = .mappings._doc) | tojson), "\u0000")' < infile.json |
while IFS= read -r -d '' template_name && IFS= read -r -d '' template_content; do
echo "PUT _template/$template_name"
printf '%s\n' "$template_content"
done
Which perfectly does the job !
Thanks Charles

Extract parent node by filtering on child nodes

I need to filter JSON on some child elements and to extract the parent node id.
Part of the JSON:
[
{
"data": {
"id": "2da44298-05ec-4bb5-acce-b524ef56328c",
"attributes": {
"units": [
{
"id": "1492de82-2f36-43bf-b077-5cf54a3f38b9",
"show_name": false,
"children": [
{
"id": "a5d76efa-5b21-4874-a8a5-c9c9f8317ee6",
"contents": [
{
"id": "b96c127c-6a4f-4a29-924d-63f0ba55972a",
"link": {
"url": "#",
"target": "_blank",
"data": {
"aspect-ratio": null
}
},
"duration": null
},
{
"id": "dbb7e8fd-aa35-4acc-8ad7-1c7dcd08a6d8",
"link": {
"data": {
"id": "dbb7e8fd-aa35-4acc-8ad7-1c7dcd08a6d8",
"aspect-ratio": null
}
},
"duration": null
}
]
},
{
"id": "8a805cd0-7447-4fac-b4fc-aaa9a2f7e649",
"contents": [
{
"id": "d64138b6-5195-48b4-a0f7-b087c5496587",
"link": {
"data": {
"id": "d64138b6-5195-48b4-a0f7-b087c5496587",
"aspect-ratio": null
}
},
"duration": null
},
{
"id": "392406b1-fa20-413b-a98a-4d1a5b201d8e",
"link": {
"url": "#",
"target": "_blank",
"data": {
"id": "423498d9-8e0f-41ef-891a-34b078862ce7",
"aspect-ratio": null
}
},
"duration": null
}
]
}
],
"contents": []
}
],
"text": []
}
},
"jsonapi": {
"version": "1.0"
}
}
]
For example, it is required to extract unit id filtered on contents id b96c127c-6a4f-4a29-924d-63f0ba55972
I tried the following expressions:
$..data..?(#contents.data.id == 'b96c127c-6a4f-4a29-924d-63f0ba55972a')].id
$..data..?(#contents.data.id == 'b96c127c-6a4f-4a29-924d-63f0ba55972a')].children.id
$..data..?(#contents.data.id == 'b96c127c-6a4f-4a29-924d-63f0ba55972a')].unit.id
I need to do this way because of those ids are being got from different responses.
You could use nested filter operators to get the parent node, something like:
$..data.attributes.units.[?(#.children[?(#.content[?(#.id == 'b96c127c-6a4f-4a29-924d-63f0ba55972')])])].id
Demo:
More information: JMeter's JSON Path Extractor Plugin - Advanced Usage Scenarios

merging 2 json into one single json with value parsing in bash.

I have two JSONS:
{
"name": "paypal_modmon",
"description": "Role For Paypal admin-service box",
"run_list": [
"recipe[djcm_paypal_win::sslVerify]"
]
}
and
{
"name": "paypal_dev",
"default_attributes": {
"7-zip": {
"home": "%SYSTEMDRIVE%\\7-zip"
},
"modmon": {
"env": "dev"
},
"paypal": {
"artifact": "%5BINTEGRATION%5D"
}
},
"override_attributes": {
"default": {
"env": "developmen"
},
"windows": {
"password": "Pib1StheK1N5"
},
"task_sched":{
"credentials": "kX?rLQ4XN$q"
},
"seven_zip": {
"url": "https://djcm:Pib1StheK1N5#artifactory.dowjones.io/artifactory/djcm-zip-local/djcm/chef/paypal/7z1514-x64.msi"
}
},
"chef_type": "environment"
}
I want to read the values from the second json : "default_attributes" and "override_attributes" and merge them with the first json into an output like :
{
"description": "Role For Paypal admin-service box",
"run_list": [
"recipe[djcm_paypal_win::sslVerify]"
],
"chef_type": "environment",
"seven_zip": {
"url": "https://djcm:Pib1StheK1N5#artifactory.dowjones.io/artifactory/djcm-zip-local/djcm/chef/paypal/7z1514-x64.msi"
},
"task_sched": {
"credentials": "kX?rLQ4XN$q"
},
"windows": {
"password": "Pib1StheK1N5"
},
"paypal": {
"artifact": "%5BINTEGRATION%5D"
},
"modmon": {
"env": "dev"
},
"7-zip": {
"home": "%SYSTEMDRIVE%\\7-zip"
},
"default": {
"env": "developmen"
},
"name": "paypal_modmon"
}
Is there a way to do this in bash and how would go to achieve it ?
Generally, if you're reading in multiple files, you should use the --argfile option so you can reference the contents of the file by name. And judging by the name of the attributes you wish to merge, you should be wary of the different merging options you have. default_attributes suggests it should be attributes that should be used if omitted. override_attributes suggests it should force it's values in.
$ jq --argfile merge input2.json \
'($merge.default_attributes * .) + $merge.override_attributes' input1.json
By merging the input with the default_attributes using *, it allows you to start with the defaults and add your actual values in place. That way missing values end up being provided by the default object.
Then adding the override_attributes object, the values are completely replaced and not just merged.
Got it. With jq seems super simple :
jq -s '.[0] + .[1].default_attributes + .[1].override_attributes' a-roles.json a-env.json > manifest.json
manifest.json ->
{
"default": {
"env": "developmen-jq"
},
"7-zip": {
"home": "%SYSTEMDRIVE%\\7-zip"
},
"name": "paypal_modmon",
"description": "Role For Paypal admin-service box",
"run_list": [
"recipe[djcm_paypal_win::sslVerify]"
],
"seven_zip": {
"url": "https://djcm:Pib1StheK1N5#artifactory.dowjones.io/artifactory/djcm-zip-local/djcm/chef/paypal/7z1514-x64.msi"
},
"task_sched": {
"credentials": "kX?rLQ4XN$q"
},
"windows": {
"password": "Pib1StheK1N5"
},
"paypal": {
"artifact": "%5BINTEGRATION%5D"
},
"modmon": {
"env": "dev"
}
}
EDIT 1 :
I also need to parse out the run_list key value pair from a-roles.json and ignore all other info to have something:
{
"default": {
"env": "developmen-jq"
},
"7-zip": {
"home": "%SYSTEMDRIVE%\\7-zip"
},
"run_list": [
"recipe[djcm_paypal_win::sslVerify]"
],
"seven_zip": {
"url": "https://djcm:Pib1StheK1N5#artifactory.dowjones.io/artifactory/djcm-zip-local/djcm/chef/paypal/7z1514-x64.msi"
},
"task_sched": {
"credentials": "kX?rLQ4XN$q"
},
"windows": {
"password": "Pib1StheK1N5"
},
"paypal": {
"artifact": "%5BINTEGRATION%5D"
},
"modmon": {
"env": "dev"
}
}
is that possible with jq ?

Mongodb - Finding GeoNear within nested JSON objects

I need to get the closest place near a certain point using this data structure:
[
{
"data_id": "127",
"actual_data": [
{
"id": "220",
"value": "shaul"
},
{
"id": "221",
"value": "3234324"
},
{
"id": "222",
"value": {
"lngalt": [
13.7572225,
-124.0429047
],
"street_number": null,
"political": null,
"country": null,
}
},
{
"id": "223",
"value": "dqqqf1222fs3d7ddd77#Dcc11cS2112s.com"
},
{
"id": "224",
"value": "123123"
},
{
"id": "225",
"value": "lala1"
},
....
},
{
"data_id": "133",
"actual_data": [
{
"id": "260",
"value": {
"lngalt": [
1.7572225,
32.0429047
],
"street_number": null,
"political": null,
"country": null,
}
},
{
"id": "261",
"value": -122.25
}
],
}
]
I used the following query in order to get what I need:
{
"actual_data": {
"$elemMatch": {
"id": "260",
"value.lngalt": {
"$near": {
"$geometry": {
"type": "Point",
"coordinates": [
-73.9667,
40.78
]
},
"$minDistance": 1000,
"$maxDistance": 5000
}
}
}
}
}
But after queering it, I get "Can't canonicalize query: BadValue geoNear must be top-level expr" (error code: 17287). It's strange since I do get the right results when I query without the $near but with $elemMatch only in order to get the exact object with particular value.
Thanks.
SOLVED!
for future refarance I used $geoWithin instead $near function. So my query looks like :
{
"actual_data": {
"$elemMatch": {
"id": "260",
"value.lnglat": {
"$geoWithin": {
"$centerSphere": [
[
-71.07651880000003,
42.353068
],
0.001
]
}
}
}
}
}
PEACE!