Check on a list of JSON objects in Ansible - json

I have in stdout a list of json objects:
[
{
"key1": "value1",
"status": "connected",
"key2": "value2"
},
{
"key1": "value1",
"status": "disconnected",
"key2": "value2"
},
...
]
I would loop on this list and check the value of key "status".
If it is "connected" for all objects, set a var to true. If there is at least one object with "status" to "disconnected", set the var to false.

use selectattr to check if a value is present:
- name: load json
shell: cat file.json
register: result
- name: save the Json data to a Variable as a Fact
set_fact:
jsondata: "{{ result.stdout | from_json }}"
- name: trap final result
set_fact:
status: "{{ value|int == 0 }}"
vars:
value: "{{ jsondata | selectattr('status', 'match', 'disconnected') |length}}"
- debug:
var: status
jsondata | selectattr('status', 'match', 'disconnected') |length gives the lenght of array containing disconnected so if result is 0, that means all values are connected

Related

How to skip null values in json query?

Im trying to pull values from a json file but I do not want values that are null? How can I easily skip null values?
I am just trying to find the easiest and simplest solution to parsing JSON into YALM and not have formatting errors in YAML. Maybe I am over complicating it.
JSON:
{
"configuration": {
"protocols" : {
"bgp" : {
"group" : [
{
"name" : "IBGP",
"neighbor" : [
{
"name" : "192.168.1.2",
"description" : "router-2"
}
]
},
{
"name" : "EBGP",
"neighbor" : [
{
"name" : "192.168.2.2",
"description" : "router-3",
"local-address" : "192.168.2.1",
"peer-as" : "65555"
}
]
}
]
}
}
}
}
Play:
- name: parse bgp
set_fact:
bgp: "{{ read | to_json | from_json | json_query(bgp_var) }}"
vars:
bgp_var: 'configuration.protocols.bgp.group[].{group_name:name,neighbors:[{peer_ip:neighbor[].name | [0], peer_description:neighbor[].description | [0], peer_as:neighbor[]."peer-as" | [0] }]}'
- copy:
content: >-
routing_protocols:
bgp:
{{ bgp | to_nice_yaml(indent=2, width=1337, sort_keys=False) | indent(4) }}
dest: "{{ inventory_hostname }}_routing_protocols.yaml"
Returns:
routing_protocols:
bgp:
- group_name: IBGP
neighbors:
- peer_as: null # < Is there a way to skip this anytime a value is null?
peer_description: router-2
peer_ip: 192.168.1.2
- group_name: EBGP
neighbors:
- peer_as: '65555'
peer_description: router-3
peer_ip: 192.168.2.2
you could use this playbook with the parameter omit:
tasks:
- name: set var
set_fact:
bgp: >-
{{ bgp | d([]) +
[ {'group_name': _gpname,
'neighbors': [{'peer_description': _desc, 'peer_ip': _peerip, 'peer_as': _peeras}]
}
] }}
loop: "{{ configuration.protocols.bgp.group }}"
vars:
_gpname: "{{ item.name }}"
_desc: "{{ item.neighbor.0.description }}"
_peeras: "{{ item.neighbor[0]['peer-as'] | d(omit)}}"
_peerip: "{{ item.neighbor.0.name }}"
- name: displ
debug:
msg: "{{ bgp }}"
result:
ok: [localhost] => {
"msg": [
{
"group_name": "IBGP",
"neighbors": [
{
"peer_description": "router-2",
"peer_ip": "192.168.1.2"
}
]
},
{
"group_name": "EBGP",
"neighbors": [
{
"peer_as": "65555",
"peer_description": "router-3",
"peer_ip": "192.168.2.2"
}
]
}
]
}
You can make it in one go in JMESPath, but that would require a double query on the property neighbour, as JMESPath cannot omit a property as Ansible can (for that, see #Frenchy's answer).
So, you would have to have list of list, into which, the first sub list would consist of the neighbour that would contain the key peer-as, and the second sub list, the one that would not contain it. Then you will have to flatten this list of list.
So, here is the copy task:
- copy:
content: >-
{{
read
| to_json
| from_json
| json_query(query)
| to_nice_yaml(indent=2)
}}
dest: "{{ inventory_hostname }}_routing_protocols.yaml"
vars:
query: >-
{
routing_protocols: {
bgp: configuration.protocols.bgp.group[].{
group_name: name,
neighbors: [
neighbor[?"peer-as"].{
peer_description: description,
peer_ip: name,
peer_as: "peer-as"
},
neighbor[?"peer-as" == null].{
peer_description: description,
peer_ip: name
}
]|[]
}
}
}
After this task, you will end with a YAML file containing your expected:
routing_protocols:
bgp:
- group_name: IBGP
neighbors:
- peer_description: router-2
peer_ip: 192.168.1.2
- group_name: EBGP
neighbors:
- peer_as: '65555'
peer_description: router-3
peer_ip: 192.168.2.2

Need to replace value in dictionary in destination file with the value in the dictionary in source file

I have been trying to update the values in a dictionary in destination json with the values in the dictionary in source JSON file. Below is the example of source and destination JSON file:
Source file:
[
{
"key": "MYSQL",
"value": "456"
},
{
"key": "RDS",
"value": "123"
}
]
Destination File:
[
{
"key": "MYSQL",
"value": "100"
},
{
"key": "RDS",
"value": "111"
},
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
Expectation in destination file after running Ansible playbook:
[
{
"key": "MYSQL",
"value": "**456**"
},
{
"key": "RDS",
"value": "**123**"
},
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
Below is the playbook I have tried so far, but this only updates the value if it is hard coded:
- hosts: localhost
tasks:
- name: Parse JSON
shell: cat Source.json
register: result
- name: Save json data to a variable
set_fact:
jsondata: "{{result.stdout | from_json}}"
- name: Get key names
set_fact:
json_key: "{{ jsondata | map(attribute='key') | flatten }}"
- name: Get Values names
set_fact:
json_value: "{{ jsondata | map(attribute='value') | flatten }}"
# Trying to update the destination file with only the values provided in source.json
- name: Replace values in json
replace:
path: Destination.json
regexp: '"{{ item }}": "100"'
replace: '"{{ item }}": "456"'
loop:
- value
The main goal is to update the value in destination.json with the value provided in source.json.
In Ansible, the couple key/value tends to be handled with the filters dict2items and items2dict. And your use case can be handled by those filters.
Here would be the logic:
Read both files
Convert both files into dictionaries, with dict2items
Combine the two dictionaries, with the combine filter
Convert the dictionary back into a list with items2dict
Dump the result in JSON back into the file
Given the playbook:
- hosts: localhost
gather_facts: no
tasks:
- shell: cat Source.json
register: source
- shell: cat Destination.json
register: destination
- copy:
content: "{{
destination.stdout | from_json | items2dict |
combine(
source.stdout | from_json | items2dict
) | dict2items | to_nice_json
}}"
dest: Destination.json
We end up with Destination.json containing:
[
{
"key": "MYSQL",
"value": "456"
},
{
"key": "RDS",
"value": "123"
},
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
Without to knowing the structure of your destination file it's difficult to use a regex.
I suggest you to load your destination file in a variable, do the changes and save the content of variable to file.
This solution does the job:
- hosts: localhost
tasks:
- name: Parse JSON
set_fact:
source: "{{ lookup('file', 'source.json') | from_json }}"
destination: "{{ lookup('file', 'destination.json') | from_json }}"
- name: create new json
set_fact:
json_new: "{{ json_new | d([]) + ([item] if _rec == [] else [_rec]) | flatten }}"
loop: "{{ destination }}"
vars:
_rec: "{{ source | selectattr('key', 'equalto', item.key) }}"
- name: save new json
copy:
content: "{{ json_new | to_nice_json }}"
dest: dest_new.json
Result -> dest_new.json:
ok: [localhost] => {
"msg": [
{
"key": "MYSQL",
"value": "456"
},
{
"key": "RDS",
"value": "123"
},
{
"key": "DB1",
"value": "TestDB"
},
{
"key": "OS",
"value": "EX1"
}
]
}

Ansible Dictionary value not set

I'm trying to fill a value in a dictionary with ansible but apparently is not doing it.
I do like this:
- name: Fill with zeros
set_fact:
item: "{{ item | combine(zero_fill, recursive=true) }}"
vars:
zero_fill: { 'json' : { 'data': { 'result': [{ 'value' : ["0.0","0.0"]}]}}}
when: item.json.data.result == []
with_items:
- "{{ requests.results }}"
One item from this variable is like this:
{
...
"json": {
"data": {
"result": [],
"resultType": "vector"
}
}
...
}
The point is that in the output of this task I do see the value added, but when I print it just right after the task, the value is not there.
For example, create a new list results_zf and update the dictionary json.data with zero_fill when json.data.result is an empty list
- set_fact:
results_zf: "{{ results_zf|default([]) + [_item] }}"
loop: "{{ requests.results }}"
vars:
_item: "{{ (item.json.data.result|length > 0)|
ternary(item,
{'json': {'data': item.json.data|combine(zero_fill)}})
}}"
Limit the dictionary zero_fill to the attribute result
zero_fill:
'result': [{'value': ["0.0","0.0"]}]

Using item in Ansible json_query

I'm trying to loop through a list of keys to grab associated names from some json:
- name: show names
debug:
msg: "{{ data.json | json_query(query) }}"
vars:
query: "[? key==item].name"
with_items: "{{ keys.split() }}"
But it never displays properly when I try to run it. The keys are correct, but no data is returned:
TASK [get_help_on_SO: show]
ok: [localhost] => (item=Key1) => {
"msg": []
}
ok: [localhost] => (item=Key2) => {
"msg": []
}
Manually putting in the code works just fine so my query syntax seems to be right:
query: "[? key==`Key1`].name"
TASK [get_help_on_SO : show]
ok: [localhost] => (item=Key1) => {
"msg": [
"FooBar 1"
]
}
ok: [localhost] => (item=Key2) => {
"msg": [
"FooBar 1"
]
}
How can I properly pass the item into the json_query?
You didn't surround the item variable with any Jinja delimiters, so it is not interpreted.
You end testing if the key is equal to the string 'item' and not to the string stored in the variable item.
- name: show names
debug:
msg: "{{ data.json | json_query(query) }}"
vars:
query: "[?key==`{{ item }}`].name"
with_items: "{{ keys.split() }}"
Given the data
keys: 'key1 key3'
data:
json: [{
"key": "key1",
"name": "name1"
},
{
"key": "key2",
"name": "name2"
},
{
"key": "key3",
"name": "name3"
}
]
the expected result is
- name1
- name3
It's possible to avoid both the loop and json_query and simplify the solution. The task below
- name: show names
debug:
msg: "{{ data.json|
selectattr('key', 'in', my_keys)|
map(attribute='name')|
list }}"
vars:
my_keys: "{{ keys.split() }}"
gives
msg:
- name1
- name3

Ansible-Merge two nested json files into single file using a key match

I am having two json and want to append one into another and save them in one file. I have done a set fact to read the values and put them in a variable using following:
- name: Set json combine to add new event
set_fact:
event_json_create: "{{ lookup('file', 'event_template.json') }}"
- name: Set json combine to get the existing list of events
set_fact:
event_json_existing: "{{ lookup('file', 'notification.json') }}"
Now I want to append the event_json_create to event_json_existing.
The event_json_create looks like this:
"event_json_create": {
"LambdaFunctionConfigurations": [{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:*******:function:xyz"
}]
}
The event_json_existing looks like this:
"event_json_existing": {
"LambdaFunctionConfigurations": [{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:******:function:abc"
}],
"TopicConfigurations": [{
"TopicArn": "arn:aws:sns:us-east-1:xxxxxx:crt"
}]
}
How can I append the two json in ansible ensuring that both json are under major group: LambdaFunctionConfigurations and retain the remaining content of TopicConfiguration then I can write this into a json file. So the output I expect:
{
"LambdaFunctionConfigurations": [
{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:*******:function:xyz"
},
{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:*******:function:abc"
}
],
"TopicConfigurations": [
{
"TopicArn": "arn:aws:sns:us-east-1:xxxxxx:crt"
}
]
}
Please help!
(As an example of How to create a Minimal, Reproducible Example let's transform the question)
Q: Given the dictionaries below
create:
dict1:
- key1: value1
existing:
dict1:
- key1: value2
dict2:
- key2: value3
get output
expected:
dict1:
- key1: value1
- key1: value2
dict2:
- key2: value3
Write output to file in JSON
A: The tasks below
- set_fact:
events: "{{ events|default({})|
combine({item: existing[item]|default([]) +
create[item]|default([])}) }}"
loop: "{{ (create.keys()|list + existing.keys()|list)|unique }}"
- template:
src: events.json.j2
dest: events.json
with the template
shell> cat events.json.j2
{{ events|to_nice_json }}
give
shell> cat events.json
{
"dict1": [
{
"key1": "value2"
},
{
"key1": "value1"
}
],
"dict2": [
{
"key2": "value3"
}
]
}