Ansible: get specific attribute value from json output - json

I have the following Ansible task:
tasks:
- name: ensure instances are running
ec2:
aws_access_key: "{{aws_access_key}}"
aws_secret_key: "{{aws_secret_key}}"
...
user_data: "{{ lookup('template', 'userdata.txt.j2') }}"
register: ec2_result
- debug:
msg: "{{ ec2_result }}"
- set_fact:
win_instance_id: "{{ ec2_result | json_query('tagged_instances[*].id') }}"
The output:
TASK [debug] ***************
ok: [localhost] => {
"msg": {
"changed": false,
"failed": false,
"instance_ids": null,
"instances": [],
"tagged_instances": [
{
"ami_launch_index": "0",
"architecture": "x86_64",
"block_device_mapping": {
"/dev/sda1": {
"delete_on_termination": true,
"status": "attached",
"volume_id": "vol-01f217e489c681211"
}
},
"dns_name": "",
"ebs_optimized": false,
"groups": {
"sg-c63822ac": "WinRM RDP"
},
"hypervisor": "xen",
"id": "i-019c03c3e3929f76e",
"image_id": "ami-3204995d",
...
"tags": {
"Name": "Student01 _ Jumphost"
},
"tenancy": "default",
"virtualization_type": "hvm"
}
]
}
}
TASK [set_fact] ****************
ok: [localhost]
TASK [debug] ******************
ok: [localhost] => {
"msg": "The Windows Instance ID is: [u'i-019c03c3e3929f76e']"
}
As you can see, the instance ID is correct, but not well formated. Is there a way to convert this output into "human readable" output? Or is there any better way to parse the instance id from the ec2 task output?
Thanks!

It's not non-human readable format, but a list object in Python notation, because you query a list.
If you want a string, you should pass it through a first filter.
win_instance_id: "{{ ec2_result | json_query('tagged_instances[*].id') | first }}"
You can also access the value directly without json_query ([0] refers to the first element of a list):
win_instance_id: "{{ ec2_result.tagged_instances[0].id }}"

Related

how to deal with missing keys in JSON array when reading in Ansible

Below is my JSON file:
[
{
"?xml": {
"attributes": {
"encoding": "UTF-8",
"version": "1.0"
}
}
},
{
"domain": [
{
"name": "mydom"
},
{
"domain-version": "12.2.1.3.0"
},
{
"server": [
{
"name": "AdminServer"
},
{
"ssl": {
"name": "AdminServer"
}
},
{
"listen-port": "12400"
},
{
"listen-address": "mydom.host1.bank.com"
}
]
},
{
"server": [
{
"name": "myserv1"
},
{
"ssl": [
{
"name": "myserv1"
},
{
"login-timeout-millis": "25000"
}
]
},
{
"log": [
{
"name": "myserv1"
},
{
"file-name": "/web/bea_logs/domains/mydom/myserv1/myserv1.log"
}
]
}
]
},
{
"server": [
{
"name": "myserv2"
},
{
"ssl": {
"name": "myserv2"
}
},
{
"reverse-dns-allowed": "false"
},
{
"log": [
{
"name": "myserv2"
},
{
"file-name": "/web/bea_logs/domains/mydom/myserv2/myserv2.log"
}
]
}
]
}
]
}
]
I need to get log list's name and file-name like below using ansible code.
myserv1_log: "/web/bea_logs/domains/mydom/myserv1/myserv1.log"
myserv2_log: "/web/bea_logs/domains/mydom/myserv2/myserv2.log"
There are two challenges that i m facing.
server may not always be the 3rd key of domain array.
log array may not alway be a key for all server arrays and thus should not be printed. For example. server name AdminServer does not have any log list while myserv1 & myserv2 do have.
I need an ansible code to print the desired for the dynamically changing json.
Note: server will always be a key in the domain array
I'm posting with reference to my similar query here: unable to ideally parse a json file in ansible
Kindly suggest.
you just test if both keys exist:
- hosts: localhost
gather_facts: no
vars:
json: "{{ lookup('file', './file.json') | from_json }}"
tasks:
- name: display
debug:
msg: "name: {{ servername }} --> filename: {{ filename }}"
loop: "{{ json[1].domain }}"
vars:
servername: "{{ item.server.0.name }}_log"
filename: "{{ item['server'][2]['log'][1]['file-name'] }}"
when: item.server is defined and item.server.2.log is defined
result:
TASK [display]
skipping: [localhost] => (item={'name': 'USWL1212MRSHM01'})
skipping: [localhost] => (item={'domain-version': '12.2.1.3.0'})
skipping: [localhost] => (item={'server': [{'name': 'AdminServer'}, {'ssl': {'name': 'AdminServer'}}, {'listen-port': '12400'}, {'listen-address': 'myhost1'}]})
ok: [localhost] => (item={'server': [{'name': 'myserv1'}, {'ssl': {'name': 'myserv1'}}, {'log': [{'name': 'myserv1'}, {'file-name': '/web/bea_logs/domains/mydom/myserv1/myserv1.log'}]}]}) => {
"msg": "name: myserv1_log --> filename: /web/bea_logs/domains/mydom/myserv1/myserv1.log"
}
ok: [localhost] => (item={'server': [{'name': 'myserv2'}, {'ssl': {'name': 'myserv2'}}, {'log': [{'name': 'myserv2'}, {'file-name': '/web/bea_logs/domains/mydom/myserv2/myserv2.log'}]}]}) => {
"msg": "name: myserv2_log --> filename: /web/bea_logs/domains/mydom/myserv2/myserv2.log"
}
As you can see, when the condition is not true, the action is skipped...
you could simplify by testing only key log, because in your case, keylog is always linked to key server
when: item.server.2.log is defined

Unable to read json data in ansible play

I have the below json data file:
[
{
"?xml": {
"attributes": {
"encoding": "UTF-8",
"version": "1.0"
}
}
},
{
"domain": [
{
"name": "mydom"
},
{
"domain-version": "12.2.1.3.0"
},
{
"server": [
{
"name": "AdminServer"
},
{
"ssl": {
"name": "AdminServer"
}
},
{
"listen-port": "12400"
},
{
"listen-address": "mydom.myserver1.mybank.com"
}
]
},
{
"server": [
{
"name": "SERV01"
},
{
"log": [
{
"name": "SERV01"
},
{
"file-name": "/web/bea_logs/domains/mydom/SERV01/SERV01.log"
}
]
},
{
"listen-port": "12401"
},
{
"listen-address": "mydom.myserver1.mybank.com"
},
{
"server-start": [
{
"java-vendor": "Sun"
},
{
"java-home": "/web/bea/platform1221/jdk"
}
]
}
]
},
{
"server": [
{
"name": "SERV02"
},
{
"log": [
{
"name": "SERV02"
},
{
"file-name": "/web/bea_logs/domains/mydom/SERV02/SERV02.log"
}
]
},
{
"listen-port": "12401"
},
{
"listen-address": "mydom.myhost2.mybank.com"
},
{
"server-start": [
{
"java-home": "/web/bea/platform1221/jdk"
} ]
}
]
}
]
}
]
I wish to display all the server names and their respective port numbers.
Below is my failed attempt to display all the server names viz
AdminServer
SERV01
SERV02
My playbook:
tasks:
- name: Read the JSON file content in a variable
shell: "cat {{ playbook_dir }}/tmpfiles/{{ Latest_Build_Number }}/testme.json"
register: result
- name: Server Names
set_fact:
servernames: "{{ jsondata | json_query(jmesquery) }}"
vars:
jmesquery: '*.domain.server[*].name'
- name: Server Names and Ports
set_fact:
serverinfo: "{{ jsondata | json_query(jmesquery) }}"
vars:
jmesquery: '*.server[*].[name, port]'
- name: Print all server names
debug:
msg: "{{ item}}"
with_items:
- "{{ servernames }}"
I also tried the below:
jmesquery: 'domain.server[*].name'
There is no error but no data in the output as well. Output below:
TASK [Print all server names] *********************************************************************************
Monday 21 February 2022 03:07:47 -0600 (0:00:00.129) 0:00:03.590 *******
ok: [localhost] => (item=) => {
"msg": ""
}
Can you please suggest how can I get the desired data?
lot of solutions, one
with jmespath, you could try this:
tasks:
- name: Read the JSON file content in a variable
shell: "cat testme.json"
register: result
- name: jsondata
set_fact:
jsondata: "{{ result.stdout | from_json }}"
- name: Server Names
set_fact:
servernames: "{{ servernames | default([]) + [dict(name=item[0], port=item[1])] }}"
loop: "{{ jsondata | json_query(jmesquery0) | zip(jsondata | json_query(jmesquery1)) | list }}"
vars:
jmesquery0: '[].domain[].server[].name'
jmesquery1: '[].domain[].server[]."listen-port"'
- name: debug result
debug:
msg: "{{ servernames }}"
result:
ok: [localhost] => {
"msg": [
{
"name": "AdminServer",
"port": "12400"
},
{
"name": "SERV01",
"port": "12401"
},
{
"name": "SERV02",
"port": "12401"
}
]
}
Due to the nature of your data being in lists, you'll have to resort to conditionals in order to get rid of the empty objects and single item lists that would otherwise pollute your data:
[].domain[?server].server, to get the objects having a property server
[?name].name | [0] to get the name
[?"listen-port"]."listen-port" | [0] to get the port
So, a valid JMESPath query on your data would be
[].domain[?server]
.server[]
.{
name: [?name].name | [0],
port: [?"listen-port"]."listen-port" | [0]
}
And in Ansible, with that single JMESPath query, given that the file is on the controller:
- debug:
var: >-
lookup(
'file',
playbook_dir ~ '/tmpfiles/' ~ Latest_Build_Number ~ '/testme.json'
)
| from_json
| json_query('
[].domain[?server]
.server[]
.{
name: [?name].name | [0],
port: [?"listen-port"]."listen-port" | [0]
}
')
vars:
Latest_Build_Number: 1
This would yield
TASK [debug] *************************************************************************
ok: [localhost] =>
? |-
lookup(
'file',
playbook_dir ~ '/tmpfiles/' ~ Latest_Build_Number ~ '/testme.json'
) | from_json | json_query('
[].domain[?server]
.server[]
.{
name: [?name].name | [0],
port: [?"listen-port"]."listen-port" | [0]
}
')
: - name: AdminServer
port: '12400'
- name: SERV01
port: '12401'
- name: SERV02
port: '12401'
If the file is on the nodes and not on the controller, then, you can either slurp the files first or resort to cat, as you did before applying the same JMESPath query.

ansible print key if inside value is defined

can somebody please help me with this json parse?
I have this json
{
"declaration": {
"ACS-AS3": {
"ACS": {
"class": "Application",
"vs_ubuntu_22": {
"virtualAddresses": ["10.11.205.167"]
},
"pool_ubuntu_22": {
"members": {
"addressDiscovery": "static",
"servicePort": 22
}
},
"vs_ubuntu_443": {
"virtualAddresses": ["10.11.205.167"],
"virtualPort": 443
},
"pool_ubuntu01_443": {
"members": [{
"addressDiscovery": "static",
"servicePort": 443,
"serverAddresses": [
"10.11.205.133",
"10.11.205.165"
]
}]
},
"vs_ubuntu_80": {
"virtualAddresses": [
"10.11.205.167"
],
"virtualPort": 80
},
"pool_ubuntu01_80": {
"members": [{
"addressDiscovery": "static",
"servicePort": 80,
"serverAddresses": [
"10.11.205.133",
"10.11.205.165"
],
"shareNodes": true
}],
"monitors": [{
"bigip": "/Common/tcp"
}]
}
}
}
}
}
and I am trying this playbook
tasks:
- name : deploy json file AS3 to F5
debug:
msg: "{{ lookup('file', 'parse2.json') }}"
register: atc_AS3_status
no_log: true
- name : Parse json 1
debug:
var: atc_AS3_status.msg.declaration | json_query(query_result) | list
vars:
query_result: "\"ACS-AS3\".ACS"
#query_result1: "\"ACS-AS3\".ACS.*.virtualAddresses"
register: atc_AS3_status1
I got this response
TASK [Parse json 1] ******************************************************************************************************************************************************************************************
ok: [avx-bigip01.dhl.com] => {
"atc_AS3_status1": {
"atc_AS3_status.msg.declaration | json_query(query_result) | list": [
"class",
"vs_ubuntu_22",
"pool_ubuntu_22",
"vs_ubuntu_443",
"pool_ubuntu01_443",
"vs_ubuntu_80",
"pool_ubuntu01_80"
],
"changed": false,
"failed": false
}
}
but I would like to print just key which has inside key virtualAddresses
if ""ACS-AS3".ACS.*.virtualAddresses" is defined the print the key .
the result should be
vs_ubuntu_22
vs_ubuntu_443
vs_ubuntu_80
One way to get the keys of a dict, is to use the dict2items filter. This will give vs_ubuntu_22 etc. as "key" and their sub-dicts as "value". Using this we can conditionally check if virtualAddresses is defined in values.
Also parse2.json can be included as vars_file or with include_vars rather than having a task to debug and register the result.
Below task using vars_file in playbook should get you the intended keys from the JSON:
vars_files:
- parse2.json
tasks:
- name: show atc_status
debug:
var: item.key
loop: "{{ declaration['ACS-AS3']['ACS'] | dict2items }}"
when: item['value']['virtualAddresses'] is defined

Extract value from Ansible task output and create a variable from it

I use elb_application_lb_info module to get info about my application load balancer. Here is the code I am using for it:
- name: Test playbook
hosts: tag_elastic_role_logstash
vars:
aws_access_key: AKIARWXXVHXJS5BOIQ6P
aws_secret_key: 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
tasks:
- name: Gather information about all ELBs
elb_application_lb_info:
aws_access_key: AKIXXXXXXXXXXXXXXXXXXX
aws_secret_key: gG6a586XXXXXXXXXXXXXXXXXX
region: ap-southeast-2
names:
- LoadBalancer
register: albinfo
- debug:
msg: "{{ albinfo }}"
This is working fine and I got the following output:
"load_balancers": [
{
"idle_timeout_timeout_seconds": "60",
"routing_http2_enabled": "true",
"created_time": "2021-01-26T23:58:27.890000+00:00",
"access_logs_s3_prefix": "",
"security_groups": [
"sg-094c894246db1bd92"
],
"waf_fail_open_enabled": "false",
"availability_zones": [
{
"subnet_id": "subnet-0195c9c0df024d221",
"zone_name": "ap-southeast-2b",
"load_balancer_addresses": []
},
{
"subnet_id": "subnet-071060fde585476e0",
"zone_name": "ap-southeast-2c",
"load_balancer_addresses": []
},
{
"subnet_id": "subnet-0d5f856afab8f0eec",
"zone_name": "ap-southeast-2a",
"load_balancer_addresses": []
}
],
"access_logs_s3_bucket": "",
"deletion_protection_enabled": "false",
"load_balancer_name": "LoadBalancer",
"state": {
"code": "active"
},
"scheme": "internet-facing",
"type": "application",
"load_balancer_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:loadbalancer/app/LoadBalancer/27cfc970d48501fd",
"access_logs_s3_enabled": "false",
"tags": {
"Name": "loadbalancer_test",
"srg:function": "Storage",
"srg:owner": "ISCloudPlatforms#superretailgroup.com",
"srg:cost-centre": "G110",
"srg:managed-by": "ISCloudPlatforms#superretailgroup.com",
"srg:environment": "TST"
},
"routing_http_desync_mitigation_mode": "defensive",
"canonical_hosted_zone_id": "Z1GM3OXH4ZPM65",
"dns_name": "LoadBalancer-203283612.ap-southeast-2.elb.amazonaws.com",
"ip_address_type": "ipv4",
"listeners": [
{
"default_actions": [
{
"target_group_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:targetgroup/test-ALBID-W04X8DBT450Q/c999ac1cda7b1d4a",
"type": "forward",
"forward_config": {
"target_group_stickiness_config": {
"enabled": false
},
"target_groups": [
{
"target_group_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:targetgroup/test-ALBID-W04X8DBT450Q/c999ac1cda7b1d4a",
"weight": 1
}
]
}
}
],
"protocol": "HTTP",
"rules": [
{
"priority": "default",
"is_default": true,
"rule_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:listener-rule/app/LoadBalancer/27cfc970d48501fd/671ad3428c35c834/5b5953a49a886c03",
"conditions": [],
"actions": [
{
"target_group_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:targetgroup/test-ALBID-W04X8DBT450Q/c999ac1cda7b1d4a",
"type": "forward",
"forward_config": {
"target_group_stickiness_config": {
"enabled": false
},
"target_groups": [
{
"target_group_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:targetgroup/test-ALBID-W04X8DBT450Q/c999ac1cda7b1d4a",
"weight": 1
}
]
}
}
]
}
],
"listener_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:listener/app/LoadBalancer/27cfc970d48501fd/671ad3428c35c834",
"load_balancer_arn": "arn:aws:elasticloadbalancing:ap-southeast-2:117557247443:loadbalancer/app/LoadBalancer/27cfc970d48501fd",
"port": 9200
}
],
"vpc_id": "vpc-0016dcdf5abe4fef0",
"routing_http_drop_invalid_header_fields_enabled": "false"
}
]
I need to fetch "dns_name" which is dns name of the load balancer and pass it in another play as a variable.
I tried with json_query but got the error. Here is the code:
- name: save the Json data to a Variable as a Fact
set_fact:
jsondata: "{{ albinfo.stdout | from_json }}"
- name: Get ALB dns name
set_fact:
dns_name: "{{ jsondata | json_query(jmesquery) }}"
vars:
jmesquery: 'load_balancers.dns_name'
- debug:
msg: "{{ dns_name }}"
And here is the error:
"msg": "The task includes an option with an undefined variable. The error was: Unable to look up a name or access an attribute in template string ({{ albinfo.stdout | from_json }}).\nMake sure your variable name does not contain invalid characters like '-': the JSON object must be str, bytes or bytearray
Any idea how to extract "dns_name" from the json above?
Here is the way to get the dns_name from above json output:
- name: Get Application Load Balancer DNS Name
set_fact:
rezultat: "{{ albinfo | json_query('load_balancers[*].dns_name') }}"
- debug:
msg: "{{ rezultat }}"

Using item in Ansible json_query

I'm trying to loop through a list of keys to grab associated names from some json:
- name: show names
debug:
msg: "{{ data.json | json_query(query) }}"
vars:
query: "[? key==item].name"
with_items: "{{ keys.split() }}"
But it never displays properly when I try to run it. The keys are correct, but no data is returned:
TASK [get_help_on_SO: show]
ok: [localhost] => (item=Key1) => {
"msg": []
}
ok: [localhost] => (item=Key2) => {
"msg": []
}
Manually putting in the code works just fine so my query syntax seems to be right:
query: "[? key==`Key1`].name"
TASK [get_help_on_SO : show]
ok: [localhost] => (item=Key1) => {
"msg": [
"FooBar 1"
]
}
ok: [localhost] => (item=Key2) => {
"msg": [
"FooBar 1"
]
}
How can I properly pass the item into the json_query?
You didn't surround the item variable with any Jinja delimiters, so it is not interpreted.
You end testing if the key is equal to the string 'item' and not to the string stored in the variable item.
- name: show names
debug:
msg: "{{ data.json | json_query(query) }}"
vars:
query: "[?key==`{{ item }}`].name"
with_items: "{{ keys.split() }}"
Given the data
keys: 'key1 key3'
data:
json: [{
"key": "key1",
"name": "name1"
},
{
"key": "key2",
"name": "name2"
},
{
"key": "key3",
"name": "name3"
}
]
the expected result is
- name1
- name3
It's possible to avoid both the loop and json_query and simplify the solution. The task below
- name: show names
debug:
msg: "{{ data.json|
selectattr('key', 'in', my_keys)|
map(attribute='name')|
list }}"
vars:
my_keys: "{{ keys.split() }}"
gives
msg:
- name1
- name3