Environment:
Ansible: 2.10.12
Python: 3.6.8
Jinja2: 3.0.1
Given an API response, I'm trying to grab the address, but getting "variable is not defined". Here's a playbook that has the API response as a var:
---
- name: test
gather_facts: no
hosts: localhost
vars:
pool_details: {
"allow": "",
"cache_control": "no-store, no-cache, must-revalidate",
"changed": false,
"connection": "close",
"content_length": "668",
"content_security_policy": "default-src 'self' 'unsafe-inline' 'unsafe-eval' data: blob:; img-src 'self' data: http://127.4.1.1 http://127.4.2.1",
"content_type": "application/json; charset=UTF-8",
"cookies": {},
"cookies_string": "",
"date": "Tue, 24 Aug 2021 16:34:47 GMT",
"elapsed": 0,
"expires": "-1",
"failed": false,
"json": {
"items": [
{
"address": "10.200.136.22",
"connectionLimit": 0,
"dynamicRatio": 1,
"ephemeral": "false",
"fqdn": {
"autopopulate": "disabled"
},
"fullPath": "/Common/sensor01:443",
"generation": 103,
"inheritProfile": "enabled",
"kind": "tm:ltm:pool:members:membersstate",
"logging": "disabled",
"monitor": "default",
"name": "sensor01:443",
"partition": "Common",
"priorityGroup": 0,
"rateLimit": "disabled",
"ratio": 1,
"selfLink": "https://localhost/mgmt/tm/ltm/pool/~Common~sensor01/members/~Common~sensor01:443?ver=16.1.0",
"session": "monitor-enabled",
"state": "up"
}
],
"kind": "tm:ltm:pool:members:memberscollectionstate",
"selfLink": "https://localhost/mgmt/tm/ltm/pool/~Common~sensor01/members?ver=16.1.0"
},
"msg": "OK (668 bytes)",
"pragma": "no-cache",
"redirected": false,
"server": "Jetty(9.2.22.v20170606)",
"status": 200,
"strict_transport_security": "max-age=16070400; includeSubDomains",
"x_content_type_options": "nosniff",
"x_frame_options": "SAMEORIGIN",
"x_xss_protection": "1; mode=block"
}
tasks:
- debug:
var: pool_details
- debug:
var: pool_details.json.items[0].address
The full error, with verbosity enabled is:
pool_details.json.items[0].address: 'VARIABLE IS NOT DEFINED!: builtin_function_or_method object has no element 0'
My second debug to grab the address isn't working ("not defined"), & I can't figure out why. If I take the same json, & pipe it to jq, it works just fine:
pbpaste | jq '.json.items[0].address'
"10.200.136.22"
items is a Python dictionary method.
To return the value to the key "items":
- debug:
var: pool_details.json['items'][0].address
If there are more items in the list you might want to use json_query, e.g.
- set_fact:
list_address: "{{ pool_details.json['items']|
json_query('[].address') }}"
gives
list_address:
- 10.200.136.22
If you don't want to, or can't, install JmesPath, the next option would be map attribute, e.g. the code below gives the same result
- set_fact:
list_address: "{{ pool_details.json['items']|
map(attribute='address')|
list }}"
Related
I am trying to use the ansible.builtin.lookup plugin in order to read a JSON file from the local directory and then pass it as the payload to the ansible.builtin.uri module to send a POST message to a URI endpoint.
Following are the contents of my JSON file (config.json):
{
"Configuration": {
"Components": [
{
"Name": "A",
"Attributes": [
{
"Name": "A1",
"Value": "1",
"Set On Import": "True",
"Comment": "Read and Write"
},
{
"Name": "A2",
"Value": "2",
"Set On Import": "True",
"Comment": "Read and Write"
}
]
}
]
}
}
I need to send the above JSON content as the below string in the payload to ansible.builtin.uri module:
"{\"Configuration\": {\"Components\": [{\"Name\": \"A\", \"Attributes\": [{\"Name\": \"A1\", \"Value\": \"1\", \"Set On Import\": \"True\", \"Comment\": \"Read and Write\"}, {\"Name\": \"A2\", \"Value\": \"2\", \"Set On Import\": \"True\", \"Comment\": \"Read and Write\"}]}]}}"
I am trying to use the lookup plugin with the to_json filter to read and format the JSON content. Following is my playbook:
- name: import scp
ansible.builtin.uri:
url: "https://{{ inventory_hostname }}/api/config/actions/import"
user: "{{ user }}"
password: "{{ password }}"
method: POST
headers:
Accept: "application/json"
Content-Type: "application/json"
body:
Parameters:
Type: "LOCAL_FILE"
Target: "ALL"
IgnoreCertificateWarning: "Enabled"
Buffer: "{{ lookup('file', 'config.json') | to_json }}"
body_format: json
status_code: 202
validate_certs: no
force_basic_auth: yes
However, the uri module double escapes all the new-line and tab characters. Following is the how the payload is sent when I run the playbook:
"invocation": {
"module_args": {
"attributes": null,
"body": {
"Buffer": "\"{\\n\\t\\\"Configuration\\\": {\\n\\t\\t\\\"Components\\\": [\\n\\t\\t\\t{\\n\\t\\t\\t\\t\\\"Name\\\": \\\"A\\\",\\n\\t\\t\\t\\t\\\"Attributes\\\": [\\n\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\\"Name\\\": \\\"A1\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Value\\\": \\\"1\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Set On Import\\\": \\\"True\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Comment\\\": \\\"Read and Write\\\"\\n\\t\\t\\t\\t\\t},\\n\\t\\t\\t\\t\\t{\\n\\t\\t\\t\\t\\t\\t\\\"Name\\\": \\\"A2\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Value\\\": \\\"2\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Set On Import\\\": \\\"True\\\",\\n\\t\\t\\t\\t\\t\\t\\\"Comment\\\": \\\"Read and Write\\\"\\n\\t\\t\\t\\t\\t}\\n\\t\\t\\t\\t]\\n\\t\\t\\t}\\n\\t\\t]\\n\\t}\\n}\"",
"Parameters": {
"IgnoreCertificateWarning": "Enabled",
"Type": "LOCAL_FILE",
"Target": "ALL"
},
},
"body_format": "json",
...
},
Could you please let me know how I can format the payload with uri module? Appreciate any help.
Edited (5/11/2021):
I made the changes as suggested by #mdaniel in his response and used string filter instead of to_json. With the suggested change, I can see the JSON being formatted properly into a string with newline ('\n') and tab ('\t') characters. I tried to use the replace filter to remove the \n and \t characters. However, now the whole string is converted back into the JSON.
Following is the playbook and the output when using the string filter alone:
...
body:
Parameters:
Type: "LOCAL_FILE"
Target: "ALL"
IgnoreCertificateWarning: "Enabled"
Buffer: "{{ lookup('file', 'config.json') | string }}"
$ ansible-playbook import_file.yml -i hosts --tags
...
"body": {
"HostPowerState": "On",
"Buffer": "{\n\t\"Configuration\": {\n\t\t\"Components\": [\n\t\t\t{\n\t\t\t\t\"Name\": \"A\",\n\t\t\t\t\"Attributes\": [\n\t\t\t\t\t{\n\t\t\t\t\t\t\"Name\": \"A1\",\n\t\t\t\t\t\t\"Value\": \"1\",\n\t\t\t\t\t\t\"Set On Import\": \"True\",\n\t\t\t\t\t\t\"Comment\": \"Read and Write\"\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t\"Name\": \"A2\",\n\t\t\t\t\t\t\"Value\": \"2\",\n\t\t\t\t\t\t\"Set On Import\": \"True\",\n\t\t\t\t\t\t\"Comment\": \"Read and Write\"\n\t\t\t\t\t}\n\t\t\t\t]\n\t\t\t}\n\t\t]\n\t}\n}",
"Parameters": {
"IgnoreCertificateWarning": "Enabled",
"Type": "LOCAL_FILE",
"Target": "ALL"
},
},
Following is the playbook and the output when using replace filter in conjunction with string filter:
...
body:
Parameters:
Type: "LOCAL_FILE"
Target: "ALL"
IgnoreCertificateWarning: "Enabled"
Buffer: "{{ lookup('file', 'config.json') | string | replace('\n', '') | replace('\t', '') }}"
...
$ ansible-playbook import_file.yml -i hosts --tags
...
"body": {
"Buffer": {
"Configuration": {
"Components": [
{
"Attributes": [
{
"Comment": "Read and Write",
"Name": "A1",
"Set On Import": "True",
"Value": "1"
},
{
"Comment": "Read and Write",
"Name": "A2",
"Set On Import": "True",
"Value": "2"
}
],
"Name": "A"
}
]
}
},
"Parameters": {
"IgnoreCertificateWarning": "Enabled",
"Type": "LOCAL_FILE",
"Target": "ALL"
},
},
...
Any pointers on how I remove the \n and \t characters from the string?
You have used to_json on a dict value that is, itself, going to be to_json-ed; ansible cannot transmit a python dict over HTTP, so any yaml structure that is not already a string needs to be converted into one first
What you'll want is just that lookup result (which will return a str, not a dict) and then ansible will apply to_json to the whole body: value for the aforementioned reason
However, because ansible is trying to be "helpful", it will auto-coerce a yaml value that it finds starting with { back into a dict -- that's why you just need to send the result of lookup through the | string filter to reinforce to ansible that yes, you really do want it to remain a str in that context
...
body:
Parameters:
Type: "LOCAL_FILE"
Target: "ALL"
IgnoreCertificateWarning: "Enabled"
Buffer: "{{ lookup('file', 'config.json') | string }}"
updated answer approach
In light of the comment discussion that the dict coercion was continuing to be a problem, and the leading space concerned the OP, the alternative approach is to build up the actual payload structure completely, and only "JSON-ify" it before transmission, to keep ansible and jinja on the same page about the data types:
- name: import scp
vars:
body_dict:
Parameters:
Type: "LOCAL_FILE"
Target: "ALL"
IgnoreCertificateWarning: "Enabled"
# this will be filled in before submission
# Buffer:
whitespace_free_config_json: >-
{{ lookup('file', 'config.json')
| regex_replace('[\t\n]', '')
| string
}}
ansible.builtin.uri:
...
body: >-
{{ body_dict
| combine({"Buffer": whitespace_free_config_json})
| to_json }}
body_format: json
status_code: 202
ill have a role playbook which get json from gitlab
---
- name: Make an API call
uri:
method: GET
url: "https://gitlab.example.com/api/v4/projects/***/repository/files/Dev/raw?ref=master"
headers:
PRIVATE-TOKEN: **************
register: json_var
- name: show json
debug:
msg: "{{json_var}}"
- name: test
debug:
var: json_var.json.plannedrelease
register: release
- name: debug
debug:
msg: "{{ release }}"
but cant get json value to variable, i need only version "1.0" in variable release (from "plannedrelease" : "1.0"), how can i filter it?
Playbook output is:
PLAY [127.0.0.1] ***************************************************************
TASK [Gathering Facts] *********************************************************
ok: [127.0.0.1]
TASK [get_contour_version : Make an API call] **********************************
ok: [127.0.0.1]
TASK [get_contour_version : show json] *****************************************
ok: [127.0.0.1] => {
"msg": {
"cache_control": "max-age=0, private, must-revalidate, no-store, no-cache",
"changed": false,
"connection": "close",
"content_disposition": "inline; filename=\"Dev\"; filename*=UTF-8''Dev",
"content_length": "402",
"content_type": "text/plain; charset=utf-8",
"cookies": {},
"cookies_string": "",
"date": "Tue, 19 Jan 2021 19:42:33 GMT",
"elapsed": 0,
"expires": "Fri, 01 Jan 1990 00:00:00 GMT",
"failed": false,
"json": {
"Created": "12/06/2020 10:11",
"Key": "123",
"Updated": "01/12/2020 11:51",
"contour": "Dev",
"plannedrelease": "1.0",
"…
TASK [get_contour_version : test] **********************************************
ok: [127.0.0.1] => {
"json_var.json.plannedrelease": "1.0"
}
TASK [get_contour_version : debug] *********************************************
ok: [127.0.0.1] => {
"msg": {
"changed": false,
"failed": false,
"json_var.json.plannedrelease": "1.0"
}
}
PLAY RECAP *********************************************************************
127.0.0.1 : ok=5 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
probably i used wrong method for filtering
Have you tried https://docs.ansible.com/ansible/latest/collections/ansible/builtin/set_fact_module.html ? Reading the documentation the following might work:
- name: Save release var
set_fact:
release: "{{ json_var.json.plannedrelease }}"
I query the Cisco ACI to attain the advanced vmm provider details for a specific EPG.
The result is successful.
I then register the result to a variable.
I try to search that variable and obtain\extract a single specific piece of information such as 'dn' or 'encap' as this would allow me to use the information in other plays.
Unfortunately i'm unable to extract the information as the result comes back in an usual format. Looking at a debug on the register variable, it would appear it's a dictionary variable but no matter what I try the only item i'm able to access is the 'current' item.
All other items are not registered as dictionary items.
I have tried to change the variable to a list but still i'm unable to attain the information I require.
I've searched forums to see if the there is a methodology to convert the variable from a json result or dictionary variable to a string and then grep for the information but no success.
Ideally I would like to extract the information without installing additional 'apps'.
Will be very grateful if someone can advise how to search for a specific result from an irregular nested result which doesn't list the items in a correct dictionary format.
- name: Access VMM provider Information
hosts: apics
gather_facts: false
connection: local
#
vars:
ansible_python_interpreter: /usr/bin/python3
#
tasks:
- name: Play 1 Obtain VMM Provider Information
aci_epg_to_domain:
hostname: "{{ apics.hostname }}"
username: "{{ apics.username }}"
password: "{{ apics.password }}"
tenant: Tenant_A
ap: AP_Test
epg: EPG_Test
domain: DVS_Dell
domain_type: vmm
vm_provider: vmware
state: query
validate_certs: no
register: DVS_Result
#
- set_fact:
aci_result1: "{{ DVS_Result.current }}"
- set_fact:
aci_result2: "{{ DVS_Result.fvRsDomAtt.attributes.dn }}"
#
- debug:
msg: "{{ DVS_Result }}"
- debug:
var=aci_result1
- debug:
var=aci_result2
DVS_Result
ok: [apic1r] => {
"msg": {
"changed": false,
"current": [
{
"fvRsDomAtt": {
"attributes": {
"annotation": "",
"bindingType": "none",
"childAction": "",
"classPref": "encap",
"configIssues": "",
"delimiter": "",
"dn": "uni/tn-TN_prod/ap-AP_Test/epg-EPG_Test/rsdomAtt-[uni/vmmp-VMware/dom-DVS_Dell]",
"encap": "unknown",
"encapMode": "auto",
"epgCos": "Cos0",
"epgCosPref": "disabled",
"extMngdBy": "",
"forceResolve": "yes",
"instrImedcy": "lazy",
"lagPolicyName": "",
"lcOwn": "local",
"modTs": "2019-08-18T20:52:13.570+00:00",
"mode": "default",
"monPolDn": "uni/tn-common/monepg-default",
"netflowDir": "both",
"netflowPref": "disabled",
"numPorts": "0",
"portAllocation": "none",
"primaryEncap": "unknown",
"primaryEncapInner": "unknown",
"rType": "mo",
"resImedcy": "lazy",
"secondaryEncapInner": "unknown",
"state": "missing-target",
"stateQual": "none",
"status": "",
"switchingMode": "native",
"tCl": "infraDomP",
"tDn": "uni/vmmp-VMware/dom-DVS_Dell",
"tType": "mo",
"triggerSt": "triggerable",
"txId": "8646911284551354729",
"uid": "15374"
}
}
}
],
"failed": false
}
}
######################################
### aci_result1
ok: [apic1r] => {
"aci_result1": [
{
"fvRsDomAtt": {
"attributes": {
"annotation": "",
"bindingType": "none",
"childAction": "",
"classPref": "encap",
"configIssues": "",
"delimiter": "",
"dn": "uni/tn-TN_prod/ap-AP_Test/epg-EPG_Test/rsdomAtt-[uni/vmmp-VMware/dom-DVS_Dell]",
"encap": "unknown",
"encapMode": "auto",
"epgCos": "Cos0",
"epgCosPref": "disabled",
"extMngdBy": "",
"forceResolve": "yes",
"instrImedcy": "lazy",
"lagPolicyName": "",
"lcOwn": "local",
"modTs": "2019-08-18T20:52:13.570+00:00",
"mode": "default",
"monPolDn": "uni/tn-common/monepg-default",
"netflowDir": "both",
"netflowPref": "disabled",
"numPorts": "0",
"portAllocation": "none",
"primaryEncap": "unknown",
"primaryEncapInner": "unknown",
"rType": "mo",
"resImedcy": "lazy",
"secondaryEncapInner": "unknown",
"state": "missing-target",
"stateQual": "none",
"status": "",
"switchingMode": "native",
"tCl": "infraDomP",
"tDn": "uni/vmmp-VMware/dom-DVS_Dell",
"tType": "mo",
"triggerSt": "triggerable",
"txId": "8646911284551354729",
"uid": "15374"
}
}
}
]
}
############################################
### aci_result2
fatal: [apic1r]: FAILED! => {"msg": "The task includes an option with an undefined variable. The error was: 'dict object' has no attribute 'fvRsDomAtt'\n\nThe error appears to be in '/etc/ansible/playbooks/cisco/aci/create_bd_ap_epg3.yml': line 37, column 8, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - set_fact:\n ^ here\n"}
Use json_query. For example
- debug:
msg: "{{ DVS_Result.current|
json_query('[].fvRsDomAtt.attributes.dn') }}"
I am trying to parse the json output of yum module, to conditionally get data.
My playbook looks like below:
---
- hosts: all
become: true
tasks:
- name: list ggk rpms
yum:
list: "{{ item }}"
register: ggk_njk_info
ignore_errors: yes
with_items:
- ggk_base
- njk_tt_client
- debug: msg="{{ item.results }}"
with_items: "{{ ggk_njk_info.results }}"
when: item.results
The output for the debug task looks like below:
A part of the debug looks like below:
"msg": [
{
"arch": "noarch",
"envra": "0:njk_tt_client-2.36.11-1.noarch",
"epoch": "0",
"name": "njk_tt_client",
"release": "1",
"repo": "ggk_Software",
"version": "2.36.11",
"yumstate": "available"
},
{
"arch": "noarch",
"envra": "0:njk_tt_client-2.36.11-1.noarch",
"epoch": "0",
"name": "njk_tt_client",
"release": "1",
"repo": "installed",
"version": "2.36.11",
"yumstate": "installed"
},
{
"arch": "noarch",
"envra": "0:njk_tt_client-2.36.3-1.noarch",
"epoch": "0",
"name": "njk_tt_client",
"release": "1",
"repo": "ggk_Software",
"version": "2.36.3",
"yumstate": "available"
}
]
}
I would like to find the rpm "version" ONLY when its corresponding "yumstate" is "installed"
In this case i would like to be able to get the version for below:
"repo": "installed",
"version": "2.36.11",
json_query does the job. For example the task below
- debug:
msg: "{{ ggk_njk_info.results|
json_query('[?yumstate==`installed`].{repo: repo,
version: version}') }}"
gives
"msg": [
{
"repo": "installed",
"version": "2.36.11"
}
]
I have a task which performs a GET request to a page. The response's body is a JSON like the following.
{
"ips": [
{
"organization": "1233124121",
"reverse": null,
"id": "1321411312",
"server": {
"id": "1321411",
"name": "name1"
},
"address": "x.x.x.x"
},
{
"organization": "2398479823",
"reverse": null,
"id": "2418209841",
"server": {
"id": "234979823",
"name": "name2"
},
"address": "x.x.x.x"
}
]
}
I want to extract the fields id and address, and tried (for id field):
tasks:
- name: get request
uri:
url: "https://myurl.com/ips"
method: GET
return_content: yes
status_code: 200
headers:
Content-Type: "application/json"
X-Auth-Token: "0010101010"
body_format: json
register: json_response
- name: copy ip_json content into a file
copy: content={{json_response.json.ips.id}} dest="/dest_path/json_response.txt"
but I get this error:
the field 'args' has an invalid value, which appears to include a variable
that is undefined. The error was: 'list object' has no attribute 'id'..
Where is the problem?
The error was: 'list object' has no attribute 'id'
json_response.json.ips is a list.
You either need to choose one element (first?): json_response.json.ips[0].id.
Or process this list for example with map or json_query filters if you need all ids.
Ansible command to copy to file:
copy:
content: "{{ output.stdout[0] }}"
dest: "~/ansible/local/facts/{{ inventory_hostname }}.txt"