Using Key(#) function to extract keys from an object in Ansible - json

I have a file file.sub which contains this JSON object {"kas_sub.test1": "true", "kas_sub.test2": "true"}. I would extract the keys and to get this: kas_sub.test1 kas_sub.test1.
When i try
- shell: 'cat path/to/file.sub'
register: file1
- debug:
var: file1.stdout_lines
I got:
TASK [shell] *****************************************************************************************************************
changed: [ansible4]
changed: [control]
TASK [debug] *****************************************************************************************************************
ok: [control] => {
"file1.stdout_lines": [
"{\"kas_sub.tes1\": \"true\", \"kas_sub.test2\": \"true\"}"
]
}
So it's not conserving the same JSON format because i would use the json_query filter.
- debug:
msg: "{{ file1.stdout_lines| json_query(value1)}}"
vars:
value1: "#[?keys(#)]"
keys(#)function doesn't return anything
ok: [control] => {
"msg": ""
}

note: taking for granted you want to read a file on the target machine
In a nutshell:
- hosts: your_group
gather_facts: false
vars:
file_to_read: /path/to/file.sub
tasks:
- name: slurp file content from target
slurp:
src: "{{ file_to_read }}"
register: slurped_file
- name: display keys from json inside file
debug:
msg: "{{ (slurped_file.content | b64decode | from_json).keys() }}"

Given the file
shell> cat /tmp/file.sub
{"kas_sub.test1": "true", "kas_sub.test2": "true"}
Use jq (if you can). For example, get the keys
- command: jq 'keys' /tmp/file.sub
register: result
and convert them to a list
keys: "{{ result.stdout|from_yaml }}"
gives
keys:
- kas_sub.test1
- kas_sub.test2
Example of a complete playbook
- hosts: localhost
vars:
keys: "{{ result.stdout|from_yaml }}"
tasks:
- command: jq 'keys' /tmp/file.sub
register: result
- debug:
var: keys

Related

Syntax to check if var

I have a playbook to get all disks letter configured on my server and I need a task to verify if extra var letter is on the list.
For example I need to check if "F" is on the json data below.
Could you please help me on the best best syntax?
Thanks
{
"disks_drives_letter": [
[
"C"
],
[
"D"
],
[
"E"
],
[]
]
}
You can use setup module to get your host information like disks. For more information about the setup module https://docs.ansible.com/ansible/latest/collections/ansible/builtin/setup_module.html
Example of playbook:
- hosts: localhost
tasks:
gather_facts: false
vars:
my_disk_drives: ['sda', 'sdb']
tasks:
- name: Collect host hardware information
setup:
gather_subset:
- hardware
- name: Output if disk exist
debug:
msg: "{{ item }} exists"
loop: "{{ my_disk_drives }}"
when: item in hostvars[inventory_hostname].ansible_devices.keys() | list
- name: Output if disks does not exist
debug:
msg: "{{ item }} does not exist"
loop: "{{ my_disk_drives }}"
when: not item in hostvars[inventory_hostname].ansible_devices.keys() | list
Output:
TASK [Output if disk exist]
ok: [localhost] => (item=sda) => {
"msg": "sda exists"
}
skipping: [localhost] => (item=sdb)
TASK [Output if disks does not exist]
skipping: [localhost] => (item=sda)
ok: [localhost] => (item=sdb) => {
"msg": "sdb does not exist"
}
Use filters intersect and difference, and declare the lists
my_disks_exist: "{{ ansible_devices.keys()|intersect(my_disks) }}"
my_disks_not_exist: "{{ my_disks|difference(my_disks_exist) }}"
Example of a complete playbook for testing
- hosts: localhost
vars:
my_disks: [sda, sdb, sdc]
my_disks_exist: "{{ ansible_devices.keys()|intersect(my_disks) }}"
my_disks_not_exist: "{{ my_disks|difference(my_disks_exist) }}"
tasks:
- setup:
gather_subset: devices
- debug:
var: ansible_devices.keys()
- debug:
var: my_disks_exist
- debug:
var: my_disks_not_exist
| flatten help me thanks #vladimir-botka
- name: Get all disks letter from the disks infos
set_fact:
disks_drives_letters: "{{ win_disk_facts | json_query(query) | flatten }}"
- name: Check if disk_letter is used on server fail:
msg: "The disk letter already exist on the VM" when: '"{{ drive_letter }}" in "{{ disks_drives_letters}}"'

Ansible reading nested json values and matching variable

I am using this in an Ansible playbook:
- name: Gather info from Vcenter
vmware_vm_info:
hostname: "{{ result_item.vcenter }}"
username: "{{ ansible_username }}"
password: "{{ ansible_password }}"
validate_certs: no
register: vminfo
loop: "{{ result.list }}"
loop_control:
loop_var: result_item
I loop through a csv which has a list of VMs and their Vcenters. The json output from the Ansible task is this:
{
"results": [
{
"changed": false,
"virtual_machines": [
{
"guest_name": "Server1",
"guest_fullname": "SUSE Linux Enterprise 11 (64-bit)",
"power_state": "poweredOn",
},
{
"guest_name": "Server2",
"guest_fullname": "FreeBSD Pre-11 versions (64-bit)",
"power_state": "poweredOn",
},
Now I need to query this output for the VMs in my csv (guest_name matches vmname) and use set_fact to indicate whether the VMs in the csv are poweredOff or poweredOn. Next I can use it as a conditional on whether to power off the VM or not based on its current status.
I can't seem to get the json_query to work when matching to the VM name in the csv to the json output and then getting the corresponding power status. Any ideas?
CSV file:
vmname vcenter
Server1 Vcenter1
Server2 Vcenter1
Q: "set_fact to indicate whether the VMs in the CSV are powered off or powered on."
A: For example
- read_csv:
path: servers.csv
dialect: excel-tab
register: result
- set_fact:
servers: "{{ result.list|map(attribute='vmname')|list }}"
- set_fact:
virtual_machines: "{{ virtual_machines|default([]) +
[dict(_servers|zip(_values))] }}"
loop: "{{ vminfo.results }}"
vars:
_servers: "{{ servers|intersect(_dict.keys()|list) }}"
_values: "{{ _servers|map('extract',_dict)|list }}"
_dict: "{{ item.virtual_machines|
items2dict(key_name='guest_name', value_name='power_state') }}"
- debug:
var: virtual_machines
gives
virtual_machines:
- Server1: poweredOn
Server2: poweredOn
Servers missing in the vminfo.results will be silently ignored.
Q: "Use it as a conditional on whether to power off the VM or not."
A: For example Server1 in the first host
- debug:
msg: "Host={{ _host }} VM={{ _vm }} is poweredOn"
when: virtual_machines[_host][_vm] == 'poweredOn'
vars:
_host: 0
_vm: Server1
gives
msg: Host=0 VM=Server1 is poweredOn
I suppose, from your your example that you do have a TSV, so a tab separated values and not a CSV, which stands for comma separated values.
Based on this, the read_csv module, along with the dialect: excel-tab will help you read your TSV.
Then, you will need to use a filter projection to query the JSON based on the data in your TSV file.
You could also need to flatten the projection to get rid of the doubles list created by both the list in results and in virtual_machines.
An example of the resulting JMESPath query, for the Server1 ends up being:
results[].virtual_machines[?
guest_name == `Server1`
]|[]|[0].power_state
Then with all this in a playbook we do end up with:
- hosts: localhost
gather_facts: no
tasks:
- read_csv:
path: servers.csv
dialect: excel-tab
register: servers
- debug:
msg: >-
For {{ item.vmname }}, the state is {{
vminfo |
json_query(
'results[].virtual_machines[?
guest_name == `' ~ item.vmname ~ '`
]|[]|[0].power_state'
)
}}
loop: "{{ servers.list }}"
loop_control:
label: "{{ item.vmname }}"
vars:
vminfo:
results:
- changed: false
virtual_machines:
- guest_name: Server1
guest_fullname: SUSE Linux Enterprise 11 (64-bit)
power_state: poweredOn
- guest_name: Server2
guest_fullname: FreeBSD Pre-11 versions (64-bit)
power_state: poweredOn
Which yields the recap:
PLAY [localhost] **************************************************************************************************
TASK [read_csv] ***************************************************************************************************
ok: [localhost]
TASK [debug] ******************************************************************************************************
ok: [localhost] => (item=Server1) =>
msg: For Server1, the state is poweredOn
ok: [localhost] => (item=Server2) =>
msg: For Server2, the state is poweredOn
PLAY RECAP ********************************************************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

Ansible Extract JSON Tag

I'm trying to work with Infoblox API, and it's responses. I would need to extract values of tags from the response, that seems to be in JSON format, but I cannot find the way to do it.
Here is my playbook:
- name: "Checking _node_exporter Service Record for {{ inventory_hostname }}"
local_action:
module: uri
url: "{{ infobloxapiurl }}record:srv?name=_node_exporter.domain.com&target={{ inventory_hostname }}"
force_basic_auth: yes
user: "{{ infobloxuser }}"
password: "{{ infobloxpassword }}"
validate_certs: no
return_content: yes
register: _infoblox_results
- debug:
var: _infoblox_results.json
The _infoblox_results.json variable looks like this:
TASK [prometheus : debug] *******************************************************************************************************************************************************************************************
task path: /ansible/roles/tasks/task.yml:38
ok: [server.domain.com] => {
"_infoblox_results.json": [
{
"_ref": "record:srv/ZG5zLmJpbmRfc3J2JC5fZGVmYXVsdC5jb20udmNpbnQuZXcxL19ub2RlX2V4cG9ydGVyLzAvMC85MTAwL3Zhcm5pc2g3MDJ0c3QuZXcxLnZjaW50LmNvbQ:_node_exporter.domain.com/default",
"name": "_node_exporter.domain.com",
"port": 9100,
"priority": 0,
"target": "server.domain.com",
"view": "default",
"weight": 0
}
]
}
I want to use the data of _ref from _infoblox_results.json, but I wasn't able to extract it with regex_replace (it drops back the full _infoblox_results.json):
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json | regex_replace('record:srv.*\\/default,', '\\1') }}"
- debug:
var: _rcdid
when: _infoblox_results.json != []
Neither with json_query (it drops back nothing):
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json | json_query('_ref') }}"
- debug:
var: _rcdid
when: _infoblox_results.json != []
Can someone please point me into the right direction?
You have already an object in the memory, so simply refer to its value: _infoblox_results.json[0]._ref contains the string record:srv/ZG5zLmJpbmRfc3J2JC5fZGVmYXVsdC5jb20udmNpbnQuZXcxL19ub2RlX2V4cG9ydGVyLzAvMC85MTAwL3Zhcm5pc2g3MDJ0c3QuZXcxLnZjaW50LmNvbQ:_node_exporter.domain.com/default.
With that you can split the string and select the second element:
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json[0]._ref.split('/')[1] }}"

Add a new key-value to a json file using Ansible

I'm using Ansible to automate some configuration steps for my application VM, but having difficult to insert a new key-value to an existing json file on the remote host.
Say I have this json file:
{
"foo": "bar"
}
And I want to insert a new key value pair to make the file become:
{
"foo": "bar",
"hello": "world"
}
Since json format is not line based, I'm excluding lineinfile module from my options. Also, I would prefer not to use any external modules. Google keeps giving me examples to show how to read json file, but nothing about change json values and write them back to file. Really appreciate your help please!
since the file is of json format, you could import the file to a variable, append the extra key:value pairs you want, and then write back to the filesystem.
here is a way to do it:
---
- hosts: localhost
connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
include_vars:
file: /tmp/var.json
name: imported_var
- debug:
var: imported_var
- name: append more key/values
set_fact:
imported_var: "{{ imported_var | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
UPDATE:
as OP updated, the code should work towards remote host, in this case we cant use included_vars or lookups. We could use the slurp module.
NEW code for remote hosts:
---
- hosts: greenhat
# connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
slurp:
src: /tmp/var.json
register: imported_var
- debug:
msg: "{{ imported_var.content|b64decode|from_json }}"
- name: append more key/values
set_fact:
imported_var: "{{ imported_var.content|b64decode|from_json | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
hope it helps
ilias-sp's solution is great!
In my case, it lacked the case when we may have to create a base json file.
So I had to add this task in the beginning of the play:
- name: Ensure json file exists
copy:
content: "{}"
dest: /tmp/var.json
force: false
For people who are OK with custom ansible modules: https://github.com/ParticleDecay/ansible-jsonpatch works great!
With this you can simply do:
- name: append key/values
json_patch:
src: /tmp/var.json
operations:
- op: add
path: "/hello"
value: "world"
pretty: yes
create: yes
- name: update log
copy:
content: "{{ log | to_nice_json}}"
dest: "{{ log_file }}"
vars:
log: "{{ (lookup('file', log_file) | from_json) + ([{'job': (build_id if build_id != '' else 'dev'), 'keystore': ks, 'timestamp': ansible_date_time.iso8601}]) }}"
log_file: log/log.json
build_id: "{{ lookup('ENV', 'BUILD_ID') }}"
tags: log

How to use mongoDB collection output as variables in ansible

I can able to print the mongodb data using ansible. but here my requirement is to use the printed data as variables in ansible.
here is the output I'm getting ansible playbook output:
here is my ansible playbook.
---
- hosts: localhost
vars:
- i: "db.repo.find({ $and: [{'product': 'Admin'}, {'env':'SHK'}] }).pretty()"
tasks:
- name: Printing the retrieved data
command: mongo Advantage --quiet --eval "{{i}}"
register: temp
- name: Printing the retrieved data
set_fact:
"{{item}}"
with_items:
- [ "{{temp.stdout.split('\t')[0] }}", "{{temp.stdout.split('\t')[1] }}", "{{temp.stdout.split('\t')[2] }}", "{{temp.stdout.split('\t')[3] }}", "{{temp.stdout.split('\t')[4] }}", "{{temp.stdout.split('\t')[5] }}", "{{temp.stdout.split('\t')[6] }}", "{{temp.stdout.split('\t')[7] }}", "{{temp.stdout.split('\t')[8] }}", "{{temp.stdout.split('\t')[9] }}" ]
- include: /etc/ansible/roles/patchdeployment_3_11_2/tasks/applypatch/applypatch_windows_websphere.yml PR_ID={{PR_ID}}
#- include: /etc/ansible/roles/patchdeployment_3_11_2/tasks/applypatch/applypatch_linux_websphere.yml
please help me on the same.
Make mongo output pure JSON friendly by disabling all fields with custom types (like _id) in query.
Then use from_json Ansible filter to parse output.
- hosts: localhost
vars:
qry: "db.repo.findOne({ $and: [{'product': 'Admin'}, {'env':'SHK'}] },{_id:false})"
tasks:
- name: Get data
command: mongo Advantage --quiet --eval "{{qry}}"
register: temp
- name: Save parsed data
set_fact:
mongo_result: "{{ temp.stdout | from_json }}"
- name: Print some data
debug:
var: mongo_result.appName