Ansible: extract values from nested dictionary object (JSON) - json

I am transforming existing installation shell scripts for one application to Ansible job. I am now stock on step where i have to find given set of files and later use them in other task with another properties in nested loop. Problem is I can't find the way how to transform find result to usable form.
Steps:
- name: Finds files to use
find:
paths: "{{ item }}"
file_type: file
use_regex: yes
recurse: yes
patterns:
- ".*\\.xml$"
- ".*\\.yml$"
- ".*\\.hcl$"
- ".*\\.json$"
with_items:
- /etc/<<some_folder>>
- /opt/<<some_folder>>/conf
- /opt/<<some_folder>>/x-cluster
- /opt/<<some_folder>>/config
- /opt/<<some_folder>>/x_worker/config
register: "findoutput"
- name: Replace var strings
replace:
path: "{{ item.0.path }}"
regexp: "{{ item.1 }}"
replace: "{{ item.2 }}"
with_nested:
- "{{ findoutput | **<<insert_magic>>** | list }}"
- "{{ replace_values | dictsort }}"
This approach keeps failing because I am recieving list of 5 values, each from one search iteration, which has nested all the found files and I haven't found way to access this for my use.
I will be glad for any help or point in the right direction.

After the find task, flatten find output and fetch only the path value like below:
- set_fact:
formatted_result: "{{ findoutput.results | json_query('[*].files[*].path') | list | flatten }}"
Then use formatted_result list on your next task to replace strings.

Related

Write dictionary into a file

I am trying to fetch a specific software version from my inventory, write them into a dictionary and write this dictionary at the end in a JSON, SCV or whatever kind of file.
I have the dictionary:
{
"hostvars | dict2items | json_query(query) | items2dict": {
"seccheck-node1.foo.bar": "6.0.3",
"seccheck-node2.foo.bar": "5.6.2"
}
}
That I get from:
- name: Assemble a new dict in one shot using dict2items + json_query + items2dict
run_once: yes
debug:
var:
hostvars | dict2items | json_query(query) | items2dict
vars:
query: '[*].{key: key, value: value.softwareversion.stdout }'
But I don't know, how I can put this dictionary now into a file. Do I have to give it a name?
- name: Make new file
copy:
dest: "/tmp/software-version.json"
content: "{{ query }}"
delegate_to: localhost

Ansible extract value from stdout in json

hello I have ansible output which is json format.
{
"1000":{
"name":1000,
"lan":"99",
"numMacs":0,
"numArpNd":0,
"numRemoteVteps":"n\/a",
"type":"L3",
"vrf":"default",
"isL3svd":false
}
}
like to extract value from the key name from stdout output. Following snippet of code I tried, but not getting the correct result. Any pointers?
- name: save the Json data to a Variable as a Fact
set_fact:
name: "{{ data | json_query(jmesquery) }}"
vars:
jmesquery: json.name
key 1000 not static can be dynamic.

transform values of yaml hash into keys of json hash in Ansible

I'm trying to get Ansible to convert an array of hashes, into to a list of key value pairs with the keys being one of the values from the first hash and the values being a different value from the first hash.
An example will help.
I want to convert :-
TASK [k8s_cluster : Cluster create | debug result of private ec2_vpc_subnet_facts] ***
ok: [localhost] => {
"result": {
"subnets": [
{
"availability_zone": "eu-west-1c",
"subnet_id": "subnet-cccccccc",
},
{
"availability_zone": "eu-west-1a",
"subnet_id": "subnet-aaaaaaaa",
},
{
"availability_zone": "eu-west-1b",
"subnet_id": "subnet-bbbbbbbb",
}
]
}
}
into
eu-west-1a: subnet-aaaaaaaa
eu-west-1b: subnet-bbbbbbbb
eu-west-1c: subnet-cccccccc
I've tried result.subnets | map('subnet.availability_zone': 'subnets.subnet_id') (which doesn't work at all) and json_query('subnets[*].subnet_id' which simply pickes out the subnet_id values and puts them into a list.
I think I could do this with Zip and Hash in Ruby but I don't know how to make this work in Ansible, or more specifically in Jmespath.
I have generated the below list I will add a new line to the generated list(thought to share this first)
---
- name: play
hosts: localhost
tasks:
- name: play
include_vars: vars.yml
- name: debug
debug:
msg: "{% for each in subnets %}{{ each.availability_zone }}:{{ each.subnet_id }}{% raw %},{% endraw %}{% endfor %}"
output --->
ok: [localhost] => {
"msg": "eu-west-1c:subnet-cccccccc,eu-west-1a:subnet-aaaaaaaa,eu-west-1b:subnet-bbbbbbbb,"
}
Jmespath does not allow to use dynamic names in multi select hashes. I have found an extension to jmespath allowing to do such thing by using key references, but it is not part of the plain jmespath implementation nor ansible.
To do this in plain ansible, you will have to create a new variable and populate it with a loop. There might be other ways using other filters but this is the solution I came up with:
- name: Create the expected hash
set_fact:
my_hash: >-
{{
my_hash
| default({})
| combine({ item.availability_zone: item.subnet_id })
}}
loop: "{{ subnets }}"
- name: Print result
debug:
var: my_hash

Ansible : pass a variable in a json_query filter

I need to pass a variable in a json_query filter.
This example, with a fixed string, is working correctly (string=tutu) :
- set_fact:
my_value_exist: "{{ my_json.json | json_query('contains(component.name,`tutu`)')}}"
But i need to pass a variable , instead of tutu
- set_fact:
my_value_exist: "{{ my_json.json | json_query('contains(component.name,`{{my_var}}`)')}}"
{{my_var}} is a string retreived in a previous step
Do you have the correct syntax, so that the variable {{my_var}} could be passed correctly in parameter ?
Thanks for your help.
Regards,
Use helper variable for a task:
- set_fact:
my_value_exist: "{{ my_json.json | json_query(qry) }}"
vars:
qry: 'contains(component.name,`{{my_var}}`'
If you would like to avoid using a helper var you can use the second var directly by wrapping it in escaped double quotes ( \" ) between plus characters ( + ) like this:
- set_fact:
my_value_exist: "{{ my_json.json | json_query('contains(component.name,`\" + my_var + \"`)') }}"
I know that this is a old question but it might help someone since this is the top result on the subject on google.

how can i map a list to a dict with computed values keyed by the list items in ansible?

apologies for the awkward title, but i couldn't figure out a better way to phrase what seems like a very common operation:
i have a list like
repos:
- nrser/x
- nrser/y
and want to transform it to a dict like
repos_dict:
nrser/x: nrser_x
nrser/y: nrser_y
this is super simple in python
repos = ['nrser/x', 'nrser/y']
repos_dict = dict((repo, repo.replace('/', '_')) for repo in repos)
# => {'nrser/x': 'nrser_x', 'nrser/y': 'nrser_y'}
but i can't figure out how to accomplish it with Ansible / Jinja2 (short of dropping into python via a module or plugin, but that seems ridiculous for such a basic use case).
it's easy to map the repos to a new list with the underscored names (i need to use them in file paths)
set_fact:
repo_filename_segments: "{{ repos | map('replace', '/', '_') | list }}"
but then i need zip them together, and i can't find support for that either (see ziplist1-list2-in-jinja2 and how-to-combine-two-lists-together)
i've tried:
- set_fact:
repos:
- beiarea/www
- beiarea/relib
- set_fact:
repos_dict: {}
- with_items: "{{ repos }}"
set_fact:
"repos_dict[{{ item }}]: "{{ item | replace('/', '_') }}"
but that doesn't work either.
maybe it's not possible in Ansible / Jinja, but it seems like a really elementary operation to have been overlooked.
thanks for any solutions or suggestions.
Ansible 2.3 adds (amongst others) a zip filter:
- set_fact:
repos:
- beiarea/www
- beiarea/relib
- set_fact:
dirnames: "{{ repos | zip(repos|map('replace', '/', '_')) | list }}"
- debug: var=dirnames
# "dirnames": [
# [
# "beiarea/www",
# "beiarea_www"
# ],
# [
# "beiarea/relib",
# "beiarea_relib"
# ]
# ]
- debug:
msg: "{{ item[0] }} => {{ item[1] }}"
with_items:
- "{{ dirnames }}" # with_items flattens the first level
I'm still looking for a good way to turn that into a dictionary, for easy lookups.