hello I have ansible output which is json format.
{
"1000":{
"name":1000,
"lan":"99",
"numMacs":0,
"numArpNd":0,
"numRemoteVteps":"n\/a",
"type":"L3",
"vrf":"default",
"isL3svd":false
}
}
like to extract value from the key name from stdout output. Following snippet of code I tried, but not getting the correct result. Any pointers?
- name: save the Json data to a Variable as a Fact
set_fact:
name: "{{ data | json_query(jmesquery) }}"
vars:
jmesquery: json.name
key 1000 not static can be dynamic.
Related
I am trying to fetch a specific software version from my inventory, write them into a dictionary and write this dictionary at the end in a JSON, SCV or whatever kind of file.
I have the dictionary:
{
"hostvars | dict2items | json_query(query) | items2dict": {
"seccheck-node1.foo.bar": "6.0.3",
"seccheck-node2.foo.bar": "5.6.2"
}
}
That I get from:
- name: Assemble a new dict in one shot using dict2items + json_query + items2dict
run_once: yes
debug:
var:
hostvars | dict2items | json_query(query) | items2dict
vars:
query: '[*].{key: key, value: value.softwareversion.stdout }'
But I don't know, how I can put this dictionary now into a file. Do I have to give it a name?
- name: Make new file
copy:
dest: "/tmp/software-version.json"
content: "{{ query }}"
delegate_to: localhost
I was running into this particularly painful Ansible task of:
Reading JSON from a file.
Passing the JSON as a string to helm, BUT not quoting it.
- name: deploy release
community.kubernetes.helm:
name: my_release
chart_ref: ./charts/my_chart
release_namespace: "{{namespace}}"
state: "{{state}}"
release_values:
x: "{{ lookup('file', './stuff.json') }}"
What I want the helm values file to look like is:
x: |
{ "hello": "world" }
The issue I ran into with the following lookup {{ lookup('file', './stuff.json') }} is that ansible will interpret it as a dict and pass the dict to helm. This does not work as I need a string. Here's what the output in the helm values file looks like:
x:
hello: world
Then I tried {{ lookup('file', './stuff.json') | quote}}. Ansible passes a string to helm, but that string has a quote around it. When I try to read the JSON in my deployment, I get a parse error. Here's what the output would look like:
x: '{ "hello": "world" }'
I even tried {{ lookup('file', './stuff.json') | to_json }}, as recommended here, but that failed as well.
Using {{ lookup('file', './stuff.json') | string }} will force Ansible to evaluate it as a string without adding quotes.
There are several examples in Using filters to manipulate data that use this filter.
Documentation for the filter can be found in the Jinja2 documentation. The documentation states that the filter will:
Make a string unicode if it isn’t already. That way a markup string is not converted back to unicode.
I'm not particularly sure why this corrects the issue, but it did.
I am transforming existing installation shell scripts for one application to Ansible job. I am now stock on step where i have to find given set of files and later use them in other task with another properties in nested loop. Problem is I can't find the way how to transform find result to usable form.
Steps:
- name: Finds files to use
find:
paths: "{{ item }}"
file_type: file
use_regex: yes
recurse: yes
patterns:
- ".*\\.xml$"
- ".*\\.yml$"
- ".*\\.hcl$"
- ".*\\.json$"
with_items:
- /etc/<<some_folder>>
- /opt/<<some_folder>>/conf
- /opt/<<some_folder>>/x-cluster
- /opt/<<some_folder>>/config
- /opt/<<some_folder>>/x_worker/config
register: "findoutput"
- name: Replace var strings
replace:
path: "{{ item.0.path }}"
regexp: "{{ item.1 }}"
replace: "{{ item.2 }}"
with_nested:
- "{{ findoutput | **<<insert_magic>>** | list }}"
- "{{ replace_values | dictsort }}"
This approach keeps failing because I am recieving list of 5 values, each from one search iteration, which has nested all the found files and I haven't found way to access this for my use.
I will be glad for any help or point in the right direction.
After the find task, flatten find output and fetch only the path value like below:
- set_fact:
formatted_result: "{{ findoutput.results | json_query('[*].files[*].path') | list | flatten }}"
Then use formatted_result list on your next task to replace strings.
I'm trying to get Ansible to convert an array of hashes, into to a list of key value pairs with the keys being one of the values from the first hash and the values being a different value from the first hash.
An example will help.
I want to convert :-
TASK [k8s_cluster : Cluster create | debug result of private ec2_vpc_subnet_facts] ***
ok: [localhost] => {
"result": {
"subnets": [
{
"availability_zone": "eu-west-1c",
"subnet_id": "subnet-cccccccc",
},
{
"availability_zone": "eu-west-1a",
"subnet_id": "subnet-aaaaaaaa",
},
{
"availability_zone": "eu-west-1b",
"subnet_id": "subnet-bbbbbbbb",
}
]
}
}
into
eu-west-1a: subnet-aaaaaaaa
eu-west-1b: subnet-bbbbbbbb
eu-west-1c: subnet-cccccccc
I've tried result.subnets | map('subnet.availability_zone': 'subnets.subnet_id') (which doesn't work at all) and json_query('subnets[*].subnet_id' which simply pickes out the subnet_id values and puts them into a list.
I think I could do this with Zip and Hash in Ruby but I don't know how to make this work in Ansible, or more specifically in Jmespath.
I have generated the below list I will add a new line to the generated list(thought to share this first)
---
- name: play
hosts: localhost
tasks:
- name: play
include_vars: vars.yml
- name: debug
debug:
msg: "{% for each in subnets %}{{ each.availability_zone }}:{{ each.subnet_id }}{% raw %},{% endraw %}{% endfor %}"
output --->
ok: [localhost] => {
"msg": "eu-west-1c:subnet-cccccccc,eu-west-1a:subnet-aaaaaaaa,eu-west-1b:subnet-bbbbbbbb,"
}
Jmespath does not allow to use dynamic names in multi select hashes. I have found an extension to jmespath allowing to do such thing by using key references, but it is not part of the plain jmespath implementation nor ansible.
To do this in plain ansible, you will have to create a new variable and populate it with a loop. There might be other ways using other filters but this is the solution I came up with:
- name: Create the expected hash
set_fact:
my_hash: >-
{{
my_hash
| default({})
| combine({ item.availability_zone: item.subnet_id })
}}
loop: "{{ subnets }}"
- name: Print result
debug:
var: my_hash
In Ansible, is there a way to convert a dynamic list of key/value pairs that are located in a JSON variable into variable names/values that can be accessed in a Playbook without using the filesystem?
IE - If I have the following JSON in a variable (in my case, already imported from a URI call):
{
"ansible_facts": {
"list_of_passwords": {
"ansible_password": "abc123",
"ansible_user": "user123",
"blue_server_password": "def456",
"blue_server_user": "user456"
}
}
Is there a way to convert that JSON variable into the equivelant of:
vars:
ansible_password: abc123
ansible_user: user123
blue_server_password: def456
blue_server_user: user456
Normally, I'd write the variable to a file, then import it using vars_files:. Our goal is to not write the secrets to the filesystem.
You can use uri module to make a call and then register response to variable:
For example:
- uri:
url: http://www.mocky.io/v2/59667604110000040ec8f5c6
body_format: json
register: response
- debug:
msg: "{{response.json}}"
- set_fact: {"{{ item.key }}":"{{ item.val }}"}
with_dict: "{{response.json.ansible_facts.list_of_passwords}}"