Ansible: Create variables from json string - json

In Ansible, is there a way to convert a dynamic list of key/value pairs that are located in a JSON variable into variable names/values that can be accessed in a Playbook without using the filesystem?
IE - If I have the following JSON in a variable (in my case, already imported from a URI call):
{
"ansible_facts": {
"list_of_passwords": {
"ansible_password": "abc123",
"ansible_user": "user123",
"blue_server_password": "def456",
"blue_server_user": "user456"
}
}
Is there a way to convert that JSON variable into the equivelant of:
vars:
ansible_password: abc123
ansible_user: user123
blue_server_password: def456
blue_server_user: user456
Normally, I'd write the variable to a file, then import it using vars_files:. Our goal is to not write the secrets to the filesystem.

You can use uri module to make a call and then register response to variable:
For example:
- uri:
url: http://www.mocky.io/v2/59667604110000040ec8f5c6
body_format: json
register: response
- debug:
msg: "{{response.json}}"
- set_fact: {"{{ item.key }}":"{{ item.val }}"}
with_dict: "{{response.json.ansible_facts.list_of_passwords}}"

Related

Ansible extract value from stdout in json

hello I have ansible output which is json format.
{
"1000":{
"name":1000,
"lan":"99",
"numMacs":0,
"numArpNd":0,
"numRemoteVteps":"n\/a",
"type":"L3",
"vrf":"default",
"isL3svd":false
}
}
like to extract value from the key name from stdout output. Following snippet of code I tried, but not getting the correct result. Any pointers?
- name: save the Json data to a Variable as a Fact
set_fact:
name: "{{ data | json_query(jmesquery) }}"
vars:
jmesquery: json.name
key 1000 not static can be dynamic.

ansible passing JSON as a string without quoting it

I was running into this particularly painful Ansible task of:
Reading JSON from a file.
Passing the JSON as a string to helm, BUT not quoting it.
- name: deploy release
community.kubernetes.helm:
name: my_release
chart_ref: ./charts/my_chart
release_namespace: "{{namespace}}"
state: "{{state}}"
release_values:
x: "{{ lookup('file', './stuff.json') }}"
What I want the helm values file to look like is:
x: |
{ "hello": "world" }
The issue I ran into with the following lookup {{ lookup('file', './stuff.json') }} is that ansible will interpret it as a dict and pass the dict to helm. This does not work as I need a string. Here's what the output in the helm values file looks like:
x:
hello: world
Then I tried {{ lookup('file', './stuff.json') | quote}}. Ansible passes a string to helm, but that string has a quote around it. When I try to read the JSON in my deployment, I get a parse error. Here's what the output would look like:
x: '{ "hello": "world" }'
I even tried {{ lookup('file', './stuff.json') | to_json }}, as recommended here, but that failed as well.
Using {{ lookup('file', './stuff.json') | string }} will force Ansible to evaluate it as a string without adding quotes.
There are several examples in Using filters to manipulate data that use this filter.
Documentation for the filter can be found in the Jinja2 documentation. The documentation states that the filter will:
Make a string unicode if it isn’t already. That way a markup string is not converted back to unicode.
I'm not particularly sure why this corrects the issue, but it did.

transform values of yaml hash into keys of json hash in Ansible

I'm trying to get Ansible to convert an array of hashes, into to a list of key value pairs with the keys being one of the values from the first hash and the values being a different value from the first hash.
An example will help.
I want to convert :-
TASK [k8s_cluster : Cluster create | debug result of private ec2_vpc_subnet_facts] ***
ok: [localhost] => {
"result": {
"subnets": [
{
"availability_zone": "eu-west-1c",
"subnet_id": "subnet-cccccccc",
},
{
"availability_zone": "eu-west-1a",
"subnet_id": "subnet-aaaaaaaa",
},
{
"availability_zone": "eu-west-1b",
"subnet_id": "subnet-bbbbbbbb",
}
]
}
}
into
eu-west-1a: subnet-aaaaaaaa
eu-west-1b: subnet-bbbbbbbb
eu-west-1c: subnet-cccccccc
I've tried result.subnets | map('subnet.availability_zone': 'subnets.subnet_id') (which doesn't work at all) and json_query('subnets[*].subnet_id' which simply pickes out the subnet_id values and puts them into a list.
I think I could do this with Zip and Hash in Ruby but I don't know how to make this work in Ansible, or more specifically in Jmespath.
I have generated the below list I will add a new line to the generated list(thought to share this first)
---
- name: play
hosts: localhost
tasks:
- name: play
include_vars: vars.yml
- name: debug
debug:
msg: "{% for each in subnets %}{{ each.availability_zone }}:{{ each.subnet_id }}{% raw %},{% endraw %}{% endfor %}"
output --->
ok: [localhost] => {
"msg": "eu-west-1c:subnet-cccccccc,eu-west-1a:subnet-aaaaaaaa,eu-west-1b:subnet-bbbbbbbb,"
}
Jmespath does not allow to use dynamic names in multi select hashes. I have found an extension to jmespath allowing to do such thing by using key references, but it is not part of the plain jmespath implementation nor ansible.
To do this in plain ansible, you will have to create a new variable and populate it with a loop. There might be other ways using other filters but this is the solution I came up with:
- name: Create the expected hash
set_fact:
my_hash: >-
{{
my_hash
| default({})
| combine({ item.availability_zone: item.subnet_id })
}}
loop: "{{ subnets }}"
- name: Print result
debug:
var: my_hash

Provision a JSON file with Ansible while keeping indentation

I want to provision a JSON file with Ansible. The content of this file is a variable in my Ansible's playbook.
And very important for my usecase: I need the indentation & line breaks to be exactly the same as in my variable.
The variable looks like this :
my_ansible_var:
{
"foobar": {
"foo": "bar"
},
"barfoo": {
"bar": "foo"
}
}
And it's use like this in my playbook :
- name: drop the gitlab-secrets.json file
copy:
content: "{{ my_ansible_var }}"
dest: "/some/where/file.json"
Problem: when this tasks is played, my file is provisionned but as a "one-line" file:
{ "foobar": { "foo": "bar" }, "barfoo": { "bar": "foo" } }
I tried several other ways:
Retrieve the base64 value of my JSON content, and use content: "{{ my_ansible_var | b64decode }}" : same problem at the end
I tried playing with YAML block indicator : none of the block indicators helped me with that problem
I tried adding some filters like to_json, to_nice_json(indent=2) : no more luck here
Question:
How in Ansible can I provison a JSON file while keeping the exact indentation I want ?
In your example my_ansible_var is a dict. If you don't need to access its keys in your playbook (e.g. my_ansible_var.foobar.foo) and just want it as JSON string for your copy task, force it to be a string.
There is type-detection feature in Ansible template engine, so if you feed it with dict-like or list-like string, it will be evaluated into object. See some details here.
This construction will work ok for your case:
---
- hosts: localhost
gather_facts: no
vars:
my_ansible_var: |
{
"foobar": {
"foo": "bar"
},
"barfoo": {
"bar": "foo"
}
}
tasks:
- copy:
content: "{{ my_ansible_var | string }}"
dest: /tmp/out.json
Note vertical bar in my_ansible_var definition and | string filter in content expression.

How to prevent Jinja2 substitution in Ansible playbook?

In my playbook, a JSON file is included using the include_vars module. The content of the JSON file is as given below:
{
"Component1": {
"parameter1" : "value1",
"parameter2" : "value2"
},
"Component2": {
"parameter1" : "{{ NET_SEG_VLAN }}",
"parameter2": "value2"
}
}
After the JSON file is included in the playbook, I am using uri module to sent an http request as given below:
- name: Configure Component2 variables using REST API
uri:
url: "http://0.0.0.0:5000/vse/api/v1.0/config/working/Component2/configvars/"
method: POST
return_content: yes
HEADER_x-auth-token: "{{ login_resp.json.token }}"
HEADER_Content-Type: "application/json"
body: "{{ Component2 }}"
body_format: json
As it can be seen, the body of the http request is send with the JSON data Component2. However, Jinja2 tries to substitute the {{ NET_SEG_VLAN }} in the JSON file and throws and undefined error. The intention is not to substitute anything inside the JSON file using Jinja2 and send the body as it is in http request.
How to prevent the Jinja2 substitution for the variables included from the JSON file?
You should able to escape the variable even with {{'{{NET_SEG_VLAN}}'}} to tell jinja not to template anything inside that block.
You should be able to escape the variable with {% raw %} and {% endraw %} to tell Jinja not to template anything inside that block.
!unsafe
From documentation at https://docs.ansible.com/ansible/2.10/user_guide/playbooks_advanced_syntax.html#unsafe-or-raw-strings:
When handling values returned by lookup plugins, Ansible uses a data type called unsafe to block templating. Marking data as unsafe prevents malicious users from abusing Jinja2 templates to execute arbitrary code on target machines. The Ansible implementation ensures that unsafe values are never templated. It is more comprehensive than escaping Jinja2 with {% raw %} ... {% endraw %} tags.
You can use the same unsafe data type in variables you define, to prevent templating errors and information disclosure. You can mark values supplied by vars_prompts as unsafe. You can also use unsafe in playbooks. The most common use cases include passwords that allow special characters like { or %, and JSON arguments that look like templates but should not be templated.
I am using it all the time, like this:
# Load JSON content, as a raw string with !unsafe
- tags: ["always"]
set_fact:
dashboard_content: !unsafe "{{ lookup('file', './dash.json') | to_json }}"
# Build dictionnary via template
- tags: ["always"]
set_fact:
cc: "{{ lookup('template', './templates/cm_dashboard.yaml.j2') | from_yaml }}"
## cm_dashboard.yaml.j2 content:
hello: {{ cc_dashboard_content }}
# Now, "cc" is a dict variable, with "hello" field protected!