ansible passing JSON as a string without quoting it - json

I was running into this particularly painful Ansible task of:
Reading JSON from a file.
Passing the JSON as a string to helm, BUT not quoting it.
- name: deploy release
community.kubernetes.helm:
name: my_release
chart_ref: ./charts/my_chart
release_namespace: "{{namespace}}"
state: "{{state}}"
release_values:
x: "{{ lookup('file', './stuff.json') }}"
What I want the helm values file to look like is:
x: |
{ "hello": "world" }
The issue I ran into with the following lookup {{ lookup('file', './stuff.json') }} is that ansible will interpret it as a dict and pass the dict to helm. This does not work as I need a string. Here's what the output in the helm values file looks like:
x:
hello: world
Then I tried {{ lookup('file', './stuff.json') | quote}}. Ansible passes a string to helm, but that string has a quote around it. When I try to read the JSON in my deployment, I get a parse error. Here's what the output would look like:
x: '{ "hello": "world" }'
I even tried {{ lookup('file', './stuff.json') | to_json }}, as recommended here, but that failed as well.

Using {{ lookup('file', './stuff.json') | string }} will force Ansible to evaluate it as a string without adding quotes.
There are several examples in Using filters to manipulate data that use this filter.
Documentation for the filter can be found in the Jinja2 documentation. The documentation states that the filter will:
Make a string unicode if it isn’t already. That way a markup string is not converted back to unicode.
I'm not particularly sure why this corrects the issue, but it did.

Related

Ansible extract value from stdout in json

hello I have ansible output which is json format.
{
"1000":{
"name":1000,
"lan":"99",
"numMacs":0,
"numArpNd":0,
"numRemoteVteps":"n\/a",
"type":"L3",
"vrf":"default",
"isL3svd":false
}
}
like to extract value from the key name from stdout output. Following snippet of code I tried, but not getting the correct result. Any pointers?
- name: save the Json data to a Variable as a Fact
set_fact:
name: "{{ data | json_query(jmesquery) }}"
vars:
jmesquery: json.name
key 1000 not static can be dynamic.

Create a json parsable k8s configmaps containing lists of quoted elements with Helm

I was trying to modify a helm chart (this one), and one template is there to generate a configMap.
The config map is then loaded and parsed as a json by the different modules.
The thing is that I need at some point to put a list of strings in the json. Passing just the value in the template resulted in a not quoted list. I tried then to use a range to do it element by element, but then I get a final comma. And the json parser used by the image I’m deploying (over which I have no control) won't accept a non strict json. I.e., the last element in the list cannot have a trailing comma.
Here is an example values.yaml:
val:
- "a"
- "b"
- "c"
And some template.tpl:
apiVersion: v1
kind: ConfigMap
metadata:
name: some-configmap
data:
cfg.json: |
{
"val": [{{ range .Values.val }}{{ . | quote }},{{ end }}]
}
But this yields:
{
"val": ["a","b","c",]
}
Which is rejected by the json parser with message like:
internal/modules/cjs/loader.js:1008
throw err;
^
SyntaxError: /etc/config/..2020_08_03_15_32_26.221540866/pelias.json: Unexpected token ] in JSON at position 1744
at JSON.parse (<anonymous>)
at Object.Module._extensions..json (internal/modules/cjs/loader.js:1005:27)
at Module.load (internal/modules/cjs/loader.js:811:32)
at Function.Module._load (internal/modules/cjs/loader.js:723:14)
at Module.require (internal/modules/cjs/loader.js:848:19)
at require (internal/modules/cjs/helpers.js:74:18)
at getConfig (/code/pelias/schema/node_modules/pelias-config/index.js:66:21)
at Object.generate (/code/pelias/schema/node_modules/pelias-config/index.js:24:18)
at Object.<anonymous> (/code/pelias/schema/scripts/create_index.js:2:41)
at Module._compile (internal/modules/cjs/loader.js:955:30)
If I pass only empty lists, the json parser is happy, but I need those arguments for the rest of the process.
Is there a way to either remove the last comma, or even to load a list of strings in a more elegant manner using helm templates?
(I know I can hardcode the value in my templates, but I would like this deployment to be reusable with other parameters)
Thanks in advance.
[EDIT]
I found a somewhat working strategy by doing:
"val": [{{ join "," .Values.val }}]
The only issue now is that I need to use double quotes in my values.yaml:
val:
- '"a"'
- '"b"'
- '"c"'
This is ok, but I'd be interested in a cleaner solution if anyone has it.
You can use the toJson function to convert sections (in your case of your values file) into JSON:
Both of those work (the difference being what you prefer and whether there are other functions you want to apply before):
"val": {{ toJson .Values.val }}
"val": {{ .Values.val | toJson }}
There is also a toYaml function.
Examples
All examples use a slightly modified values.yaml:
config:
val:
- a
- b
- c
Example 1: Just the array
data:
cfg.json: |-
{
"val": {{ toJson .Values.config.val }}
}
results in
data:
cfg.json: |-
{
"val": ["a","b","c"]
}
Example 2: The whole section
data:
cfg.json: {{ .Values.config | toJson | quote }}
results in
data:
cfg.json: "{\"val\":[\"a\",\"b\",\"c\"]}"
Example 3: Pretty JSON
data:
cfg.json: |-
{{- .Values.config | toPrettyJson | nindent 4 }}
results in
data:
cfg.json: |-
{
"val": [
"a",
"b",
"c"
]
}
Example 4: YAML
data:
cfg.json: |-
{{- .Values.config | toYaml | nindent 4 }}
results in
data:
cfg.json: |-
val:
- a
- b
- c

transform values of yaml hash into keys of json hash in Ansible

I'm trying to get Ansible to convert an array of hashes, into to a list of key value pairs with the keys being one of the values from the first hash and the values being a different value from the first hash.
An example will help.
I want to convert :-
TASK [k8s_cluster : Cluster create | debug result of private ec2_vpc_subnet_facts] ***
ok: [localhost] => {
"result": {
"subnets": [
{
"availability_zone": "eu-west-1c",
"subnet_id": "subnet-cccccccc",
},
{
"availability_zone": "eu-west-1a",
"subnet_id": "subnet-aaaaaaaa",
},
{
"availability_zone": "eu-west-1b",
"subnet_id": "subnet-bbbbbbbb",
}
]
}
}
into
eu-west-1a: subnet-aaaaaaaa
eu-west-1b: subnet-bbbbbbbb
eu-west-1c: subnet-cccccccc
I've tried result.subnets | map('subnet.availability_zone': 'subnets.subnet_id') (which doesn't work at all) and json_query('subnets[*].subnet_id' which simply pickes out the subnet_id values and puts them into a list.
I think I could do this with Zip and Hash in Ruby but I don't know how to make this work in Ansible, or more specifically in Jmespath.
I have generated the below list I will add a new line to the generated list(thought to share this first)
---
- name: play
hosts: localhost
tasks:
- name: play
include_vars: vars.yml
- name: debug
debug:
msg: "{% for each in subnets %}{{ each.availability_zone }}:{{ each.subnet_id }}{% raw %},{% endraw %}{% endfor %}"
output --->
ok: [localhost] => {
"msg": "eu-west-1c:subnet-cccccccc,eu-west-1a:subnet-aaaaaaaa,eu-west-1b:subnet-bbbbbbbb,"
}
Jmespath does not allow to use dynamic names in multi select hashes. I have found an extension to jmespath allowing to do such thing by using key references, but it is not part of the plain jmespath implementation nor ansible.
To do this in plain ansible, you will have to create a new variable and populate it with a loop. There might be other ways using other filters but this is the solution I came up with:
- name: Create the expected hash
set_fact:
my_hash: >-
{{
my_hash
| default({})
| combine({ item.availability_zone: item.subnet_id })
}}
loop: "{{ subnets }}"
- name: Print result
debug:
var: my_hash

Ansible: Create variables from json string

In Ansible, is there a way to convert a dynamic list of key/value pairs that are located in a JSON variable into variable names/values that can be accessed in a Playbook without using the filesystem?
IE - If I have the following JSON in a variable (in my case, already imported from a URI call):
{
"ansible_facts": {
"list_of_passwords": {
"ansible_password": "abc123",
"ansible_user": "user123",
"blue_server_password": "def456",
"blue_server_user": "user456"
}
}
Is there a way to convert that JSON variable into the equivelant of:
vars:
ansible_password: abc123
ansible_user: user123
blue_server_password: def456
blue_server_user: user456
Normally, I'd write the variable to a file, then import it using vars_files:. Our goal is to not write the secrets to the filesystem.
You can use uri module to make a call and then register response to variable:
For example:
- uri:
url: http://www.mocky.io/v2/59667604110000040ec8f5c6
body_format: json
register: response
- debug:
msg: "{{response.json}}"
- set_fact: {"{{ item.key }}":"{{ item.val }}"}
with_dict: "{{response.json.ansible_facts.list_of_passwords}}"

How to prevent Jinja2 substitution in Ansible playbook?

In my playbook, a JSON file is included using the include_vars module. The content of the JSON file is as given below:
{
"Component1": {
"parameter1" : "value1",
"parameter2" : "value2"
},
"Component2": {
"parameter1" : "{{ NET_SEG_VLAN }}",
"parameter2": "value2"
}
}
After the JSON file is included in the playbook, I am using uri module to sent an http request as given below:
- name: Configure Component2 variables using REST API
uri:
url: "http://0.0.0.0:5000/vse/api/v1.0/config/working/Component2/configvars/"
method: POST
return_content: yes
HEADER_x-auth-token: "{{ login_resp.json.token }}"
HEADER_Content-Type: "application/json"
body: "{{ Component2 }}"
body_format: json
As it can be seen, the body of the http request is send with the JSON data Component2. However, Jinja2 tries to substitute the {{ NET_SEG_VLAN }} in the JSON file and throws and undefined error. The intention is not to substitute anything inside the JSON file using Jinja2 and send the body as it is in http request.
How to prevent the Jinja2 substitution for the variables included from the JSON file?
You should able to escape the variable even with {{'{{NET_SEG_VLAN}}'}} to tell jinja not to template anything inside that block.
You should be able to escape the variable with {% raw %} and {% endraw %} to tell Jinja not to template anything inside that block.
!unsafe
From documentation at https://docs.ansible.com/ansible/2.10/user_guide/playbooks_advanced_syntax.html#unsafe-or-raw-strings:
When handling values returned by lookup plugins, Ansible uses a data type called unsafe to block templating. Marking data as unsafe prevents malicious users from abusing Jinja2 templates to execute arbitrary code on target machines. The Ansible implementation ensures that unsafe values are never templated. It is more comprehensive than escaping Jinja2 with {% raw %} ... {% endraw %} tags.
You can use the same unsafe data type in variables you define, to prevent templating errors and information disclosure. You can mark values supplied by vars_prompts as unsafe. You can also use unsafe in playbooks. The most common use cases include passwords that allow special characters like { or %, and JSON arguments that look like templates but should not be templated.
I am using it all the time, like this:
# Load JSON content, as a raw string with !unsafe
- tags: ["always"]
set_fact:
dashboard_content: !unsafe "{{ lookup('file', './dash.json') | to_json }}"
# Build dictionnary via template
- tags: ["always"]
set_fact:
cc: "{{ lookup('template', './templates/cm_dashboard.yaml.j2') | from_yaml }}"
## cm_dashboard.yaml.j2 content:
hello: {{ cc_dashboard_content }}
# Now, "cc" is a dict variable, with "hello" field protected!