Setting fact from json with hyphens - json

I have a json coming from an API like this -
{
"Clusters": {
"cluster_name": "cluster1",
"desired_configs": {
"ams-env": {
"tag": "15646576543547354",
"version": 2
},
"ams-grafana-env": {
"tag": "156765743275788",
"version": 2
},
"ams-grafana-ini": {
"tag": "987657435754385457",
"version": 2
}
}
}
}
And I need to parse it with ansible. The trouble is, the variable that will be passed is the part with the hyphens.
I'm able to print the tag name with debug: var but I cant turn it into a fact and I also cant make it print when I use debug: msg
This is the play - I would like to take the "tag" for whichever config_name is passed at runtime and create a new var to be passed into later tasks
- name: Parsing Json
hosts: localhost
connection: local
tags: setup_infra
vars:
- config_name: ams-env
tasks:
- name: access fact
set_fact:
access_auths: "{{ lookup('file', 'ambari.json') | from_json }}"
- name: This works
debug:
var: access_auths.Clusters.desired_configs['{{ config_name }}'].tag
- name: This does not work
set_fact:
new_config: "{{ access_auths.Clusters.desired_configs['{{ config_name }}'].tag }}"
- name: Debug 0.3
debug:
var: new_config
Thanks in advance for any help

This issue is happening because of the nesting of {{ jinja delimiters. The config_name var you set is already a text "ams-env", so we don't need to quote it again as well.
The following tasks should work:
- debug:
msg: "tag is {{ access_auths['Clusters']['desired_configs'][config_name]['tag'] }}"
- set_fact:
new_config: "{{ access_auths['Clusters']['desired_configs'][config_name]['tag'] }}"

Related

Using Key(#) function to extract keys from an object in Ansible

I have a file file.sub which contains this JSON object {"kas_sub.test1": "true", "kas_sub.test2": "true"}. I would extract the keys and to get this: kas_sub.test1 kas_sub.test1.
When i try
- shell: 'cat path/to/file.sub'
register: file1
- debug:
var: file1.stdout_lines
I got:
TASK [shell] *****************************************************************************************************************
changed: [ansible4]
changed: [control]
TASK [debug] *****************************************************************************************************************
ok: [control] => {
"file1.stdout_lines": [
"{\"kas_sub.tes1\": \"true\", \"kas_sub.test2\": \"true\"}"
]
}
So it's not conserving the same JSON format because i would use the json_query filter.
- debug:
msg: "{{ file1.stdout_lines| json_query(value1)}}"
vars:
value1: "#[?keys(#)]"
keys(#)function doesn't return anything
ok: [control] => {
"msg": ""
}
note: taking for granted you want to read a file on the target machine
In a nutshell:
- hosts: your_group
gather_facts: false
vars:
file_to_read: /path/to/file.sub
tasks:
- name: slurp file content from target
slurp:
src: "{{ file_to_read }}"
register: slurped_file
- name: display keys from json inside file
debug:
msg: "{{ (slurped_file.content | b64decode | from_json).keys() }}"
Given the file
shell> cat /tmp/file.sub
{"kas_sub.test1": "true", "kas_sub.test2": "true"}
Use jq (if you can). For example, get the keys
- command: jq 'keys' /tmp/file.sub
register: result
and convert them to a list
keys: "{{ result.stdout|from_yaml }}"
gives
keys:
- kas_sub.test1
- kas_sub.test2
Example of a complete playbook
- hosts: localhost
vars:
keys: "{{ result.stdout|from_yaml }}"
tasks:
- command: jq 'keys' /tmp/file.sub
register: result
- debug:
var: keys

Ansible Extract JSON Tag

I'm trying to work with Infoblox API, and it's responses. I would need to extract values of tags from the response, that seems to be in JSON format, but I cannot find the way to do it.
Here is my playbook:
- name: "Checking _node_exporter Service Record for {{ inventory_hostname }}"
local_action:
module: uri
url: "{{ infobloxapiurl }}record:srv?name=_node_exporter.domain.com&target={{ inventory_hostname }}"
force_basic_auth: yes
user: "{{ infobloxuser }}"
password: "{{ infobloxpassword }}"
validate_certs: no
return_content: yes
register: _infoblox_results
- debug:
var: _infoblox_results.json
The _infoblox_results.json variable looks like this:
TASK [prometheus : debug] *******************************************************************************************************************************************************************************************
task path: /ansible/roles/tasks/task.yml:38
ok: [server.domain.com] => {
"_infoblox_results.json": [
{
"_ref": "record:srv/ZG5zLmJpbmRfc3J2JC5fZGVmYXVsdC5jb20udmNpbnQuZXcxL19ub2RlX2V4cG9ydGVyLzAvMC85MTAwL3Zhcm5pc2g3MDJ0c3QuZXcxLnZjaW50LmNvbQ:_node_exporter.domain.com/default",
"name": "_node_exporter.domain.com",
"port": 9100,
"priority": 0,
"target": "server.domain.com",
"view": "default",
"weight": 0
}
]
}
I want to use the data of _ref from _infoblox_results.json, but I wasn't able to extract it with regex_replace (it drops back the full _infoblox_results.json):
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json | regex_replace('record:srv.*\\/default,', '\\1') }}"
- debug:
var: _rcdid
when: _infoblox_results.json != []
Neither with json_query (it drops back nothing):
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json | json_query('_ref') }}"
- debug:
var: _rcdid
when: _infoblox_results.json != []
Can someone please point me into the right direction?
You have already an object in the memory, so simply refer to its value: _infoblox_results.json[0]._ref contains the string record:srv/ZG5zLmJpbmRfc3J2JC5fZGVmYXVsdC5jb20udmNpbnQuZXcxL19ub2RlX2V4cG9ydGVyLzAvMC85MTAwL3Zhcm5pc2g3MDJ0c3QuZXcxLnZjaW50LmNvbQ:_node_exporter.domain.com/default.
With that you can split the string and select the second element:
- name: Get Record ID
set_fact:
_rcdid: "{{ _infoblox_results.json[0]._ref.split('/')[1] }}"

Add a new key-value to a json file using Ansible

I'm using Ansible to automate some configuration steps for my application VM, but having difficult to insert a new key-value to an existing json file on the remote host.
Say I have this json file:
{
"foo": "bar"
}
And I want to insert a new key value pair to make the file become:
{
"foo": "bar",
"hello": "world"
}
Since json format is not line based, I'm excluding lineinfile module from my options. Also, I would prefer not to use any external modules. Google keeps giving me examples to show how to read json file, but nothing about change json values and write them back to file. Really appreciate your help please!
since the file is of json format, you could import the file to a variable, append the extra key:value pairs you want, and then write back to the filesystem.
here is a way to do it:
---
- hosts: localhost
connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
include_vars:
file: /tmp/var.json
name: imported_var
- debug:
var: imported_var
- name: append more key/values
set_fact:
imported_var: "{{ imported_var | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
UPDATE:
as OP updated, the code should work towards remote host, in this case we cant use included_vars or lookups. We could use the slurp module.
NEW code for remote hosts:
---
- hosts: greenhat
# connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
slurp:
src: /tmp/var.json
register: imported_var
- debug:
msg: "{{ imported_var.content|b64decode|from_json }}"
- name: append more key/values
set_fact:
imported_var: "{{ imported_var.content|b64decode|from_json | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
hope it helps
ilias-sp's solution is great!
In my case, it lacked the case when we may have to create a base json file.
So I had to add this task in the beginning of the play:
- name: Ensure json file exists
copy:
content: "{}"
dest: /tmp/var.json
force: false
For people who are OK with custom ansible modules: https://github.com/ParticleDecay/ansible-jsonpatch works great!
With this you can simply do:
- name: append key/values
json_patch:
src: /tmp/var.json
operations:
- op: add
path: "/hello"
value: "world"
pretty: yes
create: yes
- name: update log
copy:
content: "{{ log | to_nice_json}}"
dest: "{{ log_file }}"
vars:
log: "{{ (lookup('file', log_file) | from_json) + ([{'job': (build_id if build_id != '' else 'dev'), 'keystore': ks, 'timestamp': ansible_date_time.iso8601}]) }}"
log_file: log/log.json
build_id: "{{ lookup('ENV', 'BUILD_ID') }}"
tags: log

Ansible parse json and read result into different variables

I've set up a task which queries the github api meta endpoint and returns the following
{
"verifiable_password_authentication": true,
"github_services_sha": "f9e3a6b98d76d9964a6613d581164039b8d54d89",
"hooks": [
"192.30.252.0/22",
"185.199.108.0/22",
"140.82.112.0/20"
],
"git": [
"192.30.252.0/22",
"185.199.108.0/22",
"140.82.112.0/20",
"13.229.188.59/32",
"13.250.177.223/32",
"18.194.104.89/32",
"18.195.85.27/32",
"35.159.8.160/32",
"52.74.223.119/32"
],
"pages": [
"192.30.252.153/32",
"192.30.252.154/32",
"185.199.108.153/32",
"185.199.109.153/32",
"185.199.110.153/32",
"185.199.111.153/32"
],
"importer": [
"54.87.5.173",
"54.166.52.62",
"23.20.92.3"
]
}
What I need to do is get the 3 hook IPs and read them each into their own variable.
I've tried a couple of solutions i've found around but nothing is seeming to work for me.
I've got as far as drilling down into the json so i'm being returned only the 3 IPs, but how do I get them out and into variables individually?
i gave it a shot using j2 syntax in the variable name part, and - TIL - looks like the jinja2 syntax is allowed in that part as well!
please see playbook to process the hooks list variable and assign to variables variable_1, variable_2, variable_3 and so on:
- hosts: localhost
gather_facts: false
vars:
counter: 1
hooks:
- 192.30.252.0/22
- 185.199.108.0/22
- 140.82.112.0/20
tasks:
- name: populate vars
set_fact:
variable_{{counter}}: "{{ item }}"
counter: "{{ counter | int + 1 }}"
with_items:
- "{{ hooks }}"
- name: print vars
debug:
msg: "variable_1: {{variable_1}}, variable_2: {{variable_2}}, variable_3: {{variable_3}}"
and the output:
[root#optima-ansible ILIAS]# ansible-playbook 50257063.yml
PLAY [localhost] ***********************************************************************************************************************************************************************************************************************
TASK [populate vars] *******************************************************************************************************************************************************************************************************************
ok: [localhost] => (item=192.30.252.0/22)
ok: [localhost] => (item=185.199.108.0/22)
ok: [localhost] => (item=140.82.112.0/20)
TASK [print vars] **********************************************************************************************************************************************************************************************************************
ok: [localhost] => {
"msg": "variable_1: 192.30.252.0/22, variable_2: 185.199.108.0/22, variable_3: 140.82.112.0/20"
}
PLAY RECAP *****************************************************************************************************************************************************************************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0
[root#optima-ansible ILIAS]#
hope it helps
UPDATE:
something weird i noticed - also TIL - is that if you reverse the lines:
variable_{{counter}}: "{{ item }}"
counter: "{{ counter | int + 1 }}"
to:
counter: "{{ counter | int + 1 }}"
variable_{{counter}}: "{{ item }}"
you still end up with the same variable names, _1 to _3, while i would expect to get _2 to _4.
I guess ansible loops behave differently than expected from other programming languages.
---
- name: Query Github Meta API and get Hook Ips
hosts: local
connection: local
vars:
counter: 1
tasks:
- name: Query API
uri:
url: https://api.github.com/meta
return_content: yes
register: response
- name: Populate Hook Variables
set_fact:
webhook_ip_{{counter}}: "{{ item }}"
counter: "{{ counter | int + 1 }}"
with_items:
- "{{ response['json']['hooks'] }}"
- name: print vars
debug:
msg: "Variable_1: {{ webhook_ip_1 }}, Variable_2: {{ webhook_ip_2 }}, Variable_3: {{ webhook_ip_3 }}"
Works with GitHub Webhook IPs in a loop
- name: get request to github
uri:
url: "https://api.github.com/meta"
method: GET
return_content: yes
status_code: 200
headers:
Content-Type: "application/json"
#X-Auth-Token: "0010101010"
body_format: json
register: json_response
- name: GitHub webhook IPs
debug:
msg: "{{ item }}"
with_items: "{{ (json_response.content | from_json).hooks }}"

how it I parse each row of data from the google sheets api with ansible's with_items?

I am using the Google Sheets V4 Values collection and I am having trouble figuring out how to get each row to parse in to an {{ item }}
My Ansible ymal looks like.
tasks:
- name: debug url
debug:
msg: "{{ g_url }}"
- name: GET data from google spead sheet api
uri:
url: "{{ g_url }}"
return_content: yes
dest: /tmp/o_gd_form.json
register: google_data
- name: whats the dump?
debug:
var: "{{ item.0 |to_json }}"
with_items: "{{ google_data.json.values() }}" # this is the line that needs to be fixed
And the the responding json looks like:
{
"range": "Sheet1!A1:D5",
"majorDimension": "ROWS",
"values": [
["Item", "Cost", "Stocked", "Ship Date"],
["Wheel", "$20.50", "4", "3/1/2016"],
["Door", "$15", "2", "3/15/2016"],
["Engine", "$100", "1", "30/20/2016"],
["Totals", "$135.5", "7", "3/20/2016"]
],
}
I can't seem figure out how to write the the with_items to return an {{ item }} of json like ["Engine", "$100", "1", "30/20/2016"].
any help or do I need to split this task out to some middleware parser?
The Google sheets api documentation is at:
https://developers.google.com/sheets/samples/reading
To get what you want, use:
- debug: msg="Name={{ item.0 }} Cost={{ item.1 }}"
with_list: "{{ google_data.json['values'] }}"
There are two problems in your code:
values is a special keyword, so you can't access json.values, but should use json['values'] instead.
with_items flattens the list of lists, so you end up with long list of strings. You should use with_list to iterate over outer list (rows) and get inner lists (column values) as items.