Add a new key-value to a json file using Ansible - json

I'm using Ansible to automate some configuration steps for my application VM, but having difficult to insert a new key-value to an existing json file on the remote host.
Say I have this json file:
{
"foo": "bar"
}
And I want to insert a new key value pair to make the file become:
{
"foo": "bar",
"hello": "world"
}
Since json format is not line based, I'm excluding lineinfile module from my options. Also, I would prefer not to use any external modules. Google keeps giving me examples to show how to read json file, but nothing about change json values and write them back to file. Really appreciate your help please!

since the file is of json format, you could import the file to a variable, append the extra key:value pairs you want, and then write back to the filesystem.
here is a way to do it:
---
- hosts: localhost
connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
include_vars:
file: /tmp/var.json
name: imported_var
- debug:
var: imported_var
- name: append more key/values
set_fact:
imported_var: "{{ imported_var | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
UPDATE:
as OP updated, the code should work towards remote host, in this case we cant use included_vars or lookups. We could use the slurp module.
NEW code for remote hosts:
---
- hosts: greenhat
# connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
slurp:
src: /tmp/var.json
register: imported_var
- debug:
msg: "{{ imported_var.content|b64decode|from_json }}"
- name: append more key/values
set_fact:
imported_var: "{{ imported_var.content|b64decode|from_json | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
hope it helps

ilias-sp's solution is great!
In my case, it lacked the case when we may have to create a base json file.
So I had to add this task in the beginning of the play:
- name: Ensure json file exists
copy:
content: "{}"
dest: /tmp/var.json
force: false

For people who are OK with custom ansible modules: https://github.com/ParticleDecay/ansible-jsonpatch works great!
With this you can simply do:
- name: append key/values
json_patch:
src: /tmp/var.json
operations:
- op: add
path: "/hello"
value: "world"
pretty: yes
create: yes

- name: update log
copy:
content: "{{ log | to_nice_json}}"
dest: "{{ log_file }}"
vars:
log: "{{ (lookup('file', log_file) | from_json) + ([{'job': (build_id if build_id != '' else 'dev'), 'keystore': ks, 'timestamp': ansible_date_time.iso8601}]) }}"
log_file: log/log.json
build_id: "{{ lookup('ENV', 'BUILD_ID') }}"
tags: log

Related

Using Key(#) function to extract keys from an object in Ansible

I have a file file.sub which contains this JSON object {"kas_sub.test1": "true", "kas_sub.test2": "true"}. I would extract the keys and to get this: kas_sub.test1 kas_sub.test1.
When i try
- shell: 'cat path/to/file.sub'
register: file1
- debug:
var: file1.stdout_lines
I got:
TASK [shell] *****************************************************************************************************************
changed: [ansible4]
changed: [control]
TASK [debug] *****************************************************************************************************************
ok: [control] => {
"file1.stdout_lines": [
"{\"kas_sub.tes1\": \"true\", \"kas_sub.test2\": \"true\"}"
]
}
So it's not conserving the same JSON format because i would use the json_query filter.
- debug:
msg: "{{ file1.stdout_lines| json_query(value1)}}"
vars:
value1: "#[?keys(#)]"
keys(#)function doesn't return anything
ok: [control] => {
"msg": ""
}
note: taking for granted you want to read a file on the target machine
In a nutshell:
- hosts: your_group
gather_facts: false
vars:
file_to_read: /path/to/file.sub
tasks:
- name: slurp file content from target
slurp:
src: "{{ file_to_read }}"
register: slurped_file
- name: display keys from json inside file
debug:
msg: "{{ (slurped_file.content | b64decode | from_json).keys() }}"
Given the file
shell> cat /tmp/file.sub
{"kas_sub.test1": "true", "kas_sub.test2": "true"}
Use jq (if you can). For example, get the keys
- command: jq 'keys' /tmp/file.sub
register: result
and convert them to a list
keys: "{{ result.stdout|from_yaml }}"
gives
keys:
- kas_sub.test1
- kas_sub.test2
Example of a complete playbook
- hosts: localhost
vars:
keys: "{{ result.stdout|from_yaml }}"
tasks:
- command: jq 'keys' /tmp/file.sub
register: result
- debug:
var: keys

Ansible set_facts by localhost json file and copy to remote hosts

I'm trying to copy a json file from my localhost to remote hosts, and use the json "version" key value as a parameter in the destination file name.
JSON:
{"name": "test", "version": 3}
YML:
---
- name: Get json file version
hosts: locahost
become: true
tasks:
- name: Register licenses file
command: "cat conf/licenses.json"
register: result
- name: Save the json data
set_fact:
jsondata: "{{ result.stdout | from_json }}"
- name: Save licenses file version
set_fact:
file_version: "{{ jsondata | json_query(jmesquery) }}"
vars:
jmesquery: 'version'
- name: Deploy Licenses File
hosts: hosts
become: true
tasks:
- name: Copy licenses file
copy:
src: "conf/licenses.json"
dest: "/tmp/licenses_{{ file_version }}.json"
When I run the playbook above, the Deploy Licenses File key doesn't find the file_version fact, even though I can see it is successfully saved in the Get json file version key.
Set file_version fact:
ok: [localhost] => {
"ansible_facts": {
"file_version": "1"
},
"changed": false
}
Error:
The task includes an option with an undefined variable. The error was: 'file_version' is undefined
I think the facts are saved on the given host granularity and are not global facts per playbook initiation.
My current workaround is to combine the keys to a single task and then it works correctly, but I prefer to get the version once instead of repeating it for each remote host.
To access facts of another host, you can always use the hostvars special variable.
So, in your case:
dest: "/tmp/licenses_{{ hostvars.localhost.file_version }}.json"
Now, you actually do not need that level of complication with two plays and could well do:
- name: Deploy Licenses File
hosts: hosts
become: true
tasks:
- name: Copy licenses file
copy:
src: "{{ _licence_file }}"
dest: "/tmp/licenses_{{ file_version }}.json"
vars:
_licence_file: conf/licenses.json
file_version: >-
{{ (lookup('file', _licence_file) | from_json).version }}
The first play runs at localhost only. Therefore the variables declared in the first play are available to localhost only. This is the reason that the variables are not available to host in the second play
The error was: 'file_version' is undefined
Run once the tasks in a block of the first play at all hosts. This way the variables will be available to all hosts in the second play. For example, the simplified playbook below does the job
- hosts: all
tasks:
- block:
- include_vars:
file: licenses.json
name: jsondata
- set_fact:
file_version: "{{ jsondata.version }}"
run_once: true
- hosts: hosts
tasks:
- copy:
src: licenses.json
dest: "/tmp/licenses_{{ file_version }}.json"
Created file at the remote host
shell> ssh admin#test_11 cat /tmp/licenses_3.json
{"name": "test", "version": 3}
The code can be further simplified. The single play below does the job as well
- hosts: hosts
tasks:
- include_vars:
file: licenses.json
name: jsondata
run_once: true
- copy:
src: licenses.json
dest: "/tmp/licenses_{{ jsondata.version }}.json"

Extract part of JSON in ansible playbook

I want to extract a part from a json extra vars input and use this as a variable in further commands.
The extra vars being parsed towards ansible is:
{
"problemUrl": "https://xxxxx.xxxxxxxxx-xxxxx.xxxx/e/58b59a93-xxxx-xxxx-xxxx-91bb5ca1f41c/#problems/problemdetails;pid=-5484403941961857966_1631165040000V2",
}
I want to extract the part -5484403941961857966_1631165040000V2 and store it into a variable.
- name: get pid from URL
set_fact:
pidproblem: "{{ problemUrl | urlsplit('fragment') | regex_search('pid=(.+)', '\\1') }}"
- name: show pid
debug:
var: pidproblem[0]
- name: update problem with output
when: state == "OPEN"
uri:
url: https://xxxxx.xxxxxxxxx-xxxxx.xxxx/e/58b59a93-xxxx-xxxx-xxxx-91bb5ca1f41c/api/v2/problems/"{{ pidproblem[0] }}"/comments
method: POST
headers:
Content-Type: application/json; charset=utf-8
Authorization: Api-Token xxxxx
body_format: json
body: "{\"message\":\"TEST\",\"context\":\"TEST\"}"
Could the issue reside in the fact that the id is subsituded as "6551567569324750926_1631192580000V2" instead of 6551567569324750926_1631192580000V2?
"url": "https://xxxxx.xxxxxxxxx-xxxxx.xxxx/e/58b59a93-xxxx-xxxx-xxxx-91bb5ca1f41c/api/v2/problems/\"6551567569324750926_1631192580000V2\"/comments"
There is a urlsplit filter which can split a URL into known segments. We can use this to break down the URL and get the last fragment, i.e.
"{{ problemUrl | urlsplit('fragment') }}"
Gives...
problems/problemdetails;pid=-5484403941961857966_1631165040000V2
Now this gives us a more "manageable" string. We can do a regex_search (with groups) on this, to get the pid, like:
- name: get pid from URL
set_fact:
pid: "{{ problemUrl | urlsplit('fragment') | regex_search('pid=(-.+)', '\\1') }}"
- name: show pid
debug:
var: pid[0]
- name: update problem with output
uri:
url: "https://xxxxx.xxxxxxxxx-xxxxx.xxxx/e/58b59a93-xxxx-xxxx-xxxx-91bb5ca1f41c/api/v2/problems/{{ pid[0] }}/comments"
# other params
Not super-reliable as we don't know how your url can change, but you could use some regex filter to extract the pid value:
- hosts: localhost
vars:
problemUrl: '{ "problemUrl": "https://xxxxx.xxxxxxxxx-xxxxx.xxxx/e/58b59a93-xxxx-xxxx-xxxx-91bb5ca1f41c/#problems/problemdetails;pid=-5484403941961857966_1631165040000V2;other=false" }'
tasks:
- name: set_fact some paramater
set_fact:
pid: "{{ (problemUrl | from_json).problemUrl | regex_replace('.*pid=(?P<pid>[^;]*).*', '\\g<pid>') }}"
- name: "update"
debug:
msg: "{{ pid }}"

Writing to a particular cell in csv file from Ansible

I'm trying to find if any option to write to a particular cell in CSV file from Ansible. I can use Ansible lookup plugin type "csvfile" to lookup for a value in the csv file but I need to add text to that file. Consider below example:
empID,name,mark
111,user1
112,user2
113,user3
I need to add the mark for each user, running the playbook should prompt for the empID and marks:
---
- name: Update data
hosts: localhost
vars_prompt:
- name: empid
prompt: EMP_ID
private: no
- name: marks
prompt: MARKS
private: no
tasks:
- name: Lookup
debug:
msg: "{{ lookup('csvfile', '{{ empid }} file=data.csv delimiter=, col=2') }}"
Need help to write to CSV file column C rows 1,2,3....
As #mdaniel mentioned, there's no out-of-box write_csv module in Ansible. Without creating your own module, the only workaround I can think of is the following:
You read in the CSV file with read_csv module and register the data table as a dictionary variable.
You add the new values to the dict variable recursively with the key "mark". (Check this post for modifying values in dict variable)
You loop over the dict variable and output each line to a new file with the lineinfile module. The file can be created with.csv extension, and in each line, the values are separated with delimiters (, or ;).
Here is an example:
---
- hosts: localhost
gather_facts: no
tasks:
- name: Read from CSV file
community.general.read_csv:
path: data.csv
key: empID
delimiter: ','
register: response
- name: Create a dict variable from the CSV file
set_fact:
data_dict: "{{ response.dict }}"
- name: Update values in dictionary recursively
set_fact:
data_dict: "{{ data_dict | combine(new_item, recursive=true) }}"
vars:
new_item: "{ '{{ item.key }}': { 'mark': 'some_value' } }"
with_dict: "{{ data_dict }}"
- name: Debug
debug:
msg: "{{ data_dict }}"
- name: Save ddata_dict headers to a new CSV file
lineinfile:
path: new_data.csv
line: empID,name,mark
create: yes
- name: Save data_dict values to a new CSV file
lineinfile:
path: new_data.csv
line: "{{ item.value.empID }},{{ item.value.name }},{{ item.value.mark }}"
loop: "{{ data_dict | dict2items }}"
And the outputted CSV file is:
empID,name,mark
111,user1,some_value
112,user2,some_value
113,user3,some_value
There doesn't appear to be a provided write_csv mechanism in ansible, partially because ansible is not a general purpose computing platform. However, you can easily write your own module which behaves as you wish, or you may be able to find an existing one out in ansible galaxy

How to use mongoDB collection output as variables in ansible

I can able to print the mongodb data using ansible. but here my requirement is to use the printed data as variables in ansible.
here is the output I'm getting ansible playbook output:
here is my ansible playbook.
---
- hosts: localhost
vars:
- i: "db.repo.find({ $and: [{'product': 'Admin'}, {'env':'SHK'}] }).pretty()"
tasks:
- name: Printing the retrieved data
command: mongo Advantage --quiet --eval "{{i}}"
register: temp
- name: Printing the retrieved data
set_fact:
"{{item}}"
with_items:
- [ "{{temp.stdout.split('\t')[0] }}", "{{temp.stdout.split('\t')[1] }}", "{{temp.stdout.split('\t')[2] }}", "{{temp.stdout.split('\t')[3] }}", "{{temp.stdout.split('\t')[4] }}", "{{temp.stdout.split('\t')[5] }}", "{{temp.stdout.split('\t')[6] }}", "{{temp.stdout.split('\t')[7] }}", "{{temp.stdout.split('\t')[8] }}", "{{temp.stdout.split('\t')[9] }}" ]
- include: /etc/ansible/roles/patchdeployment_3_11_2/tasks/applypatch/applypatch_windows_websphere.yml PR_ID={{PR_ID}}
#- include: /etc/ansible/roles/patchdeployment_3_11_2/tasks/applypatch/applypatch_linux_websphere.yml
please help me on the same.
Make mongo output pure JSON friendly by disabling all fields with custom types (like _id) in query.
Then use from_json Ansible filter to parse output.
- hosts: localhost
vars:
qry: "db.repo.findOne({ $and: [{'product': 'Admin'}, {'env':'SHK'}] },{_id:false})"
tasks:
- name: Get data
command: mongo Advantage --quiet --eval "{{qry}}"
register: temp
- name: Save parsed data
set_fact:
mongo_result: "{{ temp.stdout | from_json }}"
- name: Print some data
debug:
var: mongo_result.appName