Passing Values from a CSV file into an Ansible playbook - csv

The files below are my playbook, inventory, and a CSV file ( my dictionary)
network_objects.yml
---
- hosts: all
connection: httpapi
gather_facts: False
vars_files:
- 'my_var.yml'
tasks:
- name: add-host
check_point.mgmt.cp_mgmt_host:
name: New Host 1
ip_address: 192.0.2.1
state: present
auto_publish_session: yes
- name: add-group
check_point.mgmt.cp_mgmt_group:
name: New Group 1
members:
- New Host 1
state: present
auto_publish_session: yes
my_var.yml
ansible_httpapi_validate_certs: False
ansible_httpapi_use_ssl: True
ansible_network_os: check_point.mgmt.checkpoint
ansible_python_interpreter: "python3"
ansible_user: admin
ansible_password: password
hosts
[check_point]
checkpoint ansible_host=#IP address of checkpoint
users.csv
Name,IP Address,Comments
PCUSER1,10.11.92.44, Created by User Feb 19 2021
PCUSER2,10.11.12.13, Created by User 02/19/2012
readCSV.yml           ---> This  reads a csv file called users.csv
- name: "Check for CSV File"
hosts: localhost
gather_facts: no
tasks:
- name: Read CSV File and Return a dictionary
read_csv:
path: users.csv
key: name
register: user
- ansible.builtin.debug:
msg: '{{ user }}
What I want to achieve,
I want to be able to read my CSV and pass those variables in the order the CSV data are listed. Then I can publish the New Host ( with all the Names, IP Addresses, Comments ) to Checkpoint
- name: add-host
check_point.mgmt.cp_mgmt_host:
name: {{ Get all Names from CSV File }}
ip_address: {{ Get all IP Addresses from CSV File }}
comments: {{ Get all Comments from CSV File }}

- name: add-host
check_point.mgmt.cp_mgmt_host:
name: {{ item.key }}
ip_address: {{ item.value[0]}}
comments: {{ item.value[1] }}
loop: "{{ user |dict2items}}"
I'm not sure for value.
i think the ouput of user is { user:[ip_address,comments],...}
Look your's structure dictonnaries, i think my purpose is good but adapt it.
source https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html#iterating-over-a-dictionary

Related

Ansible set_facts by localhost json file and copy to remote hosts

I'm trying to copy a json file from my localhost to remote hosts, and use the json "version" key value as a parameter in the destination file name.
JSON:
{"name": "test", "version": 3}
YML:
---
- name: Get json file version
hosts: locahost
become: true
tasks:
- name: Register licenses file
command: "cat conf/licenses.json"
register: result
- name: Save the json data
set_fact:
jsondata: "{{ result.stdout | from_json }}"
- name: Save licenses file version
set_fact:
file_version: "{{ jsondata | json_query(jmesquery) }}"
vars:
jmesquery: 'version'
- name: Deploy Licenses File
hosts: hosts
become: true
tasks:
- name: Copy licenses file
copy:
src: "conf/licenses.json"
dest: "/tmp/licenses_{{ file_version }}.json"
When I run the playbook above, the Deploy Licenses File key doesn't find the file_version fact, even though I can see it is successfully saved in the Get json file version key.
Set file_version fact:
ok: [localhost] => {
"ansible_facts": {
"file_version": "1"
},
"changed": false
}
Error:
The task includes an option with an undefined variable. The error was: 'file_version' is undefined
I think the facts are saved on the given host granularity and are not global facts per playbook initiation.
My current workaround is to combine the keys to a single task and then it works correctly, but I prefer to get the version once instead of repeating it for each remote host.
To access facts of another host, you can always use the hostvars special variable.
So, in your case:
dest: "/tmp/licenses_{{ hostvars.localhost.file_version }}.json"
Now, you actually do not need that level of complication with two plays and could well do:
- name: Deploy Licenses File
hosts: hosts
become: true
tasks:
- name: Copy licenses file
copy:
src: "{{ _licence_file }}"
dest: "/tmp/licenses_{{ file_version }}.json"
vars:
_licence_file: conf/licenses.json
file_version: >-
{{ (lookup('file', _licence_file) | from_json).version }}
The first play runs at localhost only. Therefore the variables declared in the first play are available to localhost only. This is the reason that the variables are not available to host in the second play
The error was: 'file_version' is undefined
Run once the tasks in a block of the first play at all hosts. This way the variables will be available to all hosts in the second play. For example, the simplified playbook below does the job
- hosts: all
tasks:
- block:
- include_vars:
file: licenses.json
name: jsondata
- set_fact:
file_version: "{{ jsondata.version }}"
run_once: true
- hosts: hosts
tasks:
- copy:
src: licenses.json
dest: "/tmp/licenses_{{ file_version }}.json"
Created file at the remote host
shell> ssh admin#test_11 cat /tmp/licenses_3.json
{"name": "test", "version": 3}
The code can be further simplified. The single play below does the job as well
- hosts: hosts
tasks:
- include_vars:
file: licenses.json
name: jsondata
run_once: true
- copy:
src: licenses.json
dest: "/tmp/licenses_{{ jsondata.version }}.json"

Checking the key value in a JSON file

I'm having trouble verifying the value of a json file on a remote server. I have to overwrite the file once on the remote machine from the template (j2). After that, I start a service that writes additional values to this file.
But when restarting ansible-playbook, this file is overwritten because it differs from the template. Before starting the task of writing a file from a template, I want to check the file for unique values.
For testing on a local machine, I do this and everything works:
- name: Check file
hosts: localhost
vars:
config: "{{ lookup('file','config.json') | from_json }}"
tasks:
- name: Check info
set_fact:
info: "{{ config.Settings.TimeStartUP }}"
- name: Print info
debug:
var: info
- name: Create directory
when: interfaces | length != 0
ansible.builtin.file:
...
But when I try to do the same in a task on a remote machine, for some reason ansible is looking for a file on the local machine
all.yml
---
config_file: "{{ lookup('file','/opt/my_project/config.json') | from_json }}"
site.yml
---
- name: Install My_project
hosts: server
tasks:
- name: Checking if a value exists
set_fact:
info: "{{ config_file.Settings.TimeStartUP }}"
- name: Print info
debug:
var: info
Error:
fatal: [server]: FAILED! => {"msg": "An unhandled exception occurred while templating '{{ lookup('file','/opt/my_project/config.json') | from_json }}'. Error was a <class 'ansible.errors.AnsibleError'>, original message: An unhandled exception occurred while running the lookup plugin 'file'. Error was a <class 'ansible.errors.AnsibleError'>, original message: could not locate file in lookup: /opt/my_project/config.json. could not locate file in lookup: /opt/my_project/config.json"}
Please tell me how to correctly check the key value in a JSON file on a remote server?
fetch the files from the remote hosts first. For example, given the files below for testing
shell> ssh admin#test_11 cat /tmp/config.json
{"Settings": {"TimeStartUP": "today"}}
shell> ssh admin#test_12 cat /tmp/config.json
{"Settings": {"TimeStartUP": "yesterday"}}
The playbook below
- hosts: test_11,test_12
gather_facts: false
tasks:
- file:
state: directory
path: "{{ playbook_dir }}/configs"
delegate_to: localhost
run_once: true
- fetch:
src: /tmp/config.json
dest: "{{ playbook_dir }}/configs"
- include_vars:
file: "{{ config_path }}"
name: config
vars:
config_path: "{{ playbook_dir }}/configs/{{ inventory_hostname }}/tmp/config.json"
- debug:
var: config.Settings.TimeStartUP
will create the directory configs in playbook_dir on the controller and will fetch the files from the remote hosts into this directory. See the parameter dest on how the path will be created
shell> cat configs/test_11/tmp/config.json
{"Settings": {"TimeStartUP": "today"}}
shell> cat configs/test_12/tmp/config.json
{"Settings": {"TimeStartUP": "yesterday"}}
Then include_vars and store the dictionary into the variable config
ok: [test_11] =>
config.Settings.TimeStartUP: today
ok: [test_12] =>
config.Settings.TimeStartUP: yesterday

Writing to a particular cell in csv file from Ansible

I'm trying to find if any option to write to a particular cell in CSV file from Ansible. I can use Ansible lookup plugin type "csvfile" to lookup for a value in the csv file but I need to add text to that file. Consider below example:
empID,name,mark
111,user1
112,user2
113,user3
I need to add the mark for each user, running the playbook should prompt for the empID and marks:
---
- name: Update data
hosts: localhost
vars_prompt:
- name: empid
prompt: EMP_ID
private: no
- name: marks
prompt: MARKS
private: no
tasks:
- name: Lookup
debug:
msg: "{{ lookup('csvfile', '{{ empid }} file=data.csv delimiter=, col=2') }}"
Need help to write to CSV file column C rows 1,2,3....
As #mdaniel mentioned, there's no out-of-box write_csv module in Ansible. Without creating your own module, the only workaround I can think of is the following:
You read in the CSV file with read_csv module and register the data table as a dictionary variable.
You add the new values to the dict variable recursively with the key "mark". (Check this post for modifying values in dict variable)
You loop over the dict variable and output each line to a new file with the lineinfile module. The file can be created with.csv extension, and in each line, the values are separated with delimiters (, or ;).
Here is an example:
---
- hosts: localhost
gather_facts: no
tasks:
- name: Read from CSV file
community.general.read_csv:
path: data.csv
key: empID
delimiter: ','
register: response
- name: Create a dict variable from the CSV file
set_fact:
data_dict: "{{ response.dict }}"
- name: Update values in dictionary recursively
set_fact:
data_dict: "{{ data_dict | combine(new_item, recursive=true) }}"
vars:
new_item: "{ '{{ item.key }}': { 'mark': 'some_value' } }"
with_dict: "{{ data_dict }}"
- name: Debug
debug:
msg: "{{ data_dict }}"
- name: Save ddata_dict headers to a new CSV file
lineinfile:
path: new_data.csv
line: empID,name,mark
create: yes
- name: Save data_dict values to a new CSV file
lineinfile:
path: new_data.csv
line: "{{ item.value.empID }},{{ item.value.name }},{{ item.value.mark }}"
loop: "{{ data_dict | dict2items }}"
And the outputted CSV file is:
empID,name,mark
111,user1,some_value
112,user2,some_value
113,user3,some_value
There doesn't appear to be a provided write_csv mechanism in ansible, partially because ansible is not a general purpose computing platform. However, you can easily write your own module which behaves as you wish, or you may be able to find an existing one out in ansible galaxy

Ansible save output from multiple hosts

I'm facing the issue with ansible playbook, I want to collect the info about all servers to a single file. Simly speaking I need gather info from all servers specified under hosts file.
Here is my .yml file:
---
- hosts: idrac
connection: local
name: Get system inventory
gather_facts: False
collections:
- dellemc.openmanage
tasks:
- name: Get system inventory
dellemc_get_system_inventory:
idrac_ip: "{{ idrac_ip }}"
idrac_user: "root"
idrac_password: "root"
register: result
- name: Copy results locally to output file
copy:
content: "{{ result }}"
dest: "./output/system_inventory_output.json"
delegate_to: localhost
But the problem is that I check output file, it contains json data only from one server.
I've been browsing the Net but till now did not find any working solution for that...
Any idea how to achieve that?
Thanks!
Create the output file in a second play, and iterate over all the hosts using a template. Something like this:
---
- hosts: idrac
connection: local
name: Get system inventory
gather_facts: False
collections:
- dellemc.openmanage
tasks:
- name: Get system inventory
dellemc_get_system_inventory:
idrac_ip: "{{ idrac_ip }}"
idrac_user: "root"
idrac_password: "root"
register: system_inventory
- hosts: localhost
gather_facts: false
tasks:
- name: Write results to local output file
copy:
dest: "./output/system_inventory_output.json"
content: |
{% for host in groups.idrac %}
=== {{ host }} ==
{{hostvars[host].system_inventory}}
{% endfor %}
You might elect to use the template module rather than embedding the template in the content argument of the copy module, as I have done here.

Add a new key-value to a json file using Ansible

I'm using Ansible to automate some configuration steps for my application VM, but having difficult to insert a new key-value to an existing json file on the remote host.
Say I have this json file:
{
"foo": "bar"
}
And I want to insert a new key value pair to make the file become:
{
"foo": "bar",
"hello": "world"
}
Since json format is not line based, I'm excluding lineinfile module from my options. Also, I would prefer not to use any external modules. Google keeps giving me examples to show how to read json file, but nothing about change json values and write them back to file. Really appreciate your help please!
since the file is of json format, you could import the file to a variable, append the extra key:value pairs you want, and then write back to the filesystem.
here is a way to do it:
---
- hosts: localhost
connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
include_vars:
file: /tmp/var.json
name: imported_var
- debug:
var: imported_var
- name: append more key/values
set_fact:
imported_var: "{{ imported_var | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
UPDATE:
as OP updated, the code should work towards remote host, in this case we cant use included_vars or lookups. We could use the slurp module.
NEW code for remote hosts:
---
- hosts: greenhat
# connection: local
gather_facts: false
vars:
tasks:
- name: load var from file
slurp:
src: /tmp/var.json
register: imported_var
- debug:
msg: "{{ imported_var.content|b64decode|from_json }}"
- name: append more key/values
set_fact:
imported_var: "{{ imported_var.content|b64decode|from_json | default([]) | combine({ 'hello': 'world' }) }}"
- debug:
var: imported_var
- name: write var to file
copy:
content: "{{ imported_var | to_nice_json }}"
dest: /tmp/final.json
hope it helps
ilias-sp's solution is great!
In my case, it lacked the case when we may have to create a base json file.
So I had to add this task in the beginning of the play:
- name: Ensure json file exists
copy:
content: "{}"
dest: /tmp/var.json
force: false
For people who are OK with custom ansible modules: https://github.com/ParticleDecay/ansible-jsonpatch works great!
With this you can simply do:
- name: append key/values
json_patch:
src: /tmp/var.json
operations:
- op: add
path: "/hello"
value: "world"
pretty: yes
create: yes
- name: update log
copy:
content: "{{ log | to_nice_json}}"
dest: "{{ log_file }}"
vars:
log: "{{ (lookup('file', log_file) | from_json) + ([{'job': (build_id if build_id != '' else 'dev'), 'keystore': ks, 'timestamp': ansible_date_time.iso8601}]) }}"
log_file: log/log.json
build_id: "{{ lookup('ENV', 'BUILD_ID') }}"
tags: log