ansible lookup in file with ; delimited based on 2 variable - csv

Ansible lookup in file with semi colon; delimited & lookup based on 2
variable
#Input csv file:
Sysport;name;address;column1;port;column2;column3
host001$port0;host001;x.x.x.10;x.x.x.10:port0,x.x.x.11:port0;port0;port0;envq1
host001$port1;host001;x.x.x.10;x.x.x.10:port1,x.x.x.11:port1;port1;port1;envq1
host001$port2;host001;x.x.x.10;x.x.x.10:port2,x.x.x.11:port2;port2;port2;envq1
host001$port3;host001;x.x.x.10;x.x.x.10:port3,x.x.x.11:port3;port3;port3;envq1
host001$port4;host001;x.x.x.10;x.x.x.10:port4,x.x.x.11:port4;port4;port4;envq1
host001$port5;host001;x.x.x.10;x.x.x.10:port5,x.x.x.11:port5;port5;port5;envq1
code
---
- name: lookup example
# Include host, group of hosts
hosts: [dummy]
# Count of servers to run in batch
serial: 10
# Collect basic information from servers
gather_facts: True
#ignore_unreachable: true
# Execution tasks
tasks:
- shell: ls /directory
register: port
- debug:
msg: "{{ lookup('csvfile', inventory_hostname ~ '$' ~ item file=/ansible/files/repos.csv delimiter=; col=3 }}"
with_items:
"{{ port.stdout_lines }}"
Error:
TASK [debug] *******************************************************************
fatal: [dummy1]: FAILED! => {"failed": true, "msg": "ERROR! template error while templating string: expected token ',', got
'file'"}
expected: result
value of 'server_repo' in message

As shown in the fine manual, the csvfile lookup wants the 2nd argument as a string, not a freeform collection of tokens (to say nothing of the missing closing paren in your example):
- debug:
msg: "{{ lookup('csvfile', inventory_hostname ~ '$' ~ item ~ ' file=/ansible/files/vips.csv delimiter=; col=3') }}"
with_items:
"{{ port.stdout_lines }}"
at your discretion, you can also extract that csv key part to a var since the vars: blocks are evaluated for every iteration:
- debug:
msg: "{{ lookup('csvfile', my_csv_key ~ ' file=/ansible/files/vips.csv delimiter=; col=3') }}"
vars:
my_csv_key: "{{ inventory_hostname ~ '$' ~ item }}"
with_items:
"{{ port.stdout_lines }}"
both work the same, but the 2nd may be a little easier for folks to mentally parse

Related

Ansible reading nested json values and matching variable

I am using this in an Ansible playbook:
- name: Gather info from Vcenter
vmware_vm_info:
hostname: "{{ result_item.vcenter }}"
username: "{{ ansible_username }}"
password: "{{ ansible_password }}"
validate_certs: no
register: vminfo
loop: "{{ result.list }}"
loop_control:
loop_var: result_item
I loop through a csv which has a list of VMs and their Vcenters. The json output from the Ansible task is this:
{
"results": [
{
"changed": false,
"virtual_machines": [
{
"guest_name": "Server1",
"guest_fullname": "SUSE Linux Enterprise 11 (64-bit)",
"power_state": "poweredOn",
},
{
"guest_name": "Server2",
"guest_fullname": "FreeBSD Pre-11 versions (64-bit)",
"power_state": "poweredOn",
},
Now I need to query this output for the VMs in my csv (guest_name matches vmname) and use set_fact to indicate whether the VMs in the csv are poweredOff or poweredOn. Next I can use it as a conditional on whether to power off the VM or not based on its current status.
I can't seem to get the json_query to work when matching to the VM name in the csv to the json output and then getting the corresponding power status. Any ideas?
CSV file:
vmname vcenter
Server1 Vcenter1
Server2 Vcenter1
Q: "set_fact to indicate whether the VMs in the CSV are powered off or powered on."
A: For example
- read_csv:
path: servers.csv
dialect: excel-tab
register: result
- set_fact:
servers: "{{ result.list|map(attribute='vmname')|list }}"
- set_fact:
virtual_machines: "{{ virtual_machines|default([]) +
[dict(_servers|zip(_values))] }}"
loop: "{{ vminfo.results }}"
vars:
_servers: "{{ servers|intersect(_dict.keys()|list) }}"
_values: "{{ _servers|map('extract',_dict)|list }}"
_dict: "{{ item.virtual_machines|
items2dict(key_name='guest_name', value_name='power_state') }}"
- debug:
var: virtual_machines
gives
virtual_machines:
- Server1: poweredOn
Server2: poweredOn
Servers missing in the vminfo.results will be silently ignored.
Q: "Use it as a conditional on whether to power off the VM or not."
A: For example Server1 in the first host
- debug:
msg: "Host={{ _host }} VM={{ _vm }} is poweredOn"
when: virtual_machines[_host][_vm] == 'poweredOn'
vars:
_host: 0
_vm: Server1
gives
msg: Host=0 VM=Server1 is poweredOn
I suppose, from your your example that you do have a TSV, so a tab separated values and not a CSV, which stands for comma separated values.
Based on this, the read_csv module, along with the dialect: excel-tab will help you read your TSV.
Then, you will need to use a filter projection to query the JSON based on the data in your TSV file.
You could also need to flatten the projection to get rid of the doubles list created by both the list in results and in virtual_machines.
An example of the resulting JMESPath query, for the Server1 ends up being:
results[].virtual_machines[?
guest_name == `Server1`
]|[]|[0].power_state
Then with all this in a playbook we do end up with:
- hosts: localhost
gather_facts: no
tasks:
- read_csv:
path: servers.csv
dialect: excel-tab
register: servers
- debug:
msg: >-
For {{ item.vmname }}, the state is {{
vminfo |
json_query(
'results[].virtual_machines[?
guest_name == `' ~ item.vmname ~ '`
]|[]|[0].power_state'
)
}}
loop: "{{ servers.list }}"
loop_control:
label: "{{ item.vmname }}"
vars:
vminfo:
results:
- changed: false
virtual_machines:
- guest_name: Server1
guest_fullname: SUSE Linux Enterprise 11 (64-bit)
power_state: poweredOn
- guest_name: Server2
guest_fullname: FreeBSD Pre-11 versions (64-bit)
power_state: poweredOn
Which yields the recap:
PLAY [localhost] **************************************************************************************************
TASK [read_csv] ***************************************************************************************************
ok: [localhost]
TASK [debug] ******************************************************************************************************
ok: [localhost] => (item=Server1) =>
msg: For Server1, the state is poweredOn
ok: [localhost] => (item=Server2) =>
msg: For Server2, the state is poweredOn
PLAY RECAP ********************************************************************************************************
localhost : ok=2 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0

Ansible Error when using dict with csv: "The task includes an option with an undefined variable. The error was 'dict object' has no attribute 'schema'

I'm trying to get my playbook working with csv while using a dict due to the nature of the data [each task asks for different parts of the data in a row so i can't use a list?].
BD and EPG always will appear exactly once per file and can be used as a key if necessary.
I'm getting an error "the task includes an option with an undefined variable". Since the variable (schema) appears as a column header in the csv file I must have some sort of syntax issue.
What I am trying to do is loop, one row at a time, through the csv file and have schema: "{{ item.schema }}" evaluate to that particular rows value under the schema column, etc.
Playbook:
tasks:
- name: Read CSV
read_csv:
# Name of the csv
path: ./Create_EPGs_and_BDs.csv
dialect: excel
key: bd
# Creates register value to be used later
register: csv_data
#
##################################################
#Creates BD in MSO Template
##################################################
#
- name: Add a new BD
cisco.mso.mso_schema_template_bd:
<<: *aci_login
state: present
schema: "{{ item.schema }}"
template: "{{ item.template }}"
bd: "{{ item.bd }}"
vrf:
name: "{{ item.vrf }}"
loop: "{{ csv_data.dict|dict2items }}"
#
##################################################
#Creates EPG in MSO Template
##################################################
#
- name: Add a new EPG
cisco.mso.mso_schema_template_anp_epg:
<<: *aci_login
state: present
schema: "{{ item.schema }}"
template: "{{ item.template }}"
anp: "{{ item.app_profile }}"
epg: "{{ item.epg }}"
bd:
name: "{{ item.bd }}"
loop: "{{ csv_data.dict|dict2items }}"
CSV File:
https://i.stack.imgur.com/VlyLP.jpg
What is the correct syntax to pull the csv values for the entire row and then let me access each column's value in the playbook for that particular row?
The dict2items is going to transpose your dict into a list of dicts, shaped like [{"key": "the-value-of-bd-for-that-row", "value": {"schema": "schema1", ...}}, ...]
Thus, you just need to add .value in between your item and the dict key it references:
- name: Add a new BD
cisco.mso.mso_schema_template_bd:
<<: *aci_login
state: present
schema: "{{ item.value.schema }}"
template: "{{ item.value.template }}"
# or if prefer, item.value.bd
bd: "{{ item.key }}"
vrf:
name: "{{ item.value.vrf }}"
loop: "{{ csv_data.dict|dict2items }}"
it's your code style, but you'll want to be very careful using only one space for yaml indentation, as it's very easy to make a mistake doing that, and certainly makes asking questions on SO harder by doing so :-)
In the future, the use of debug: var=item will go a long way toward helping you understand the shape of your data when you encounter any such "has no attribute" error

Read and use values from csv file

I'm struggling to find a solution for LAB project I'm working on right now. I'd like to use csv file to populate variables in my playbook when configuring Cisco ACI. I'm using read_csv module and the latest Ansible 2.9
Sample CSV:
tenant1;tenant1-vrf;tenant1-app
tenant1;tenant1-vrf2;tenant1-app2
tenant2;;tenant2-vrf2;tenant2-app2
UPDATE - based on Sai's code I'm not far from reaching the objective. This is the full tasks code.
UPDATE2 - eventually I went back to the read_csv module. It works nice even for complex things. Hope it helps someone as an example.
tasks:
- name: Read tenant from CSV file and return a list
read_csv:
path: "{{ filename }}"
delimiter: ;
register: tenantconfig
- name: TASK 1 - BUILD tenant
aci_tenant:
<<: *aci_login
validate_certs: no
use_ssl: yes
tenant: "{{ item.tenant }}"
description: "{{ item.tenant }} creation as per {{ filename }} source file"
state: present
with_items: "{{ tenantconfig.list }}"
- name: TASK 2 - BUILD Routing {{ vrf }} for {{ tenant }} on {{ apic_host }}
aci_vrf:
<<: *aci_login
state: present
validate_certs: no
use_ssl: yes
tenant: "{{ item.tenant }}"
vrf: "{{ item.vrf }}"
description: "{{ item.vrf }}"
with_items: "{{ tenantconfig.list }}"
i have changed answer to dynamically process your input file and assign tenant,vrf fields where ever you want to call.
tasks:
- name: split fields
command: /usr/bin/awk -F';' '!/^#/ && !/^$/ { print $1 }' tenant1.csv
register: tenants_out
#- debug:
# msg: "{{ lookup('csvfile', item + ' file=tenant1.csv delimiter=; col=0') }}"
# with_items: "{{ tenants_out.stdout_lines }}"
- name: TASK 1 - BUILD tenant
aci_tenant:
state: present
tenant: "{{ lookup('csvfile', item + ' file=tenant1.csv delimiter=; col=0') }}"
vrf: "{{ lookup('csvfile', item + ' file=tenant1.csv delimiter=; col=1') }}"
with_items: "{{ tenants_out.stdout_lines }}"
input file lines are spitted using initial task, and you can direct specify required tenent, vrf values using "with_items" looping. this is useful if your input file has multiple lines as well.

How to loop through two lists & add conditional statement to execute something when one condition is true

I have a question, i got the sid list & the DB open_mode, i am trying to run a sql script on the DB when below two conditions satisfy:
DB name should end with '1'.
DB Open_Mode should be 'READ WRITE'.
i am using ansible dynamic inventory to get the sid's from the host & loop through that list, but i am unable to use the two conditions to work through the conditions i am adding.
- hosts: all
gather_facts: false
strategy: free
tasks:
- include_vars: roles/oracle/vars/install_vars.yaml
vars:
var_list:
- script_name
- set_fact:
ORACLE_HOMES_DIR: "/u01/app/oracle/product"
DB_HOME: "{{ ORACLE_HOMES_DIR }}/{{ ORACLE_VERSION }}/dbinst_1"
- name: Copy script to host
copy:
src: "{{ playbook_dir }}/{{ script_name }}"
dest: "/tmp/"
owner: "{{ USER_ORACLE }}"
group: "{{ GROUP_ORACLE }}"
mode: 0755
- name: Verify if the DB is open READ WRITE (or) not
become_user: "{{ USER_ORACLE }}"
environment:
ORACLE_SID: "{{ sid }}"
ORACLE_HOME: "{{ ORACLE_HOME }}"
shell: "echo \"set pagesize 0\n select trim(open_mode) from v\\$database;\" | {{ORACLE_HOME}}/bin/sqlplus -S / as sysdba"
with_items: "{{ hostvars[inventory_hostname]['sid_list'] }}"
loop_control:
loop_var: sid
register: om
- name: Get list of sid that are open in READ WRITE mode
set_fact:
sid_list: "{{ om.results | selectattr('sid','search','1$') | map (attribute='sid') | list }}"
- name: Get the OPEN MODE output of the sid's from the list
set_fact:
om_out: "{{ om.results | selectattr('stdout') | map (attribute='stdout') | list }}"
- name: execute sql script
become_user: "{{ USER_ORACLE }}"
environment:
ORACLE_SID: "{{ item.0 }}"
ORACLE_HOME: "{{ ORACLE_HOME }}"
shell: "{{ ORACLE_HOME }}/bin/sqlplus / as sysdba #/tmp/{{ script_name }}"
when: item.1 == 'READ WRITE'
with_together:
- "{{ sid_list }}"
- "{{ om_out }}"
I am expected the playbook to execute the SQL script on the DB, but i am getting error saying "conditional result was False"
TASK [Get list of sid that are open in READ WRITE mode] ****************************************************************************************************************************************************
task path: /uhome/abhi/ansible/sql_script_execute.yaml:44
ok: [dwracdb1] => {
"ansible_facts": {
"sid_list": [
"abhitest1",
"dw1"
]
},
"changed": false
}
TASK [Get the SQL output from all the sid's] ***************************************************************************************************************************************************************
task path: /uhome/abhi/ansible/sql_script_execute.yaml:48
ok: [dwracdb1] => {
"ansible_facts": {
"om_out": [
"READ WRITE",
"READ WRITE"
]
},
"changed": false
}
TASK [Print om out] ****************************************************************************************************************************************************************************************
task path: /uhome/abhi/ansible/sql_script_execute.yaml:52
ok: [dwracdb1] => (item=[u'abhitest1', u'READ WRITE']) => {
"msg": "sid output is abhitest1 om output is READ WRITE"
}
ok: [dwracdb1] => (item=[u'dw1', u'READ WRITE']) => {
"msg": "sid output is dw1 om output is READ WRITE"
}
TASK [execute sql script] **********************************************************************************************************************************************************************************
task path: /uhome/abhi/ansible/sql_script_execute.yaml:61
fatal: [dwracdb1]: FAILED! => {
"msg": "The conditional check 'item.1 == 'READ WRITE'' failed. The error was: error while evaluating conditional (item.1 == 'READ WRITE'): 'item' is undefined\n\nThe error appears to have been in '/uhome/abhi/ansible/sql_script_execute.yaml': line 61, column 5, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n\n - name: execute sql script\n ^ here\n"
}
First, a formatting hint: move the when condition on your block up to the top so that it's obvious what it controls. When you put it at the bottom it's not obvious:
- block:
when:
- hostvars[inventory_hostname]['sid_list'] is defined
In this task, you're collecting multiple results (one for each entry in sid_list):
- name: Verify if the DB is open READ WRITE (or) not
become_user: "{{ USER_ORACLE }}"
environment:
ORACLE_SID: "{{ sid }}"
ORACLE_HOME: "{{ ORACLE_HOME }}"
shell: "echo \"set pagesize 0\n select trim(open_mode) from v\\$database;\" | {{ORACLE_HOME}}/bin/sqlplus -S / as sysdba"
with_items: "{{ hostvars[inventory_hostname]['sid_list'] }}"
loop_control:
loop_var: sid
register: om
- name: Get list of sid that are open in READ WRITE mode
set_fact:
sid_list: "{{ om.results | selectattr('sid','search','1$') | map (attribute='sid') | list }}"
That's why, when you run this task, you end up with a list of results:
- name: Get the SQL output from all the sid's
set_fact:
om_out: "{{ om.results | selectattr(\"stdout\",'equalto','READ WRITE') | map (attribute='stdout') | list }}"
You're doing the correct thing here in your debug task using with_together: you need that in order to associate a result in om_out with one of the entries in sid_list:
- name: Print om out
debug:
msg: sid output is {{ item.0 }} om output is {{ item.1 }}
with_together:
- "{{ sid_list }}"
- "{{ om_out }}"
You should do the same thing when trying to execute your sql script. Get rid of the block, because you only have a single task and you can't loop a block:
- name: execute sql script
become_user: "{{ USER_ORACLE }}"
environment:
ORACLE_SID: "{{ sid.0 }}"
ORACLE_HOME: "{{ ORACLE_HOME }}"
shell: "{{ ORACLE_HOME }}/bin/sqlplus / as sysdba #/tmp/{{ script_name }}"
when:
- sid.1 == 'READ WRITE'
with_together:
- "{{ sid_list }}"
- "{{ om_out }}"
loop_control:
loop_var: sid
In this loop, sid.0 will be the value from sid_list, and sid.1 will be the corresponding value from om_out.

Why is Ansible unable to read unicode string as JSON?

Summary
When retrieving data using the uri module in Ansible, I am unable to parse a section of it as JSON to retrieve a nested value.
The desired value is the ci field inside the content.data or json.data field (see output below).
Steps to Reproduce
site.yml
---
- hosts: localhost
gather_facts: false
tasks:
- name: Get String
uri:
url: "http://localhost/get-data"
method: POST
body_format: json
body: "{ \"kong-jid\": \"run-sn-discovery\" }"
return_content: yes
register: output
- set_fact:
ci: "{{ output.json.data.ci }}"
- debug:
msg: "{{ ci }}"
The {{ output }} variable
{
u'status': 200,
u'cookies': {},
u'url': u'http://kong-demo:8000/get-data',
u'transfer_encoding': u'chunked',
u'changed': False,
u'connection': u'close',
u'server': u'kong/0.34-1-enterprise-edition',
u'content':
u'{"data":"\\"{u\'ci\': u\'3bb8d625dbac3700e4f07b6e0f96195b\'}\\""}',
'failed': False,
u'json': {u'data': u'"{u\'ci\': u\'3bb8d625dbac3700e4f07b6e0f96195b\'}"'},
u'content_type': u'application/json',
u'date': u'Thu, 18 Apr 2019 15:50:25 GMT',
u'redirected': False,
u'msg': u'OK (unknown bytes)'
}
Result
[user#localhost]$ ansible-playbook site.yml
[WARNING]: Could not match supplied host pattern, ignoring: all
[WARNING]: provided hosts list is empty, only localhost is available
PLAY [localhost] ***************************************************************************************************************
TASK [Pass Redis data to next task as output] **********************************************************************************
ok: [localhost]
TASK [set_fact] ****************************************************************************************************************
fatal: [localhost]: FAILED! => {}
MSG:
The task includes an option with an undefined variable. The error was: 'ansible.utils.unsafe_proxy.AnsibleUnsafeText object' has no attribute 'ci'
The error appears to have been in 'site.yml': line 19, column 7, but may
be elsewhere in the file depending on the exact syntax problem.
The offending line appears to be:
- set_fact:
^ here
exception type: <class 'ansible.errors.AnsibleUndefinedVariable'>
exception: 'ansible.utils.unsafe_proxy.AnsibleUnsafeText object' has no attribute 'ci'
Important Troubleshooting Information
It appears the root issue is related to which Ansible type being interpreted. I desire to parse ci from the output in one task.
The two-task solution shown below works, but this leads me to believe this should be possible in one line...
Two-Task Solution
- set_fact:
ci: "{{ output.json.data | from_json }}"
- debug:
msg: "{{ ci['ci'] }}"
But the ci fact set from {{ output.json.data | from_json }} reports a different TYPE than the inline type...
Unicode or Dict?
- debug:
msg: "{{ output.json.data | from_json | type_debug }}" # returns unicode
- set_fact:
ci: "{{ output.json.data | from_json }}"
- debug:
msg: "{{ ci | type_debug }}" # returns dict
Why isn't {{ output.json.data | from_json | type_debug }}
the same as {{ ci | type_debug }}?
Although json and data are keys in their resp objects, ci is just part of a larger string (which happens to look like a JSON object
If the relevant line in your datastructure would be:
u'json': {u'data': {'ci': u'3bb8d625dbac3700e4f07b6e0f96195b'}},
then you could expect to use "{{ output.json.data.ci }}" but not when the .ci part is just a normal part of a string.