Ansible: parsing/concatenating output of getent module - json

I try to set up chroot for sftp users, so that they can see user/group names on ls -l as per this article. To this end I need to get output of getent command and place it into /chroots/{{ user.username }}/etc/passwd file.
I try to use Ansible to replace this command getent passwd sftpuser > /chroots/sftpuser/etc/passwd as follows:
- name: get {{ user.username }} user info
getent:
database: passwd
key: "{{ user.username }}"
- debug:
var: getent_passwd
- name: create /chroots/{{ user.username }}/etc/passwd file
lineinfile:
path: /chroots/{{ user.username }}/etc/passwd
line: "{{ getent_passwd | from_json }}"
state: present
create: yes
owner: root
group: root
mode: '0644'
The 'getent_passwd' looks as follows:
ok: [cf1] => {
"getent_passwd": {
"testuser1": [
"x",
"1001",
"1002",
"",
"/home/testuser1",
"/usr/sbin/nologin"
]
}
}
But I get this error: FAILED! => {"failed": true, "msg": "Unexpected templating type error occurred on ({{ getent_passwd | from_json }}): expected string or buffer"}
What is the proper way to get those values supplied by getent_passwd into one flat string joined by ":"?
Is it safe to use genent module with key: "root" this way instead of echo "root:x:0:0:not really root:::" >> /chroots/sftpuser/etc/passwd?
one can run getent passwd user1 user2 - is it possible to supply two keys to the ansible's getent module somehow?

What is the proper way to get those values supplied by getent_passwd into one flat string joined by ":"?
For example using a Jinja2 template with join filter:
- debug:
msg: "{{ user.username }}:{{getent_passwd[user.username]|join(':')}}"
One can run getent passwd user1 user2 - is it possible to supply two keys to the ansible's getent module somehow?
No. Either a single one or all.
Use an outer loop to request values in the first case, or filter the resulting list in the second.

Related

How can I retrieve count(*) values form ansible mysql_query module?

I am using ansible and community.mysql.mysql_query to perform some sanity on my database.
I already figured out that I need to register the output and the output holds a parameter named query_result that contains the returned data.
My problem is that all examples are for a standard select in which you use param.query_result['column'] but mine has a COUNT(*).
My output for this debug :
- name: debug in db role
debug:
msg: |
result : {{ first_query.query_result }}
is :
ok: [localhost] => {
"msg": "result : [[{u'COUNT(*)': 16}]]\n"
}
Since count has * in it I cannot access it in the playbook.
Any thoughts on to how I can accomplish it and actually use this '16' count number?
Thanks
That was fast on my part ...
- name: debug in db role
debug:
msg: |
result : {{ first_query.query_result[0][0]['COUNT(*)'] }}

Ansible: create a function

I found this page from answer number 4 by #cobbzilla useful to my use case.
Just want to ask if Ansible is capable to have a function that will handle this command:
2>&1 >> /tmp/debug.log
I have already applied this solution to my yml files and I was looking if the command can be wrapped on a function so that it will show much cleaner.
My Sample Ansible yml:
- name: Deploy
hosts: localhost
connection: local
gather_facts: false
tasks:
- name: perform 1st script
shell: bash -c "1st_script.sh 2>&1 >> /var/tmp/debug.log"
- name: perform 2nd script
shell: bash -c "2nd_script.sh 2>&1 >> /var/tmp/debug.log"
- name: perform 3rd script
shell: bash -c "3rd_script.sh 2>&1 >> /var/tmp/debug.log"
Preliminary note: I am answering your direct question below because this can be useful in other circumstances.
Meanwhile, in the ansible context, playing bash scripts and returning their output and error to a log file on the target is generally not a good idea. You will be left blind if something goes wrong, or if you want to analyze the output, as the module will return an empty stderr and stdout. You will have to rely on analyzing the log file later (and finding the correct output since you mix script output in there).
On an even wider level, you should use shell/command only when there is no other possibility to get the same job done using existing modules
If you don't mind writing some lines of python, a pretty easy and straight forward way to acheive your goal is to use a custom filter. The below example is quickNdirty. You will probably have to harden its code (escape command characters, check for specific errors...) but this should put you on track.
For the example, I am creating the filter in the filter_plugins folder adjacent to the demo playbook. If you need to distribute that filter across projects, search the ansible documentation to learn how to wrap that in a role or a collection.
In filter_plugins/shell_filters.py
def bash_n_log(command, log_file='/var/tmp/debug.log'):
"""Return a formatted string to play the script in bash and log its output"""
return f'bash -c "{command} 2>&1 >> {log_file}"'
class FilterModule(object):
"""collection of shell utility filters."""
def filters(self):
"""Return the filter list."""
return {
'bash_n_log': bash_n_log
}
Then the demo playbook.yml
- name: Custom shell filter demo
hosts: localhost
gather_facts: false
vars:
my_commands:
- echo Hello World
- ls -l /dev/null
tasks:
- name: Play my commands with my filter
shell: "{{ item | bash_n_log }}"
loop: "{{ my_commands }}"
- name: Same example with non default log file
shell: "{{ item | bash_n_log('/tmp/other.log') }}"
loop: "{{ my_commands }}"
- name: Get content of the log files
slurp:
path: "{{ item }}"
register: slurped_logs
loop:
- /var/tmp/debug.log
- /tmp/other.log
- name: Show log file content
debug:
msg: "{{ (item.content | b64decode).split('\n') }}"
loop: "{{ slurped_logs.results }}"
loop_control:
label: "{{ item.item }}"
gives:
$ ansible-playbook playbook.yml
PLAY [Custom shell filter demo] ********************************************************************************************************************************************************************************************************
TASK [Play my commands with my filter] *************************************************************************************************************************************************************************************************
changed: [localhost] => (item=echo Hello World)
changed: [localhost] => (item=ls -l /dev/null)
TASK [Same example with non default log file] ******************************************************************************************************************************************************************************************
changed: [localhost] => (item=echo Hello World)
changed: [localhost] => (item=ls -l /dev/null)
TASK [Get content of the log files] ****************************************************************************************************************************************************************************************************
ok: [localhost] => (item=/var/tmp/debug.log)
ok: [localhost] => (item=/tmp/other.log)
TASK [Show log file content] ***********************************************************************************************************************************************************************************************************
ok: [localhost] => (item=/var/tmp/debug.log) => {
"msg": [
"Hello World",
"crw-rw-rw- 1 root root 1, 3 Mar 27 12:06 /dev/null",
""
]
}
ok: [localhost] => (item=/tmp/other.log) => {
"msg": [
"Hello World",
"crw-rw-rw- 1 root root 1, 3 Mar 27 12:06 /dev/null",
""
]
}
PLAY RECAP *****************************************************************************************************************************************************************************************************************************
localhost : ok=4 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0
I just tried and tested this simple solution worked, I just added vars section witch points the variable to a command
- name: Deploy
hosts: localhost
connection: local
gather_facts: false
vars:
logger: "2>&1 >> /var/tmp/debug.log"
tasks:
- name: perform 1st script
shell: bash -c "1st_script.sh {{ logger }}"
- name: perform 2nd script
shell: bash -c "2nd_script.sh {{ logger }}"
- name: perform 3rd script
shell: bash -c "3rd_script.sh {{ logger }}"

Ansible: Invalid JSON when using --extra-vars

Hi community,
I have been struggling with an issue in ansible issue for days now.
Everything is executed wihtin a Jenkins pipeline.
The ansible command looks like:
sh """
ansible-playbook ${env.WORKSPACE}/cost-optimization/ansible/manage_dynamo_db.yml \
--extra-vars '{"projectNameDeployConfig":${projectNameDeployConfig},"numberOfReplicas":${numberOfReplicas},"dynamodbtask":${dynamodbtask}}'
"""
And the playbooks is:
playbook.yml
---
- hosts: localhost
vars:
numberOfReplicas: "{{numberOfReplicas}}"
dynamodbtask: "{{dynamodbtask}}"
namespace: "{{projectNameDeployConfig}}"
status: "{{status}}"
- tasks:
- name: "Get replica number for the pods"
command: aws dynamodb put-item --table-name pods_replicas
register: getResult
when: dynamodbtask == "get"
- name: "Update replica number for specified pods"
command: |
aws dynamodb put-item
--table-name pods_replicas
--item '{"ProjectNameDeployConfig":{"S":{{namespace}}},"NumberReplicas":{"N":{{numberOfReplicas}}}}'
register: updatePayload
when: dynamodbtask == "put" and getResult is skipped
However, there is always the following error:
fatal: [localhost]: FAILED! => {"changed": true, "cmd": ["aws", "dynamodb", "put-item", "--table-name",
"pods_replicas", "--item", "{\"ProjectNameDeployConfig\":{\"S\":LERN-PolicyCenterV10},\"NumberReplicas\":
{\"N\":0}}"], "delta": "0:00:01.702107", "end": "2020-02-09 16:58:26.055579",
"msg": "non-zero return code", "rc": 255, "start": "2020-02-09 16:58:24.353472", "stderr": "\nError parsing parameter '--item': Invalid JSON: No JSON object could be decoded\nJSON received: {\"ProjectNameDeployConfig\":{\"S\":LERN-PolicyCenterV10},\"NumberReplicas\":{\"N\":0}}", "stderr_lines": ["", "Error parsing parameter '--item': Invalid JSON: No JSON object could be decoded", "JSON received: {\"ProjectNameDeployConfig\":{\"S\":LERN-PolicyCenterV10},\"NumberReplicas\":{\"N\":0}}"], "stdout": "", "stdout_lines": []}
There are two answers to your question: the simple one and the correct one
The simple one is that had you actually fed the JSON into jq, or python -m json.tool, you would have observed that namespace is unquoted:
"{\"ProjectNameDeployConfig\":{\"S\": LERN-PolicyCenterV10 },\"NumberReplicas\": {\"N\":0}}"
where I added a huge amount of space, but didn't otherwise alter the quotes
The correct answer is that you should never use jinja2 to try and assemble structured text when there are filters that do so for you.
What you actually want is to use the to_json filter:
- name: "Update replica number for specified pods"
command: |
aws dynamodb put-item
--table-name pods_replicas
--item {{ dynamodb_item | to_json | quote }}
vars:
dynamodb_item:
"ProjectNameDeployConfig":
"S": '{{ projectNameDeployConfig }}'
"NumberReplicas":
"N": 0
register: updatePayload
when: dynamodbtask == "put" and getResult is skipped
although you'll notice that I changed your variable name because namespace is the name of a type in jinja2, so you can either call it ns or I just used the interpolation value from your vars: block at the top of the playbook, as it doesn't appear that it changed from then

Ansible "set_fact" repository url from json file using filters like "from_json"

Using Ansible "set_fact" module, I need to get repository url from json file using filters like "from_json". I tried in couple ways, and still doesn't get it how is should work.
- name: initial validation
tags: bundle
hosts: localhost
connection: local
tasks:
- name: register bundle version_file
include_vars:
file: '/ansible/playbook/workbench-bundle/bundle.json'
register: bundle
- name: debug registered bundle file
debug:
msg: '{{ bundle }}'
I get json that I wanted:
TASK [debug registered bundle file] ************************************************
ok: [127.0.0.1] => {
"msg": {
"ansible_facts": {
"engine-config": "git#bitbucket.org/engine-config.git",
"engine-monitor": "git#bitbucket.org/engine-monitor.git",
"engine-server": "git#bitbucket.org/engine-server.git",
"engine-worker": "git#bitbucket.org/engine-worker.git"
},
"changed": false
}
}
And then I'm trying to select each value by key name to use this value as URL to "npm install" each package in separate instances.
- name: set_fact some paramater
set_fact:
engine_url: "{{ bundle.('engine-server') | from_json }}"
And then I get error:
fatal: [127.0.0.1]: FAILED! => {"failed": true, "msg": "template error
while templating string: expected name or number. String: {{
bundle.('engine-server') }}"}
I many others ways like this loopkup, and it still fails with others errors. Can someone help to understand, how I can find each parameter and store him as "set_fact"? Thanks
Here is a sample working code to set a variable like in the question (although I don't see much sense in it):
- name: initial validation
tags: bundle
hosts: localhost
connection: local
tasks:
- name: register bundle version_file
include_vars:
file: '/ansible/playbook/workbench-bundle/bundle.json'
name: bundle
- debug:
var: bundle
- debug:
var: bundle['engine-server']
- name: set_fact some paramater
set_fact:
engine_url: "{{ bundle['engine-server'] }}"
The above assumes your input data (which you did not include) is:
{
"engine-config": "git#bitbucket.org/engine-config.git",
"engine-monitor": "git#bitbucket.org/engine-monitor.git",
"engine-server": "git#bitbucket.org/engine-server.git",
"engine-worker": "git#bitbucket.org/engine-worker.git"
}

Filter a JSON document in Ansible

I have a JSON reply from a GitHub repository with a list of possible downloads for a certain release (the assets array in the document).
I want to get the browser download URL when the name of an asset ends with x64.AppImage.
In Ansible, the filters are built apon jmespath and using its terminal tool, I can query the url with the following expression:
assets[?ends_with(name, 'x64.AppImage')].browser_download_url
With the following playbook, the JSON document is queried and stored in the json_reply variable.
---
- hosts: local
tasks:
- name: Get list of Rambox releases
uri:
url: "https://api.github.com/repos/saenzramiro/rambox/releases/latest"
body_format: json
register: json_reply
- name: Filter reply
debug: URL -> "{{ item }}"
with_items:
- "{{ json_reply.json | json_query(json_filter) }}"
vars:
- json_filter: assets[?ends_with(name, 'x64.AppImage')].browser_download_url
However, executing this gives the following error:
fatal: [localhost]: FAILED! => {
"msg": "JMESPathError in json_query filter plugin:\nIn function ends_with(), invalid type for value: latest-mac.json, expected one of: ['string'], received: \"unknown\""
}
Where latest-mac.json is the first object in the assets array.
How can I make Ansible to iterate over all the assets array and apply my filter?
PS:
If instead of querying if the name ends with a word I specify it directly, the filter works:
assets[?name == 'Rambox-0.5.13-x64.AppImage')].browser_download_url
JSON example:
{
"url": "https://api.github.com/repos/saenzramiro/rambox/releases/8001922",
"prerelease": false,
"created_at": "2017-10-04T21:14:15Z",
"published_at": "2017-10-05T01:10:55Z",
"assets": [
{
"url": "https://api.github.com/repos/saenzramiro/rambox/releases/assets/4985942",
"id": 4985942,
"name": "latest-mac.json",
"uploader": {
"login": "saenzramiro",
"id": 2694669
},
"browser_download_url": "https://github.com/saenzramiro/rambox/releases/download/0.5.13/latest-mac.json"
},
{
"url": "https://api.github.com/repos/saenzramiro/rambox/releases/assets/4985640",
"id": 4985640,
"name": "Rambox-0.5.13-x64.AppImage",
"uploader": {
"login": "saenzramiro",
"id": 2694669
},
"browser_download_url": "https://github.com/saenzramiro/rambox/releases/download/0.5.13/Rambox-0.5.13-x64.AppImage"
}
],
"tarball_url": "https://api.github.com/repos/saenzramiro/rambox/tarball/0.5.13"
}
The problem of type errors in JMESPath filters is discussed in issue 27299.
You can use this patched json_query.py filter plugin.
Or apply double conversion to your object as a workaround: | to_json | from_json |.
This will convert object to JSON (thus plain strings) and back, so json_query will treat strings as supported type.
Loop through each asset
Print the browser URL of the item if it ends with x64.AppImage
Solution not using JMESPath:
- name: Filter reply
debug: var=item.browser_download_url
with_items: "{{ json_reply.json.assets }}"
when: item.browser_download_url | regex_search('x64.AppImage$')
As #helloV said, you can accomplish this using Ansible loops, although there's no reason to involve a regular expression match. You can use the same test you're already using:
- name: Filter reply
debug:
var: item.browser_download_url
with_items: "{{ json_reply.json.assets }}"
when: item.name.endswith('x64.AppImage')
The root problem would appear to be an Ansible bug. The error comes from the following check in the jmespath library:
if actual_typename not in allowed_types:
raise exceptions.JMESPathTypeError(
function_name, current,
self._convert_to_jmespath_type(actual_typename), types)
At the point this code is called, the data type of values in your json response is AnsibleUnsafeText, where as allowed_types is [str, unicode]. I think the transformation of values from native types to the AnsibleUnsafeText type probably is some sort of standard Ansible module behavior being imposed by the uri module. We can work around it by using curl instead, like this:
- name: Get list of Rambox releases
command: >
curl -s "https://api.github.com/repos/saenzramiro/rambox/releases/latest"
register: json_reply
And then:
- name: Filter reply
debug:
var: item.browser_download_url
with_items: >
{{ json_reply.stdout|from_json|json_query('assets[?ends_with(name, `x64.AppImage`)]') }}