I have the following json file called cust.json :
{
"customer":{
"CUST1":{
"zone":"ZONE1",
"site":"ASIA"
},
"CUST2":{
"zone":"ZONE2",
"site":"EUROPE"
}
}
}
I am using this json file in my main.yml to get a list of customers (CUST1 and CUST2).
main.yml:
- name: Include the vars
include_vars:
file: "{{ playbook_dir }}/../default_vars/cust.json"
name: "cust_json"
- name: Generate customer config
include_tasks: create_config.yml
loop: "{{ cust_json.customer }}"
I was hoping the loop will basically pass each customer's code (eg CUST1) to create_config.yml, so that something like the following can happen:
create_config.yml:
- name: Create customer config
block:
- name: create temporary file for customer
tempfile:
path: "/tmp"
state: file
prefix: "my customerconfig_{{ item }}."
suffix: ".tgz"
register: tempfile
- name: Setup other things
include_tasks: "othercustconfigs.yml"
Which will result in :
The following files being generated : /tmp/mycustomerconfig_CUST1 and /tmp/mycustomerconfig_CUST2
The tasks within othercustconfigs.yml be run for CUST1 and CUST2.
Questions :
Running the ansible, it fails at this point:
TASK [myrole : Generate customer config ] ************************************************************************************************************************************************************
fatal: [127.0.0.1]: FAILED! => {
"msg": "Invalid data passed to 'loop', it requires a list, got this instead: {u'CUST1': {u'site': u'ASIA', u'zone': u'ZONE1'}, u'CUST2': {u'site': u'EUROPE', u'zone': uZONE2'}}. Hint: If you passed a list/dict of just one element, try adding wantlist=True to your lookup invocation or use q/query instead of lookup."
}
How do I loop the JSON so that it would get the list of customers (CUST1 and CUST2) correctly? loop: "{{ cust_json.customer }}" clearly doesnt work.
If I manage to get the above working, is it possible to pass the result of the loop to the next include_tasks: "othercustconfigs.yml ? SO basically, passing the looped items from main.yml , then to config.yml, and then to othercustconfigs.yml. Is this possible?
Thanks!!
J
cust_json.customer is a hashmap containing one key for each customer, not a list.
The dict2items filter can transform this hashmap into a list of elements each containing a key and value attribute, e.g:
- key: "CUST1"
value:
zone: "ZONE1"
site: "ASIA"
- key: "CUST2"
value:
zone: "ZONE2"
site: "EUROPE"
With this in mind, you can transform your include to the following:
- name: Generate customer config
include_tasks: create_config.yml
loop: "{{ cust_json.customer | dict2items }}"
and the relevant task in your included file to:
- name: create temporary file for customer
tempfile:
path: "/tmp"
state: file
prefix: "my customerconfig_{{ item.key }}."
suffix: ".tgz"
register: tempfile
Of course you can adapt all this to use the value element where needed, e.g. item.value.site
You can see the following documentations for in depth info and alternative solutions:
https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html#dict-filter
https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html#iterating-over-a-dictionary
https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html#with-dict
https://jinja.palletsprojects.com/en/2.11.x/templates/#dictsort
Related
I'm parsing some values using json_query. After extracting the values I'm left with a list of elements which contain the list of values. My goal is to have a single list of un-nested values.
How can I achieve this?
E.g.:
my_list: [ [1,2],[3,4],[5,6] ]
Should become
my_list: [1,2,3,4,5,6]
I can't use my_list[0] | union([my_list[1]) | union(my_list[2]) because the my_list is dynamic.
Use the flatten filter.
Given:
- debug:
msg: "{{ [ [1,2],[3,4],[5,6] ] | flatten(1) }}"
This yields the expected list:
ok: [localhost] =>
msg:
- 1
- 2
- 3
- 4
- 5
- 6
And since you state that you are using json_query, note that there is also a way to flatten in JMESPath, called flatten projection, so you might bake this is your existing query.
As an example:
- debug:
msg: "{{ [ [1,2],[3,4],[5,6] ] | json_query('[]') }}"
Will also yield the expected result.
You can use a custom plugin to handle Python-like jobs easily. To do this, create a folder named filter_plugins (make sure to use this reserved name) in the same folder as your playbook, and add your Python filter there.
$ tree
├── nested_list.yml
├── filter_plugins
│ └── nested_union.py
└── inventory
Make sure the filter contains the FilterModule class and filters method:
$ cat nested_union.py
class FilterModule(object):
def filters(self):
return {
'nested_union': self.nested_union,
}
def nested_union(self, nested_list):
return [x for inner_list in nested_list for x in inner_list]
Call the new filter from your Ansible playbook:
---
- name:
hosts: local
tasks:
- name: Union merged lists
vars:
my_list: [ [1,2],[3,4],[5,6] ]
set_fact:
new_list: "{{ my_list | nested_union }}"
...
Here is the inventory file, just for reference and to complete the example:
[local]
127.0.0.1 ansible_connection=local
And here is the result of the execution:
$ ansible-playbook -i inventory nested_list.yml -v
-- snip --
TASK [Union merged lists]
ok: [127.0.0.1] => {"ansible_facts": {"new_list": [1, 2, 3, 4, 5, 6]}, "changed": false}
Scenario:
I have inspec profile-A(10 controls), Profile-B(15 controls), Profile-C(5 controls)
Profile-A depends on Profile-B and Profile-C.
I have a file in Profile-A which I am prasing with inspec.profile.file('test.json') and executing the 10 controls in the same profile.
I have to pass the same file to Profile-B and Profile-C so that I can execute the other set of tests in each profile as part of the profile dependency
I am able to successfully parse the test.json file in profile-A as the file is in correct folder path
myjson = json(content: inspec.profile.file('test.json'))
puts myjson
I have followed the inspec documentation to set up the profile dependency and inputs to the dependant profiles.
https://docs.chef.io/inspec/inputs/
Issue:
Issue is that I am able to pass a single Input values (like string, array etc..) to the dependent profiles but not able to pass the entire json file so that it will parse and the controls will be executed.
I have tried the following in the profile metadata file
# ProfileB inspec.yml
name: profile-b
inputs:
- name: file1
- name: file2
# wrapper inspec.yml
name: profile-A
depends:
- name: profile-b
path: ../profile-b
inputs:
- name: file1
value: 'json(content: inspec.profile.file('test.json'))'
profile: profile-b
- name: file2
value: 'FILE.read('/path/to/test.json')'
profile: profile-b
Error:
when I try to load the file1 and file2 in profile-b with the following
jsonfile1 = input('file1')
jsonfile2 = input('file2')
puts jsonfile1
puts jsonfile2
error - no implicit conversion of nil to integer
Goal:
I should be able to pass the file from profile-A to profile-B or profile-C so that the respective dependent profile controls are execute.
I have a variable template
var1.yml
variables:
- name: TEST_DB_HOSTNAME
value: 10.123.56.222
- name: TEST_DB_PORTNUMBER
value: 1521
- name: TEST_USERNAME
value: TEST
- name: TEST_PASSWORD
value: TEST
- name: TEST_SCHEMANAME
value: SCHEMA
- name: TEST_ACTIVEMQNAME
value: 10.123.56.223
- name: TEST_ACTIVEMQPORT
value: 8161
When I run the below pipeline
resources:
repositories:
- repository: templates
type: git
name: pipeline_templates
ref: refs/heads/master
trigger:
- none
variables:
- template: templates/var1.yml#templates
pool:
name: PoolA
steps:
- pwsh: |
Write-Host "${{ convertToJson(variables) }}"
I get the output
{
build.sourceBranchName: master,
build.reason: Manual,
system.pullRequest.isFork: False,
system.jobParallelismTag: Public,
system.enableAccessToken: SecretVariable,
TEST_DB_HOSTNAME: 10.123.56.222,
TEST_DB_PORTNUMBER: 1521,
TEST_USERNAME: TEST,
TEST_PASSWORD: TEST,
TEST_SCHEMANAME: SCHEMA,
TEST_ACTIVEMQNAME: 10.123.56.223,
TEST_ACTIVEMQPORT: 8161
}
How can I modify the pipeline to extract only the key value from the result set that starts with "Test_" and store into another variable in the same format so that I could be used in other tasks in the same pipeline ?
OR iterate through the objects that has keys "Test_" and get the value for the same ?
The output you have shown is invalid JSON and cannot be transformed with JSON. Assuming that it were valid JSON:
{
"build.sourceBranchName": "master",
"build.reason": "Manual",
"system.pullRequest.isFork": "False",
"system.jobParallelismTag": "Public",
"system.enableAccessToken": "SecretVariable",
"TEST_DB_HOSTNAME": "10.123.56.222",
"TEST_DB_PORTNUMBER": 1521,
"TEST_USERNAME": "TEST",
"TEST_PASSWORD": "TEST",
"TEST_SCHEMANAME": "SCHEMA",
"TEST_ACTIVEMQNAME": "10.123.56.223",
"TEST_ACTIVEMQPORT": 8161
}
then you can use the to_entries or with_entries filters of jq to get an object containing only those keys which start with "TEST_":
with_entries(select(.key|startswith("TEST_")))
This will give you a new object as output:
{
"TEST_DB_HOSTNAME": "10.123.56.222",
"TEST_DB_PORTNUMBER": 1521,
"TEST_USERNAME": "TEST",
"TEST_PASSWORD": "TEST",
"TEST_SCHEMANAME": "SCHEMA",
"TEST_ACTIVEMQNAME": "10.123.56.223",
"TEST_ACTIVEMQPORT": 8161
}
The convertToJson() function is a bit messy, as the "json" it creates is not, in fact, a valid json.
There are several possible approaches I can think of:
Use convertToJson() to pass the non-valid json to a script-step, convert it to a valid json and then extract the relevant values. I have done this before and it typically works, if you have control over the data in the variables. The downside is that there is risk that the conversion to valid json can fail.
Create a yaml-loop that iterates the variables and extract the ones that begins with Test_. You can find examples of how to write a loop here, but basically, it would look like this:
- stage:
variables:
firstVar: 1
secondVar: 2
Test_thirdVar: 3
Test_forthVar: 4
jobs:
- job: loopVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: handling ${{ var.key }}
If applicable to your use case, you can create complex parameters (instead of variables) for only the Test_ variables. Using this, you could use the relevant values directly and would not need to extract a subset from your variable list. Note however, that parameters are inputs to a pipeline and can be adjusted before execution. Example:
parameters:
- name: non-test-variables
type: object
default:
firstVar: 1
secondVar: 2
- name: test-variables
type: object
default:
Test_thirdVar: 3
Test_forthVar: 4
You can use these by referencing ${{ parameters.Test_thirdVar }} in the pipeline.
I'm calling a webservice and returning some JSON. I want to conditionally run a subsequent task, based on whether a particular name value is found.
For example, set a value nameExists if and only if the values array contains a name field of myDemo. So in this case, nameExists would be defined:
...
"failed": false,
"json": {
"values": [{
"id": "1234",
"name": "myDemo"
},
{
"id": "6789",
"name": "myDemo2"
}]
},
"msg": "OK (100 bytes)"
...
Here's what I'm currently trying:
# Call API
- name: Call API
uri:
url: myURL
method: POST
register: apiCheckResult
- name: Debug Auto tags
debug:
msg: "{{ item.name }}"
loop: "{{ apiCheckResult.json['values'] }}"
when: item.name == "myDemo"
register: tagExists
This works, in a way, but it gives me the full JSON output, all I need is a true / false.
Am I on the right track or is there a better way to achieve this?
You don't (normally) use a debug task to set variables. You probably want to use set_fact. If I understand your question correctly, you want to set a boolean tagExists to true if one of the items in the values list of the API response contains the name myDemo. That might look like this:
- set_fact:
tagExists: "{{ apiCheckResult.json|json_query('values[?name == `myDemo`]') }}"
"But wait!", you say, "that's not a boolean!". While you are correct, you can treat it like on. For example, after having set tagExists using that task, you could do this:
- debug:
msg: "The tag exists!"
when: tagExists
This works because a non-empty list evaluates as a true value in a boolean context (and an empty list evaluates as false). The json_query expression above returns a non-empty list when there is a match, and an empty list otherwise.
If you really want a boolean, you could do this instead:
- set_fact:
tagExists: "{{ true if apiCheckResult.json|json_query('values[?name == `myDemo`]') else false }}"
I have been stuck to get a particular json object if the value of a key matches a variable (string).
My json file looks like this:
"totalRecordsWithoutPaging": 1234,
"jobs": [
{
"jobSummary": {
"totalNumOfFiles": 0,
"jobId": 8035,
"destClientName": "BOSDEKARLSSP010",
"destinationClient": {
"clientId": 10,
"clientName": "BOSDEKARLSSP010"
}
}
},
{
"jobSummary": {
"totalNumOfFiles": 0,
"jobId": 9629,
"destClientName": "BOSDEKARLSSP006",
"destinationClient": {
"clientId": 11,
"clientName": "BOSDEKARLSSP006"
}
}
},
.....
]
}
I read this json with result: "{{ lookup('file','CVExport-short.json') | from_json }}" and I can get only one value of destClientName key with the following code:
- name: Iterate JSON
set_fact:
app_item: "{{ item.jobSummary }}"
with_items: "{{ result.jobs }}"
register: app_result
- debug:
var: app_result.results[0].ansible_facts.app_item.destClientName
My goal is to get the value of jobIdif the value of destClientName matches some other variable or string in any jobSummary.
I don't still have much knowledge in Ansible. So, any help would be much appreciated.
Update
Ok, I have found one solution.
- name: get job ID
set_fact:
job_id: "{{ item.jobSummary.jobId }}"
with_items: "{{ result.jobs}}"
when: item.jobSummary.destClientName == '{{ target_vm }}'
- debug:
msg: "{{job_id}}"
But I think there might be a better solution than this. Any idea how?
Ansible's json_query filter let's you perform complex filtering of JSON documents by applying JMESPath expressions. Rather than looping over the jobs in the the result, you can get the information you want in a single step.
We want to query all jobs in which have a destClientName that matches the value in target_vm. Using literal values, the expression yielding that list of jobs would look like this:
jobs[?jobSummary.destClientName == `BOSDEKARLSSP006`]
The result of this, when applied to your sample data, would be:
[
{
"jobSummary": {
"totalNumOfFiles": 0,
"jobId": 9629,
"destClientName": "BOSDEKARLSSP006",
"destinationClient": {
"clientId": 11,
"clientName": "BOSDEKARLSSP006"
}
}
}
]
From this result, you want to extract the jobId, so we rewrite the expression like this:
jobs[?jobSummary.destClientName == `BOSDEKARLSSP006`]|[0].jobSummary.jobId
Which gives us:
9629
To make this work in a playbook, you'll want to replace the literal hostname in this expression with the value of your target_vm variable. Here's a complete playbook that demonstrates the solution:
---
- hosts: localhost
gather_facts: false
# This is just the sample data from your question.
vars:
target_vm: BOSDEKARLSSP006
results:
totalRecordsWithoutPaging: 1234
jobs:
- jobSummary:
totalNumOfFiles: 0
jobId: 8035
destClientName: BOSDEKARLSSP010
destinationClient:
clientId: 10
clientName: BOSDEKARLSSP010
- jobSummary:
totalNumOfFiles: 0
jobId: 9629
destClientName: BOSDEKARLSSP006
destinationClient:
clientId: 11
clientName: BOSDEKARLSSP006
tasks:
- name: get job ID
set_fact:
job_id: "{{ results|json_query('jobs[?jobSummary.destClientName == `{}`]|[0].jobSummary.jobId'.format(target_vm)) }}"
- debug:
var: job_id
Update re: your comment
The {} in the expression is a Python string formatting sequence that
is filled in by the call to .format(target_vm). In Python, the
expression:
'The quick brown {} jumped over the lazy {}.'.format('fox', 'dog')
Would evaluate to:
The quick brown fox jumped over the lazy dog.
And that's exactly what we're doing in that set_fact expression. I
could instead have written:
job_id: "{{ results|json_query('jobs[?jobSummary.destClientName == `' ~ target_vm ~ '`]|[0].jobSummary.jobId') }}"
(Where ~ is the Jinja stringifying concatenation operator)