Leverage key and value from matrix object in GitHub actions - json

Assuming I have a step that produces the following output
{
"foo": 1,
"bar": 2,
"baz": 3
}
How can I then use / access both the keys and values of the object above in a matrix strategy, i.e. something like
build-python-images:
needs:
- create-python-matrix
strategy:
matrix:
python-builds: ${{ fromJSON(needs.create-python-and-debian-matrix.outputs.python_builds) }}
uses: Org/repo/.github/workflows/reusable.yaml#master
with:
someinput: ${{ matrix.python-builds.key }}
anotherinput: ${{ matrix.python-builds.value }}

Related

How to pass Github secrets to JSON file

I wanted to pass secrets from a GitHub action to a JSON file in the same workflow.
# Github secrets
SECRET_TOKEN: 4321
In file.json the SECRET_TOKEN value needs to be fetched.
# file.json
{
secret_token: "SECRET_TOKEN", # should fetch the SECRET_TOKEN from git action
apiId: "blabla"
}
Expected Output:
# file.json
{
secret_token: "4321",
apiId: "blabla"
}
You have several options - you can use pure bash and jq to achieve that or if you are not that experienced, an easier way will be to use one of existing actions from marketplace, like this one:
https://github.com/marketplace/actions/create-json
- name: create-json
uses: jsdaniell/create-json#1.1.2
with:
name: "file.json"
json: '{"app":"blabla", "secret_token":"${{ secrets.SECRET_TOKEN }}"}'
I would suggest you to use the replace-tokens action, as example, suppose this json file:
file.json
{
secret_token: "#{SECRET_TOKEN}#",
apiId: "blabla"
}
the action:
- uses: cschleiden/replace-tokens#v1
with:
files: 'file.json'
env:
SECRET_TOKEN: ${{ secrets.SECRET_TOKEN }}
If you want to use a different token format, you can specify a custom token prefix/suffix. For example, to replace just tokens like `{SECRET_TOKEN} you could add:
- uses: cschleiden/replace-tokens#v1
with:
files: 'file.json'
tokenPrefix: '{'
tokenSuffix: '}'
env:
SECRET_TOKEN: ${{ secrets.SECRET_TOKEN }}

Can't extract data from nested array in Ansible

I am trying to extract the values of vs_name for every item in the array list but it looks like there is something I am doing wrong but can't figure it out.
Here is the output I want to parse
ok: [localhost] => {
"msg": {
"AV-FAS": {
"vs_name": "AV-FAS",
"vs_type": "admin"
},
"AV-FAS-01": {
"vs_name": "AV-FAS-01",
"vs_type": "node"
},
"AV-FAS-02": {
"vs_name": "AV-FAS-02",
"vs_type": "node"
}
Here is my code:
- name: populate vs list
set_fact:
vs_list: "{{ vs_list|default([]) }} + [ '{{ item.vs_name }}' ]"
with_items: "{{ output }}"
Q: "Extract the values of vs_name."
A: Use filter json_query. For example
- set_fact:
vs_list: "{{ output|json_query('*.vs_name') }}"
gives
"vs_list": [
"AV-FAS",
"AV-FAS-01",
"AV-FAS-02"
]
The next option is mapping of an attribute. The filter dict2items is needed to convert the dictionary to a list. For example, the task below gives the same result
- set_fact:
vs_list: "{{ output|dict2items|
map(attribute='value.vs_name')|
list }}"
Q: "List vs_name when vs_type = 'admin'"
A: Add filter selectattr to the pipe. For example
- set_fact:
vs_list: "{{ output|dict2items|
selectattr('value.vs_type', 'eq', 'admin')|
map(attribute='value.vs_name')|
list }}"
gives
"vs_list": [
"AV-FAS"
]

how to iterate over list from json in ansible

I tried to register an output to a variable, but i couldnt filter the way i want it.
output:
oc get hpa -o json |jq -r '.items[].spec'
{
"maxReplicas": 3,
"minReplicas": 1,
"scaleTargetRef": {
"apiVersion": "apps.openshift.io/v1",
"kind": "DeploymentConfig",
"name": "hello-openshift"
},
"targetCPUUtilizationPercentage": 70
}
{
"maxReplicas": 4,
"minReplicas": 2,
"scaleTargetRef": {
"apiVersion": "apps/v1",
"kind": "Deployment",
"name": "testrhel"
},
"targetCPUUtilizationPercentage": 79
}
Register the output to variable
- name: check for existing
shell: oc get hpa -o json |jq -r '.items[].spec'
register: existing
I would like to loop the output.name and compare it to another variable.
- name: set_fact
exist: {% if item.name == newvar and item.kind == newvar2 %}yes{%else%}no{%endif%}
loop:
- "{{ existing }}"
- name: task
shell: do something
when: exist == yes
Thanks in advance.
edit:
currently i am using below to get my comparison for the variables.
- name: Get existing hpa output
shell: oc get hpa -o json -n {{ namespace }} |jq -r '.'
register: tempvar
- name: set hpa variable to fact
set_fact:
existing_deploy: "{{ tempvar.stdout}}"
- name: Comparing existing hpa to new config
set_fact:
hpa_exist: "{% if deploy_type == item.spec.scaleTargetRef.kind|lower and deploy_name == item.spec.scaleTargetRef.name|lower %}yes{% else %}no{% endif %}"
with_items:
- "{{ existing_deploy['items'] }}"
but the variable got overwrite when i trying to use when condition
- name: task a
include_tasks: a.yml
when: hpa_exist
- name: task b
include_tasks: b.yml
when: not hpa_exist
deploymentconfig/hello-openshift condition always fail even when it is true. leading to execute task b, which is not supposed to
Check out the documentation of the shell module.
The output of the shell on stdout will be in <var>.stdout (so in existing.stdout in your case.)
Once you got that, you obviously have json as text, but you want to parse it. To do that, use the from_json filter as shown in this answer.
Summa summarum your task should look like this:
- name: set_fact
set_fact:
exist: {% if item['scaleTargetRef']['name'] == newvar and item['scaleTargetRef']['kind'] == newvar2 %}yes{% else %}no{% endif %}
loop: "{{ existing.stdout | from_json}}"
But your output needs to be a valid list, so basically, it needs to look like this:
[{
"maxReplicas": 3,
"minReplicas": 1,
"scaleTargetRef": {
"apiVersion": "apps.openshift.io/v1",
"kind": "DeploymentConfig",
"name": "hello-openshift"
},
"targetCPUUtilizationPercentage": 70
},
{
"maxReplicas": 4,
"minReplicas": 2,
"scaleTargetRef": {
"apiVersion": "apps/v1",
"kind": "Deployment",
"name": "testrhel"
},
"targetCPUUtilizationPercentage": 79
}]
But you might actually have a logic error, because you are looping over the list and overwriting the variable exist on every turn. So you will end up with one variable exist in the end and that will hold the value of the last iteration.
Check out how to register variables with a loop if you need the output of every iteration.
If you want to do something for each item that meets the condition, you can do this:
- name: check for existing
shell: oc get hpa -o json | jq -r '.items[].spec'
register: existing
- name: include tasks a
include_tasks: a.yml
when:
- deploy_type == item['scaleTargetRef']['kind'] | lower
- deploy_name == item['scaleTargetRef']['name'] | lower
loop: "{{ existing.stdout | from_json }}"
- name: include tasks b
include_tasks: b.yml
when: (deploy_type != item['scaleTargetRef']['kind'] | lower) or
(deploy_name != item['scaleTargetRef']['name'] | lower)
loop: "{{ existing.stdout | from_json }}"
You do not need any of the set_fact stuff in that case.

transform values of yaml hash into keys of json hash in Ansible

I'm trying to get Ansible to convert an array of hashes, into to a list of key value pairs with the keys being one of the values from the first hash and the values being a different value from the first hash.
An example will help.
I want to convert :-
TASK [k8s_cluster : Cluster create | debug result of private ec2_vpc_subnet_facts] ***
ok: [localhost] => {
"result": {
"subnets": [
{
"availability_zone": "eu-west-1c",
"subnet_id": "subnet-cccccccc",
},
{
"availability_zone": "eu-west-1a",
"subnet_id": "subnet-aaaaaaaa",
},
{
"availability_zone": "eu-west-1b",
"subnet_id": "subnet-bbbbbbbb",
}
]
}
}
into
eu-west-1a: subnet-aaaaaaaa
eu-west-1b: subnet-bbbbbbbb
eu-west-1c: subnet-cccccccc
I've tried result.subnets | map('subnet.availability_zone': 'subnets.subnet_id') (which doesn't work at all) and json_query('subnets[*].subnet_id' which simply pickes out the subnet_id values and puts them into a list.
I think I could do this with Zip and Hash in Ruby but I don't know how to make this work in Ansible, or more specifically in Jmespath.
I have generated the below list I will add a new line to the generated list(thought to share this first)
---
- name: play
hosts: localhost
tasks:
- name: play
include_vars: vars.yml
- name: debug
debug:
msg: "{% for each in subnets %}{{ each.availability_zone }}:{{ each.subnet_id }}{% raw %},{% endraw %}{% endfor %}"
output --->
ok: [localhost] => {
"msg": "eu-west-1c:subnet-cccccccc,eu-west-1a:subnet-aaaaaaaa,eu-west-1b:subnet-bbbbbbbb,"
}
Jmespath does not allow to use dynamic names in multi select hashes. I have found an extension to jmespath allowing to do such thing by using key references, but it is not part of the plain jmespath implementation nor ansible.
To do this in plain ansible, you will have to create a new variable and populate it with a loop. There might be other ways using other filters but this is the solution I came up with:
- name: Create the expected hash
set_fact:
my_hash: >-
{{
my_hash
| default({})
| combine({ item.availability_zone: item.subnet_id })
}}
loop: "{{ subnets }}"
- name: Print result
debug:
var: my_hash

how can i map a list to a dict with computed values keyed by the list items in ansible?

apologies for the awkward title, but i couldn't figure out a better way to phrase what seems like a very common operation:
i have a list like
repos:
- nrser/x
- nrser/y
and want to transform it to a dict like
repos_dict:
nrser/x: nrser_x
nrser/y: nrser_y
this is super simple in python
repos = ['nrser/x', 'nrser/y']
repos_dict = dict((repo, repo.replace('/', '_')) for repo in repos)
# => {'nrser/x': 'nrser_x', 'nrser/y': 'nrser_y'}
but i can't figure out how to accomplish it with Ansible / Jinja2 (short of dropping into python via a module or plugin, but that seems ridiculous for such a basic use case).
it's easy to map the repos to a new list with the underscored names (i need to use them in file paths)
set_fact:
repo_filename_segments: "{{ repos | map('replace', '/', '_') | list }}"
but then i need zip them together, and i can't find support for that either (see ziplist1-list2-in-jinja2 and how-to-combine-two-lists-together)
i've tried:
- set_fact:
repos:
- beiarea/www
- beiarea/relib
- set_fact:
repos_dict: {}
- with_items: "{{ repos }}"
set_fact:
"repos_dict[{{ item }}]: "{{ item | replace('/', '_') }}"
but that doesn't work either.
maybe it's not possible in Ansible / Jinja, but it seems like a really elementary operation to have been overlooked.
thanks for any solutions or suggestions.
Ansible 2.3 adds (amongst others) a zip filter:
- set_fact:
repos:
- beiarea/www
- beiarea/relib
- set_fact:
dirnames: "{{ repos | zip(repos|map('replace', '/', '_')) | list }}"
- debug: var=dirnames
# "dirnames": [
# [
# "beiarea/www",
# "beiarea_www"
# ],
# [
# "beiarea/relib",
# "beiarea_relib"
# ]
# ]
- debug:
msg: "{{ item[0] }} => {{ item[1] }}"
with_items:
- "{{ dirnames }}" # with_items flattens the first level
I'm still looking for a good way to turn that into a dictionary, for easy lookups.