I have a GitHub action step like this (extracted from a larger test.yml file):
steps:
- name: Parse
shell: bash
env:
TYPE: ${{matrix.package-type}}
BV: ${{matrix.builder-version}}
# This comment is line 63, the "#" is in column 9
NULL: ${{ matrix.beta-version }}
run: |
echo TYPE is "$TYPE"
echo BV is "$BV"
printf "Null is '%s'\n" "$NULL"
When I run it, I get the following error:
The workflow is not valid. .github/workflows/test.yml (Line: 64, Col: 9): Unexpected value ''
Why is this line invalid? How do I fix it?
If you're using a reusable workflow, make sure you're not passing runs-on to the job that uses the workflow. The runs-on is contained in the workflow file itself, not in the top level job.
The workflow is not valid. .github/workflows/REDACTED.yml (Line: 123, Col: 5): Unexpected value 'uses' .github/workflows/REDACTED.yml (Line: 124, Col: 5): Unexpected value 'with'
It turns out there is some quirk in the GitHub action YAML parser that treats NULL as a special token. I guess it parses
NULL: ${{ matrix.beta-version }}
as if it were
'': ${{ matrix.beta-version }}
Changing NULL to null does not help. (Side note, keys to env must be unique when compared in a case-insensitive comparison, meaning you cannot have both FOO and foo, even though the case is preserved when setting the environment variable name.)
The best fix/workaround is to avoid using "NULL" and use something else, like "NIL". However, if you must use "NULL", you can do it by putting it in quotes:
"NULL": ${{ matrix.beta-version }}
In my case it was because go-version looked like this in the matrix section:
matrix:
go-version: 1.17
os: [ubuntu-latest, macos-latest]
It needed to be changed to an array: [1.17]
Related
I have the following JSON structure, generated by Zabbix Discovery key, with the following data:
[{
"{#SERVICE.NAME}": ".WindowsService1",
"{#SERVICE.DISPLAYNAME}": ".WindowsService1 - Testing",
"{#SERVICE.DESCRIPTION}": "Application Test 1 - Master",
"{#SERVICE.STATE}": 0,
"{#SERVICE.STATENAME}": "running",
"{#SERVICE.PATH}": "E:\\App\\Test\\bin\\testingApp.exe",
"{#SERVICE.USER}": "LocalSystem",
"{#SERVICE.STARTUPTRIGGER}": 0,
"{#SERVICE.STARTUP}": 1,
"{#SERVICE.STARTUPNAME}": "automatic delayed"
},
{
"{#SERVICE.NAME}": ".WindowsService2",
"{#SERVICE.DISPLAYNAME}": ".WindowsService2 - Testing",
"{#SERVICE.DESCRIPTION}": "Application Test 2 - Slave",
"{#SERVICE.STATE}": 0,
"{#SERVICE.STATENAME}": "running",
"{#SERVICE.PATH}": "E:\\App\\Test\\bin\\testingApp.exe",
"{#SERVICE.USER}": "LocalSystem",
"{#SERVICE.STARTUPTRIGGER}": 0,
"{#SERVICE.STARTUP}": 1,
"{#SERVICE.STARTUPNAME}": "automatic delayed"
}]
So, what i want to do is: Use JSONPath to get ONLY the object that {#SERVICE.NAME} == WindowsService1...
The problem is, i am trying to create the JSONPath but it's giving me a couple of errors.
Here's what i tried, and what i discovered so far:
JSONPath:
$.[?(#.{#SERVICE.NAME} == '.WindowsService1')]
Error output:
jsonPath: Unexpected token '{': _$_v.{#SERVICE.NAME} ==
'.WindowsService1'
I also tried doing the following JSONPath, to match Regular Expression:
$.[?(#.{#SERVICE.NAME} =~ '^(.WindowsService1$)')]
It gave me the same error - So the problem is not after the == or =~ ...
What i discovered is, if i REMOVE the curly braces {}, the hashtag # and replace the dot . in "Service name" with _ (Underline), in JSONPath and in JSON data, it works, like this:
Data without # {} . :
[{
"SERVICE_NAME": ".WindowsService1",
[...]
JSONPath following new data structure:
$.[?(#.SERVICE_NAME == '.WindowsService1')]
But the real problem is, i need to maintain the original strucutre, with the curly braces, dots, and hashtags...
How can i escape those and stop seeing this error?
Thank you...
$.[?(#['{#SERVICE.NAME}'] == '.WindowsService1')]
I'm using Airflow 2.2.2 with the latest providers installed as appropriate.
I'm trying to use the Azure and MySQL hooks and have created custom operators with templates defined for what variables can be templated.
When I do so, I get an error saying that conn or var cannot be found
e.g. my passed parameter is
{{ conn.<variable_name> }}
or
{{ var.json.value.<variable_name> }}
I believe this should be possible in > v2.0 but not working for me, any ideas why?
EDIT: Below are snippets of code with some sensitive information removed, let me know if anything else is needed?
DAG error -
Broken DAG: [/home/dags/dag.py] Traceback (most recent call last):
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/dags/dag.py", line 52, in <module>
wasb_conn_id = {{ conn.wasb }},
NameError: name 'conn' is not defined
task in dag.py
t1 = WasbLogBlobsToCSVOperator(
task_id='task_xyz',
wasb_conn_id = {{ conn.wasb }},
Custom Operator using an extended version of the Microsoft Azure wasb hook , used by dag.py -
class WasbLogBlobsToCSVOperator(BaseOperator):
template_fields = (
'wasb_conn_id',
)
def __init__(
self,
*,
wasb_conn_id: str = 'wasb',
**kwargs,
) -> None:
super().__init__(**kwargs)
self.wasb_conn_id = wasb_conn_id
self.hook = ExtendedWasbHook(wasb_conn_id=self.wasb_conn_id)
There looks to be a few things going on here that should help.
Jinja templates are string expressions. Try wrapping your wasb_conn_id arg in quotes.
wasb_conn_id = "{{ conn.wasb }}",
Templated fields are not rendered until the task runs meaning the Jinja expression won't be evaluated until an operator's execute() method is called. This is why you are seeing an exception from your comment below. The literal string "{{ conn.wasb }}" is being evaluated as the conn_id. If you want to use a template field in the custom operator, you need to move that logic to be in the scope of the execute() method.
Why do you need to use a Jinja expression here? Since the format of accessing the Connection object via Jinja is {{ conn.<my_conn_id> }}, you could just use the value "wasb" directly.
I have a variable template
var1.yml
variables:
- name: TEST_DB_HOSTNAME
value: 10.123.56.222
- name: TEST_DB_PORTNUMBER
value: 1521
- name: TEST_USERNAME
value: TEST
- name: TEST_PASSWORD
value: TEST
- name: TEST_SCHEMANAME
value: SCHEMA
- name: TEST_ACTIVEMQNAME
value: 10.123.56.223
- name: TEST_ACTIVEMQPORT
value: 8161
When I run the below pipeline
resources:
repositories:
- repository: templates
type: git
name: pipeline_templates
ref: refs/heads/master
trigger:
- none
variables:
- template: templates/var1.yml#templates
pool:
name: PoolA
steps:
- pwsh: |
Write-Host "${{ convertToJson(variables) }}"
I get the output
{
build.sourceBranchName: master,
build.reason: Manual,
system.pullRequest.isFork: False,
system.jobParallelismTag: Public,
system.enableAccessToken: SecretVariable,
TEST_DB_HOSTNAME: 10.123.56.222,
TEST_DB_PORTNUMBER: 1521,
TEST_USERNAME: TEST,
TEST_PASSWORD: TEST,
TEST_SCHEMANAME: SCHEMA,
TEST_ACTIVEMQNAME: 10.123.56.223,
TEST_ACTIVEMQPORT: 8161
}
How can I modify the pipeline to extract only the key value from the result set that starts with "Test_" and store into another variable in the same format so that I could be used in other tasks in the same pipeline ?
OR iterate through the objects that has keys "Test_" and get the value for the same ?
The output you have shown is invalid JSON and cannot be transformed with JSON. Assuming that it were valid JSON:
{
"build.sourceBranchName": "master",
"build.reason": "Manual",
"system.pullRequest.isFork": "False",
"system.jobParallelismTag": "Public",
"system.enableAccessToken": "SecretVariable",
"TEST_DB_HOSTNAME": "10.123.56.222",
"TEST_DB_PORTNUMBER": 1521,
"TEST_USERNAME": "TEST",
"TEST_PASSWORD": "TEST",
"TEST_SCHEMANAME": "SCHEMA",
"TEST_ACTIVEMQNAME": "10.123.56.223",
"TEST_ACTIVEMQPORT": 8161
}
then you can use the to_entries or with_entries filters of jq to get an object containing only those keys which start with "TEST_":
with_entries(select(.key|startswith("TEST_")))
This will give you a new object as output:
{
"TEST_DB_HOSTNAME": "10.123.56.222",
"TEST_DB_PORTNUMBER": 1521,
"TEST_USERNAME": "TEST",
"TEST_PASSWORD": "TEST",
"TEST_SCHEMANAME": "SCHEMA",
"TEST_ACTIVEMQNAME": "10.123.56.223",
"TEST_ACTIVEMQPORT": 8161
}
The convertToJson() function is a bit messy, as the "json" it creates is not, in fact, a valid json.
There are several possible approaches I can think of:
Use convertToJson() to pass the non-valid json to a script-step, convert it to a valid json and then extract the relevant values. I have done this before and it typically works, if you have control over the data in the variables. The downside is that there is risk that the conversion to valid json can fail.
Create a yaml-loop that iterates the variables and extract the ones that begins with Test_. You can find examples of how to write a loop here, but basically, it would look like this:
- stage:
variables:
firstVar: 1
secondVar: 2
Test_thirdVar: 3
Test_forthVar: 4
jobs:
- job: loopVars
steps:
- ${{ each var in variables }}:
- script: |
echo ${{ var.key }}
echo ${{ var.value }}
displayName: handling ${{ var.key }}
If applicable to your use case, you can create complex parameters (instead of variables) for only the Test_ variables. Using this, you could use the relevant values directly and would not need to extract a subset from your variable list. Note however, that parameters are inputs to a pipeline and can be adjusted before execution. Example:
parameters:
- name: non-test-variables
type: object
default:
firstVar: 1
secondVar: 2
- name: test-variables
type: object
default:
Test_thirdVar: 3
Test_forthVar: 4
You can use these by referencing ${{ parameters.Test_thirdVar }} in the pipeline.
I have the following json file called cust.json :
{
"customer":{
"CUST1":{
"zone":"ZONE1",
"site":"ASIA"
},
"CUST2":{
"zone":"ZONE2",
"site":"EUROPE"
}
}
}
I am using this json file in my main.yml to get a list of customers (CUST1 and CUST2).
main.yml:
- name: Include the vars
include_vars:
file: "{{ playbook_dir }}/../default_vars/cust.json"
name: "cust_json"
- name: Generate customer config
include_tasks: create_config.yml
loop: "{{ cust_json.customer }}"
I was hoping the loop will basically pass each customer's code (eg CUST1) to create_config.yml, so that something like the following can happen:
create_config.yml:
- name: Create customer config
block:
- name: create temporary file for customer
tempfile:
path: "/tmp"
state: file
prefix: "my customerconfig_{{ item }}."
suffix: ".tgz"
register: tempfile
- name: Setup other things
include_tasks: "othercustconfigs.yml"
Which will result in :
The following files being generated : /tmp/mycustomerconfig_CUST1 and /tmp/mycustomerconfig_CUST2
The tasks within othercustconfigs.yml be run for CUST1 and CUST2.
Questions :
Running the ansible, it fails at this point:
TASK [myrole : Generate customer config ] ************************************************************************************************************************************************************
fatal: [127.0.0.1]: FAILED! => {
"msg": "Invalid data passed to 'loop', it requires a list, got this instead: {u'CUST1': {u'site': u'ASIA', u'zone': u'ZONE1'}, u'CUST2': {u'site': u'EUROPE', u'zone': uZONE2'}}. Hint: If you passed a list/dict of just one element, try adding wantlist=True to your lookup invocation or use q/query instead of lookup."
}
How do I loop the JSON so that it would get the list of customers (CUST1 and CUST2) correctly? loop: "{{ cust_json.customer }}" clearly doesnt work.
If I manage to get the above working, is it possible to pass the result of the loop to the next include_tasks: "othercustconfigs.yml ? SO basically, passing the looped items from main.yml , then to config.yml, and then to othercustconfigs.yml. Is this possible?
Thanks!!
J
cust_json.customer is a hashmap containing one key for each customer, not a list.
The dict2items filter can transform this hashmap into a list of elements each containing a key and value attribute, e.g:
- key: "CUST1"
value:
zone: "ZONE1"
site: "ASIA"
- key: "CUST2"
value:
zone: "ZONE2"
site: "EUROPE"
With this in mind, you can transform your include to the following:
- name: Generate customer config
include_tasks: create_config.yml
loop: "{{ cust_json.customer | dict2items }}"
and the relevant task in your included file to:
- name: create temporary file for customer
tempfile:
path: "/tmp"
state: file
prefix: "my customerconfig_{{ item.key }}."
suffix: ".tgz"
register: tempfile
Of course you can adapt all this to use the value element where needed, e.g. item.value.site
You can see the following documentations for in depth info and alternative solutions:
https://docs.ansible.com/ansible/latest/user_guide/playbooks_filters.html#dict-filter
https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html#iterating-over-a-dictionary
https://docs.ansible.com/ansible/latest/user_guide/playbooks_loops.html#with-dict
https://jinja.palletsprojects.com/en/2.11.x/templates/#dictsort
I am having a group_var/all file whose starting lines look exactly like this :
##################################
['./roles/openssh/defaults',
'./roles/rsyslog/defaults',
'./roles/tomcat8/defaults',
'./roles/oracle_java/defaults',
'./roles/psp_db/defaults',
'./roles/provision_kill_instance/defaults',
'./roles/kill_app/defaults',
'./roles/base/defaults',
'./roles/ntp/defaults']
##################################
base_google_dns_enabled: false
when i run it as ansible-playbook provision_aws.yml it throws a error :
ERROR: Syntax Error while loading YAML script, /home/nsingh/ansible-psportal/group_vars/all
Note: The error may actually appear before this position: line 13, column 1
base_google_dns_enabled: false
According to my diagnosis , this is because of the things inside [] , even if i put this at the end of my group_vars i encounter something similar. Any Help Would be highly appreciated.
Your syntax looks incorrect. Try:
---
myroles:
- './roles/openssh/defaults'
- './roles/rsyslog/defaults'
- './roles/tomcat8/defaults'
- './roles/oracle_java/defaults'
- './roles/psp_db/defaults'
- './roles/provision_kill_instance/defaults'
- './roles/kill_app/defaults'
- './roles/base/defaults'
- './roles/ntp/defaults'
base_google_dns_enabled: false