Using parameters between two jobs with Azure Pipeline templates - parameter-passing

I'm creating parameters in job A and want to use them in Job B, however Job B can not get the parameters value from Job A when I use template. Here is what I am trying:
fetchingfilenames.yml
jobs:
- job: A
pool:
vmImage: 'ubuntu-latest'
steps:
- checkout: myrepo
persistCredentials: true
- task: PowerShell#2
displayName: 'Fetching FileNames'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching filenames"
cd $(valuepath)
Write-Host $(valuepath)
##Listing files in the directory and saving them in an array
$a=ls
Write-Host "Files:"$a
$List = $a | foreach {'$(valuepath)/' + $_}
Write-Host "Files with complete path:"$List
$b = '"{0}"' -f ($List -join '","')
Write-Host $b ####Output is: "$(valuepath)/file1.yml, $(valuepath)/file2.yml"
Write-Host "##vso[task.setvariable variable=valuefiles;isOutput=true]$b"
name: fileoutput
deploy.yml
parameters:
files: []
jobs:
- job: B
dependsOn: A
pool:
vmImage: 'ubuntu-latest'
variables:
filenames: ${{ parameters.files }}
steps:
- checkout: myrepo
persistCredentials: true
- task: AzureCLI#2
inputs:
azureSubscription: 'mysubscription'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
echo "fetching filenames"
echo $(filenames) ####Error: output is empty
for i in $(echo $(filenames) | sed "s/,/ /g"); ###it doesn't run this line as it seems it can not find $(filename)
do
echo "valuefiles= $i";
done
azure-pipeline.yml
trigger:
branches:
include:
- master
paths:
include:
- azure-pipeline.yml
variables:
azureSubscription: 'mysubscription'
containerRegistry: 'myacr'
repository: 'myrepo'
chartPath: 'mypath'
resources:
repositories:
- repository: pipelinetemplates
type: github
name: myorg/mytemplates
endpoint: myendpoint
- repository: myrepo
type: github
name: myorg/myrepo
endpoint: myendpoint
trigger:
branches:
include:
- master
paths:
include:
- myfolder/*
stages:
- stage: Deploy
variables:
azureResourceGroup: 'myresourcegroup'
kubernetesCluster: 'myk8s'
domain: 'mydomain'
valuepath: myfolder
jobs:
- template: Template-Yaml/fetchingfilenames.yml#pipelinetemplates
- template: Template-Yaml/deploy.yml#pipelinetemplates
parameters:
##Fetching variable as parameters
files : $[dependencies.A.outputs['fileoutput.valuefiles']]
If I put Job A directly in azure-pipeline.yml and not use template for it, it workes perfectly fine however fetching Job A from template doesn't work as Job B can not fetch the parameter from Job A anymore.
Does anyone know what it is missing here?

fetchingfilenames.yml
jobs:
- job: A
steps:
- powershell: |
cd $(System.defaultworkingdirectory)
$a=ls
Write-Host "Files:"$a
$List = $a | foreach {'$(valuepath)/' + $_}
Write-Host "Files with complete path:"$List
$b = '"{0}"' -f ($List -join '","')
Write-Host $b
Write-Host "##vso[task.setvariable variable=valuefiles;isOutput=true]$b"
name: power
deploy.yml
parameters:
files: []
jobs:
- job: B
dependsOn: A
variables:
filenames: ${{parameters.files}}
steps:
- powershell: |
echo $(filenames)
foreach($a in $(filenames)){write-host $a}
azure-pipelines.yml:
resources:
repositories:
- repository: pipelinetemplates
type: git
name: PipelineBuildResourceYaml
stages:
- stage: SA
pool: Default
variables:
valuepath: "myfolder"
jobs:
- template: fetchingfilenames.yml#pipelinetemplates
- template: deploy.yml#pipelinetemplates
parameters:
##Fetching variable as parameters
files: $[dependencies.A.outputs['power.valuefiles']]
Output of Job B:

Related

`GITHUB_PULL_REQUEST_BASE_REF: parameter null or not set` error in github actions

I am getting this error while trying to set a github actions. My goal is to set up a github actions that uses another template for linting and fixing SQL. Here is my github folder.
The models folder contains a single sql file (with .sql file extention). The content of the sql folder is an sql file testing.sql with the query: select a,b,c, document as doc from table.
The workflow file contains the following yml file:
on:
pull_request:
jobs:
test-check:
name: runner / sqlfluff (github-check)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: yu-iskw/action-sqlfluff#v3
id: lint-sql
with:
github_token: ${{ secrets.github_token }}
reporter: github-pr-review
sqlfluff_version: "1.2.0"
sqlfluff_command: "fix" # Or "lint"
config: "${{ github.workspace }}/.sqlfluff"
paths: '${{ github.workspace }}/models'
- name: 'Show outputs (Optional)'
shell: bash
run: |
echo '${{ steps.lint-sql.outputs.sqlfluff-results }}' | jq -r '.'
echo '${{ steps.lint-sql.outputs.sqlfluff-results-rdjson }}' | jq -r '.'
The .sqlfluff file contains a default configuration from the following site: sqlfulff.
The workflow run is throwing the following error which I couldn't quite figure out:
I don't know what the line 15: GITHUB_PULL_REQUEST_BASE_REF: parameter null or not set means in the error. I would be glad if anyone can help with the error.
It is a parameter used by yu-iskw/action-sqlfluff action.yml in its entrypoint.sh.
SQL_FILE_PATTERN="${FILE_PATTERN:?}"
SOURCE_REFERENCE="origin/${GITHUB_PULL_REQUEST_BASE_REF:?}"
changed_files=$(git diff --name-only --no-color "$SOURCE_REFERENCE" "HEAD" -- "${SQLFLUFF_PATHS:?}" |
grep -e "${SQL_FILE_PATTERN:?}" |
xargs -I% bash -c 'if [[ -f "%" ]] ; then echo "%"; fi' || :)
Set it to the remote branch parameter github_base_ref you want to compare to (main for instance).
In your case:
on:
pull_request:
jobs:
test-check:
name: runner / sqlfluff (github-check)
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: yu-iskw/action-sqlfluff#v3
id: lint-sql
with:
github_token: ${{ secrets.github_token }}
reporter: github-pr-review
sqlfluff_version: "1.2.0"
sqlfluff_command: "fix" # Or "lint"
config: "${{ github.workspace }}/.sqlfluff"
paths: '${{ github.workspace }}/models'
github_base_ref: "main" <========================
- name: 'Show outputs (Optional)'
shell: bash
run: |
echo '${{ steps.lint-sql.outputs.sqlfluff-results }}' | jq -r '.'
echo '${{ steps.lint-sql.outputs.sqlfluff-results-rdjson }}' | jq -r '.'
(Do not include the <======... part, only the github_base_ref: "main")

How to store and retrieve a value in different steps using github actions

I have a github pipeline, there is a job. I want to get a value in the later steps that was initially set in the first step.
jobs:
backup:
name: "Backup"
runs-on: self-hosted
steps:
- uses: actions/checkout#v2
- name: Update Config
run: |
export ORIGINAL_RDS_MAX_EXEC_TIME=123
- name: Create DB Backup
run: |
// do some work
- name: Cleanup
if: always()
run: |
echo $ORIGINAL_RDS_MAX_EXEC_TIME // returns nothing
The echo in the final step of this job does not return anything, it seems like that original export variable is no longer present
You can set an output parameter and reference that later on:
- name: Set output
id: setter
run: |
echo "::set-output name=foo::bar"
- name: Use output
run: |
echo "${{ steps.setter.outputs.foo }}"
or you can add it to the environment:
- name: Set environment variable
run: |
echo "FOO=bar" >> "$GITHUB_ENV"
- name: Use environment variable
run: |
echo "$FOO"

setting output variables on Windows using cmd.exe as the shell

How can i set output variables when using shell: cmd on Windows? My repo is hosted at Github (not Gitlab).
The following is based on this accepted answer
jobs:
job1:
runs-on: my-windows-machine
# Map a step output to a job output
outputs:
output1: ${{ steps.step1.outputs.test }}
output2: ${{ steps.step2.outputs.test }}
steps:
- id: step1
run: |
echo '::echo::on'
echo "::set-output name=test::hello"
shell: cmd
- id: step2
run: |
echo '::echo::on'
echo "::set-output name=test::world"
shell: cmd
job2:
runs-on: my-windows-machine
needs: job1
steps:
- run: echo ok: ${{needs.job1.outputs.output1}} ${{needs.job1.outputs.output2}}
shell: cmd
The echo stmt in job2 only shows ok: string if shell: cmd is used in the steps in job1.
As the OP Andrew concludes in the comments, output vars just is not supported with cmd.exe.
I went ahead and broke up the steps in my workflow to use shell: cmd for the things that need to be processed by cmd.exe and created a separate step (using the default shell) just to set the output vars.
As an alternative, you can see in "Github Actions, how to share a calculated value between job steps?" the $GITHUB_OUTPUT command, which can be used to define outputs for steps. The outputs can then be used in later steps and evaluated in with and env input sections.
You can see it used in "How can I use a GitHub action's output in a workflow?".
Note: the older ::set-output command has now (Oct. 2022) been deprecated.
name: 'Hello World'
runs:
using: "composite"
steps:
- id: random-number-generator
run: echo "name=random-id::$(echo $RANDOM)" >> $GITHUB_OUTPUT
shell: bash
...
jobs:
test-job:
runs-on: self-hosted
steps:
- name: Call Hello World
id: hello-world
uses: actions/hello-world-action#v1
- name: Comment
if: ${{ github.event_name == 'pull_request' }}
uses: actions/github-script#v3
with:
github-token: ${{secrets.GITHUB_TOKEN}}
script: |
github.issues.createComment({
issue_number: context.issue.number,
owner: context.repo.owner,
repo: context.repo.repo,
body: 'Output - ${{ steps.hello-world.outputs.random-number }}'
})

Read JSON file in Github Actions

I want to read a JSON file and use a property in a string in a Github Actions YAML file. How do I do this?
(I want the version of the package.json)
Use the built-in fromJson(value) (see here: https://docs.github.com/en/actions/learn-github-actions/expressions#fromjson)
Reading a file depends on the shell you're using. Here's an example for sh:
name: Test linux job
on:
push
jobs:
testJob:
name: Test
runs-on: ubuntu-latest
steps:
- id: set_var
run: |
content=`cat ./path/to/package.json`
# the following lines are only required for multi line json
content="${content//'%'/'%25'}"
content="${content//$'\n'/'%0A'}"
content="${content//$'\r'/'%0D'}"
# end of optional handling for multi line json
echo "::set-output name=packageJson::$content"
- run: |
echo "${{fromJson(steps.set_var.outputs.packageJson).version}}"
Multi line JSON handling as per https://github.community/t5/GitHub-Actions/set-output-Truncates-Multiline-Strings/td-p/37870
GitHub issue about set-env / set-output multi line handling: https://github.com/actions/toolkit/issues/403
Below is a version of the example from Official GHA Docs that includes two changes:
Loads json from a file (./your.json)
Removes newline characters (Source)
Uses fromJson to parse the output and set a matrix variable.
name: build
on: push
jobs:
job1:
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.set-matrix.outputs.matrix }}
steps:
- id: set-matrix
run: |
JSON=$(cat ./your.json)
echo "::set-output name=matrix::${JSON//'%'/'%25'}"
job2:
needs: job1
runs-on: ubuntu-latest
strategy:
matrix: ${{fromJson(needs.job1.outputs.matrix)}}
steps:
- run: build
on: [push, pull_request]
name: Build
jobs:
build:
name: Example
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
path: './'
- run: |
echo "`jq '.base_config[0].value="Alpha-21"' config.json `" > config.json
echo "`jq '.base_config[1].value="1.2.14"' config.json`" > config.json
echo "`jq '.base_config[2].value="29/12/2020"' config.json `" > config.json
- uses: EndBug/add-and-commit#v6
with:
message: 'Add the version and date'
add: '*.json --force'
cwd: './'
token: ${{ secrets.TOKEN }}
Use a multi line environment variable:
- run: |
echo 'PACKAGE_JSON<<EOF' >> $GITHUB_ENV
cat ./package.json >> $GITHUB_ENV
echo 'EOF' >> $GITHUB_ENV
- run: |
echo '${{ fromJson(env.PACKAGE_JSON).version }}'
This avoids any need for escaping.
Inspired by answer from #dastrobu which adds key/val to $GITHUB_ENV and using jq to transform/minify package.json to a single line:
- run: echo "PACKAGE_JSON=$(jq -c . < package.json)" >> $GITHUB_ENV
- run: echo '${{ fromJson(env.PACKAGE_JSON).version }}'
I once used this to get the value from the json data. Hope this helps
- name: fetch the json value
run: |
githubjson=`cat $GITHUB_EVENT_PATH`
echo $githubjson
number=`echo $(jq -r '.number' <<< "$githubjson")`
PRTitle=`echo $(jq -r '.pull_request.title' <<< "$githubjson")`
PRUrl=`echo $(jq -r '.pull_request.html_url' <<< "$githubjson")`
PRBody=`echo $(jq -r '.pull_request.body' <<< "$githubjson")`
With Powershell:
- name: Read json
id: read-json
shell: pwsh
run: |
$json = Get-Content yourfile.json | ConvertFrom-Json
echo "::set-output name=prop::$(echo $json.prop)"
- run: echo ${{ steps.read-json.outputs.prop}}
You can easily use the Script action for this.
- name: "Read JSON"
uses: actions/github-script#v6
id: check-env
with:
result-encoding: string
script: |
try {
const fs = require('fs')
const jsonString = fs.readFileSync('./dir/file.json')
var apps = JSON.parse(jsonString)
} catch(err) {
core.error("Error while reading or parsing the JSON")
core.setFailed(err)
}

How to use output of a powershell command as parameters in Azure pipeline?

I have a PowerShell script task that gets the names of some files from a folder in my git repo and puts them into a variable. I want to use those file names in parameters and use "each" condition in another task (task: HelmDeploy#0) in order to run that task each time with one of the file names as valueFile variable.
Here is what I have tried, however it gives an error Template-Yaml/deploy-jobs.yaml#pipelinetemplates Expected a sequence or mapping. Actual value '$[dependencies.A.outputs['fileoutput.valuefiles']]' in line ${{each file in parameters.files}}
deploy-jobs.yaml
parameters:
files: []
jobs:
- job: Deploy
pool:
vmImage: 'ubuntu-latest'
variables:
filenames: ${{ parameters.files }}
steps:
- task: HelmInstaller#1
displayName: 'Installing Helm'
inputs:
helmVersionToInstall: '2.15.1'
- task: HelmDeploy#0
displayName: 'Initializing Helm'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'init'
- task: AzureCLI#2
inputs:
azureSubscription: $(azureSubscription)
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript:
echo "##vso[task.setvariable variable=imgtag]$(az acr repository show-tags --name myacr --repository myrepo --orderby time_desc --top 1 | awk ' /[[:digit:]]/ { print $0 } ' | tr -d '[:space:]')"
- task: Bash#3
displayName: 'Fetching repo-tag'
inputs:
targetType: 'inline'
script: |
echo tag=$(imgtag)
echo filenames=$(filenames) ### output is: **/myfolder/dev/file1.yaml,**/myfolder/dev/file2.yaml
- ${{each file in parameters.files}}: ##Error
- task: HelmDeploy#0
displayName: 'Upgrading helmchart'
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: $(azureSubscription)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(kubernetesCluster)
command: 'upgrade'
chartType: 'FilePath'
chartPath: $(chartPath)
install: true
releaseName: $(releaseName)
##valueFile: $(valuefiles)
valueFile: ${{ file }}
arguments: '--set image.tag=$(imgtag) --set domain=$(domain)'
azure-pipeline.yaml file is as following:
trigger:
branches:
include:
- master
- refs/tags/v*
paths:
exclude:
- readme.md
variables:
azureSubscription: 'myazuresubscription'
chartPath: '**/mychart'
containerRegistry: 'mysc'
repository: 'myrepo'
resources:
repositories:
- repository: pipelinetemplates
type: github
name: 'mygitorg/myrepo'
endpoint: 'mygitorg'
stages:
- stage: Deploy_Cluster
variables:
azureResourceGroup: 'myresourcegroup'
kubernetesCluster: 'mycluster'
releaseName: 'mychartreleasename'
#valueFile: '**/mychart/values.yaml'
domain: 'mydomain'
jobs:
- job: A
pool:
vmImage: 'ubuntu-latest'
steps:
- task: PowerShell#2
displayName: 'Fetching ValueFiles'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching value files"
cd myfolder
$a=git ls-files
$List = $a | foreach {'**/myfolder/dev/' + $_}
Write-Host $List
$d = '"{0}"' -f ($List -join '","')
Write-Host $d ### output is: "**/myfolder/dev/file1.yaml","**/myfolder/dev/file2.yaml"
Write-Host "##vso[task.setvariable variable=valuefiles;isOutput=true]$d"
name: fileoutput
- template: Template-Yaml/deploy-jobs.yaml#pipelinetemplates ##Error expected a sequence or mapping
parameters:
files : $[dependencies.Deploy.outputs['fileoutput.valuefiles']]
I got some idea from this page: https://www.aaron-powell.com/posts/2019-05-24-azure-pipeline-templates-and-parameters/ regarding using dependencies.
I googled a lot, however I couldn't find a solution to this issue so far, any help would be appreciated.
Tested the replied suggested by Levi:
parameters:
files: []
jobs:
#- ${{each file in parameters.files}}:
- job: Deploy
dependsOn: A
pool:
vmImage: 'ubuntu-latest'
variables:
filenames: ${{ parameters.file }}
steps:
- task: Bash#3
displayName: 'Fetching repo-tag'
inputs:
targetType: 'inline'
script: |
##echo files=$(filenames) #output is files=file1.yaml,file2.yaml
for i in $(filenames)
do
echo "valuefiles= $i "
done
OutPut is valuefiles= files=file1.yaml,file2.yaml
Testing with PowerShell:
- task: PowerShell#2
displayName: 'Fetching ValueFiles'
inputs:
targetType: 'inline'
script: |
foreach ($i in ${{ parameters.files }}) {
Write-Host "filenames=$i"
}
Error: ObjectNotFound: ($[dependencies.A.ou\u2026output.valuefiles]]:String) [], ParentContainsErrorRecordException
- task: PowerShell#2
displayName: 'Fetching ValueFiles'
inputs:
targetType: 'inline'
script: |
foreach ($i in $(filenames)) {
Write-Host "filenames=$i"
}
Error: foreach ($i in ) {
+ ~
Unexpected token ')' in expression or statement.
+ CategoryInfo : ParserError: (:) [], ParseException
The error occurred is because the files can only be caculated at runtime, but - ${{each file in parameters.files}} is valuated at compile time. Check here for more information about variable syntax.
- ${{each file in parameters.files}} won't work if the dynamic variables have been passed through parameters.
I doesnot know much about kubernetes, if you can manage to use powershell/bash script to do the HelmDeploy task. You can foreach the files inside script and deploy each file.
- powershell: |
foreach ($i in $(filenames)) {helm.exe upgrade ....}
A similar issue has been submitted to Microsoft Develop team, and it is determined that above issue is by design. But you can still submit a feature request by clicking "Suggest a feature " and choose "azure devops"
Update:
azure-pipeline.yaml
trigger:
branches:
include:
- master
stages:
- stage: Deploy_Cluster
jobs:
- job: A
pool:
vmImage: 'ubuntu-latest'
steps:
- task: PowerShell#2
displayName: 'Fetching ValueFiles'
inputs:
targetType: 'inline'
script: |
Write-Host "Fetching value files"
cd '$(system.defaultworkingdirectory)'
$a=git ls-files
$List = $a | foreach {'**/myfolder/dev/' + $_}
Write-Host $List
$d = '"{0}"' -f ($List -join '","')
Write-Host $d
Write-Host "##vso[task.setvariable variable=valuefiles;isOutput=true]$d"
name: fileoutput
- job: B
dependsOn: A
pool:
vmImage: 'ubuntu-latest'
variables:
allfiles: $[dependencies.A.outputs['fileoutput.valuefiles']]
steps:
- template: deploy-jobs.yaml
parameters:
files : $(allfiles)
deploy-jobs.yaml
parameters:
files: []
steps :
- task: PowerShell#2
displayName: 'Fetching ValueFiles'
inputs:
targetType: 'inline'
script: >
foreach ($i in ${{parameters.files}})
{
Write-Host "filenames=$i"
}