I have a Github Actions workflow dealing with new releases. Some last steps build the application for different platforms. Instead of creating multiple steps where each one builds for a different platform or creating a step running multiple commands I'm looking for a way to loop a step for each item in an array.
I know there is a matrix for jobs so this is my pseudo implementation to show what I'm looking for
jobs:
do_it:
runs-on: ubuntu-latest
steps:
- name: For each entry in the array...
strategy:
matrix:
target: [ this, that, something ]
run: echo ${{ matrix.target }}
Is it possible to create something similiar to a matrix so it will loop the step multiple times?
As a sidenote, I know there is a similiar question Using an array of values to repeat a step in GitHub Actions workflow but I don't want to split the job into multiple jobs because then I have to deal with all the build artifacts.
To loop the step multiple times, try something like this:
jobs:
do_it:
runs-on: ubuntu-latest
steps:
- name: For each entry in the array...
env:
PLATFORMS: [ linux, macOs, windows ]
run: |
platform_list=$(echo "${{ env.PLATFORMS }}" | tr -d '[],')
for target in $platform_list; do
echo "Building for $target platform..."
# make $target done
done
Here -- basically, the env step sets an environment variable PLATFORMS to an array of values, and the run step uses a shell for loop to iterate through the values in the array.
Related
I'm having difficulties figuring out the syntax for triggering off of different event types.
For example the following gives me a "duplicated mapping key" error on the secod pull_request trigger.
on:
pull_request:
types: [opened, reopened]
branches:
- main
- develop
pull_request:
types [synchronize]
branches:
- main
- develop
paths: ['**.h', '**.cpp', '**.hpp', '**.yaml', '**CMakeLists.txt', '**Makefile', '**.spec', '**.py', '**Dockerfile', '**conanfile.txt']
I want the workflow to always run when first opened (or reopened) but subsequently when the branch is synchronized it should only run if the changes are in one of the specified file types.
To clarify, I already have on.push event hook that's not shown here for the sake of brevity.
I do believe I neeed to have a pull_request.synchronize event to handle updated.
Can't find anything in the documentation on how to do that. I tried combining the two pull_requests triggers but then I'm getting an error that the "types" key is being duplicated.
Any ideas?
The documentation does talk about triggering based on multiple events, but not multiple events of the same type, so it isn't entirely clear if this is possible (beyond the validation errors).
To make this work you need to define three different workflows, one with each varying type of event and its filters and another with the reusable workflow using a workflow_call event.
#workflow-1
on:
pull_request:
types: [opened, reopened]
branches:
- main
- develop
jobs:
job:
uses: ./.github/workflows/workflow-3.yml
#workflow-2
on:
pull_request:
types: [synchronize]
branches:
- main
- develop
paths: ['**.h', '**.cpp', '**.hpp', '**.yaml', '**CMakeLists.txt', '**Makefile', '**.spec', '**.py', '**Dockerfile', '**conanfile.txt']
jobs:
job:
uses: ./.github/workflows/workflow-3.yml
#workflow-3
on:
workflow_call:
jobs:
job:
steps:
- run: do stuff
Context
A reusable workflow in public repos may be called by appending a reference which can be a SHA, a release tag, or a branch name, as for example:
{owner}/{repo}/.github/workflows/{filename}#{ref}
Githubs documentation states:
When a reusable workflow is triggered by a caller workflow, the github context is always associated with the caller workflow.
The problem
Since the github context is always associated with the caller workflow, the reusable workflow cannot access the reference, for example the tag v1.0.0. However, knowing the reference is important when the reusable workflow needs to checkout the repository in order to make use of composite actions.
Example
Assume that the caller workflow is being executed from within the main branch and calls the ref v1.0.0. of a reusable workflow:
name: Caller workflow
on:
workflow_dispatch:
jobs:
caller:
uses: owner/public-repo/.github/workflows/reusable-workflow.yml#v1.0.0
Here is the reusable workflow that uses a composite action:
name: reusable workflows
on:
workflow_call:
jobs:
first-job:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3.1.0
with:
repository: owner/public-repo
ref: ${{ github.ref_name }}
- name: composite action
uses: ./actions/my-composite-action
In the above code snippet, ${{ github.ref_name }} is main instead of v1.0.0 because github context is always associated with the caller workflow. Therefore, the composite actions code is based on main and not on v1.0.0. However, the caller wanted v1.0.0.
Hence my question: how is the reusable workflow able to access the reference given by the caller?
In Azure, I have a release pipeline with two stages. The first stage stores a value to a pipeline variable. I need to access that variables value in the 2nd stage's tasks. Currently the value retrieved in the second stage is empty. Is it however not empty when accessed within the same stage (multiple tasks in the same stage).
I have checked out Microsoft's documentation and it seems to only show YAML.
How to access stageDependencies variables in multiple stages in Azure Release pipeline using bash
Yes, the Stage to stage dependencies only used for the YAML pipeline.
For the Classic pipeline, we need to pass them manually so that we can use them in the next stage.
We could use the REST API to update the variable in the Variables tab.
Steps:
Define a variable in the release definition Variable.
Use REST API (Definitions - Update) to update the value of the release
definition variable in the stage 1.
Use the updated value of the release definition variable in the second stage.
The details info about using REST API to update the value of the release definition variable, you can follow the below ticket:
How to modify Azure DevOps release definition variable from a release task?
or you could use the Azure CLI to update the variable:
az pipelines variable update --name
[--allow-override {false, true}]
[--detect {false, true}]
[--new-name]
[--org]
[--pipeline-id]
[--pipeline-name]
[--project]
[--prompt-value {false, true}]
[--secret {false, true}]
[--subscription]
[--value]
I would like to set a property in a component process which is available from there on in all subsequent steps, in the rest of the current process and in all other processes that are called from there.
So, in a component process, I'm using Deploy Process plugin to set a value to a property, in the scope of the parent request.
Here's the illustration:
Deploy Request
Application Process: AppProcess1
Install Component
component name: Comp1
component process: Comp1-Proc1
Step 1: Set Process Request Property
name: PROP_1
value: val1
process request id: ${p:parentRequest.id}
Step 2: Shell
Shell Script: echo ${p:PROP_1} --> Output: <empty-string>
Step 3: Run Component Process
component process: Comp1-Proc2
Step 1: Shell
Shell Script: echo ${p:PROP_1} --> Output: val1
Step 4: Shell
Shell Script: echo ${p:PROP_1} --> Output: val1
The problem is that the value is not available in the steps in the current process (Comp1-Proc1) when referenced with ${p:PROP_1}, unless another component process (Comp1-Proc2) is called, where the value is available, and then come back to first process, when the value becomes available, too.
Am I doing something wrong? Is this an expected behavior?
I'm using an on-premise UrbanCode Deploy - version 7.0.2.3.ifix01.1022337.
I don't find anything in the official UCD documentation, nor in plugin doc which would explain the above behavior.
Try Set Property. Via this you can extend the scope of the property which you are setting. For eg
Application process 1
Step 1 - shell
Deploy Request
Application Process: AppProcess1
Install Component
component name: Comp1
component process: Comp1-Proc1
Shell 1 - (here use Set Environment property)
Install Component
component name: Comp2
component process: Comp2-Proc1
Shell 1 - (here you can refer to it) ${p:YourEnvironmentName/YourPropertyname}
You just need to extend the scope of property to a higher level
I cannot set up the code coverage configuration, the report is always 0%. I'm using codeception coverage with two projects, the first one with:
Yii2
WebDriver module
Weird stuff:
I have two codeception.yml:
/tests/codeception.yml
/codeception.yml
c3.php is not in root. It is on /vendor/codeception/codeception/tests/data/claypit/c3.php
I'm not sure where I have to include c3.php
Like I'm not sure which codeception.yml is the right file, I have the same configuration on both files.
actor: Tester
paths:
tests: tests
log: tests/_output
data: tests/_data
helpers: tests/_support
settings:
bootstrap: _bootstrap.php
colors: false
memory_limit: 1024M
modules:
config:
coverage:
enabled: true
remote: false
include:
- /controllers/*
c3_url: 'http://127.0.0.1/tmsO/#/'
I have the same problem with the second project, the differences are that I'm using:
Yii1
PhpBrowser module
Asserts module
REST module
Thank you in advance. I really need help.
When you use Codeception for acceptance Tests and using PHPBrowser/WebDriver you need remote coverage (remote: true). Therefore it will always say 0% in the console but will be saved in the _output directory.
On the remote site, the c3.php will collect all the needed data. So you need to include it in every call to your application. (for more information see https://github.com/Codeception/c3).
You can find the documentation for setting up the code coverage here: http://codeception.com/docs/11-Codecoverage