How to define a topic trigger in a .yaml for Github actions for deploying a Google Cloud Function?
A working file with http (the default) is:
# This is a basic workflow to help you get started with Actions
name: CD
# Controls when the workflow will run
on:
# Triggers the workflow on push or pull request events but only for the main branch
push:
branches: [main]
pull_request:
branches: [main]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "deploy"
deploy:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Add "id-token" with the intended permissions.
permissions:
contents: "read"
id-token: "write"
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout#v2
- id: "auth"
name: "Authenticate to Google Cloud"
uses: "google-github-actions/auth#v0"
with:
# Replace with your Workload Identity Provider Location
workload_identity_provider: "${PROJECT_VALUE}"
# Replace with your GitHub Service Account
service_account: "github-actions-service-account#${PROJECT_NAME}.iam.gserviceaccount.com"
- id: "deploy"
uses: "google-github-actions/deploy-cloud-functions#v0"
with:
# Name of the Cloud Function, same as the entry point name
name: "${FUNCTION_NAME}"
# Runtime to use for the function
runtime: "nodejs16"
This thread has some suggestions but getting the topic ID is unclear
https://github.com/google-github-actions/deploy-cloud-functions/issues/5
adding the following works
- id: "deploy"
uses: "google-github-actions/deploy-cloud-functions#v0"
with:
# Name of the Cloud Function, same as the entry point name
name: "${FUNCTION_NAME}"
# Runtime to use for the function
runtime: "nodejs16"
event_trigger_type: "google.pubsub.topic.publish"
event_trigger_resource: "projects/${PROJECT_VALUE}/topics/${TOPIC_ID}" //ie the string identifying the topic
event_trigger_service: "pubsub.googleapis.com"
The final issue was that having deployed a GCF with http trigger, that one had to be deleted before the above would successfully deploy
Related
I've been working on this for months now, but
I still can't parse the 'Commit Message' properly with Python (see script below).
You see, with every commit in my repository, every commit message begins with what represents the release's version number.
As of this writing, for example, parsing the commit message would result the a tag:
v8.11.0
I get this error message instead:
I'm not certain if it's creating the variable, tag, or not.
Python is not working for me. Would anyone have another approach?
# This workflow tests and releases the latest build
name: CI
# Controls when the action will run.
on:
# Triggers the workflow on push or pull request events but only for the master branch
push:
branches: [ master ]
pull_request:
branches: [ master ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build-and-test"
build-and-test:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout#v2
# Use the standard Java Action to setup Java
# we want the latest Java 12
- uses: actions/setup-java#v1
with:
java-version: '12.x'
# Use the community Action to install Flutter
# we want the stable channel
- uses: subosito/flutter-action#v1
with:
channel: 'stable'
# Get flutter packages
- run: flutter pub get
# Check for any formatting issues in the code.
- run: flutter format .
# Analyze our Dart code, but don't fail with there are issues.
- run: flutter analyze . --preamble --no-fatal-infos --no-fatal-warnings
# Run our tests
- run: flutter test --coverage
# Upload to codecov
- uses: codecov/codecov-action#v2
with:
token: ${{secrets.CODECOV_TOKEN}}
file: ./coverage/lcov.info
# Parse a tag from the commit message
- id: get-tag
shell: python3 {0}
run: |
import json
import os
with open(os.environ['GITHUB_EVENT_PATH']) as fh:
event = json.load(fh)
tag = event['head_commit']['message'].split()[0] <----- tag NOT CREATED?!
# Create a Release
- uses: softprops/action-gh-release#v1
env:
# This token is provided by Actions, you do not need to create your own token
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: v${{ steps.get-tag.outputs.tag }} <----- ERROR HERE!
release_name: ${{ steps.get-tag.outputs.tag }} <----- ERROR HERE!
body: |
See CHANGELOG.md
draft: false
prerelease: false
Using an alternative approach, I'm able to produce a tag using the current date.
This proves that it all works expect when trying to assign a 'tag' value using Python.
# Get current datetime in ISO format
- id: date
run: echo "::set-output name=date::$(date -u +'%Y-%m-%d')"
# Create a Release
- uses: softprops/action-gh-release#v1
env:
# This token is provided by Actions, you do not need to create your own token
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ steps.date.outputs.date }}v${{ github.run_number }}
name: ${{ steps.date.outputs.date }}v${{ github.run_number }}
body: |
See CHANGELOG.md
draft: false
prerelease: false
Any ideas?
steps.get-tag.outputs.tag is not correctly output in your workflow.
You should output it as described in the docs:
- id: get-tag
shell: python3 {0}
run: |
import json
import os
with open(os.environ['GITHUB_EVENT_PATH']) as fh:
event = json.load(fh)
tag = event['head_commit']['message'].split()[0]
print("::set-output name=tag::" + tag) # <--- This line
Is there any way for us to control what jobs/steps to run in a workflow based on the changes in a specific folder
Eg:
I have said, following folders in my git repo : a, b, c
On every PR merge to my branch I will trigger a workflow. The workflow will execute jobs say,
A -> B -> C. I want to run job A only if changes are present for folder "a/**", B for "b/**" and so on.
So, If in the PR changes only happen in "a/**"and "b/**" workflow will skip job execution for C, making the workflow run to be A->B
You could use the paths-filter custom action with if conditions at the jobs or step levels, using a setup job as preliminary to check if your specific path has been updated, saving the result as an output.
Here is an example
name: Paths Filter Example
on: [push, workflow_dispatch]
jobs:
paths-filter:
runs-on: ubuntu-latest
outputs:
output1: ${{ steps.filter.outputs.workflows }}
steps:
- uses: actions/checkout#v2
- uses: dorny/paths-filter#v2
id: filter
with:
filters: |
workflows:
- '.github/workflows/**'
# run only if 'workflows' files were changed
- name: workflow tests
if: steps.filter.outputs.workflows == 'true'
run: echo "Workflow file"
# run only if not 'workflows' files were changed
- name: not workflow tests
if: steps.filter.outputs.workflows != 'true'
run: echo "NOT workflow file"
next-job:
runs-on: ubuntu-latest
# Wait from the paths-filter to be completed before starting next-job
needs: paths-filter
if: needs.paths-filter.outputs.output1 == 'true'
steps:
...
That way, you could have something like this in your jobs: A --> B or A --> C depending on the path that has been updated.
Yes: https://docs.github.com/en/actions/learn-github-actions/events-that-trigger-workflows#registry_package
This is the syntax:
on:
push:
paths:
- 'a/**'
- 'b/**'
I am trying to setup the GitHub actions for deployment to the Azure. What I am trying to do is getting the name of some variables from the armtemplates with the given code.
name: Create Initial Resources
on:
push:
branches:
- CreateResources
jobs:
Read:
runs-on: ubuntu-latest
steps:
- name: Chekout branch
uses: actions/checkout#v2
with:
ref: CreateResources
- name: get storage account
run: |
echo ::set-output name=storage-account-name::$(jq '.parameters.storage_account_name.value' at ./armtemplates/sac/parameters-example.json)
This is the code that I use, first it checks the branch and in the second step it tries to parse the file that is in the branch but the error is like this:
jq: error: Could not open file at: No such file or directory
Issue here is at phrase. Please use this:
echo ::set-output name=storage-account-name::$(jq '.parameters.storage_account_name.value' ./armtemplates/sac/parameters-example.json)
I'm reading this quickstart guide.
The guide provides an example actions.yml:
# action.yml
name: 'Hello World'
description: 'Greet someone and record the time'
inputs:
who-to-greet: # id of input
description: 'Who to greet'
required: true
default: 'World'
outputs:
time: # id of output
description: 'The time we greeted you'
runs:
using: 'docker'
image: 'Dockerfile'
args:
- ${{ inputs.who-to-greet }}
My question concerns the last section in this file, runs:. My workflow will use multiple docker images. What's the 'right' way to do this? Should I create multiple actions.yml? Should I use multiple repos? Or can I somehow reference multiple Dockerfile's within runs:?
The docs only mention the usage of a single docker image in a docker action.
It is however possible to run docker containers in a step of a normal workflow as stated in this Stackoverflow thread. This means you can create a composite action to make use of the same features provided by a normal workflow.
name: CI
on: [push]
jobs:
myjob:
runs-on: ubuntu-latest # linux required if you want to use docker
steps:
- uses: docker://continuumio/anaconda3:2019.07 # runs anaconda image from DockerHub
I am new to github actions, and I see two things used for configuring the steps (correct me if i am wrong), with and env.
What is the difference between these two and how they are used.
uses: someAction
with:
x: 10
y: 20
env:
x1: 30
y2: 40
with - is specifically used for passing parameters to the action
env - is used specifically for introducing environment variables that can be accessed depending on scope of the resource
workflow envs - can be accessed by all resources in the workflow except services
job envs - can be accessed by all resources under job except services
step envs - can be accessed by any resource within the step
Here is an example on how parameters are handled
Let's say an action is created with following parameter in action.yaml
name: 'Npm Audit Action'
inputs:
dirPath:
description: 'Directory path of the project to audit'
required: true
default: './'
Then we will provide this parameter through the with tag in our workflow
- name: Use the action
uses: meroware/npm-audit-action#v1.0.2
with:
dirPath: vulnerable-project
Then in the action code we would handle it as such if building a Node.js action
const core = require("#actions/core");
const dirPath = core.getInput("dirPath");
Envs within actions are accessed differently, let's say we are building a Node.js action, then we would access it through process.env. Going back to our example action
name: 'Npm Audit Action'
env:
SOME_ENV: 'hey I am an env'
Then this could be accessed as
const { someEnv: SOME_ENV } = process.env
You can see in the documentation with: used to define a variable.
While env defines an environmnent variable, as defined here and in jobs.<job_id>.env
an environment variable defined in a step will override job and workflow variables with the same name, while the step executes.
A variable defined for a job will override a workflow variable with the same name, while the job executes.
You need both to access secrets:
steps:
- name: Hello world action
with: # Set the secret as an input
super_secret: ${{ secrets.SuperSecret }}
env: # Or as an environment variable
super_secret: ${{ secrets.SuperSecret }}