I am currently trying to implement a workflow which requires protobuf to be installed. However, on Ubuntu I have to compile this myself. The problem is that this takes quite some time to do so I figured caching this step is the thing to do.
However, I am not sure how I can use actions/cache for this, if at all possible.
The following is how I am installing protobuf and my Python dependencies:
name: Service
on:
push:
branches: [develop, master]
jobs:
test:
runs-on: ubuntu-18.04
steps:
- name: Install protobuf-3.6.1
run: |
wget https://github.com/protocolbuffers/protobuf/releases/download/v3.6.1/protobuf-all-3.6.1.tar.gz
tar -xvf protobuf-all-3.6.1.tar.gz
cd protobuf-3.6.1
./configure
make
make check
sudo make install
sudo ldconfig
- uses: actions/checkout#v2
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip setuptools
pip install -r requirements.txt
How can I cache these run steps s.t. they don't have to run each time?
I tested the following:
name: Service
on:
push:
branches: [develop, master]
jobs:
test:
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout#v2
- name: Load from cache
id: protobuf
uses: actions/cache#v1
with:
path: protobuf-3.6.1
key: protobuf3
- name: Compile protobuf-3.6.1
if: steps.protobuf.outputs.cache-hit != 'true'
run: |
wget https://github.com/protocolbuffers/protobuf/releases/download/v3.6.1/protobuf-all-3.6.1.tar.gz
tar -xvf protobuf-all-3.6.1.tar.gz
cd protobuf-3.6.1
./configure
make
make check
- name: Install protobuf
run: |
cd protobuf-3.6.1
sudo make install
sudo ldconfig
- name: Install Python dependencies
run: |
python -m pip install --upgrade pip setuptools
pip install -r requirements.txt
I would also delete all the source files once it's built.
Related
I've got a workflow that has some setup steps that install php with a bunch of extensions & composer.
Is it possible to cache the Install PHP and Install Composer & Dependencies so these steps don't have to happen on every run?
These steps combined take about 4 mins of a 5 min run.
name: Build
on:
workflow_call:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2
with:
path: ./src
- name: Install PHP
run: |
sudo apt-get update -y && export DEBIAN_FRONTEND=noninteractive DEBCONF_NONINTERACTIVE_SEEN=true
echo "tzdata tzdata/Areas select Europe" >> /tmp/preseed.cfg
echo "tzdata tzdata/Zones/Europe select Berlin" >> /tmp/preseed.cfg
sudo apt install software-properties-common -y
sudo add-apt-repository ppa:ondrej/php
sudo apt update
sudo apt-get install php8.1 -y --quiet
sudo apt-get install php8.1-dev libmcrypt-dev php-pear php-xml php8.1-xml -y
sudo pecl channel-update pecl.php.net
sudo apt-get install -y libapache2-mod-php8.1 php8.1-common php8.1-gmp php8.1-curl php8.1-soap php8.1-bcmath php8.1-intl php8.1-mbstring php8.1-xmlrpc php8.1-mysql php8.1-gd php8.1-xml php8.1-cli php8.1-zip
sudo rm /usr/bin/php; sudo ln -sf /usr/bin/php8.1 /usr/bin/php
- name: Install Composer & Dependencies
run: |
cd ./src/ || exit 99
sudo apt-get install -y git zip libzip-dev openssh-client libmcrypt-dev
sudo curl -sS https://getcomposer.org/installer | sudo php -- --install-dir=/usr/local/bin --filename=composer
cp auth.json.pipeline auth.json
sudo composer self-update --2
composer install --no-dev --verbose --prefer-dist --no-ansi --no-interaction
I would recommend using Setup PHP and Install PHP Dependencies with Composer GitHub Actions instead of manually installing PHP and Composer with dependencies.
These actions already have a caching mechanism and provide many interesting and valuable things additionally.
For example:
name: PHP
on:
workflow_dispatch:
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v3
- uses: 'shivammathur/setup-php#v2'
name: Setup PHP
with:
php-version: '8.1'
- uses: 'ramsey/composer-install#v2'
name: Install Composer & Dependencies
- name: Run Script
run: |
php -v
composer -v
Running this WF on my little test PHP project with few dependencies takes about 30s without cache and 20s when cache exists.
Execution demo: Screenshot.
Cache demo: Screenshot.
In addition, there is a Cache GitHub action that can be used if you need manually cache some things for your workflow.
Learn more about managing caches.
How do I specify whether to zappa deploy or zappa update my application in Github actions with some sort of if statement
My Workflow Actions as per below
name: Dev Deploy
on:
push:
branches:
- mybranch
jobs:
dev-deploy:
name: Deploy to Dev
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Set up Python 3.9.10
uses: actions/setup-python#v1
with:
python-version: 3.9.10
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install pytest
pip install python-Levenshtein
pip install virtualenv
- name: Install zappa
run: pip install zappa
- name: Install Serverless
run: npm install -g serverless
- name: Configure Serverless for zappa Services
run: serverless config credentials --provider aws --key myAWSKey --secret myAWSSecret
- name: Deploy to Dev
run: |
python -m virtualenv envsp
source envsp/bin/activate
zappa deploy dev
If application already deployed once, I get the error
Error: This application is already deployed - did you mean to call update?
In which case I would want to run zappa update dev
I am trying to install GSAP 3 with Shockingly Green package.
The following are the steps recommended by the plugin.
//npm.greensock.com/:_authToken=XXXXXXXXXXXXXXXXXXXXXX
#gsap:registry=https://npm.greensock.com
And
yarn add gsap#npm:#gsap/shockingly
This is my workflow file which I am using in github action.
name: Deployment
on:
push:
branches: [ development ]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-node#v1.4.4
with:
node-version: 14.16.0
- name: Setup PHP with intl
uses: shivammathur/setup-php#v2
with:
php-version: '7.4'
extensions: intl-67.1
- name: Install Composer
run: sudo composer
- name: Install dependencies
run: |
composer install -o
yarn
env:
NPM_AUTH_TOKEN: ${{ secrets.NPM_AUTH_TOKEN }}
- name: Build
run: yarn build
- name: Sync
env:
dest: 'root#XXXXXXXXXXX:/var/www/html/wp-content/themes/XXXXXXX'
run: |
echo "${{secrets.DEPLOY_KEY}}" > deploy_key
chmod 600 ./deploy_key
rsync -chav --delete \
-e 'ssh -i ./deploy_key -o StrictHostKeyChecking=no' \
--exclude /deploy_key \
--exclude /.git/ \
--exclude /.github/ \
--exclude /node_modules/ \
./ ${{env.dest}}
I have added proper secret NPM_AUTH_TOKEN with the code provided to me by GSAP.
But I keep getting this error.
yarn install v1.22.17
warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies caused by unsynchronized lock files. To clear this warning, remove package-lock.json.
[1/5] Validating package.json...
[2/5] Resolving packages...
[3/5] Fetching packages...
error An unexpected error occurred: "https://npm.greensock.com/#gsap%2fshockingly/-/shockingly-3.8.0.tgz: Request failed \"403 Forbidden\"".
info If you think this is a bug, please open a bug report with the information provided in "/home/runner/work/lacadives-theme/lacadives-theme/yarn-error.log".
info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
Error: Process completed with exit code 1.
I think action is running on wrong registry.
Is there a way that I can change the registry to #gsap:registry=https://npm.greensock.com.
There were two issues which needed to be addressed.
Setting up .npmrc correctly
And removing yarn.lock file
My final yml file.
name: Deployment
on:
push:
branches: [ development ]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-node#v1.4.4
with:
node-version: 14.16.0
- name: Create NPMRC
run: |
echo "//npm.greensock.com/:_authToken=XXXXXXXXXXXXXXXXXXXXXXXXXXXX" >> ~/.npmrc
echo "#gsap:registry=https://npm.greensock.com" >> ~/.npmrc
- name: Setup PHP with intl
uses: shivammathur/setup-php#v2
with:
php-version: '7.4'
extensions: intl-67.1
- name: Install Composer
run: sudo composer
- name: Install dependencies
run: |
composer install -o
rm yarn.lock
yarn
- name: Build
run: yarn build
- name: Sync
env:
dest: 'root#XXXXXXX:/var/www/html/wp-content/themes/XXXXXXXXX'
run: |
echo "${{secrets.DEPLOY_KEY}}" > deploy_key
chmod 600 ./deploy_key
rsync -chav --delete \
-e 'ssh -i ./deploy_key -o StrictHostKeyChecking=no' \
--exclude /deploy_key \
--exclude /.git/ \
--exclude /.github/ \
--exclude /node_modules/ \
./ ${{env.dest}}
I want to set up a github actions container with both dart and python. I have used the dart actions template and installed python. However, I keep getting an error saying
WARNING: The directory '/github/home/.cache/pip' or its parent directory is not owned or is not writable by the current user. The cache has been disabled. Check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Requirement already satisfied: pip in /__t/Python/3.8.7/x64/lib/python3.8/site-packages (21.0.1)
/__w/_temp/95e6ebc6-5365-42a8-8197-9f5d14c042d3.sh: 2: /__w/_temp/95e6ebc6-5365-42a8-8197-9f5d14c042d3.sh: pip: not found
Here is my yaml file:
name: Dart
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
runs-on: ubuntu-latest
# Note that this workflow uses the latest stable version of the Dart SDK.
# Docker images for other release channels - like dev and beta - are also
# available. See https://hub.docker.com/r/google/dart/ for the available
# images.
container:
image: google/dart:latest
steps:
- uses: actions/checkout#v2
- name: Set up Python 3.8
uses: actions/setup-python#v2
with:
python-version: 3.8
- name: Print Dart SDK version
run: dart --version
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
cd integ_tests
dart pub get
# Run uvicorn
- name: Run uvicorn
run: |
cd fastapi/
uvicorn app.main:app --reload --port 8000
# run my test
- name: Run dart test
run: |
cd integ_tests
dart lib/main.dart --dry true
Additionally, I'm concerned that running uvicorn inside the container will make the container hang (since it would never exit). If this is the case, how do I go about starting a localhost with uvicorn without letting the container run forever?
EDIT: full log
If I run it with sudo I get an error saying
/__w/_temp/2ffb7222-f1dd-4273-870c-c85ac57b9da3.sh: 1: /__w/_temp/2ffb7222-f1dd-4273-870c-c85ac57b9da3.sh: sudo: not found
I suspect the problem here is still that you are attempting to run pip inside the container. Here's why. The dart version is provided after the setup-python but before pip installation. I would change the order to ensure that dart --version step is before the "Run dart test" step. This will ensure that all python build and configure is done right after setup-python
I looked at a Python build of mine and on the step of upgrading pip, I get this:
> Run python -m pip install --upgrade pip
Requirement already satisfied: pip in /opt/hostedtoolcache/Python/3.7.9/x64/lib/python3.7/site-packages (21.0.1)
Collecting pytest
I believe this will have uvicorn running outside the container (i.e., in the runner VM).
I have an action on master branch which on push/merge builds a package, uploads it to PyPI then checks out to develop branch, bumps version in develop branch and pushes to the origin of develop branch. Develop branch has an action that listens to push/merge and does a snapshot release.
When I push to develop the develop action works perfectly and does a snapshot release, but when master branch pushes, push is successful but the action does not get triggered. What am I missing?
Both actions are added below.
name: Build and Upload Package to PyPI | Master Branch
on:
push:
branches:
- master
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Set up Python
uses: actions/setup-python#v1
with:
python-version: '3.5'
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install setuptools wheel twine
pip install GitPython
pip install bumpversion
- name: Strip 'snapshot' from version
run: sed -i 's/-snapshot//g' setup.py
- name: Build and publish
env:
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
TWINE_REPOSITORY_URL: https://pypi.domain.com
run: |
python setup.py sdist bdist_wheel
twine upload dist/*
- name: Bump Verison and Push to develop
run: |
git stash
git config --local user.email "name#email.com"
git config --local user.name "username"
git checkout develop
python bump_version.py
cat .bumpversion.cfg
git remote set-url --push origin https://username:$GITHUB_TOKEN#github.com/repo/path
git push origin develop
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
name: Build and Upload Package to PyPI | Develop Branch
on:
push:
branches:
- develop
jobs:
bumpTag_build_and_publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: Set up Python
uses: actions/setup-python#v1
with:
python-version: '3.5'
- name: Install dependencies for setup
run: |
python -m pip install --upgrade pip
pip install setuptools wheel twine
- name: Build and publish
env:
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }}
TWINE_REPOSITORY_URL: https://pypi.domain.co,
run: |
python setup.py sdist bdist_wheel
twine upload dist/*
Provided secrets.GITHUB_TOKEN is intentionally not allowed to trigger workflows. As seen in documention:
(...) if an action pushes code using the repository's GITHUB_TOKEN, a new workflow will not run even when the repository contains a workflow configured to run when push events occur.
If you need your automagic push to be "visible" by workflows, you need to create Personal Access Token, add it to repo secrets, and use that instead of GITHUB_TOKEN.
Note that GitHub assumes that you know what you're doing, if you use non-stock token - which means preventing possible infinite loop is on you. While it's not a case in your scenario for now (develop branch does not push anything), it's worth to remember in case one of workflows will change some day.