I have a directory containing json files, and i want to use github actions to create a new file in the repository, that contains an array of all those json files.
for example, the directory <my-repo>/configurations contain the files a.json, b.json, I want to create a new file called configs.json contains [<a.json content>,<b.json content>].
The creation must be done dynamically.
Any suggestions?
solution for config files sitting under \configs directory:
name: build unified config file
on: [push]
jobs:
build_file:
name: build unified config file
runs-on: ubuntu-latest
steps:
- name: Check out the repo
uses: actions/checkout#v2
- name: setup python
uses: actions/setup-python#v2
with:
python-version: 3.8
- name: Run script
uses: jannekem/run-python-script-action#v1
with:
script: |
import os, json, shutil
with open("unified.json", "r+") as t:
t.truncate(0)
t.write('[')
for filename in os.scandir('configs'):
print(filename)
with open(filename, "r") as f:
content = f.read()
t.write(content)
t.write(',')
t.write(']')
barak = open("unified.json", "r+")
contentb = barak.read()
print(contentb)
- name: push file to main
uses: EndBug/add-and-commit#v9
with:
add: 'unified.json'
committer_name: Committer Name
committer_email: mail#example.com
default_author: github_actor
message: 'Update unified config file'
push: true
Related
name: Git diff files changed
jobs:
if-file-changed:
name: Check if files changed
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout#v3
with:
fetch-depth: 0 # fetch all history of all branches
- name: Get files changed when pr is created/updated
# compare source branch with target branch in Pull request
if: github.event_name=='pull_request'
id: files_changed_on_pr
run: |
git symbolic-ref refs/remotes/origin/HEAD origin/${{github.base_ref}}
all_changed_files=$(git --no-pager diff --name-only origin/${{github.head_ref}} origin/${{github.base_ref}})
echo "::set-output name=all_changed_files::$all_changed_files"
echo all_changed_files = $all_changed_files
- name: Get output
id: get_output
run: |
all_changed_files=${{ steps.files_changed_on_pr.outputs.all_changed_files }}
echo $all_changed_files
I see 4 files in job id files_changed_on_pr
Output looks like following:
all_changed_files = .github/workflows/files_changed.yaml
.github/workflows/load_seed_data.yaml
.github/workflows/miscellaneous_test.yaml
.github/workflows/on_pr_dev.yaml
I only see 1 file in job id get_output
.github/workflows/files_changed.yaml
Is there some kind of JSON conversion I need to do before sending those files as output?
Introduction
I'm currently working on a project that automatically containerizes a java project with JIB.
GitHub project link.
Problem
The LIB library is implicitly used inside the YAML file, like this :
- name: Build JIB container and publish to GitHub Packages
run: |
if [ ! -z "${{ inputs.module }}" ]; then
MULTI_MODULE_ARGS="-am -pl ${{ inputs.module }}"
fi
if [ ! -z "${{ inputs.main-class }}" ]; then
MAIN_CLASS_ARGS="-Djib.container.mainClass=${{ inputs.main-class }}"
fi
mvn package com.google.cloud.tools:jib-maven-plugin:3.2.1:build \
-Djib.to.image=${{ inputs.REGISTRY }}/${{ steps.downcase.outputs.lowercase }}:${{ inputs.tag-name }} \
-Djib.to.auth.username=${{ inputs.USERNAME }} \
-Djib.to.auth.password=${{ inputs.PASSWORD }} $MULTI_MODULE_ARGS $MAIN_CLASS_ARGS
shell: bash
When the new version of JIB is released my dependabot configuration doesn't update the YAML file.
Configuration of the Dependabot :
version: 2
updates:
- package-ecosystem: github-actions
directory: '/'
schedule:
interval: weekly
Question
Does someone know how to configure dependabot.yml for an implicitly declared library?
Or how to configure Dependabot.yml to automatically create an issue when a new JIB version is released?
You can do it with hiden-dependency-updater
Example of GitHub Workflow you can use:
name: Update hidden dependencies
on:
schedule:
- cron: '0 0 * * *'
jobs:
update:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: MathieuSoysal/hiden-dependency-updater#v1.1.1
with:
files: action.yml # List of files to update
prefix: "com.google.cloud.tools:jib-maven-plugin:" # Prefix before the version, default is: ""
suffix: ":build ."
regex: "[0-9.]*"
selector: "maven"
github_repository: "GoogleContainerTools/jib"
- name: Create Pull Request
uses: peter-evans/create-pull-request#v4
with:
token: ${{ secrets.GITHUB_TOKEN }} # You need to create your own token with pull request rights
commit-message: update jib
title: Update jib
body: Update jib to reflect release changes
branch: update-jib
base: main
From the doc:
The directory must be set to "/" to check for workflow files in
.github/workflows.
- package-ecosystem: "github-actions"
# Workflow files stored in the
# default location of `.github/workflows`
directory: "/"
schedule:
interval: "daily"
So: try specifying a different directory, as example:
- package-ecosystem: "github-actions"
# Workflow files stored in the
directory: "."
schedule:
interval: "daily"
I am trying to add the github run number to a file in the github repository. The file looks like the following:
import json
from importlib import reload
import hashlib
from logging import raiseExceptions
import os
import importlib
qwe = importlib.import_module("asd-64")
The 64 signifies the github run number. I have tried doing the following:
qwe = importlib.import_module("asd-${{ github.run_number }}")
This doesn't work and prints the string ${{ github.run_number }}. Is there a way to achieve this?
I've done it using a python script and placeholders. (It might not be the best solution, but at least it works!)
The file to update would look like this:
test.py
import json
from importlib import reload
import hashlib
from logging import raiseExceptions
import os
import importlib
#placeholder1
The script to update it would look like this:
update_file.py
import os
import re
print("START")
WORKSPACE = os.getenv("WORKSPACE")
GITHUB_RUN_NUMBER = os.getenv("GITHUB_RUN_NUMBER")
FILE = f"{WORKSPACE}/test.py"
with open(FILE, "r") as file:
content = file.read()
importlib = f'qwe = importlib.import_module("asd-{GITHUB_RUN_NUMBER}")'
content = re.sub(r"#placeholder1", importlib, content)
with open(FILE, "w") as file:
file.write(content)
print("END")
And the workflow file that would perform the operation would look like this:
workflow.yml
name: ...
on:
workflow_dispatch:
jobs:
job1:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2.3.4
- uses: actions/setup-python#v4
with:
python-version: 3.8
- name: Execute Python script to update test.py file
run: python .github/scripts/update_file.py
env:
WORKSPACE: ${{ github.workspace }}
GITHUB_RUN_NUMBER: ${{ github.run_number }}
- run: cat test.py
Here is the related and successful workflow run.
(workflow file, update_file.py file, test.py file)
I'm running my workflows using GitHub Actions. When I create a pull_request that will trigger my workflow, I am getting the error message at the bottom of my question. What I am trying to do is to call my infrastructure/test/main.tf from my audit-account/prod-env directory. What do i need to change in the Env section for directory
# deploy.yml
name: 'GitHub OIDC workflow'
on:
pull_request:
branches:
- prod
env:
tf_version: 'latest'
tg_version: 'latest'
tf_working_dir: './audit-account/prod-env'
permissions:
id-token: write
contents: read
jobs:
deploy:
name: 'Build and Deploy'
runs-on: ubuntu-latest
steps:
- name: 'checkout'
uses: actions/checkout#v2
- name: configure AWS credentials
uses: aws-actions/configure-aws-credentials#master
with:
aws-region: us-east-1
role-to-assume: arn:aws:iam::123456789012:role/GitHubActions_Workflow_role
role-duration-seconds: 3600
- name: 'Terragrunt Init'
uses: the-commons-project/terragrunt-github-actions#master
with:
tf_actions_version: ${{ env.tf_version }}
tg_actions_version: ${{ env.tg_version }}
tf_actions_subcommand: 'init'
tf_actions_working_dir: ${{ env.tf_working_dir }}
tf_actions_comment: true
env:
TF_INPUT: false
# audit-account/prod-env/terragrunt.hcl
terraform {
source = "../../../../..//infrastructure/test"
}
include {
path = find_in_parent_folders()
}
infrastructure/test
main.tf
resource "aws_vpc" "test-vpc" {
cidr_block = "10.0.0.0/16"
instance_tenancy = "default"
tags = {
Name = "OIDC"
}
}
error message:
init: info: initializing Terragrunt configuration in /audit-account/prod-env
init: error: failed to initialize Terragrunt configuration in /audit-account/prod-env
time=2021-11-17T23:55:54Z level=error msg=Working dir infrastructure/test from source file:///github/workspace/audit-account/prod-env does not exist
Your source path for the infrastructure module goes way too far up in the folder structure.
Assuming you have the infrastructure and audit-account directories at the root of the repository, your source would be ../../infrastructure/test. You have it looking 5 folders up from audit-account/prod-env, which puts you 3 folders above the workspace in a folder somewhere on the runner's filesystem.
I am trying to figure out how to reference a global scoped environmental variable for input in to an action like so:
name: validate
on: pull_request
env:
CONFIG_PATH: configuration/conf.json
jobs:
upload_config:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v1
- name: create config
shell: bash -l {0}
run: |
mkdir `dirname ${CONFIG_PATH}`
echo "some config" > ${CONFIG_PATH}
- name: upload config
uses: actions/upload-artifact#v1
with:
name: config
path: ${{ CONFIG_PATH }}
However I am getting an invalid yaml error stating there is an "Unrecognized named-value: 'CONFIG_PATH'". If I try referencing the environmental variable like so:
path: ${CONFIG_PATH}
I get a "Path does not exist ${CONFIG_PATH}" error.
Any ideas?
I couldn't find a clear example of it in the docs but you need to use the env context for this like so:
path: ${{ env.CONFIG_PATH }}