How to copy a generated file from a GitHub Actions workflow to a Cloud Function? - google-cloud-functions

I'm using a GitHub Actions to deploy to a Google Cloud Function. The steps in my workflow include:
steps:
- name: "Checkout repository"
uses: actions/checkout#v3
# Setup Python so we can install Pipenv and generate requirements.txt.
- name: "Setup Python"
uses: actions/setup-python#v4
with:
python-version: '3.10'
- name: "Install Pipenv"
run: |
pipenv requirements > requirements.txt
ls -la
cat requirements.txt
- name: "Generate requirements.txt"
run: pipenv requirements > requirements.txt
- id: "auth"
name: "Authenticate to Google Cloud"
uses: "google-github-actions/auth#v0"
with:
workload_identity_provider: "..."
service_account: "..."
- id: "deploy"
uses: "google-github-actions/deploy-cloud-functions#v0"
with:
name: "my-function"
runtime: "python310"
Once I've generated the requirements.txt file I want that to be deployed along with my application code (checked out in the step above). The requirements.txt file gets generated during the build but it never gets deployed. (Confirmed by looking at the source in Cloud Functions).
How can I ensure this file is deployed along with my application code?
Update 1:
Here is the output after listing the contents of the directory after generating requirements.txt:
total 56
drwxr-xr-x 6 runner docker 4096 Sep 6 20:38 .
drwxr-xr-x 3 runner docker 4096 Sep 6 20:38 ..
-rw-r--r-- 1 runner docker 977 Sep 6 20:38 .env.example
-rw-r--r-- 1 runner docker 749 Sep 6 20:38 .gcloudignore
drwxr-xr-x 8 runner docker 4096 Sep 6 20:38 .git
drwxr-xr-x 3 runner docker 4096 Sep 6 20:38 .github
-rw-r--r-- 1 runner docker 120 Sep 6 20:38 .gitignore
-rw-r--r-- 1 runner docker 139 Sep 6 20:38 Pipfile
-rw-r--r-- 1 runner docker 454 Sep 6 20:38 Pipfile.lock
-rw-r--r-- 1 runner docker 1276 Sep 6 20:38 README.md
drwxr-xr-x 5 runner docker 4096 Sep 6 20:38 app
drwxr-xr-x 2 runner docker 4096 Sep 6 20:38 data
-rw-r--r-- 1 runner docker 2169 Sep 6 20:38 main.py
-rw-r--r-- 1 runner docker 27 Sep 6 20:38 requirements.txt
Update 2: Showing the contents of requirements.txt reveals it to only contain:
-i https://pypi.org/simple
No dependencies are included. This could well be the problem but I'm not yet sure why.
Update 3: The error shown in the deploy stage of the workflow is:
ModuleNotFoundError: No module named 'aiohttp'
This is because there is no requirements.txt file to install prior to running the function. aiohttp just happens to be the first dependency listed in my source code.

As explained by #ianyoung, the problem was with the pip file. The requirements.txt was empty, the requirements file is a list of all of a project’s dependencies. This includes the dependencies needed by the dependencies. It also contains the specific version of each dependency, specified with a double equals sign (==).

Related

airflow command not found when installing in Ubuntu via WSL - how to add it to path?

I have Ubuntu 20.04 and python 3.10.6 on WSL.
I have been trying to install airflow, and am getting 'airflow: command not found' when I'm trying to do 'airflow initdb' or 'airflow info'.
I have done
export AIRFLOW_HOME=~/airflow
and when I run
myname#LAPTOP-28BMMQV7:/root$ ls -l ~/.local/bin
I can see airflow in the list of files.
drwxrwxr-x 2 myname myname 4096 Nov 20 14:17 __pycache__
-rwxrwxr-x 1 myname myname 3472 Nov 20 14:17 activate-global-python-argcomplete
-rwxrwxr-x 1 myname myname 215 Nov 20 14:17 airflow
-rwxrwxr-x 1 myname myname 213 Nov 20 14:17 alembic
when I run this command to see where my python is, I can see this
myname#LAPTOP-28BMMQV7:/root$ ls -l /usr/bin/python*
lrwxrwxrwx 1 root root 10 Aug 18 11:39 /usr/bin/python3 -> python3.10
lrwxrwxrwx 1 root root 17 Aug 18 11:39 /usr/bin/python3-config -> python3.10-config
-rwxr-xr-x 1 root root 5912936 Nov 2 18:53 /usr/bin/python3.10
I also warnings similar to this:
WARNING: The script pygmentize is installed in '/home/myname/.local/bin' which is not on PATH.
So I need to find a way to add this directory to PATH.
I have found the following advice from the airflow documentation,
If the airflow command is not getting recognized (can happen on Windows when using WSL), then ensure that ~/.local/bin is in your PATH environment variable, and add it in if necessary:
PATH=$PATH:~/.local/bin
am not quite sure how to do it?
I also have a MySQL workbench/server 8.0.31 installed and want to connect it to airflow instead of SQLite. can anybody refer me to a good guide on how to install it correctly?
I have run 'pip install 'apache-airflow[mysql]'.
You were so close! I think your local python (and your terminal whenever you tried airflow db init ) was not able to see the airflow you installed on its path.
There is this video series I go to, whenever I need to install Airflow for a fellow coworker.
This video shows how to install Airflow locally. Also, in the second video it shows how to write a DAG.
And more importantly, on the third video it shows how to connect to a different database just like you wanted.

Docker and git error while deploying to server

I was using a CI CD pipeline to deploy my project to the server.
However it suddenly stopped working and I got two errors.
The first one is related to git and
The second one is a docker error.
Can somebody help me what could be the problem?
32 out: Total reclaimed space: OB
33 err: error: cannot pull with rebase:
You have unstaged changes. err: error: please commit or stash them. 34 35
out: docker build -f Dockerfile . -t
tourmix-next
36 err: time="20***-10-08T11:06:33Z"
level-error msg="Can't add file
/mnt/tourmix-main/database/mysql.sock
to tar: archive/tar: sockets not supported"
37 out: Sending build context to Docker daemon 255MB
38
out: Step 1/21 : FROM node:1ts as
dependencies
39 out: Its: Pulling from library/node
40 out: Digest:
sha256:b35e76ba744a975b9a5428b6c3cde1a1 cf0be53b246e1e9a4874f87034***b5a
47 41 out: Status: Downloaded newer image for node:1ts
2 42 out: ---> 946ee375d0e0
3 4 out: Step 2/21: WORKDIR /tourmix out: ---> Using cache
5 45 out: ---> 05e933ce4fa7
This is my Dockerfile:
1 FROM node:1ts as dependencies
2 WORKDIR /tourmix
3 COPY package*.json ./
4 RUN npm install --force
5
6 FROM node:lts as builder
7 WORKDIR /tourmix
8 COPY . .
9 COPY -from-dependencies /tourmix/node_modules ./node_modules
10 RUN npx prisma generate
11 RUN npm run build
12
13 FROM node:lts as runner
14 WORKDIR /tourmix
15 ENV NODE_ENV production
16 # If you are using a custom next.config.js file, uncomment this line.
17 COPY --from-builder /tourmix/next.config.js ./
18 COPY --from-builder /tourmix/public ./public
19 COPY --from-builder /tourmix/.next ./.next
20 COPY --from-builder /tourmix/node_modules ./node_modules
21 COPY -from-builder /tourmix/package.json ./package.json
22 COPY --from-builder /tourmix/.env ./.env
24 # copy the prisma folder
25 EXPOSE 3000
26 CMD ["yarn", "start"]
This is my GitHub workflow file:
# This is a basic workflow that is manually triggered
name: Deploy application
# Controls when the action will run. Workflow runs when manually triggered using the UI
# or API.
on:
push:
branches: [master]
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "greet"
deploy:
# The type of runner that the job will run on
runs-on: ubuntu-latest
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
- name: multiple command
uses: appleboy/ssh-action#master
with:
host: ${{secrets.SSH_HOST}}
username: ${{ secrets. SSH_USER }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
port: ${{ secrets.SSH_PORT}} passphrase: ${{ secrets.SSH_PASSPHRASE}}
script:|
docker system prune -a -f
cd /mnt/tourmix-main
git pull origin master --rebase
make release
docker system prune -a -f
- uses: actions/checkout#v3
with:
clean: 'true'
Start with the first error:
Add a git clean pre-step in your pipeline, to clean any private file from your workspace.
If you are using GitLab as a CICD platform, use Git clean flags (GitLab Runner 11.10+, Q2 2019)
For a GitHub Action, if the error is on the git pull command, add a git clean -ffdx just before the git pull.
script:|
docker system prune -a -f
cd /mnt/tourmix-main
git clean -ffdx <====
git stash <====
git pull origin master --rebase
make release
docker system prune -a -f

Compiling MySQL C API client does not link libmysqlclient.so.20

I'm writing some loadable modules for Zabbix, as such, compiling shared objects. I've written one which uses the MySQL C API to read some data from tables, it's fairly standard, and includes:
#include <my_global.h>
#include <mysql.h>
My gcc command looks like so (expanded mysql_config for clarity):
gcc -fPIC -shared -o zbx_mysql.so zbx_mysql.c -I/usr/lib64/mysql `mysql_config --cflags` -I/opt/zabbix/3.2/include -L/usr/lib64/mysql -lmysqlclient -lpthread -lm -lrt -ldl
Contents of /usr/lib64/mysql:
-rw-r--r-- 1 root root 21358968 Sep 13 17:15 libmysqlclient.a
lrwxrwxrwx 1 root root 20 Nov 19 23:19 libmysqlclient_r.so.18 -> libmysqlclient.so.18
lrwxrwxrwx 1 root root 24 Nov 19 23:19 libmysqlclient_r.so.18.1.0 -> libmysqlclient.so.18.1.0
lrwxrwxrwx 1 root root 20 Nov 19 23:19 libmysqlclient.so -> libmysqlclient.so.20
lrwxrwxrwx 1 root root 24 Nov 19 23:19 libmysqlclient.so.18 -> libmysqlclient.so.18.1.0
-rwxr-xr-x 1 root root 9580608 Sep 13 17:07 libmysqlclient.so.18.1.0
lrwxrwxrwx 1 root root 24 Nov 19 23:18 libmysqlclient.so.20 -> libmysqlclient.so.20.3.7
-rwxr-xr-x 1 root root 9884704 Sep 13 17:15 libmysqlclient.so.20.3.7
-rw-r--r-- 1 root root 44102 Sep 13 17:13 libmysqlservices.a
drwxr-xr-x 4 root root 28 Nov 19 23:18 mecab
drwxr-xr-x. 3 root root 4096 Nov 19 23:19 plugin
The .so compiles and runs fine on the dev box, but copying it to a box without mysql-devel installed yields the following error:
cannot load module "zbx_mysql.so": libmysqlclient.so.20: cannot open shared object file: No such file or directory
I can only assume this means that the libmysqlclient.so.20.so isn't being bundled into my .so. I'm pretty much a novice here, so if anyone can advise it'd be greatly appreciated.
Shared libraries aren't "bundled", that's why they're shared. The machine you're trying to run on obviously misses the library. Libraries typically aren't in the "-dev" or "-devel" packages.
On your typical *nix system, you can have multiple versions of the same shared library installed, but normally only one development package. If you have the dev package for mysql-client 20 installed, the compiled code will link against that version. If you want your compiled code to link against mysql-client 18, install the older version of the development package.
If you need to be independent of the libraries installed on your target system, one possibility would be to link a static library instead.

Neo4j - Couldn't load the external resource at: file:

I am using ubuntu 14.04 and trying to import csv file but getting following error - Couldn't load the external resource at: file:/usr/share/neo4j/import/orders.csv
My query is -
USING PERIODIC COMMIT
LOAD CSV WITH HEADERS FROM "file:///orders.csv" AS row
MATCH (order:Order {orderId: row.SalesOrderID})
MATCH (product:Product {productId: row.ProductID})
MERGE (order)-[pu:PRODUCT]->(product)
ON CREATE SET pu.unitPrice = toFloat(row.UnitPrice), pu.quantity = toFloat(row.OrderQty);
I have placed csv files at /var/lib/neo4j/import and also changed permission sudo chmod 777 -R /var/lib/neo4j/import but still not working.
file permissions are as -
sachin#sachin:/var/lib/neo4j$ ls -la
total 28
drwxr-xr-x 7 neo4j adm 4096 Aug 31 10:10 .
drwxr-xr-x 76 root root 4096 Aug 30 19:33 ..
drwxr-xr-x 2 neo4j nogroup 4096 Aug 31 10:10 certificates
drwxr-xr-x 4 neo4j adm 4096 Aug 31 10:10 data
drwxrwxrwx 2 neo4j adm 4096 Aug 31 11:16 import
drwxr-xr-x 2 neo4j nogroup 4096 Aug 31 10:10 .oracle_jre_usage
drwxr-xr-x 2 neo4j adm 4096 Jul 28 09:19 plugins
Please help!!! Thanks.
okay , I've resolved it by creating new directory import under /usr/share/neo4j , place csv files in this directory and set its permission to 777.
Explanation Since my error was Couldn't load the external resource at: file:/usr/share/neo4j/import/orders.csv and I placed my csv files at /var/lib/neo4j/import. Hope it will help others,Thanks.

Working with memcache and mysql using memcache_functions_mysql UDFs

I have installed the following packages --
libevent-2.0.21
memcached-1.4.17
libmemcached-0.34
memcached_functions_mysql_1.1
All of the above have been installed successfully.
The output of the ldconfig -v command (the part where it shows that libmemcached libraries have been included) is as follows:
$ldconfig -v
/usr/local/libmemcached/lib:
libmemcached.so.3 -> libmemcached.so.3.0.0
libmemcachedprotocol.so.0 -> libmemcachedprotocol.so.0.0.0
libmemcachedutil.so.0 -> libmemcachedutil.so.0.0.0
But when I try to load UDF's into mysql using the install_functions.sql that is shipped with memcache_functions it throws the following error:
ERROR 1126 (HY000) at line 38: Can't open shared library 'libmemcached_functions_mysql.so' (errno: 0 libmemcached.so.3: cannot open shared object file: No such file or directory)
And the contents of the plugin directory are:
-rw-r--r-- 1 root root 6.1K Jan 21 13:49 adt_null.so
-rw-r--r-- 1 root root 11K Jan 21 13:49 auth.so
-rw-r--r-- 1 root root 6.0K Jan 21 13:49 auth_socket.so
-rw-r--r-- 1 root root 6.2K Jan 21 13:49 auth_test_plugin.so
-rw-r--r-- 1 root root 35K Jan 21 13:49 ha_example.so
-rw-r--r-- 1 root root 10K Jan 21 13:49 libdaemon_example.so
-rw-r--r-- 1 root root 361K Feb 13 02:47 libmemcached_functions_mysql.a
-rwxr-xr-x 1 root root 1.1K Feb 13 02:47 libmemcached_functions_mysql.la
-rwxr-xr-x 1 root root 167K Feb 13 02:47 libmemcached_functions_mysql.so
-rwxr-xr-x 1 root root 167K Feb 13 02:47 libmemcached_functions_mysql.so.0
-rwxr-xr-x 1 root root 167K Feb 13 02:47 libmemcached_functions_mysql.so.0.0.0
-rw-r--r-- 1 root root 11K Jan 21 13:49 mypluglib.so
-rw-r--r-- 1 root root 5.9K Jan 21 13:49 qa_auth_client.so
-rw-r--r-- 1 root root 11K Jan 21 13:49 qa_auth_interface.so
-rw-r--r-- 1 root root 6.0K Jan 21 13:49 qa_auth_server.so
-rw-r--r-- 1 root root 39K Jan 21 13:49 semisync_master.so
-rw-r--r-- 1 root root 15K Jan 21 13:49 semisync_slave.so
Installing memcached on Ubuntu 12.04.1 LTS (64 bit)
Python - 2.6.8
MySQL – 5.5.28
Zope – 2.12.19
1. Install memcached
apt-get install memcached
2. Install libmemcached -0 .34 (Version is very important. May or may not work with other versions).
Download it from https://launchpad.net/libmemcached/
tar xvf libmemcached-0.34.tar.gz
sudo ./configure --prefix=/usr/lib/libmemcached --with-memcached=/usr/bin/memcached
sudo make
sudo make install.
3. Install memcached_functions_mysql (Version 1.1 used at the time of the installation. (To create UDF’s that are invoked by triggers to manipulate the cache).
Downloading from https://launchpad.net/memcached-udfs
sudo ./configure --prefix=/usr/local/memcached_mysql --libdir=/usr/lib/mysql/plugin --with-mysql=/usr/bin/mysql_config --with-libmemcached=/usr/lib/libmemcached
sudo make
sudo make install
Navigate to the “sql” folder inside the memcached_mysql_functions directory.
mysql –u <username> -p < install_functions.sql
4. For Zope users
Install python-memcached-1.53
Download it from https://pypi.python.org/pypi/python-memcached/
Navigate to the extracted python-memcached directory.
/home/zope/zope/bin/python setup.py install
5. Edit the script containing which imports the modules.
allow_module('memcache')
from memcache import Client
allow_class(Client)
allow_module('memcache.Client.get')
allow_module('memcache.Client.set')
This is done so that memcache can be imported and used in your Restricted Python scripts.
If an external method is used to handle the above case, then the file does need to be updated with the above content.