Google Cloud Functions: How do you share source code? - google-cloud-functions

I have a Node server and multiple controllers that perform DB operations and helpers (For e-mail, for example) within that directory.
I'd like to use source from that directory within my functions. Assuming the following directory structure:
src/
server/
/app/controllers/email_helper.js
fns/
send-confirm/
What's the best way to use email_helper within the send-confirm function?
I've tried:
Symbolically linking the 'server' directory
Adding a local repo to send-confirm/package.json
Neither of the above work.

In principle, your Cloud Functions can use any other Node.js module, the same way any standard Node.js server would. However, since Cloud Functions needs to build your module in the cloud, it needs to be able to locate those dependency modules from the cloud. This is where the issue lies.
Cloud Functions can load modules from any one of these places:
Any public npm repository.
Any web-visible URL.
Anywhere in the functions/ directory that firebase init generates for you, and which gets uploaded on firebase deploy.
In your case, from the perspective of functions/package.json, the ../server/ directory doesn't fall under any of those categories, and so Cloud Functions can't use your module. Unfortunately, firebase deploy doesn't follow symlinks, which is why that solution doesn't work.
I see two possible immediate fixes:
Move your server/ directory to be under functions/. I realize this isn't the prettiest directory layout, but it's the easiest fix while hacking. In functions/package.json you can then have a local dependency on ./server.
Expose your code behind a URL somewhere. For example, you could package up a .tar and put that on Google Drive, or on Firebase Cloud Storage. Alternatively, you can use a public git repository.
In the future, I'd love it if firebase deploy followed symlinks. I've filed a feature request for that in Firebase's internal bug tracker.

Related

google cloud functions: downloading a file to the root directory python 3.9

I have a file I need to get into the Google Cloud Function's directory for a multi-step problem. Matplotlib: Custom fonts in cloud functions using Python 3.9
I'm not sure how to do it. Do I do it as a function in cloud functions? or use the console terminal for the project? I tried that and looked in the root directory and there was nothing there. I can only change projects and not change to a specific function directory.
Can someone please show me how to put this file https://www.1001freefonts.com/balthazar.font into the function's file system so it can be called during execution?
When you deploy a Cloud Function to GCP, you can supply a ZIP file or a directory that contains your source code and additional artifacts/files that you may need.
To perform the deployment of the ZIP or directory, you will want to use the gcloud command. A good article on this is Deploying from Your Local Machine.
The detailed documentation on the CLI can be found at gcloud functions deploy.
In your example, you could create a directory that contains your source and your font file and both will be present in the context of the Cloud Function. I believe that if you want to reference the files, you will want to use the local current directory in your code. For example, instead of coding /myfontfile.font you might code ./myfontfile.font.
Here are some references to this technique:
Cloud Functions: how to upload additional file for use in code?

Deploying multiple Google Cloud Functions from same repo

The documentation for Google Cloud Functions is a little vague - I understand how to deploy a single function that is contained within index.js - even in a specific directory, but how does one deploy multiple cloud functions which are located within the same repository?
AWS Lambda allows you to specify a specific file and function name:
/my/path/my-file.myHandler
Lambda also allows you to deploy a zip file containing only the files required to run, omitting all of the optional transitive npm dependencies and their resources. For some libraries (eg Oracle DB) including node-modules/** would significantly increase the deployment time, and possibly exceed storage limits (it does on AWS Lambda).
The best that I can manage with Google Cloud Function deployment is:
$ gcloud alpha functions deploy my-function \
--trigger-http
--source-url https://github.com/user-name/my-repo.git \
--source-branch master \
--source-path lib/foo/bar
--entry-point myHandler
...but my understanding is that it deploys lib/foo/bar/index.js which contains function myHandler(req, res) {} ...and all dependencies concatenated in the same file? That doesn't make sense at all - like I said, the documentation is a little vague.
The current deployment tool takes a simple approach. It zips the directory and uploads it. This means you (currently) should move or delete node_modules before executing the command if you don't wish for them to be included in the deployment package. Note that, like lambda, GCF will resolve dependencies automatically.
As to deployment, please see: gcloud alpha functions deploy --help
Specifically:
--entry-point=ENTRY_POINT
The name of the function (as defined in source code) that will be
executed.
You might opt to use the --source flags to upload the file once, then deploy the functions sans upload. You can also instruct google to pull functions from a repo in the same manner. I suggest you write a quick deployment script to help you deploy a list of functions in a single command.

Google cloud - Stackdriver debug reports "File was not found in the executable" for GCE Jetty war

I've been trying to follow the
Setting Up Stackdriver Debugger for Java applications on Google Compute Engine, but am running into issues with Stackdriver Debug.
I'm building my .war file from a separate build server, then deploying it to my GCE server. I added the agent to the start command via /etc/defaults, and my app appears in the https://console.cloud.google.com/debug control panel. The version I set in the run command matches the revision that shows up in the source-context(s).json files.
However when I click open the app, I see the message that
No source version information was provided by the deployed application
I connected the app's git repo as a mirrored cloud repository, and can browse the source files in the sidebar of the Stackdriver Debug page. But, If I browse to a file and add a breakpoint I get an error that the error "File was not found in the executable."
I have ran the gcloud preview app gen-repo-info-file command, which created two basic json files storing my git repo and revision. Is it supposed to do anything else?
I have tried running jetty using both normal and extracted modes. If I have jetty first extract the war file, I can see the source-context.json filesin the WEB-INF/classes directory.
What am I missing?
https://github.com/GoogleCloudPlatform/cloud-debug-java#extra-classpath mentions
you can update the agentPath showing your WEB-INF/class directory.
-agentpath:/opt/cdbg/cdbg_java_agent.so=--cdbg_extra_class_path=/opt/tomcat/webapps/myapp/WEB-INF/classes
For multiple class paths:
-agentpath:/opt/cdbg/cdbg_java_agent.so=--cdbg_extra_class_path=/opt/tomcat/webapps/myapp/WEB-INF/classes:/another/path/with/classes
There are a couple of things going on here.
First, it sounds like you are doing the correct thing with gen-repo-info-file. The debugger agent should pick up the json files from the WEB-INF/classes directory.
The debugger uses fuzzy matching to find source files, so as long as the name of the .java file matches a file in your executable, you should not get that error.
The most likely scenario given the information in your question is that you are attaching the debugger to a launcher process, rather than your actual application. Without further details, I can't absolutely confirm that, though.
If you send us more details at cdbg-feedback#google.com, we can look more closely at your case to see if we can understand exactly what's happening, and potentially improve our documentation, since it sounds like you followed the docs pretty closely.

Multiple alias in one account

I never used fortrabbit before and i have a question about it.
I know i can create apps and define the document root, but lets imagine the following:
I want to go with Yii2 Framework (Advanced template)
Advanced template have "two apps" in it (2 folders) the backend and the frontend.
On a real server we have to create two alias, eg:
admin.myapp.com -> root/backend/www
www.myapp.com -> root/frontend/www
Is possible to configure the fortrabbit to work with it within the same application and share the same resources (MySQL, cache, etc)?
your setup is possible at fortrabbit.
Just put both folders in your git repo and push to forrabbit. After that you can route the subdomains (www., admin.) to the subfolders (frontend/www, backend/www).
If your project requires a composer install during the deploy process it will not work our of the box, since we check only for the composer.json/lock in the root of your project.
However you can define your custom post-deploy scripts. In these script you could call a composer install in the subfolders.
Cheers
Oliver (fortrabbit staff)

Openshift: where to put resource files that I want outside of the deployment folder

I'm starting a new web app with Openshift (jboss, mysql). It's the first time I use openshift and after reading through some doc and experimenting a bit with it, I'm having one question regarding best practices for the architecture of my app.
There will be some files generated by- or uploaded to the application (resources). I'd like those files to be outside the deployment folder so they are not erased/overwritten when the app deploys again. I have browsed through the directories and I was wondering:
is it ok to use the /var/lib/openshift/[openshift-id]/app-root/data folder for these files?
Yes, you should use your ~/app-root/data folder for any files that you want to not be erased when you do a git push, there is also an environment variable that you can use that points to that folder called OPENSHIFT_DATA_DIR. Please note that if you are using a scaled application, that folder is not shared among your gears.