Is there any way to output logs using log4js on Cloud Functions? - google-cloud-functions

I'm using GCP's cloud function. What I want to achieve is to output logs by log4js.
I know and have tried that using console.xxx() works well.
Environment:
- Google Cloud Functions
- Functions-framework
- nodejs10 as Runtime
logger.js
const log4js = require('log4js');
const logger = exports = module.exports = {};
log4js.configure({
appenders: {
out: { type: 'console' },
trail: {
type: 'dateFile',
filename: './logs/trail',
pattern: '-yyyy-MMdd-hh.log',
alwaysIncludePattern: true
}
},
categories: {
default: { appenders: [ 'out' ], level: 'info' },
trail: { appenders: [ 'trail' ], level: 'DEBUG' }
}
})
logger.trail = log4js.getLogger('trail')
index.js
const { logger } = require('./logger');
exports.spTest = (pubSubEvent, context) => {
console.log('console.log should appear'); // => properly logged
logger.trail.error('logger should appear'); => doesn't show up
};
Thanks in advance!

According to the oficial documentation link:
Cloud Logging is part of the Google Cloud's operations suite of
products in Google Cloud. It includes storage for logs, a user
interface called the Logs Viewer, and an API to manage logs
programmatically.
Also Custom StackDriver logs
Cloud Functions logs are backed by StackDriver Logging. You can use
the StackDriver Logging library for Node.js to log events with
structured data, enabling easier analysis and monitoring.
const { Logging } = require('#google-cloud/logging');
// ...
// Instantiate the StackDriver Logging SDK. The project ID will
// be automatically inferred from the Cloud Functions environment.
const logging = new Logging();
const log = logging.log('my-custom-log-name');
// This metadata is attached to each log entry. This specifies a fake
// Cloud Function called 'Custom Metrics' in order to make your custom
// log entries appear in the Cloud Functions logs viewer.
const METADATA = {
resource: {
type: 'cloud_function',
labels: {
function_name: 'CustomMetrics',
region: 'us-central1'
}
}
};
// ...
// Data to write to the log. This can be a JSON object with any properties
// of the event you want to record.
const data = {
event: 'my-event',
value: 'foo-bar-baz',
// Optional 'message' property will show up in the Firebase
// console and other human-readable logging surfaces
message: 'my-event: foo-bar-baz'
};
// Write to the log. The log.write() call returns a Promise if you want to
// make sure that the log was written successfully.
const entry = log.entry(METADATA, data);
log.write(entry);index.js
Therefore, I do not think you can use log4js on Cloud Functions.

Related

Google App Scripts User Cache Service: Accessing Cache Information in a Different Script but Same User

Looking into a way of sharing data via Google App Scripts's Cache Services from one web app to another.
Users load up the first webpage and filled out their information. Once submitted a function is run on this data and stored via the cache.
CacheService.getUserCache().put('FirstName','David')
CacheService.getUserCache().put('Surname','Armstrong')
Console log shows reports back that these two elements have been saved to cache.
However in the second web app when cache is called upon the console log returns null
var cache = CacheService.getUserCache().get('Firstname');
var cache2 = CacheService.getUserCache().get('Surname');
console.log(cache)
console.log(cache2)
Any ideas?
A possible solution would be to implement a service to synchronize the cache between web apps.
This can be achieved by creating a WebApp that via POST allows to add to the ScriptCache of the "Cache Synchronizer" the UserCache of the individual Web Apps.
The operation would be very simple:
From the web app that we want to synchronize, we check if we have cache of the user.
If it exists, we send it to the server so that it stores it.
If it does not exist, we check if the server has stored the user's cache.
Here is a sketch of how it could work.
CacheSync.gs
const cacheService = CacheService.getScriptCache()
const CACHE_SAVED_RES = ContentService
.createTextOutput(JSON.stringify({ "msg": "Cache saved" }))
.setMimeType(ContentService.MimeType.JSON)
const doPost = (e) => {
const { user, cache } = JSON.parse(e.postData.contents)
const localCache = cacheService.get(user)
if (!localCache) {
/* If no local data, we save it */
cacheService.put(user, JSON.stringify(cache))
return CACHE_SAVED_RES
} else {
/* If data we send it */
return ContentService
.createTextOutput(JSON.stringify(localCache))
.setMimeType(ContentService.MimeType.JSON)
}
}
ExampleWebApp.gs
const SYNC_SERVICE = "<SYNC_SERVICE_URL>"
const CACHE_TO_SYNC = ["firstName", "lastName"]
const cacheService = CacheService.getUserCache()
const syncCache = () => {
const cache = cacheService.getAll(CACHE_TO_SYNC)
const options = {
method: "POST",
payload: JSON.stringify({
user: Session.getUser().getEmail(),
cache
})
}
if (Object.keys(cache).length === 0) {
/* If no cache try to fetch it from the cache service */
const res = UrlFetchApp.fetch(SYNC_SERVICE, options)
const parsedResponse = JSON.parse(JSON.parse(res.toString()))
Object.keys(parsedResponse).forEach((k)=>{
console.log(k, parsedResponse[k])
cacheService.put(k, parsedResponse[k])
})
} else {
/* If cache send it to the sync service */
const res = UrlFetchApp.fetch(SYNC_SERVICE, options)
console.log(res.toString())
}
}
const createCache = () => {
cacheService.put('firstName', "Super")
cacheService.put('lastName', "Seagull")
}
const clearCache = () => {
cacheService.removeAll(CACHE_TO_SYNC)
}
Additional information
The synchronization service must be deployed with ANYONE access. You can control the access via an API_KEY.
This is just an example, and is not fully functional, you should adapt it to your needs.
The syncCache function of the web App is reusable, and would be the function you should use in all Web Apps.
There is a disadvantage when retrieving the cache, since you must provide the necessary keys, which forces you to write them manually (ex CACHE_TO_SYNC).
It could be considered to replace ScriptCache with ScriptProperties.
Documentation
Cache
Properties
Session
The doc says:
Gets the cache instance scoped to the current user and script.
As it is scoped to the script, accessing from another script is not possible. This is also the case with PropertiesService:
Properties cannot be shared between scripts.
To share, you can use a common file shared between them, like a drive text file or a spreadsheet.

Autodesk forge configurator inventor add new model authorization error

Hello we are using Autodesk forge configurator inventor And we created our own js function. Below, you will find the logic we want to import to the application. On it's own, we make it work, but with forge configurator inventor, we get the authentication error. We tried a lot of different options but failed to make it load the document.
Error is --> GET 401 (Unauthorized)
import repo from '../../Repository';
var options = repo.hasAccessToken() ?
{ accessToken: repo.getAccessToken() } :
{ env: 'Local' };
var documentId = 'urn:MyUrn';
Autodesk.Viewing.Document.load(
documentId, (doc) => {
console.log("test");
let items = doc.getRoot().search(
{
type: "geometry",
role: "3d",
},
true
);
if (items.length === 0) {
console.error("Document contains no viewables.");
return;
}
viewer.loadDocumentNode(doc, items[0], {
keepCurrentModels: true,
//placementTransform: tr,
})
.then(function (model2) {
secondModel = model2;
let tr = secondModel.getPlacementTransform();
let _selecterTr = _selectedModel.getPlacementTransform();
console.log(_selecterTr);
tr = _selecterTr;
secondModel.setPlacementTransform(tr);
viewer.impl.invalidate(true, true, true);
});
}, onDocumentLoadFailure,options);
function onDocumentLoadFailure() {
console.error('Failed fetching Forge manifest');
}
The main issue is that repo does not provide an access token because it was never needed. The models' SVF contents are always loaded directly from the server - instead of relying on the Model Derivative service to provide them.
You just have to provide an endpoint (e.g. /api/viewables/token) on the server-side by adding a Controller that could be called from the client-side in order to get an access token with viewables:read scope.
Detailed information here:
https://forge.autodesk.com/blog/drag-and-drop-design-automation-inventor-sample

Facing challenge to invoke cloud Function from cloud task using oidcToken

I am facing challenge to invoke cloud Function from cloud task using oidcToken.
Here are details of my IAM & Code:
const { CloudTasksClient } = require('#google-cloud/tasks');
const client = new CloudTasksClient();
//See https://cloud.google.com/tasks/docs/tutorial-gcf
module.exports = async (payload, scheduleTimeInSec) => {
const project = process.env.GOOGLE_APPLICATION_PROJECTID;
const queue = process.env.QUEUE_NAME;
const location = process.env.QUEUE_LOCATION;
const callBackUrl = https://asia-south2-trial-288318.cloudfunctions.net/cloud-function-node-expres/;
// Construct the fully qualified queue name.
const parent = client.queuePath(project, location, queue);
const body = Buffer.from(JSON.stringify(payload)).toString('base64');
const task = {
httpRequest: {
httpMethod: 'POST',
url: callBackUrl,
headers: { 'Content-Type': 'application/json' },
body
},
scheduleTime: {
seconds: scheduleTimeInSec,
}
};
if (process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL) {
task.httpRequest.oidcToken = {
serviceAccountEmail: process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL
}
}
const request = {
parent: parent,
task: task,
};
// Send create task request.
try {
let [responses] = await client.createTask(request);
return ({ sts: true, taskName: responses.name, msg: "Email Schedule Task Created" })
}
catch (e) {
return ({ sts: true, err: true, errInfo: e, msg: "Unable to Schedule Task. Internal Error." })
}
}
The process.env.GOOGLE_APPLICATION_SERVICE_ACCOUNT_EMAIL has Cloud Functions Invoker role and the Cloud Function has allAuthenticatedUsers member with role Cloud Functions Invoker as per the doc.
But still I am seeing the 401 resposnse recevied by Cloud Task and Cloud Function is not getting called(See below image):
Any comment on this, whats going wrong here
This seems to be related that you have created the function in Firebase (guessing from the url). Seems the "Cloud Functions Invoker" is not enough for Firebase functions. I have replicated similar behavior on HelloWorld function from Firebase. The error is differnet (403) but I hope it will help you to troubleshoot the same way.
After creation helloWorld in Firebase I tested it with glcoud command in following steps:
Create service acount with role "Cloud Functions Invoker" or use exiting one
Download key for the account in JSON.
Change gcloud to act as service account:
gcloud auth activate-service-account <service-account#email> --key-file=<key-form-step-2.json>
gcloud functions call helloWorld
As the result of last action I got this error:
ERROR: (gcloud.functions.call) ResponseError: status=[403], code=[Forbidden], message=[Permission 'cloudfunctions.functions.call' denied on resource 'projects/functions-asia-test-vitooh/locations/us-central1/functions/helloWorld' (or reso
urce may not exist).]
So I created custom role in IAM: Cloud Functions Invoker + Firebase adding permission from the error massage cloudfunctions.functions.call.
The function started to work with the same gcloud functions call:
executionId: 3fgndpolu981
result: Hello from Firebase!
I think it will work as well. You can try add the same permission. If it wont work, try the same testing.
References:
gcloud auth command
create custom role in Cloud IAM
gcloud function call

how can I properly invoke a google cloud function inside another function [duplicate]

I am using a Cloud Function to call another Cloud Function on the free spark tier.
Is there a special way to call another Cloud Function? Or do you just use a standard http request?
I have tried calling the other function directly like so:
exports.purchaseTicket = functions.https.onRequest((req, res) => {
fetch('https://us-central1-functions-****.cloudfunctions.net/validate')
.then(response => response.json())
.then(json => res.status(201).json(json))
})
But I get the error
FetchError: request to
https://us-central1-functions-****.cloudfunctions.net/validate
failed, reason: getaddrinfo ENOTFOUND
us-central1-functions-*****.cloudfunctions.net
us-central1-functions-*****.cloudfunctions.net:443
Which sounds like firebase is blocking the connection, despite it being a google owned, and therefore it shouldn't be locked
the Spark plan only allows outbound network requests to Google owned
services.
How can I make use a Cloud Function to call another Cloud Function?
You don't need to go through the trouble of invoking some shared functionality via a whole new HTTPS call. You can simply abstract away the common bits of code into a regular javascript function that gets called by either one. For example, you could modify the template helloWorld function like this:
var functions = require('firebase-functions');
exports.helloWorld = functions.https.onRequest((request, response) => {
common(response)
})
exports.helloWorld2 = functions.https.onRequest((request, response) => {
common(response)
})
function common(response) {
response.send("Hello from a regular old function!");
}
These two functions will do exactly the same thing, but with different endpoints.
To answer the question, you can do an https request to call another cloud function:
export const callCloudFunction = async (functionName: string, data: {} = {}) => {
let url = `https://us-central1-${config.firebase.projectId}.cloudfunctions.net/${functionName}`
await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ data }),
})
}
(Note we are using the npm package 'node-fetch' as our fetch implementation.)
And then simply call it:
callCloudFunction('search', { query: 'yo' })
There are legitimate reasons to do this. We used this to ping our search cloud function every minute and keep it running. This greatly lowers response latency for a few dollars a year.
It's possible to invoke another Google Cloud Function over HTTP by including an authorization token. It requires a primary HTTP request to calculate the token, which you then use when you call the actual Google Cloud Function that you want to run.
https://cloud.google.com/functions/docs/securing/authenticating#function-to-function
const {get} = require('axios');
// TODO(developer): set these values
const REGION = 'us-central1';
const PROJECT_ID = 'my-project-id';
const RECEIVING_FUNCTION = 'myFunction';
// Constants for setting up metadata server request
// See https://cloud.google.com/compute/docs/instances/verifying-instance-identity#request_signature
const functionURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + functionURL;
exports.callingFunction = async (req, res) => {
// Fetch the token
const tokenResponse = await get(tokenUrl, {
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = tokenResponse.data;
// Provide the token in the request to the receiving function
try {
const functionResponse = await get(functionURL, {
headers: {Authorization: `bearer ${token}`},
});
res.status(200).send(functionResponse.data);
} catch (err) {
console.error(err);
res.status(500).send('An error occurred! See logs for more details.');
}
};
October 2021 Update: You should not need to do this from a local development environment, thank you Aman James for clarifying this
Despite of the question tag and other answers concern the javascript I want to share the python example as it reflects the title and also authentification aspect mentioned in the question.
Google Cloud Function provide REST API interface what incluse call method that can be used in another Cloud Function.
Although the documentation mention using Google-provided client libraries there is still non one for Cloud Function on Python.
And instead you need to use general Google API Client Libraries. [This is the python one].3
Probably, the main difficulties while using this approach is an understanding of authentification process.
Generally you need provide two things to build a client service:
credentials ans scopes.
The simpliest way to get credentials is relay on Application Default Credentials (ADC) library. The rigth documentation about that are:
https://cloud.google.com/docs/authentication/production
https://github.com/googleapis/google-api-python-client/blob/master/docs/auth.md
The place where to get scopes is the each REST API function documentation page.
Like, OAuth scope: https://www.googleapis.com/auth/cloud-platform
The complete code example of calling 'hello-world' clound fucntion is below.
Before run:
Create default Cloud Function on GCP in your project.
Keep and notice the default service account to use
Keep the default body.
Notice the project_id, function name, location where you deploy function.
If you will call function outside Cloud Function environment (locally for instance) setup the environment variable GOOGLE_APPLICATION_CREDENTIALS according the doc mentioned above
If you will call actualy from another Cloud Function you don't need to configure credentials at all.
from googleapiclient.discovery import build
from googleapiclient.discovery_cache.base import Cache
import google.auth
import pprint as pp
def get_cloud_function_api_service():
class MemoryCache(Cache):
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
scopes = ['https://www.googleapis.com/auth/cloud-platform']
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set,
# ADC uses the service account file that the variable points to.
#
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set,
# ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run,
# and Cloud Functions provide
#
# see more on https://cloud.google.com/docs/authentication/production
credentials, project_id = google.auth.default(scopes)
service = build('cloudfunctions', 'v1', credentials=credentials, cache=MemoryCache())
return service
google_api_service = get_cloud_function_api_service()
name = 'projects/{project_id}/locations/us-central1/functions/function-1'
body = {
'data': '{ "message": "It is awesome, you are develop on Stack Overflow language!"}' # json passed as a string
}
result_call = google_api_service.projects().locations().functions().call(name=name, body=body).execute()
pp.pprint(result_call)
# expected out out is:
# {'executionId': '3h4c8cb1kwe2', 'result': 'It is awesome, you are develop on Stack Overflow language!'}
These suggestions don't seem to work anymore.
To get this to work for me, I made calls from the client side using httpsCallable and imported the requests into postman. There were some other links to https://firebase.google.com/docs/functions/callable-reference there were helpful. But determining where the information was available took a bit of figuring out.
I wrote everything down here as it takes a bit of explaining and some examples.
https://www.tiftonpartners.com/post/call-google-cloud-function-from-another-cloud-function
Here's an inline version for the 'url' might expire.
This 'should' work, it's not tested but based off of what I wrote and tested for my own application.
module.exports = function(name,context) {
const {protocol,headers} = context.rawRequest;
const host = headers['x-forwardedfor-host'] || headers.host;
// there will be two different paths for
// production and development
const url = `${protocol}://${host}/${name}`;
const method = 'post';
const auth = headers.authorization;
return (...rest) => {
const data = JSON.stringify({data:rest});
const config = {
method, url, data,
headers: {
'Content-Type': 'application/json',
'Authorization': auth,
'Connection': 'keep-alive',
'Pragma': 'no-cache,
'Cache-control': 'no-cache',
}
};
try {
const {data:{result}} = await axios(config);
return result;
} catch(e) {
throw e;
}
}
}
This is how you would call this function.
const crud = httpsCallable('crud',context);
return await crud('read',...data);
context you get from the google cloud entry point and is the most important piece, it contains the JWT token needed to make the subsequent call to your cloud function (in my example its crud)
To define the other httpsCallable endpoint you would write an export statement as follows
exports.crud = functions.https.onCall(async (data, context) => {})
It should work just like magic.
Hopefully this helps.
I found a combination of two of the methods works best
const anprURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + anprURL;
// Fetch the token
const tokenResponse = await fetch(tokenUrl, {
method: "GET"
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = await tokenResponse.text();
const functionResponse = await fetch(anprURL, {
method: 'POST',
headers: {
"Authorization": `bearer ${token}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({"imageUrl": url}),
});
// Convert the response to text
const responseText = await functionResponse.text();
// Convert from text to json
const reponseJson = JSON.parse(responseText);
Extending the Shea Hunter Belsky's answer I would love to inform you that the call to the metatdata server of google to fetch the authorization token would not work from local machine
Since fetch is not readily available in Node.JS and my project was already using the axios library, I did it like this:
const url = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION_NAME}`;
const headers = {
'Content-Type': 'application/json',
};
const response = await axios.post(url, { data: YOUR_DATA }, { headers });

How to response a plain text in feathersjs websocket api?

I define a feathers service api as below:
class Monitor {
find(_) {
const metrics = prom.register.metrics();
log.info(metrics);
return new Promise((resolve) => {
resolve({text: metrics});
});
}
}
function restFormatter(req, res) {
res.format({
'text/plain': function() {
log('xxxx:', res);
res.end(`The Message is: "${res.data}"`);
}
});
}
module.exports = function () {
const app = this;
// Initialize our service with any options it requires
const service = new Monitor();
app.configure(rest(restFormatter)).use('/metrics', service);
// Get our initialize service to that we can bind hooks
const monitorService = app.service('/metrics');
// Set up our before hooks
monitorService.before(hooks.before);
// Set up our after hooks
monitorService.after(hooks.after);
return service;
};
module.exports.Monitor = Monitor;
when call this API from browser, I get below response:
"# HELP nodejs_gc_runs_total Count of total garbage collections.\n# TYPE nodejs_gc_runs_total counter\n\n# HELP nodejs_gc_pause_seconds_total Time spent in GC Pause in seconds.\n# TYPE nodejs_gc_pause_seconds_total counter\n\n# HELP nodejs_gc_reclaimed_bytes_total Total number of bytes reclaimed by GC.\n# TYPE nodejs_gc_reclaimed_bytes_total counter\n"
from above output you can see that feathersjs doesn't return the data in plain text format. It transpile my response text into a string. Below is the output from express service shown in the browser:
# HELP nodejs_gc_runs_total Count of total garbage collections.
# TYPE nodejs_gc_runs_total counter
# HELP nodejs_gc_pause_seconds_total Time spent in GC Pause in seconds.
# TYPE nodejs_gc_pause_seconds_total counter
# HELP nodejs_gc_reclaimed_bytes_total Total number of bytes reclaimed by GC.
# TYPE nodejs_gc_reclaimed_bytes_total counter
# HELP newConnection The number of requests served
# TYPE newConnection counter
this output is what I really want. How can I make them feathersjs service return above output?
Below is my feathersjs configuration part:
app
.use(compress())
.options('*', cors())
.use(cors())
.use('/', serveStatic(app.get('public')))
.use(bodyParser.json())
.use(bodyParser.urlencoded({extended: true}))
.configure(hooks())
.configure(rest())
.configure(
swagger({
docsPath: '/docs',
uiIndex: path.join(__dirname, '../public/docs.html'),
info: {
title: process.env.npm_package_fullName,
description: process.env.npm_package_description
}
})
)
.configure(
primus(
{
transformer: 'websockets',
timeout: false
},
(primus) => {
primus.library();
primus.save(path.join(__dirname, '../public/dist/primus.js'));
}
)
)
.configure(services)
.configure(middleware);
You are configuring feathers-rest twice which is why you still get the old output. Remove the app.configure(rest(restFormatter)) from your service file and then either change .configure(rest()) to .configure(rest(restFormatter)) in the main file to use the formatter to apply to all services or register a custom middleware for the service that does the formatting just for that service:
app.use('/metrics', service, function(req, res) {
res.format({
'text/plain': function() {
log('xxxx:', res);
res.end(`The Message is: "${res.data}"`);
}
});
});