What would be the best automatic available value to return (eg. in a HTTP function) that gives an unique indication of the current deployed version or timestamp of my functions (so not version of the node runtime)? Looking for e.g. a number or timestamp that gets incremented automatically when I do a new deploy of my functions.
I thought using this:
function currentFileTimestamp() {
var stats = fs.statSync("./index.js");
var mtime = new Date(stats.mtime);
return mtime
}
But this always seams to return Tue Jan 01 1980 00:00:01 GMT+0000
Does the Firebase Admin API have a "current deployed version" timestamp or something that we could use?
There are some alternatives that you can give it a try. On Node.js 10 runtime environment there is an environment variable - more information here - called K_REVISION, where you can find the value from the version of your Cloud Function.
For Python and older Node.js versions there is also an environment variable called X_GOOGLE_FUNCTION_VERSION that you can also check for the deployed version.
You can get more details on how to use them, with examples in this similar case here.
I just deployed a test function to Google Cloud Functions using the python 3.10 runtime and can confirm that the environment variable is called K_REVISION as wheatazogg discusses in his post about nodejs10
import os
def testdeployfunction(data,context) :
for name, value in os.environ.items() :
print("{0}: {1}".format(name,value))
return 'OK'
returns:
Related
This question already has answers here:
Can I use in Google Apps Scripts a defined Class in a library with ES6 (V8)?
(2 answers)
Closed 2 years ago.
I am testing the use of libraries for a first project I want to run in Google Sheets.
I created a Google Sheet, with just a single button (a drawing) in it. To this button, I assigned my script ‘myFunction’.
This ‘myFunction’
Calls ‘justMyTestFunction’ in library Tlib
Logs a locally defined constant string (logInfo) to the Logger
Logs a string, defined in library Tlib, to the Logger
Below you see the content of my Google Sheet Script
const logInfo = 'This is a local sentence; not stored in any library';
function myFunction() {
TLib.justMyTestFunction();
Logger.log(logInfo);
Logger.log(TLib.logSentence);
}
And you see the content of my Tlib library
const logSentence = 'This sentence is stored as a constant in library TestLibrary...';
function justMyTestFunction() {
Logger.log('This sentence is hardcoded in function justMyTestFunction of library Tlib...');
}
When clicking the button in my Google sheet, the result is the following Logger information:
Stackdriver-logboeken
31 aug. 2020 22:11:25 Informatie This sentence is hardcoded in function justMyTestFunction of library Tlib...
31 aug. 2020 22:11:25 Informatie This is a local sentence; not stored in any library
31 aug. 2020 22:11:25 Informatie null
From which I conclude:
The call to ‘justMyTestFunction’ in library Tlib was successful
The access to the locally defined string was successful and could be logged (of course 😉)
The access to the string defined in the library (Tlib.logSentence) was not successful. As a result, the null value is sent to the Logger
What I cannot understand : apparently the link with the library is OK, because my Sheet can access and execute the function ‘justMyTestFunction’.
But the same Sheet has no access to a ‘global’ constant, that was defined in this library.
Obviously I am missing something trivial here. But I am ‘out of ideas’. Can anyone point me to the cause of the problem and its solution ?
Thanks a lot!
It seems that in the current stage, when const is used in the library as the global variable, the value cannot be retrieved from the client. I think that this might be resolved in the future update. So as the current workaround, please use var instead of const as follows.
From:
const logInfo = 'This is a local sentence; not stored in any library';
To:
var logInfo = 'This is a local sentence; not stored in any library';
After above modification, please test it again. When the developer mode of the installed library is "ON", you will be able to test the modified script.
Reference:
Libraries
How can I use Koa library, the express replacement, in Cloud Functions?
I know KOA use all great ES2017 and make more use of Async use of JavaScript.
or it might not be needed at all working with Cloud Functions because the Firebase system won't send multiple calls to the same Cloud Function until it ends the previous one?
it unclear to me.
it know demands Node 8.x and I know the NodeJs 8.9.x, has now LTS.
Reading from cloud functions doc:
Base Image Cloud Functions uses a Debian-based execution environment
and includes contents of the gcr.io/google-appengine/nodejs Docker
image, with the Node.js runtime installed in the version, specified
above:
FROM gcr.io/google-appengine/nodejs
RUN install_node v6.14.0
To see what is included in the image, you can check its GitHub
project, or pull and inspect the image itself. Updates to the language
runtime (Node.js) are generally done automatically (unless otherwise
notified), and include any changes in the definition of the base
image.
And I saw a pull request back in November 2017, adding Nodejs v8. Here's hoping it can finally land in Google Cloud Functions 🤞🏻
UPDATE: Google Cloud Functions now support Node.js 8 and even Python!
Referring to the release notes from Google... Cloud Functions Release Notes
Node version supported is still at v6, same for firebase. You need to wait awhile before they release it in v8. Am pretty sure they will move to v8 when v6 no longer supported, but hopefully earlier...
Use babel:
index.js:
----------=
'use strict';
require('#babel/register')
require('babel-polyfill')
const http = require('http')
const createApp = require('./app/app.js')
const handle = createApp().callback()
if (process.env.IS_LOCAL_DEPLOYMENT) {
// to use same code in local server
http.createServer(handle).listen(3000)
} else {
module.exports.http = (request, response) => {
handle(request, response)
};
}
app.js:
--------
'use strict';
const Koa = require('koa')
module.exports = () => {
const app = new Koa()
app.use(......)
return app
}
package.json
------------
"scripts": {
.
.
"start": "export IS_LOCAL_DEPLOYMENT=true && node index"
.
.
}
I just saw in Cloud Functions Console editor for one of my functions that Node 8 is now a runtime option. See screenshot:
I'm building an application that stores files into the FIWARE Object Storage. I don't quite understand what is the correct way of storing files into the storage.
The code python code snippet below taken from the Object Storage - User and Programmers Guide shows 2 ways of doing it:
def store_text(token, auth, container_name, object_name, object_text):
headers = {"X-Auth-Token": token}
# 1. version
#body = '{"mimetype":"text/plain", "metadata":{}, "value" : "' + object_text + '"}'
# 2. version
body = object_text
url = auth + "/" + container_name + "/" + object_name
return swift_request('PUT', url, headers, body)
The 1. version confuses me, because when I first looked at the only Node.js module (repo: fiware-object-storage) that works with Object Storage, it seemed to use 1. version. As the module was making calls to the old (v.1.1) API version instead of the presumably newest (v.2.0), referencing to the python example, not sure if that is an outdated version of doing it or not.
As I played more with the module, realised it didn't work and the code for it was a total mess. So I forked the project and quickly understood that I will need rewrite it form the ground up, taking the above mention python example from the usage guide as an reference. Link to my repo.
As of writing this the only methods that aren't implement is the object storage (PUT) and object fetching (GET).
Had some addition questions about the Object Storage which I sent to fiware-lab-help#lists.fiware.org, but haven't heard anything back so asking them here.
Haven't got much experience with writing API libraries. Should I need to worry about auth token expiring? I presume it is not needed to make a new authentication, every time we interact with storage. The authentication should happen once when server is starting-up (we create a instance) and it internally keeps it. Should I implement some kind of mechanism that refreshes the token?
Does the tenant id change? From the quote below is presume that getting a tenant I just a one time deal, then later you can use it in the config to make less authentication calls.
A valid token is required to access an object store. This section
describes how to get a valid token assuming an identity management
system compatible with OpenStack Keystone is being used. If the
username, password and tenant details are known, only step 3 is
required. source
During the authentication when fetching tenants how should I select the "right" one? For now i'm just taking the first one similar as the example code does.
Is it true that a object storage container belongs to only a single region?
Use only what you call version 2. Ignore your version 1. It is commented out in the example. It should be removed from the documentation.
(1) The token will be valid for some period of time. This could be an hour or a day, depending on the setup. This period of time should be specified in the token that is returned by the authentication service. The token needs to be periodically refreshed.
(2) The tenant id does not change.
(3) Typically only one tenant id is returned. It is possible, however, that you were assigned more than one id, in which case you have to pick which one you are currently using. Containers typically belong to a single tenant and are not shared between tenants.
(4) Containers are typically limited to a single region. This may change in the future when multi-region support for a container is added to Swift.
Solved my troubles and created the NPM module that works with the FIWARE Object Storage: https://github.com/renarsvilnis/fiware-object-storage-ge
I've been using the Google apiclient library in python for various Google Cloud APIs - mostly for Google Compute - with great success.
I want to start using the library to create and control the Google Logging mechanism offered by the Google Cloud Platform.
However, this is a beta version, and I can't find any real documentation or example on how to use the logging API.
All I was able to find are high-level descriptions such as:
https://developers.google.com/apis-explorer/#p/logging/v1beta3/
Can anyone provide a simple example on how to use apiclient for logging purposes?
for example creating a new log entry...
Thanks for the help
Shahar
I found this page:
https://developers.google.com/api-client-library/python/guide/logging
Which states you can do the following to set the log level:
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
However it doesn't seem to have any impact on the output which is always INFO for me.
I also tried setting httplib2 to debuglevel 4:
import httplib2
httplib2.debuglevel = 4
Yet I don't see any HTTP headers in the log :/
I know this question is old, but it is getting some attention, so I guess it might be worth answering to it, in case someone else comes here.
Stackdriver Logging Client Libraries for Google Cloud Platform are not in beta anymore, as they hit General Availability some time ago. The link I shared contains the most relevant documentation for installing and using them.
After running the command pip install --upgrade google-cloud-logging, you will be able to authenticate with your GCP account, and use the Client Libraries.
Using them is as easy as importing the library with a command such as from google.cloud import logging, then instantiate a new client (which you can use by default, or even pass the Project ID and Credentials explicitly) and finally work with Logs as you want.
You may also want to visit the official library documentation, where you will find all the details of how to use the library, which methods and classes are available, and how to do most of the things, with lots of self-explanatory examples, and even comparisons between the different alternatives on how to interact with Stackdriver Logging.
As a small example, let me also share a snippet of how to retrieve the five most recent logs which have status more sever than "warning":
# Import the Google Cloud Python client library
from google.cloud import logging
from google.cloud.logging import DESCENDING
# Instantiate a client
logging_client = logging.Client(project = <PROJECT_ID>)
# Set the filter to apply to the logs, this one retrieves GAE logs from the default service with a severity higher than "warning"
FILTER = 'resource.type:gae_app and resource.labels.module_id:default and severity>=WARNING'
i = 0
# List the entries in DESCENDING order and applying the FILTER
for entry in logging_client.list_entries(order_by=DESCENDING, filter_=FILTER): # API call
print('{} - Severity: {}'.format(entry.timestamp, entry.severity))
if (i >= 5):
break
i += 1
Bear in mind that this is just a simple example, and that many things can be achieved using the Logging Client Library, so you should refer to the official documentation pages that I shared in order to get a more deep understanding of how everything works.
However it doesn't seem to have any impact on the output which is
always INFO for me.
add a logging handler, e.g.:
formatter = logging.Formatter('%(asctime)s %(process)d %(levelname)s: %(message)s')
consoleHandler = logging.StreamHandler()
consoleHandler.setLevel(logging.DEBUG)
consoleHandler.setFormatter(formatter)
logger.addHandler(consoleHandler)
I have a small app deployed on Modulus.io, and there's a section for a variable called METEOR_SETTINGS, which is a JSON object that holds some API keys.
I have two API keys in this object.
For whatever reason, my google-analytics tracker always reports that my tracker is MISSING.
I'm using the iron-router GA package.
Here's a sample of my JSON string:
{ "PrerenderIO": {"token": "xxxxxxxxxxxxx"}, "public": {"ga": {"id": "UA-xxxxxxxx-x"} } }
Has anyone setup a METEOR_SETTINGS successfully with GA and another service? What am I doing wrong?
Be sure after you run a modulus project reset every time you change the settings options on the modulus dashboard admin.
modulus project restart
modulus deploy