I try to deploy one gcloud function with the below Github link to back up the datastore.
https://github.com/portsoc/cloud-simple-datastore-backup/blob/master/index.js
After updating the variant BUCKET_NAME with my cloud storage bucket name, I run it under gcloud shell with the command: node index.js and it will backup the datastore successfully.
but when I continue to run the below command to deploy it:
gcloud functions deploy main
--runtime nodejs12 --trigger-http --allow-unauthenticated
--region=asia-southeast2
After a while, it will give me the below error:
Deploying function (may take a while - up to 2 minutes)...failed.
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. This is likely due to a bug in the user code. Error message: Error: please examine your function logs to see the error cause: https://cloud.google.com/functions/docs/monitoring/logging#viewing_logs. Additional troubleshooting documentation can be found at https://cloud.google.com/functions/docs/troubleshooting#logging. Please visit https://cloud.google.com/functions/docs/troubleshooting for in-depth troubleshooting documentation.
Click to view the error screenshot
Any suggestion on this?
Cloud Functions have a specific set of signatures that must be used.
I'm less familiar with JavaScript|Node.JS but I think the function you reference is intended to be invoked as you do node index.js (or similar) and this is incompatible with Cloud Functions.
Please review Write Cloud Functions to understand that signature type that you will need. You will probably have to tweak the authentication in the example to better meet your needs too.
Almost certainly you don't want --allow-unauthenticated either.
After changing the upload code below, I can deploy it.
const { GoogleAuth } = require("google-auth-library");
// fill in your bucket name here:
const BUCKET_NAME = "gs://testinbbk10";
exports.myfunction = async (req,res) =>{
try {
const auth = new GoogleAuth({
scopes: "https://www.googleapis.com/auth/cloud-platform",
});
const client = await auth.getClient();
const projectId = await auth.getProjectId();
console.log(`Project ID is ${projectId}`);
const res2 = await client.request({
method: "POST",
url: `https://datastore.googleapis.com/v1/projects/${projectId}:export`,
data: {
outputUrlPrefix: BUCKET_NAME,
},
});
console.error("RESPONSE:");
console.log(res2.data);
} catch (error) {
console.error("ERROR");
console.error(error);
}
}
but when I try to access the provided link: deploy link, it will show me the error: could not handle the request.
I am confused about how to properly deploy this to the google cloud function? Just want to deploy one simply google cloud function to backup datastore in the google cloud.
Related
My unit test launch looks like this. As you can see I have exploited CLI options to install a VSIX my CICD has already produced, and then also tried to install ms-vscode-remote.remote-ssh because I want to re-run the tests on a remote workspace.
import * as path from 'path';
import * as fs from 'fs';
import { runTests } from '#vscode/test-electron';
async function main() {
try {
// The folder containing the Extension Manifest package.json
// Passed to `--extensionDevelopmentPath`
const extensionDevelopmentPath = path.resolve(__dirname, '../../');
// The path to the extension test runner script
// Passed to --extensionTestsPath
const extensionTestsPath = path.resolve(__dirname, './suite/index');
const vsixName = fs.readdirSync(extensionDevelopmentPath)
.filter(p => path.extname(p) === ".vsix")
.sort((a, b) => a < b ? 1 : a > b ? -1 : 0)[0];
const launchArgsLocal = [
path.resolve(__dirname, '../../src/test/test-docs'),
"--install-extension",
vsixName,
"--install-extension",
"ms-vscode-remote.remote-ssh"
];
const SSH_HOST = process.argv[2];
const SSH_WORKSPACE = process.argv[3];
const launchArgsRemote = [
"--folder-uri",
`vscode-remote://ssh-remote+testuser#${SSH_HOST}${SSH_WORKSPACE}`
];
// Download VS Code, unzip it and run the integration test
await runTests({ extensionDevelopmentPath, extensionTestsPath, launchArgs: launchArgsLocal });
await runTests({ extensionDevelopmentPath, extensionTestsPath, launchArgs: launchArgsRemote });
} catch (err) {
console.error(err);
console.error('Failed to run tests');
process.exit(1);
}
}
main();
runTests downloads and installs VS Code, and passes through the parameters I supply. For the local file system all the tests pass, so the extension from the VSIX is definitely installed.
But ms-vscode-remote.remote-ssh doesn't seem to be installed - I get this error:
Cannot get canonical URI because no extension is installed to resolve ssh-remote
and then the tests fail because there's no open workspace.
This may be related to the fact that CLI installation of multiple extensions repeats the --install-extension switch. I suspect the switch name is used as a hash key.
What to do? Well, I'm not committed to any particular course of action, just platform independence. If I knew how to do a platform independent headless CLI installation of VS Code:latest in a GitHub Action, that would certainly do the trick. I could then directly use the CLI to install the extensions before the tests, and pass the installation path. Which would also require a unified way to get the path for vs code.
Update 2022-07-20
Having figured out how to do a platform independent headless CLI installation of VS Code:latest in a GitHub Action followed by installation of the required extensions I face new problems.
The test framework options include a path to an existing installation of VS Code. According to the interface documentation, supplying this should cause the test to use the existing installation instead of installing VS Code; this is why I thought the above installation would solve my problems.
However, the option seems to be ignored.
My latest iteration uses an extension dependency on remote-ssh to install it. There's a new problem: how to get the correct version of my extension onto the remote host. By default the remote host uses the marketplace version, which obviously won't be the version we're trying to test.
I would first try with only one --install-extension option, just to check if any extension is installed.
I would also check if the same set of commands works locally (install VSCode and its remote SSH extension)
Testing it locally (with only one extension) also allows to check if that extension has any dependencies (like Remote SSH - Editing)
After two weeks of trying to run my site, I'm asking for your help.
Has anyone hosted Sails.JS on PlanetHoster?
My queries don't work because the connection to the database doesn't seem established.
Here's an example of some very simple queries:
await User.findOne({ email: email });
Here's what's displayed in the browser error console:
Uncaught (in promise) Error: Request failed with status code 500
I've tried to handle the errors but nothing is displayed...
try { await User.findOne({ email: email }); } catch(err) { // nothing }
So I've deduced that it was a problem with calling the database.
Unfortunately, I have no way to read the error logs ...
Yet, I've set the production.js file (config/env/production.js) and when I run NODE_ENV = production node app.js, it's still displayed in development. In fact, PlanetHoster doesn't require running the command sails lift, it just runs the platform already ...
I'm currently in a total blur as for where to go from here so if you have suggestions, I will take them with pleasure.
Thank you
Environment: Sails v1.0.2
I am developing a cross platform app with nativescript and firebase and I have some cloud functions triggered onCreate and onWrite but when the functions are "cold" (after a longer time of inactivity) I get this error most of the time and the function fails to execute properly. Following requests do work, though.
Error: Unexpected error while acquiring application default credentials: Could not load the default credentials. Browse to https://developers.google.com/accounts/docs/application-default-credentials for more information.
at GoogleAuth.<anonymous> (/user_code/node_modules/firebase-admin/node_modules/google-auth-library/build/src/auth/googleauth.js:229:31)
at step (/user_code/node_modules/firebase-admin/node_modules/google-auth-library/build/src/auth/googleauth.js:47:23)
at Object.next (/user_code/node_modules/firebase-admin/node_modules/google-auth-library/build/src/auth/googleauth.js:28:53)
at fulfilled (/user_code/node_modules/firebase-admin/node_modules/google-auth-library/build/src/auth/googleauth.js:19:58)
at process._tickDomainCallback (internal/process/next_tick.js:135:7)
My first three lines of functions.js look like this (as in the documentation):
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
I've tried using a service_account file generated from the console (as described here) but then I get a different error which I guess is because I'm running it on Googles servers and am not hosting it myself.
Any idea why this is happening and how I can prevent this?
With the new firebase-functions 1.x SDK, you initialize the Admin SDK like this:
const admin = require('firebase-admin');
admin.initializeApp();
Be sure you're using the latest version of that module.
I am working with my client so I cloned git repo and built application which use AWS KMS to generate data key.
All is works well on live server but when I got failed on my local environment.
Here is code snippet and result of error.
const AWS = require('aws-sdk');
AWS.config.update({region:'eu-central-1'});
const kms = new AWS.KMS({ apiVersion: '2014-11-01' });
kms.generateDataKey({
KeyId: 'XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX',
KeySpec: 'AES_256',
}).promise()
.catch(err => {
console.error('generateDataKey error', err.message, err.stack);
throw err;
})
.then(data => {
console.log(data);
});
Is there a way to fix this error?
"GenerateDataKey error Signature expired...."
When you send a request signed using the AWS SigV4 protocol (to KMS or any other AWS service), the requests include a timestamp from when the signature was generated. The tolerance is 5 minutes. This mechanism is in place to make replay attacks harder (they essentially have a smaller window to be peformed). More information here.
Since the same request is working fine on your server, but failing locally, I think the clock on your local workspace is off by more than five minutes.
I am following the tutorial on
https://cloud.google.com/datastore/docs/getstarted/start_nodejs/
trying to use datastore from my Compute Engine project.
Step 2 in the tutorial mentioned I do not have to create new service account credentials when running from Compute Engine.
I run the sample with:
node test.js abc-test-123
where abc-test-123 is my Project Id and that project have enabled all cloud API access including DataStore API.
After uploaded the code and executed the sample, I got the following error:
Adams: { 'rpc error': { [Error: Invalid Credentials] code: 401,
errors: [ [Object] ] } }
Update:
I did a workaround by changing the default sample code to use the JWT credential way (with a generated .json key file) and things are working now.
Update 2:
This is the scope config when I run
gcloud compute instances describe abc-test-123
And the result:
serviceAccounts:
scopes:
- https://www.googleapis.com/auth/cloud-platform
According to the doc:
You can set scopes only when you create a new instance, and cannot
change or expand the list of scopes for existing instances. For
simplicity, you can choose to enable full access to all Google Cloud
Platform APIs with the https://www.googleapis.com/auth/cloud-platform
scope.
I still welcome any answer about why the original code not work in my case~
Thanks for reading
This most likely means that when you created the instance, you didn't specify the right scopes (datastore and userinfo-email according to the tutorial). You can check that by executing the following command:
gcloud compute instances describe <instance>
Look for serviceAccounts/scopes in the output.
There are 2 way to create an instance with right credential:
gcloud compute instances create $INSTANCE_NAME --scopes datastore,userinfo-email
Using web: on Access & Setting Enable User Info & Datastore