I am trying to deploy a Google Cloud function, I started by just adding the initial requirements to my index.js file:
// Import the Google Cloud client libraries
const nl = require('#google-cloud/language')();
const speech = require('#google-cloud/speech')();
const storage = require('#google-cloud/storage')();
But I get the following message when deploying:
Detailed stack trace: TypeError: require(...) is not a function
This only happens with the #google-cloud/speech and #google-cloud/language modules, the #google-cloud/storage module is loaded fine as a function (I tested by commenting the first two).
Any advise will be greatly appreciated.
Borrigan
With reference to this Github comment, there was some changes in google-cloud v2 package
so you import packages such as:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({
// config...
});
Google cloud function are nodejs modules so the syntax is same as nodejs syntax.
Regarding your problem:
you have to write
const storage = require('#google-cloud/storage');
(without () at the end of each statement)
So the correct declaration will be:
// Import the Google Cloud client libraries
const nl = require('#google-cloud/language');
const speech = require('#google-cloud/speech');
const storage = require('#google-cloud/storage');
I hope this helps.
It tells you that whatever you required is not a function and therefore can't be invoked with ()
if you look here : https://www.npmjs.com/package/#google-cloud/language#using-the-client-library
you see a service object with multiple class returning functions is being returned, so you should set it up like this:
const nl = require('#google-cloud/language');
const language = new nl.LanguageServiceClient();
Follow the new syntax below:
const {Storage} = require('#google-cloud/storage');
const googleCloudStorage = new Storage({
projectId: 'your-project-id',
keyFilename: 'path-to-json-config-file'
});
Related
Can I have multiple apps script function in google.script.run like:
google.script.run.withSuccessHandler(function).scriptfunction1(variable).scriptfunction2(variable);
No. The .scriptfunction1(variable) function cannot return a script runner. It can only return a value, an object or array that ultimately contains primitives, or undefined.
But you can do something like this (from the documentation):
const myRunner = google.script.run.withFailureHandler(onFailure);
const myRunner1 = myRunner.withSuccessHandler(onSuccess);
const myRunner2 = myRunner.withSuccessHandler(onDifferentSuccess);
myRunner1.doSomething();
myRunner1.doSomethingElse();
myRunner2.doSomething();
I am new to Forge and using node.js - I am having some difficulty getting a simple 3-legged Oauth process to work.
This is how far I have got below. It gives me the error "Cannot GET /api/forge/oauth/callback"
I have checked that my callback url matches what is in the Forge App.
Ultimately what I am trying to achieve is, getting the shared link for a newly created file in Fusion teams, or at least opening the browser to the file overview page.
Would anybody be able to help with this?
var express = require('express');
var ForgeSDK = require('forge-apis');
const { stringify } = require('node:querystring');
var opn = require('opn');
// Set up Express web server
var app = express();
// Set the Forge Variables
var FORGE_CLIENT_ID = 'client-id', FORGE_CLIENT_SECRET = 'client-secret', REDIRECT_URL = 'http://localhost:3000/api/forge/oauth/callback';
// This is for web server to start listening to port 3000
app.set('port', 3000);
var server = app.listen(app.get('port'), function () {
console.log('Server listening on port ' + server.address().port);
});
// Initialize the 3-legged OAuth2 client, set specific scopes and optionally set the `autoRefresh` parameter to true
// if you want the token to auto refresh
var autoRefresh = true;
var oAuth2ThreeLegged = new ForgeSDK.AuthClientThreeLegged(FORGE_CLIENT_ID, FORGE_CLIENT_SECRET, REDIRECT_URL, [
'data:read',
'data:write'
], autoRefresh);
// Generate a URL page that asks for permissions for the specified scopes.
var AuthURL = oAuth2ThreeLegged.generateAuthUrl();
console.log(AuthURL)
opn('https://developer.api.autodesk.com/authentication/v1/authorize?response_type=code&client_id=<<client-id>>&redirect_uri=http://localhost:3000/api/forge/oauth/callback&scope=data:read+data:write&state=undefined', {app: 'chrome'})
where is the callback API in your code?
BTW 3-Legged oauth is implemented here in nodejs, you can use it as reference:
https://learnforge.autodesk.io/#/oauth/3legged/nodejs
I am using the v3 .Net Google Apis. This seems like it should be a simple task, but nothing I have tried seems to work.
I am able to get the file resource for the file I want by doing this -
var fileRequest = service.Files.Get(fileId);
fileRequest.Fields = "*";
var fileResponse = fileRequest.Execute();
But then I have not been able to actually find a way to download the file. If I use this -
var exportRequest = service.Files.Export(fileResponse.Id, fileResponse.MimeType);
exportRequest.Download(stream);
It returns a 403 Forbidden status with a message stating that Export is only for Google docs.
I've tried this -
var downloadTask = exportRequest.MediaDownloader.DownloadAsync(#"https://docs.google.com/uc?export=download&id=" + fileResponse.Id, file);
downloadTask.Wait();
But that just returns the markup for the page for the user to login. I created the export request using a service account so login should not be required.
Any have a sample of doing this in .Net using the V3 google API?
Ok, I literally got this to work 5 minutes after posting this. Here is what I found works (I know I tried this before and it did not work, but who knows) -
public static async Task DownloadFile(string fileId, DriveService service)
{
var fileRequest = service.Files.Get(fileId);
fileRequest.Fields = "*";
var fileResponse = fileRequest.Execute();
var exportRequest = service.Files.Get(fileResponse.Id);
//you would need to know the file size
var size = fileResponse.Size;
var file = new FileStream(fileResponse.Name, FileMode.OpenOrCreate, FileAccess.ReadWrite);
file.SetLength((long)size);
var response = await fileRequest.DownloadAsync(file);
if (response.Status == Google.Apis.Download.DownloadStatus.Failed)
{
Console.WriteLine("Download failed");
}
}
In my Google Cloud Functions script, I want to delete a Google Cloud Storage file by using the following code:
const gcs = require('#google-cloud/storage')()
exports.deletePost = functions.https.onRequest((request, response) => {
if(!context.auth) {
throw new functions.https.HttpsError('failed-precondition', 'The function must be called while authenticated.');
}
const the_post = request.query.the_post;
const filePath = context.auth.uid + '/posts/' + the_post;
const bucket = gcs.bucket('android-com')
const file = bucket.file(filePath)
const pr = file.delete()
});
The problem is that I also need to delete a Google Firebase Firestore database entry after having deleted the Storage file. So I would want to know if I could do it within, for example, a promise that would be returned by delete?
PS : I don't find the doc
The code file.delete() is asynchronous and returns a Promise, as defined in the Google Cloud Storage: Deleting objects documentation.
To delete an object from one of your Cloud Storage buckets:
// Imports the Google Cloud client library
const {Storage} = require('#google-cloud/storage');
// Creates a client
const storage = new Storage();
/**
* TODO(developer): Uncomment the following lines before running the sample.
*/
// const bucketName = 'Name of a bucket, e.g. my-bucket';
// const filename = 'File to delete, e.g. file.txt';
// Deletes the file from the bucket
await storage
.bucket(bucketName)
.file(filename)
.delete();
console.log(`gs://${bucketName}/${filename} deleted.`);
It's not overly clear, but because the await syntax is used, this implies that the result of the expression to it's right is a Promise.
Note: Most of the useful Google Cloud Storage documentation can be found here.
I am looking up how to export RDS to S3 using Lambda, so far I found a Python node and Java API but I can't find a way of programmatically running something like mysqldump without the actual executable.
Is there a way of doing it?
I am thinking using node.js to call SHOW CREATE TABLE for each element in SHOW TABLES
Then somehow create extended INSERT statements.
Yes it is possible, you need to include mysqldump executable with your lambda package though!
Sample script in Node.js for backup using mysqldump and uploading to S3.
var S3 = require('./S3Uploader'); //custom S3 wrapper with stream upload functionality
var fs = require('fs');
var path = require('path');
var util = require('util');
const writeFile = util.promisify(fs.writeFile);
const execFile = util.promisify(require('child_process').execFile);
const exec = util.promisify(require('child_process').exec);
async function backupToS3(){
var backupName = 'mysqlbackup-'+new Date().toISOString()+'.gz'
var content = `cd /tmp
BACKUPNAME="[BACKUP_NAME]"
[EXE_PATH]/mysqldump --host [DB_ENDPOINT] --port [DB_PORT] -u [DB_USER] --password="[DB_PASS]" [DB_NAME] | gzip -c > $BACKUPNAME
`;
content = content.replace('[BACKUP_NAME]', backupName);
content = content.replace('[DB_ENDPOINT]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_PORT]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_USER]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_PASS]', 'xx'); //get from lambda environment variables
content = content.replace('[DB_NAME]', 'xx'); //get from lambda environment variables
content = content.replace('[EXE_PATH]', __dirname+'/tools'); //path where mysqldump executable is located withing the lambda package
//generate backup script
await writeFile('/tmp/db_backup.sh', content);
fs.chmodSync('/tmp/db_backup.sh', '755');
//run script
var res1 = await execFile('/tmp/db_backup.sh');
//stream upload to S3
var res2 = await S3.uploadFile('/tmp/'+backupName, 'backups');
//cleanup local backup (this should cleanup automatically according to lambda lifecycle)
var res3 = await exec('rm /tmp/'+backupName);
return 'Backup complete';
};
Sample S3Uploader posted here - Loading File directly from Node js req body to S3