How to add Firebase Storage to Cloud Functions for Firebase? And then use `getDownloadURL`? [duplicate] - google-cloud-functions

This question already has answers here:
Get Download URL from file uploaded with Cloud Functions for Firebase
(25 answers)
Closed 6 days ago.
I'm trying to get a Firebase Storage download URL (with long-lived token) from Cloud Functions for Firebase. I'm guessing two possible issues:
I'm using Google Cloud Storage, not Firebase Cloud Storage.
getDownloadURL() doesn't work from Cloud Functions.
I'm looking at the documentation for Storage and I see sections for
iOS+
Android
Web
Flutter
Admin
C++
Unity
Security & Rules
Why do I not see section for Cloud Functions for Firebase? Is Admin the same as Cloud Functions for Firebase?
There's a section Extend Cloud Storage with Cloud Functions but it's about triggering Cloud Functions from Storage, not using Cloud Functions to do stuff in Storage.
Let's try to get that download URL. The documentation says to do this:
var storage = firebase.storage();
var gsReference = storage.refFromURL('gs://bucket/images/stars.jpg');
storageRef.child('images/stars.jpg').getDownloadURL()
.then((url) => {
console.log(url);
});
But I create a Storage client with Google Cloud Storage:
import * as functions from "firebase-functions";
const admin = require('firebase-admin');
admin.initializeApp({
apiKey: 'abc123',
authDomain: 'my-app.firebaseapp.com',
credential: admin.credential.cert({serviceAccount}),
databaseURL: 'https://my-project.firebaseio.com',
storageBucket: 'gs://my-project.appspot.com',
});
const { Storage } = require('#google-cloud/storage');
const storage = new Storage();
const bucketName = 'gs://my-project.appspot.com';
const bucket = storage.bucket('my-project.appspot.com');
I can write a file to Firebase Storage:
await storage.bucket(bucketName).file(destFileName).save(file['rawBody']);
But this doesn't work:
await storage.bucket(bucketName).file(destFileName).getDownloadURL()
.then((url) => {
console.log("Download URL: " + url);
return url;
});
That throws an error:
TypeError: storage.bucket(...).file(...).getDownloadURL is not a function
That's making me suspect that getDownloadURL isn't available for Cloud Functions.
Let's try this:
var storageRef = admin.firebase.storage().ref(destFileName);
await storageRef.getDownloadURL()
.then(function (url) {
console.log(url);
});
That throws this error:
TypeError: Cannot read properties of undefined (reading 'storage')
That's worse, it's crashing on storage in the first line and not even getting to getDownloadURL.
Let's try this:
var gsReference = storage.refFromURL(bucketName + '/' + destFileName);
await gsReference.getDownloadURL()
.then(function (url) {
console.log("Download URL: " + url);
return url;
});
That throws the same error:
TypeError: Cannot read properties of undefined (reading 'storage')
Same problem, it can't get past storage in the first line.

I have been looking into the same problem last time, and you are right getDownloadURL() does not work in cloud function.
This is the other way round to get the downloadURL of a file in Cloud Storage that I am using currently, it basically generates a signed download url with expiration duration.
exports.signedUrl = functions.region("asia-southeast1").https.onRequest((request, response) => {
const { Storage } = require("#google-cloud/storage");
const storage = new Storage({ projectId: "YOUR_PROJECT_ID", keyFilename: "./serviceAccountKey.json" });
const bucketName = "YOUR_BUCKET_NAME";
generateV4ReadSignedUrl("YOUR_FILE_PATH").catch(console.error);
async function generateV4ReadSignedUrl(filename) {
const options = {
version: "v4",
action: "read",
expires: Date.now() + 2 * 60 * 1000, // 2 minutes
};
const [url] = await storage.bucket(bucketName).file(filename).getSignedUrl(options);
console.log(`Generated GET signed URL: ${url}`);
return cors()(request, response, () => {
response.send(url);
});
}
});

Related

Resize all existing images stored in firebase storage and update the newly resized image url to database via api call [duplicate]

This question already has an answer here:
How can I resize all existing images in firebase storage?
(1 answer)
Closed 9 months ago.
I have requirement to resize new and existing images stored in firebase store. For new image, I enabled firebase's resize image extension. For existing image, how can I resized the image and get the newly resized image url to update back to database via api.
Here is my firebase function to get existing image urls from database. My question is how to resize the image and get the new image url?
const functions = require("firebase-functions");
const axios =require("axios");
async function getAlbums() {
const endpoint = "https://api.mydomain.com/graphql";
const headers = {
"content-type": "application/json",
};
const graphqlQuery = {
"query": `query Albums {
albums {
id
album_cover
}
}`
};
functions.logger.info("Call API");
const response = await axios({
url: endpoint,
method: 'post',
headers: headers,
data: graphqlQuery
});
if(response.errors) {
functions.logger.info("API ERROR : ", response.errors) // errors if any
} else {
return response.data.data.albums;
}
}
exports.manualGenerateResizedImage = functions.https.onRequest(async () => {
const albums = await getAlbums();
functions.logger.info("No. of Album : ", albums.length);
});
I think the below answer from Renaud Tarnec will definitely help you.
If you look at the code of the "Resize Images" extension, you will see that the Cloud Function that underlies the extension is triggered by a onFinalize event, which means:
When a new object (or a new generation of an existing object) is
successfully created in the bucket. This includes copying or rewriting
an existing object.
So, without rewriting/regenerating the existing images the Extension will not be triggered.
However, you could easily write your own Cloud Function that does the same thing but is triggered, for example, by a call to a specific URL (HTTPS cloud Function) or by creating a new document in a temporary Firestore Collection (background triggered CF).
This Cloud Function would execute the following steps:
Get all the files of your bucket, see the getFiles() method of the
Google Cloud Storage Node.js Client API. This method returns a
GetFilesResponse object which is an Array of File instances.
By looping over the array, for each file, check if the file has a
corresponding resized image in the bucket (depending on the way you
configured the Extension, the resized images may be in a specific
folder)
If a file does not have a corresponding resized image, execute the
same business logic of the Extension Cloud Function for this File.
There is an official Cloud Function sample which shows how to create a Cloud Storage triggered Firebase Function that will create resized thumbnails from uploaded images and upload them to the database URL, (see the last lines of index.js file)
Note : If you have a lot of files to treat, you should most probably work by batch, since there is a limit of 9 minutes for Cloud Function execution. Also, depending on the number of images to treat, you may need to increase the timeout value and/or the allocated memory of your Cloud Function, see https://firebase.google.com/docs/functions/manage-functions#set_timeout_and_memory_allocation
In case someone need it. This is how I resized existing image.
const functions = require("firebase-functions");
const axios = require("axios");
const { Storage } = require("#google-cloud/storage");
const storage = new Storage();
// Don't forget to replace with your bucket name
const bucket = storage.bucket("projectid.appspot.com");
async function getAlbums() {
const endpoint = "https://api.mydomain.com/graphql";
const headers = {
"content-type": "application/json",
};
const graphqlQuery = {
query: `query Albums {
albums {
id
album_cover
}
}`,
};
const response = await axios({
url: endpoint,
method: "post",
headers: headers,
data: graphqlQuery,
});
if (response.errors) {
functions.logger.error("API ERROR : ", response.errors); // errors
if any
} else {
return response.data.data.albums;
}
}
function getFileName(url) {
var decodeURI = decodeURIComponent(url);
var index = decodeURI.lastIndexOf("/") + 1;
var filenameWithParam = decodeURI.substr(index);
index = filenameWithParam.lastIndexOf("?");
var filename = filenameWithParam.substr(0, index);
return filename;
}
function getFileNameFromFirestore(url) {
var index = url.lastIndexOf("/") + 1;
var filename = url.substr(index);
return filename;
}
const triggerBucketEvent = async () => {
bucket.getFiles(
{
prefix: "images/albums", // you can add a path prefix
autoPaginate: false,
},
async (err, files) => {
if (err) {
functions.logger.error(err);
return;
}
const albums = await getAlbums();
await Promise.all(
files.map((file) => {
var fileName = getFileNameFromFirestore(file.name);
var result = albums.find((obj) => {
return getFileName(obj.album_cover) === fileName;
});
if (result) {
var file_ext = fileName.substr(
(Math.max(0, fileName.lastIndexOf(".")) || Infinity) + 1
);
var newFileName = result.id + "." + file_ext;
// Copy each file on thumbs directory with the different name
file.copy("images/albums/" + newFileName);
} else {
functions.logger.info(file.name, " not found in album list!");
}
})
);
}
);
};
exports.manualGenerateResizedImage = functions.https.onRequest(async () => {
await triggerBucketEvent();
});

Cannot get image from ipfs in vuejs

I want to get an image from IPFS into my Vue/Nuxt project. I already import ipfs by 'npm i ipfs'. But when i run "cont node = Ipfs.create()". it show error
but this error doesn't always happen, many times it works and I can get the image normally. Has anyone ever encountered this situation and have a solution?
async downloadImg () {
const node = await Ipfs.create()
const { agentVersion, id } = await node.id()
this.agentVersion = agentVersion
this.id = id
const cid = '/ipfs/QmY2dod6X7GFmqnQ6qCBiaeNxJWa3CYQaxEjGUfL5CqMAj'
// load the raw data from js-ipfs (>=0.40.0)
const bufs = []
const a = node.cat(cid)
for await (const buf of node.cat(cid)) {
bufs.push(buf)
}
const data = Buffer.concat(bufs)
const blob = new Blob([data], { type: 'image/jpg' })
this.imageSrc = window.URL.createObjectURL(blob)
},
If am I true it can depends on the web protocol which you use, if you use https its work and in other protocol not !
The web crypto API is only available on pages accessed via https. If you're seeing that message, you are probably accessing the page via plain http.
Follow the github link at the top of the stack trace for more explanation and solutions.

how can I properly invoke a google cloud function inside another function [duplicate]

I am using a Cloud Function to call another Cloud Function on the free spark tier.
Is there a special way to call another Cloud Function? Or do you just use a standard http request?
I have tried calling the other function directly like so:
exports.purchaseTicket = functions.https.onRequest((req, res) => {
fetch('https://us-central1-functions-****.cloudfunctions.net/validate')
.then(response => response.json())
.then(json => res.status(201).json(json))
})
But I get the error
FetchError: request to
https://us-central1-functions-****.cloudfunctions.net/validate
failed, reason: getaddrinfo ENOTFOUND
us-central1-functions-*****.cloudfunctions.net
us-central1-functions-*****.cloudfunctions.net:443
Which sounds like firebase is blocking the connection, despite it being a google owned, and therefore it shouldn't be locked
the Spark plan only allows outbound network requests to Google owned
services.
How can I make use a Cloud Function to call another Cloud Function?
You don't need to go through the trouble of invoking some shared functionality via a whole new HTTPS call. You can simply abstract away the common bits of code into a regular javascript function that gets called by either one. For example, you could modify the template helloWorld function like this:
var functions = require('firebase-functions');
exports.helloWorld = functions.https.onRequest((request, response) => {
common(response)
})
exports.helloWorld2 = functions.https.onRequest((request, response) => {
common(response)
})
function common(response) {
response.send("Hello from a regular old function!");
}
These two functions will do exactly the same thing, but with different endpoints.
To answer the question, you can do an https request to call another cloud function:
export const callCloudFunction = async (functionName: string, data: {} = {}) => {
let url = `https://us-central1-${config.firebase.projectId}.cloudfunctions.net/${functionName}`
await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ data }),
})
}
(Note we are using the npm package 'node-fetch' as our fetch implementation.)
And then simply call it:
callCloudFunction('search', { query: 'yo' })
There are legitimate reasons to do this. We used this to ping our search cloud function every minute and keep it running. This greatly lowers response latency for a few dollars a year.
It's possible to invoke another Google Cloud Function over HTTP by including an authorization token. It requires a primary HTTP request to calculate the token, which you then use when you call the actual Google Cloud Function that you want to run.
https://cloud.google.com/functions/docs/securing/authenticating#function-to-function
const {get} = require('axios');
// TODO(developer): set these values
const REGION = 'us-central1';
const PROJECT_ID = 'my-project-id';
const RECEIVING_FUNCTION = 'myFunction';
// Constants for setting up metadata server request
// See https://cloud.google.com/compute/docs/instances/verifying-instance-identity#request_signature
const functionURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + functionURL;
exports.callingFunction = async (req, res) => {
// Fetch the token
const tokenResponse = await get(tokenUrl, {
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = tokenResponse.data;
// Provide the token in the request to the receiving function
try {
const functionResponse = await get(functionURL, {
headers: {Authorization: `bearer ${token}`},
});
res.status(200).send(functionResponse.data);
} catch (err) {
console.error(err);
res.status(500).send('An error occurred! See logs for more details.');
}
};
October 2021 Update: You should not need to do this from a local development environment, thank you Aman James for clarifying this
Despite of the question tag and other answers concern the javascript I want to share the python example as it reflects the title and also authentification aspect mentioned in the question.
Google Cloud Function provide REST API interface what incluse call method that can be used in another Cloud Function.
Although the documentation mention using Google-provided client libraries there is still non one for Cloud Function on Python.
And instead you need to use general Google API Client Libraries. [This is the python one].3
Probably, the main difficulties while using this approach is an understanding of authentification process.
Generally you need provide two things to build a client service:
credentials ans scopes.
The simpliest way to get credentials is relay on Application Default Credentials (ADC) library. The rigth documentation about that are:
https://cloud.google.com/docs/authentication/production
https://github.com/googleapis/google-api-python-client/blob/master/docs/auth.md
The place where to get scopes is the each REST API function documentation page.
Like, OAuth scope: https://www.googleapis.com/auth/cloud-platform
The complete code example of calling 'hello-world' clound fucntion is below.
Before run:
Create default Cloud Function on GCP in your project.
Keep and notice the default service account to use
Keep the default body.
Notice the project_id, function name, location where you deploy function.
If you will call function outside Cloud Function environment (locally for instance) setup the environment variable GOOGLE_APPLICATION_CREDENTIALS according the doc mentioned above
If you will call actualy from another Cloud Function you don't need to configure credentials at all.
from googleapiclient.discovery import build
from googleapiclient.discovery_cache.base import Cache
import google.auth
import pprint as pp
def get_cloud_function_api_service():
class MemoryCache(Cache):
_CACHE = {}
def get(self, url):
return MemoryCache._CACHE.get(url)
def set(self, url, content):
MemoryCache._CACHE[url] = content
scopes = ['https://www.googleapis.com/auth/cloud-platform']
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS is set,
# ADC uses the service account file that the variable points to.
#
# If the environment variable GOOGLE_APPLICATION_CREDENTIALS isn't set,
# ADC uses the default service account that Compute Engine, Google Kubernetes Engine, App Engine, Cloud Run,
# and Cloud Functions provide
#
# see more on https://cloud.google.com/docs/authentication/production
credentials, project_id = google.auth.default(scopes)
service = build('cloudfunctions', 'v1', credentials=credentials, cache=MemoryCache())
return service
google_api_service = get_cloud_function_api_service()
name = 'projects/{project_id}/locations/us-central1/functions/function-1'
body = {
'data': '{ "message": "It is awesome, you are develop on Stack Overflow language!"}' # json passed as a string
}
result_call = google_api_service.projects().locations().functions().call(name=name, body=body).execute()
pp.pprint(result_call)
# expected out out is:
# {'executionId': '3h4c8cb1kwe2', 'result': 'It is awesome, you are develop on Stack Overflow language!'}
These suggestions don't seem to work anymore.
To get this to work for me, I made calls from the client side using httpsCallable and imported the requests into postman. There were some other links to https://firebase.google.com/docs/functions/callable-reference there were helpful. But determining where the information was available took a bit of figuring out.
I wrote everything down here as it takes a bit of explaining and some examples.
https://www.tiftonpartners.com/post/call-google-cloud-function-from-another-cloud-function
Here's an inline version for the 'url' might expire.
This 'should' work, it's not tested but based off of what I wrote and tested for my own application.
module.exports = function(name,context) {
const {protocol,headers} = context.rawRequest;
const host = headers['x-forwardedfor-host'] || headers.host;
// there will be two different paths for
// production and development
const url = `${protocol}://${host}/${name}`;
const method = 'post';
const auth = headers.authorization;
return (...rest) => {
const data = JSON.stringify({data:rest});
const config = {
method, url, data,
headers: {
'Content-Type': 'application/json',
'Authorization': auth,
'Connection': 'keep-alive',
'Pragma': 'no-cache,
'Cache-control': 'no-cache',
}
};
try {
const {data:{result}} = await axios(config);
return result;
} catch(e) {
throw e;
}
}
}
This is how you would call this function.
const crud = httpsCallable('crud',context);
return await crud('read',...data);
context you get from the google cloud entry point and is the most important piece, it contains the JWT token needed to make the subsequent call to your cloud function (in my example its crud)
To define the other httpsCallable endpoint you would write an export statement as follows
exports.crud = functions.https.onCall(async (data, context) => {})
It should work just like magic.
Hopefully this helps.
I found a combination of two of the methods works best
const anprURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + anprURL;
// Fetch the token
const tokenResponse = await fetch(tokenUrl, {
method: "GET"
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = await tokenResponse.text();
const functionResponse = await fetch(anprURL, {
method: 'POST',
headers: {
"Authorization": `bearer ${token}`,
'Content-Type': 'application/json',
},
body: JSON.stringify({"imageUrl": url}),
});
// Convert the response to text
const responseText = await functionResponse.text();
// Convert from text to json
const reponseJson = JSON.parse(responseText);
Extending the Shea Hunter Belsky's answer I would love to inform you that the call to the metatdata server of google to fetch the authorization token would not work from local machine
Since fetch is not readily available in Node.JS and my project was already using the axios library, I did it like this:
const url = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION_NAME}`;
const headers = {
'Content-Type': 'application/json',
};
const response = await axios.post(url, { data: YOUR_DATA }, { headers });

Google Cloud Storage `predefinedAcl` and `file.makePublic()` not working

I'm trying to get permanent download URLs when I upload files to Firebase Storage (Google Cloud Storage) from Firebase Cloud Functions (Google Cloud Functions).
I tried setting predefinedAcl to authenticatedRead and to publicRead. Both resulted in 403 (Forbidden) errors when my app tried to download the files. This is from the documentation for CreateWriteStreamOptions. Here's the code:
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({ projectId: 'languagetwo-cd94d' });
const myBucket = storage.bucket('languagetwo-cd94d.appspot.com');
var mp3Promise = new Promise(function(resolve, reject) {
let options = {
metadata: {
contentType: 'audio/mp3',
public: true
}
};
synthesizeParams.accept = 'audio/mp3';
var file = myBucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
resolve('http://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3');
})
.catch(error => console.error(error));
});
That code executes without an error, writes the file to Storage, and passes the download URL to the next function. When I try to download the file with this URL:
http://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/English/United_States-Allison-Female-IBM/catbirds.mp3
I get this error:
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous caller does not have storage.objects.get access to languagetwo-cd94d.appspot.com/Audio/English/United_States-Allison-Female-IBM/catbirds.mp3.
</Details>
</Error>
Downloading the file from my app (as an authorized user) I get the 403 (Forbidden) error message.
I've also tried this property, with the same result:
let options = {
metadata: {
contentType: 'audio/webm',
predefinedAcl: 'publicRead'
}
};
Moving on, I tried file.makePublic():
const {Storage} = require('#google-cloud/storage');
const storage = new Storage({ projectId: 'languagetwo-cd94d' });
const myBucket = storage.bucket('languagetwo-cd94d.appspot.com');
var mp3Promise = new Promise(function(resolve, reject) {
let options = {
metadata: {
contentType: 'audio/mp3'
}
};
synthesizeParams.accept = 'audio/mp3';
var file = myBucket.file('Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3');
textToSpeech.synthesize(synthesizeParams)
.then(function(audio) {
audio.pipe(file.createWriteStream(options));
})
.then(function() {
file.makePublic()
.then(function(data) {
console.log(data)
resolve('http://storage.googleapis.com/languagetwo-cd94d.appspot.com/Audio/' + longLanguage + '/' + pronunciation + '/' + word + '.mp3');
})
.catch(error => console.error(error));
})
.catch(error => console.error(error));
});
This code didn't even execute. It crashed at file.makePublic() and the error message was
{ Error: No such object: languagetwo-cd94d.appspot.com/Audio/English/United_States-Allison-Female-IBM/warblers.mp3
I'm not sure what this error means. My guess is that file.createWriteStream() wrote to a file location, and then file.makePublic() couldn't find that file location.
I found the permissions for warblers.mp3 on the Cloud Storage Browser:
the first error is due the access permissions to the bucket, in the link you can
find Identity and Access Management (IAM) instructions, there you can check the roles
and permission to manage the bucket access.
The second error could be a consequence of the first one.
Please let me know if the information works for you.
You need to change options to this:
let options = {
public: true,
metadata: {
contentType: 'audio/mp3'
}
};
https://googleapis.dev/nodejs/storage/latest/global.html#CreateWriteStreamOptions
I've been struggling with the same issue for a while and in the end here is what solved it for me:
const stream = file.createWriteStream(options)
audio.pipe(stream)
stream.on('finish', () => {
// file APIs will now work
})
The problem is that audio.pipe(file.createWriteStream(options)) creates a stream that writes to the file asyncronously. At the moment when you are calling file.makePublic() there is no guarantee that that write stream has completed.

How do I create a new GCE VM instance from an instance template using GCE Node.js client?

In Google compute engine I can use an instance template to create a new VM from the template. This works fine using the GCE-console, and works fine, using the API, too (URL parameter "sourceInstanceTemplate").
How can I create a new GCE-VM from an instance template using googleapis/nodejs-compute (the Node.js GCE SDK)?
google-auth-library-nodejs can be used for accessing the GCE instances.insert API directly.
The following example is adapted from https://github.com/google/google-auth-library-nodejs and works fine, if executed within GCE (in special, in a Google Cloud Function).
const zone = 'some-zone';
const name = 'a-name';
const sourceInstanceTemplate = `some-template-name`;
createVM(zone, name, sourceInstanceTemplate)
.then(console.log)
.catch(console.error);
async function createVM(zone, vmName, templateName) {
const {auth} = require('google-auth-library');
const client = await auth.getClient({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const projectId = await auth.getDefaultProjectId();
const sourceInstanceTemplate = `projects/${projectId}/global/instanceTemplates/${templateName}`;
const url = `https://www.googleapis.com/compute/v1/projects/${projectId}/zones/${zone}/instances?sourceInstanceTemplate=${sourceInstanceTemplate}`;
return await client.request({
url: url,
method: 'post',
data: {name: vmName}
});
}
I can't find the solution in the documentation for the node client. Hopefully my alternate solution helps someone.
const exec = require('child-process-promise').exec;
var create_vm = (zone, vmname, templatename) => {
const cmd = `gcloud compute instances create ${vmname} ` +
`--zone=${zone} ` +
`--source-instance-template=${templatename} `;
return exec(cmd);
};
create_vm('us-central1-c', 'my-instance', 'whatever')
.then(console.log)
.catch(console.error);
You can customize this as far as gcloud lets you. The docs/options for creating an instance are here.