Python - Return Google Cloud Storage - Return a File - google-cloud-functions

I need to build an API layer to expose a set of files stored within a Private Google Cloud Storage bucket, using Python. I may be using FastAPI for this.
The API would most likely be hosted as GCP Run application.
Could I get some guidance on the best practices for this?
My thinking was, from within the Cloud function to return a FastAPI File Response
e.g. return FileResponse(some_file_path)
Is this a valid approach?

If you have file of "moderate" size, you can load the file from Cloud Storage and serve it directly with your service (Cloud Run, or Cloud Functions, or other).
In case of large file, above a few Mb, it's recommended to use signedURL. On the client request, you generate a signed request and you reply it. For a better experience, you can set the HTTP code 307 (temporary redirect) to indicate to the client to query the new (signed) URL

Related

Calling Firebase Hosting API from a Firebase Cloud Function

I have a Firebase (node.js) cloud function that pulls in some data from my app's Firestore database and builds some static content for the web. I'd like that same cloud function to deploy the static content to Firebase hosting via the Firebase Hosting API, creating a static portion of my site with user generated content.
I understand the general flow thanks to the somewhat clear walkthrough, but am stuck on the first step: getting an access token to call the API. Obviously I'm not going to insecurely put my service account key in the cloud function itself, so the example in the walkthrough doesn't apply. And as I understand it, Firebase cloud functions are already associated with a service account, so presumably there's some way to get an access token to call other Google Cloud services from a cloud function.
So how do I get an access token to call the hosting API from a Cloud Function?
There are some red flags that make me think this isn't possible. For example, all of the uses cases in the walkthrough allude to other server environments, as opposed to Google Cloud environments. And yet, this use case is the third bullet in the use case list in the walkthrough.
I've searched extensively here and elsewhere for some guidance, but aren't finding anything. There are some older questions about accessing hosted files from a cloud function that aren't relevant. This promising question from 5 years ago about this exact use case only has dead ends.
You can use the google-auth-library package in Cloud Functions to a get a token as shown below:
import { GoogleAuth } from "google-auth-library";
const token = await new GoogleAuth({
scopes: ["https://www.googleapis.com/auth/cloud-platform"],
}).getAccessToken();
If you use Firebase Admin SDK in the Cloud Functions, then you can get an access token of the default service account as shown below (do ensure the service account has required permissions):
import { initializeApp } from "firebase-admin/app";
const admin = initializeApp();
const token = await admin.options.credential?.getAccessToken();
// ^ Google OAuth2 access token object used to authenticate with Firebase services.

Google Cloud Functions with Google Cloud Storage

I am planning to deploy a small project on Google Cloud Functions and is using the Google Cloud Storage as well.
I notice that instead of using the os module to call the credential, I need to call from the service account file.
e.g.
credentials = service_account.Credentials.from_service_account_file('xx.json')
storage_client = storage.Client(credentials=credentials)
Then, the errorI dont' have much experence to tell which point shall I check.
I have do the from google.oauth2 import service_account and from google.cloud import storage and put the credential json key under the cloud functions' temp.
I wonder which process or details have I missed?
Do I need to manually activate the Oauth permision on somewhere or something?
Any pointers? Welcome for any advices and comments.

Is there a simple way to host a JSON document you can read and update in Google Cloud Platform?

What I'm trying to do is host a JSON document that will then, essentially, serve as a hosted version of json-server. I'm aware I can do something similar with My JSON Server, but I plan to move my entire architecture to GCP so want to get more familiar with it.
At first I looked into the Storage JSON API, but it seems like that's just for getting data about buckets rather than the items in the buckets itself. I created a bucket called test-json-api and added a test-data.json, but there's seemingly no way to access the data in the json file via this API.
I'm trying to keep it as simple as possible for testing purposes. In time, I'll probably use a firestore allocation, but for now I'd like to avoid all that complexity, and instead have a simple GET and a PUT/PATCH to a json file.
The Storage JSON API you are talking about are only for getting and updating the metadata and not for getting and updating the data inside the object. Objects inside the Google Cloud Storage bucket are immutable and one way to update them may be to get the object data from Google Cloud Storage bucket within the code, updating it, then uploading it again into the Google Cloud Storage bucket.
As you want to deal with JSON files you may explore using Cloud Datastore or Cloud Firestore. Also if you wish to use Firebase then you may explore Firebase Realtime Database.
A very quick and dirty way to make it easy to read some of the information in the json doc is to use that information in the blob name. For example the key information in
doc = {'id':3, 'name':'my name', ... }
could be stored in an object called "doc_3_my name", so that it can be read while browsing the bucket. You can then download the right doc if you need to see all the original data. An object name can be up to 1024 bytes of UTF-8 (with some exclusions), which is normally sufficient for surfacing basic information.
Note that you should never store PII like this.
https://cloud.google.com/storage/docs/objects

Google Cloud Function :: Service account :: JWT token and Bearer token

I have a Google Cloud Function. I also have a web application. I want to authenticate requests to the cloud function by using a service account.
I have the json key file.
I know that I have to follow https://cloud.google.com/functions/docs/securing/authenticating#service-to-function. But that is leading me to an IAP page that does not apply to google cloud functions.
Another similar instructions are found in https://developers.google.com/identity/protocols/oauth2/service-account
But if I am following the python library code, I end up with the sample code there :
import googleapiclient.discovery
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta3', credentials=credentials)
response = sqladmin.instances().list(project='exciting-example-123').execute()
This does not directly relate to invoking a cloud function.
This question's answer somewhat deals with my requirement but is using a Call API which is only suitable for testing.
Also, I want to expose this API to multiple applications using another tech like .net. So I believe the best option for me will be to use the HTTP method (given on the same page):
https://developers.google.com/identity/protocols/oauth2/service-account#httprest
But whatever I do I am unable to get the signature right.
Any help to get this sorted will be highly appreciated as I am stuck on this for the past few days.
You can use the Google auth library like this
from google.oauth2.id_token import fetch_id_token
from google.auth.transport import requests
audience="my_audience"
r = requests.Request()
token=fetch_id_token(r,audience)
print(token)
The fetch_id_token method will use the default credentials
The service account key file defined in the environment variable GOOGLE_APPLICATION_CREDENTIALS
The service account loaded in the Google Cloud environment
For now, I followed this answer in PHP
In the claims section, I removed the scope. Instead added a claim of target_audience.
"target_audience" => "google-function-http-trigger"
the cloud function http trigger will look like https://us-central1-test-project-name.cloudfunctions.net/function-name",
This will give the required assertion key.
Then I follow https://developers.google.com/identity/protocols/oauth2/service-account#httprest to get the id_token
Then with the id_token as the bearer token we can call the cloud function.
please note that the token expires depending on the time set in the "exp" claim. Once expired you have to redo the steps to generate the new id_token
I want to authenticate requests to the cloud function by using a service account.
I am not sure I understand the context correctly, but I would try to assign a roles/cloudfunctions.invoker IAM role to that service account (which is used to run your code in the web application) - see Cloud Functions IAM Roles .
In that case a code under that service account "Can invoke an HTTP function using its public URL"
I reckon no json keys are required in this case.

Is it safe to transfer private data through Google Cloud Tasks

I am using Google Cloud Tasks to create task and Firebase functions to handle it.
I can pass payload to my handler using body parameter
But i am not sure is it secure to pass private data using payload without encoding or better use some cipher?
Also I found this
In App Engine tasks, both the queue and the task handler run within the same Cloud project. Traffic is encrypted during transport and never leaves Google datacenters
but I'm not sure if it applies to this case
Thanks!
All depends what you put behind the word "secure".
Is your data encrypted at rest? Yes, even in Cloud Task (before triggering the task)
Is your data encrypted in transit? Yes, even when Cloud Task trigger your endpoint.
Can someone see the data of a task? Yes, if he has enough right, the user can perform a GET on the task and view its body data.
So, yes it's secure if you set the correct permission to the users. you can cipher the data, with Cloud KMS for example, if the Cloud Task data are private and you don't want that an Ops, that manage Cloud Task, accesses to the private data; but in this case, don't grant the user the permission on CLoud KMS.