I am using Google Cloud Tasks to create task and Firebase functions to handle it.
I can pass payload to my handler using body parameter
But i am not sure is it secure to pass private data using payload without encoding or better use some cipher?
Also I found this
In App Engine tasks, both the queue and the task handler run within the same Cloud project. Traffic is encrypted during transport and never leaves Google datacenters
but I'm not sure if it applies to this case
Thanks!
All depends what you put behind the word "secure".
Is your data encrypted at rest? Yes, even in Cloud Task (before triggering the task)
Is your data encrypted in transit? Yes, even when Cloud Task trigger your endpoint.
Can someone see the data of a task? Yes, if he has enough right, the user can perform a GET on the task and view its body data.
So, yes it's secure if you set the correct permission to the users. you can cipher the data, with Cloud KMS for example, if the Cloud Task data are private and you don't want that an Ops, that manage Cloud Task, accesses to the private data; but in this case, don't grant the user the permission on CLoud KMS.
Related
I have created an HTTP Google Cloud Function that does not allow unauthenticated requests.
I have created a service account in the project with one role: Cloud Functions Invoker.
This service account is listed as a principal for my http cloud function and shows to have that role:
I have created a Cloud Scheduler Job to run this function.
In the job, I've specified that I want it to obtain an OIDC token for authenticating requests to the http function:
Whenever I trigger the job, it fails with a message indicating the request is unauthenticated:
Things I've tried:
Recreate the function
Recreate the job
Use a different user (the main service account user - that one doesn't work either)
Do a POST instead of a GET from the scheduler job (I've successfully created scheduled jobs for authenticated http functions before but this is the first time I've done a GET - just grasping at straws really)
Did I miss something? Any idea why it is coming back with the "Unauthenticated" message?
I revisited this today. My IAP protected HTTP function is expecting a query string parameter to be passed into it. The Cloud Platform Web UI automatically sets the audience to the same URL (including the parameter) when creating the Scheduled Job. I figured Google knows what they are doing, so I left it that way originally.
Out of desperation I tried removing this parameter from the audience and that made the authentication work properly.
So, I changed the audience from
https://<myProject>.cloudfunctions.net/myFunction?p=abc
to
https://<myProject>.cloudfunctions.net/myFunction
I wish to schedule a Vertex Pipelines and deploy it from my local machine for now.
I have defined my pipeline which runs well I deploy it using: create_run_from_job_spec, on AIPlatformClient running it once.
When trying to schedule it with create_schedule_from_job_spec, I do have a Cloud Scheduler object well created, with a http endpoint to a Cloud Function. But when the scheduler runs, it fails because of Permission denied error. I used several service accounts with owner permissions on the project.
Do you know what could have gone wrong?
Since AIPlatformClient from Kubeflow pipelines raises deprecation warning, I also want to use PipelineJob from google.cloud.aiplatform but I cant see any direct way to schedule the pipeline execution.
I've spent about 3 hours banging my head on this too. In my case, what seemed to fix it was either:
disabling and re-enabling cloud scheduler api. Why did I do this? There is supposed to be a service account called service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com. If it is missing then re-enabling API might fix it
for older projects there is an additional step: https://cloud.google.com/scheduler/docs/http-target-auth#add
Simpler explanations include not doing some of the following steps
creating a service account for scheduler job. Grant cloud function invoker during creation
use this service account (see create_schedule_from_job_spec below)
find the (sneaky) cloud function that was created for you it will be called something like 'templated_http_request-v1' and add your service account as a cloud function invoker
response = client.create_schedule_from_job_spec(
job_spec_path=pipeline_spec,
schedule="*/15 * * * *",
time_zone="Europe/London",
parameter_values={},
cloud_scheduler_service_account="<your-service-account>#<project_id>.iam.gserviceaccount.com"
)
If you are still stuck, it is also useful to run gcloud scheduler jobs describe <pipeline-name> as it really helps to understand what scheduler is doing. You'll see cloudfunction url, POST payload which is some base64 encoded and contains pipeline yaml and you'll see that it is using OIDC/service account for security. Also useful is to view the code of the 'templated_http_request-v1' cloud function (sneakily created!). I was able to invoke the cloudfunction from POSTMAN using the payload obtained from scheduler job.
when I use dialogflow + google assistant, I usually save a temporary data with conv.data and save to user storage with conv.user.storage in webhook cloud function javascript
but, when I use dialogflow + telegram, I didn't find that.
so I want to save a score in quiz, so in 10 question, the system able to store like a temporary storage to save the score and calculate after the quiz is finish
Both conv.data and conv.user.storage are Actions on Google concepts, so are only available with that integration.
The rough equivalent of conv.data would be to have a Context with a long lifespan. Your webhook would save the data you're interested in preserving for the session in this context and then retrieve it in a later webhook call.
This is only maintained for a session, however. Any longer-term storage would require you to have a unique identifier for the user which you can use as a key into a database or data store you maintain.
I have a requirement where I need to trigger a cloud function which in turn triggers a data flow job once a file is placed in google cloud storage bucket of another project. Google documentation says its not possible, please see here https://cloud.google.com/functions/docs/calling/storage
However, I tried doing this and my cloud function failed to deploy with below error. It says
Looks like this is a permission issue and if required permission are given , this will work.
Do I need to add and give owner permission to #appspot.gserviceaccount.com of the project(project A) from where I am trying to access the bucket of another project(Project B)
So if the above is true, In my project B IAM page, I will see 2 as below
#appspot.gserviceaccount.com OWNER
#appspot.gserviceaccount.com EDITOR
Any inputs on this is much appreciated.
It's not possible to catch event from other projects for now. But there is a workaround.
In the project with the bucket, create a PubSub notification on Cloud Storage
On the topic that you created, create a push subscription. Use the Cloud Functions URL, and secure the PubSub call (you can get inspiration from there. If you are stuck, let me know, I will take more time to describe this part)
On the Cloud Functions grant the PubSub service account as cloudfunctions.invoker role
EDIT 1
The security part isn't so easy at the beginning. In your project B (where you have your Cloud Storage), you have created a PubSub topic. On this topic you can create a notification with a service account created in the project B. Take care to well fill in the audience
Then, you need to grant this "project B" service account as roles/cloudfunctions.invoker on the Cloud Function of the project A
# Create the service account
gcloud iam service-accounts create pubsub-push --project=<ProjectB>
#Create the push subscription
gcloud pubsub subscriptions create \
--push-endpoint=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--push-auth-service-account=pubsub-push#<projectB>.iam.gserviceaccount.com \
--push-auth-token-audience=https://<region>-<projectA>.cloudfunctions.net/<functionName> \
--topic=<GCSNotifTopic> --project=<ProjectB>
#Grant the service account
gcloud functions add-iam-policy-binding --member=serviceAccount:pubsub-push#<ProjectB>.iam.gserviceaccount.com --role=roles/cloudfunctions.invoker <FunctionName> --project=<projectA>
Last traps:
The Cloud Functions in the project A haven't the same signature if it's an HTTP functions (callable by a pubsub push subscription) or a Background Function (callable by events, such as CLoud Storage event). You need to update this according to the documentation
The PubSub message sent to the Cloud Functions is slightly different. Take care to update the input param accordingly.
Google documentation says its not possible
This is all you need to know. It is not possible. There are no workarounds, regardless of what sort of error messages you might see.
I (user1) have an Apps Script that connects to a Cloud SQL instance, like so:
var conn = Jdbc.getCloudSqlConnection("jdbc:google:rdbms://blah:instance1/foo")
And it works like a charm. I am trying to share this script as a Library and possibly monetize it in the future.
When I login as a different user (user2) and add it as a library, it fails when the library runs a function that tries to make the connection:
Failed to establish a database connection. Check connection string, username and password.
So it seems like the connection is coming from user2 even though the code executing the connection is coming from a library that is owned by user1. Is that true? Is there any way to make this work beyond adding user2 as an editor to my Cloud SQL instance?
Yes, this is the expected behavior. The library being owned by a different user does not really matter. The cloud-sql jdbc connector only authorize connections if the effective user can edit the database.
To workaround this you have turn this "library" into a webapp, running as you and shared to anyone, even anonymous. That accepts queries and commands to your database via HTTP parameters (either GET or POST). This webapp will return the results using ContentService on any of text formats available. I guess the easiest one is JSON, since you can work with regular js objects on Apps Script and then just JSON.stringify and JSON.parse to make the communication.
To make things easier you could then make a library to access to this webapp, abstracting the UrlFetch calls and parameters you have on the 1st script, allowing easy "native" plugin by other scripts.