AWS SNS subsription to lambda doesn't automatically create event source using Java APIs - aws-sdk

Below are the steps to reproduce using AWS Java API:
1. Create a Nodejs Lambda function
2. Create an SNS topic
3. Subscribe the SNS topic to the lambda function created using it's ARN
Now if we go to the Lambda function and look at the Event Sources tab, we don't see SNS topic listed as an event source. So the SNS doesn't trigger the Lambda function at all. Looks like an AWS issue to me.
P.S: The AWS user keys used to reproduce the above issue had blanket access (Admin Policy) attached. So it doesn't look like a permissions issue. If I repeat the above steps via AWS Console, the event source gets added properly into the Lambda function.
Has anyone encountered this issue before? How to resolve this? Or any workarounds.

You need to add permission for SNS to call Lambda. You can do this by calling the AddPermission Function.
Background: The Lambda SNS uses the Push Model to invoke Lambda. This means that SNS directly invokes your lambda function (similar to you calling invokeFunction from the Java SDK). The lambda IAM permission model requires that each caller be explicitly permitted to call the function.
This is different from the Lambda Kinesis event source, which uses a Pull Model (the lambda function runs and then pulls records from the Kinesis Stream).

Related

Permission denied when running scheduling Vertex Pipelines

I wish to schedule a Vertex Pipelines and deploy it from my local machine for now.
I have defined my pipeline which runs well I deploy it using: create_run_from_job_spec, on AIPlatformClient running it once.
When trying to schedule it with create_schedule_from_job_spec, I do have a Cloud Scheduler object well created, with a http endpoint to a Cloud Function. But when the scheduler runs, it fails because of Permission denied error. I used several service accounts with owner permissions on the project.
Do you know what could have gone wrong?
Since AIPlatformClient from Kubeflow pipelines raises deprecation warning, I also want to use PipelineJob from google.cloud.aiplatform but I cant see any direct way to schedule the pipeline execution.
I've spent about 3 hours banging my head on this too. In my case, what seemed to fix it was either:
disabling and re-enabling cloud scheduler api. Why did I do this? There is supposed to be a service account called service-[project-number]#gcp-sa-cloudscheduler.iam.gserviceaccount.com. If it is missing then re-enabling API might fix it
for older projects there is an additional step: https://cloud.google.com/scheduler/docs/http-target-auth#add
Simpler explanations include not doing some of the following steps
creating a service account for scheduler job. Grant cloud function invoker during creation
use this service account (see create_schedule_from_job_spec below)
find the (sneaky) cloud function that was created for you it will be called something like 'templated_http_request-v1' and add your service account as a cloud function invoker
response = client.create_schedule_from_job_spec(
job_spec_path=pipeline_spec,
schedule="*/15 * * * *",
time_zone="Europe/London",
parameter_values={},
cloud_scheduler_service_account="<your-service-account>#<project_id>.iam.gserviceaccount.com"
)
If you are still stuck, it is also useful to run gcloud scheduler jobs describe <pipeline-name> as it really helps to understand what scheduler is doing. You'll see cloudfunction url, POST payload which is some base64 encoded and contains pipeline yaml and you'll see that it is using OIDC/service account for security. Also useful is to view the code of the 'templated_http_request-v1' cloud function (sneakily created!). I was able to invoke the cloudfunction from POSTMAN using the payload obtained from scheduler job.

Google Cloud Function :: Service account :: JWT token and Bearer token

I have a Google Cloud Function. I also have a web application. I want to authenticate requests to the cloud function by using a service account.
I have the json key file.
I know that I have to follow https://cloud.google.com/functions/docs/securing/authenticating#service-to-function. But that is leading me to an IAP page that does not apply to google cloud functions.
Another similar instructions are found in https://developers.google.com/identity/protocols/oauth2/service-account
But if I am following the python library code, I end up with the sample code there :
import googleapiclient.discovery
sqladmin = googleapiclient.discovery.build('sqladmin', 'v1beta3', credentials=credentials)
response = sqladmin.instances().list(project='exciting-example-123').execute()
This does not directly relate to invoking a cloud function.
This question's answer somewhat deals with my requirement but is using a Call API which is only suitable for testing.
Also, I want to expose this API to multiple applications using another tech like .net. So I believe the best option for me will be to use the HTTP method (given on the same page):
https://developers.google.com/identity/protocols/oauth2/service-account#httprest
But whatever I do I am unable to get the signature right.
Any help to get this sorted will be highly appreciated as I am stuck on this for the past few days.
You can use the Google auth library like this
from google.oauth2.id_token import fetch_id_token
from google.auth.transport import requests
audience="my_audience"
r = requests.Request()
token=fetch_id_token(r,audience)
print(token)
The fetch_id_token method will use the default credentials
The service account key file defined in the environment variable GOOGLE_APPLICATION_CREDENTIALS
The service account loaded in the Google Cloud environment
For now, I followed this answer in PHP
In the claims section, I removed the scope. Instead added a claim of target_audience.
"target_audience" => "google-function-http-trigger"
the cloud function http trigger will look like https://us-central1-test-project-name.cloudfunctions.net/function-name",
This will give the required assertion key.
Then I follow https://developers.google.com/identity/protocols/oauth2/service-account#httprest to get the id_token
Then with the id_token as the bearer token we can call the cloud function.
please note that the token expires depending on the time set in the "exp" claim. Once expired you have to redo the steps to generate the new id_token
I want to authenticate requests to the cloud function by using a service account.
I am not sure I understand the context correctly, but I would try to assign a roles/cloudfunctions.invoker IAM role to that service account (which is used to run your code in the web application) - see Cloud Functions IAM Roles .
In that case a code under that service account "Can invoke an HTTP function using its public URL"
I reckon no json keys are required in this case.

Invoke AWS SNS Topic from SQL trigger

We want to invoke a AWS SNS TOPIC and with some data using SQS table trigger. In MySQL we have mysql.lambda_async("topic arn") but how is it possible from SQL. Please give your suggestions.
According to this AWS documentation, the parameters for lambda_async are actually the Lambda function ARN and a JSON payload.
Example from the documentation:
SELECT lambda_sync(
'arn:aws:lambda:us-east-1:868710585169:function:BasicTestLambda',
'{"operation": "ping"}');
You can't directly connect SNS to SQL triggers, but you could trigger a Lambda function that publishes the data to an SNS topic or pushes it to an SQS queue.

C# | How to call a MVC Data Access Layer through AWS Lambda function

We have a AWS Lambda function which will trigger whenever a new file comes under the S3 bucket, but we need to log the activities in our existing MySQL DB by using our existing MVC Application project. So that we don't need to write down the Logging mechanism. Please help me out, I am new in this world.
Thanks!
We solve it through API, We first trigger our Lambda function and in that we are calling our MVC Web API (Which is available under our existing web solution) and in that Web API we are calling our Data Access Layer database logging.

DynamoDB: How to call dynamodb api from ZF2 rest application

I have an existing app with mysql as backend, We are shifting from mysql to DynamoDB, I want to call Dynamodb api from my ZF2 rest.
I am working with this module: https://github.com/aws/aws-sdk-php-zf2 (which help to interact with Dynamodb)
Any light on the path would be great
Thanks
I am able to accomplished this task by extending AbstractRestfulController and we can use Get, Create, Delete methods to call the api
The AWS Module will help in communicating with DynamoDB.