Making a HTTP API request to Amazon Elastic Beanstalk - amazon-elastic-beanstalk

I'm trying to make a HTTP get request to
https://elasticbeanstalk.us-east-1.amazonaws.com/?ApplicationName=MyApplicationName&Operation=DescribeEnvironments
and getting
<?xml version="1.0" standalone="no"?>
<ErrorResponse xmlns="http://elasticbeanstalk.amazonaws.com/docs/2010-12-01/>
<Error>
<Type>Sender</Type>
<Code>InvalidClientTokenId</Code>
<Message>No account found for the given parameters</Message>
</Error>
<RequestId>ca83cbc7-f22a-11e3-8380-3bbf7df037f3</RequestId>
</ErrorResponse>
I've tried setting my key and secret as username and password for basic HTTP auth, but clearly this doesn't work.
So how do I add my key and secret to my remote request?

For most AWS usage scenarios it is highly recommended to use one of the many AWS SDKs to ease working with the APIs via higher level abstractions - these SDKs also take care of the required and slightly complex request signing, an explanation for the usually several options how to provide your AWS credentials can be found in the resp. SDK documentation:
The AWS SDKs provide functions that wrap an API and take care of many of the connection details, such as calculating signatures, handling request retries, and error handling. The SDKs also contain sample code, tutorials, and other resources to help you get started writing applications that call AWS. Calling the wrapper functions in an SDK can greatly simplify the process of writing an AWS application.
If you really have a need to use the AWS APIs via REST directly, Signing AWS API Requests will guide you through the required steps, see e.g. section Components of an AWS Signature 4 Request within Signature Version 4 Signing Process for the one that applies to AWS Elastic Beanstalk.
Please note that several services augment that documentation with a tailored one, see e.g. Signing and Authenticating REST Requests for the Amazon S3 variation.

Related

How to test WebHooks without an on-premise external system?

I'm trying to teach myself about integrating systems via WebHooks.
In a free/hosted GIS system, I can create a WebHook that would, in theory, POST a JSON object to an external system.
The problem is, I don't have an external system that's available right now for for receiving the POST.
I think I need some sort of publicly available sample server that would:
Receive the POST requests
Do something with the requests (ie. create some sort of record)
...so that I could determine if the WebHook worked correctly or not.
How can I test my WebHooks without having an on-premise external system?
I've poked around websites like Postman Echo and Amazon Lambda. But to my untrained eye, it seems like they're not quite designed for what I need.
You could use any of these options depending on your requirements:
You could use webhooks modules in services like Integromat or Zapier to receive webhook data and then apply transformation.
You could deploy a script on heroku and use the URL generated there to send the webhooks calls.
You could also use services like requestbin, webhook.site etc if you just want to receive webhooks data.
Regards

Google Cloud function send call to app hosted on GKE

I would like to load data to my db hosted on GKE, using cloud function (small ETL needs, Cloud function would be great for that case)
I'm working in the same region. my GKE has an internal load balancer exposing an gcloud internal IP.
the method called is working perfectly when it's from Appengine but when doing it with cloud function I have an connexion error : "can't find client at IP"
I would like to know if it is possible ?
if so, what would be the procedure ?
Many thanks !!
Gab
We just released this feature to Beta. You can get started by following our docs:
https://cloud.google.com/functions/docs/connecting-vpc https://cloud.google.com/appengine/docs/standard/python/connecting-vpc
https://cloud.google.com/vpc/docs/configure-serverless-vpc-access
This is not currently possible as of today.
https://issuetracker.google.com/issues/36859738
Thanks for your feedback.
You are totally right. At the moment the instances are only able to receive such requests via the external IP [1].
I have filed a feature request in your behalf so that this functionality might be considered for future deployments. I cannot guarantee this will be implemented or provide an E.T.A. Nevertheless, rest assured that your feedback is always seriously taken.
We also reached out to our Google Cloud representative who confirmed this was a highly requested feature that was being looked at but was unable to provide an ETA as when it would be released.

Azure API gateway and app service, concurrency limitation?

Have an odata api endpoint hosted in App Service behind API Management Gateway, but getting concurrency call issues, trying to identify where the problem occurs. We use a standard tier of API gateway. Is there a concurrent call limit? Sorry trying to scan through documentation didn't find one straight answer.
One more question, what is the simplest way to track the request and response the API gateway generates? Thanks
Adding the header Ocp-Apim-Trace: true to a request will return a link to a complete trace of the request and respond. This only works if you are using a subscription key for an administrator user.

Displaying private HTML file from AWS S3

I'm currently hosting a static website on AWS S3. I have parts of the website that I only want AWS Cognito authenticated users to access. These parts of the S3 bucket are restricted to certain roles. As I understand it, once a Cognito user has received their temporary AWS credentials, I need to use the S3 sdk to load the restricted object (index.html) from S3 and display it in the webpage. Is this the correct approach, and once I have the object back from S3, how do I go about loading it into the webpage? Thank you!
You will need application logic that runs in the back-end to control your security and to store/retrieve data. While much of this can be done from the browser, it is open to hacking. Therefore, you need your access control logic in the back-end.
Option 1: API Gateway and Lambda functions
You can have a static web page served out of Amazon S3, which makes API calls to Lambda functions via API Gateway. This is known as the serverless model.
Here's a sample diagram from the Serverless Code website:
Basically, Lambda functions receive the request, determine whether the user is authorised, determines what they would receive back (eg a pre-signed URL to a different page) and sends it back to the web page. The benefit of this design is that it does not require any servers.
Option 2: Amazon EC2 servers
Alternatively, you can run Amazon EC2 instances fronted by an Elastic Load Balancer. This is traditional application design allowing you to use many different frameworks. However, there is an on-going cost for the servers even when nobody is using your application.

Stubbing an API with Azure API Management

I was wondering if someone was able to provide some information regarding the Azure Management Portal. My question is whether the Portal can cater for stubbing APIs?
I have added an API via the Portal as well as some operations however, the documentation isn't clear whether the real API has to be published to Azure.. I was thinking I can add example requests and responses without a real API behind it, i.e. a stub?
You can add a stub easily enough. See the default 'echo' endpoint that comes with each new API that you add to API Management. Copy that pattern and you can build a stub. The API Management documention is some of the better content for Azure so I'd recommend reading it.