how to log the response and request in API Management - azure-api-management

how to log the response size ,request size, error message from API management instance?
If yes then how can I fetch the data from it.

There is an inbuilt "log-to-event-hub" policy that you can use to send basically any information that exists on the context object (meaning the request/response + a bit more) to an event hub. From there you can use any regular method for processing the events
How to log events to Azure Event Hubs in Azure API Management.

Use Azure Monitor to configure diagnostic logs from ApiManagement to either Storage, Eventhub or Log Analytics. These logs have the data you are looking for.
I would start with free tier of Log Analytics for easy querying, dashboards and alerting. Refer this.
For more custom logging, you can use log to event hub policy. Refer this blog.

Related

Is there a way to capture network traces for Azure API management?

Is there a way to capture network traces for azure api management when we make a REST API call?
In my case, it makes a REST API call and the request goes through custom DNS to the destination resource. I wanted to capture the network traffic to analyze in case of any transient failures.
No, this capability does not exist so far, you have to open support ticket and get help from the support team.

Writing a code to receive messages from iot hub and store them in a container in BlobStorage

I would like to write a function using function app in microsoft azure to receive messages from an iot hub convert them from base64 format to string and store them in a container in BlobStorage. Could you please help me to do this?
Thanks in advance.
Br
Masoud
Create the Azure Function IoT Hub Trigger to receive the messages and this template comes by default when you create it.
Connection value should be provided in the local.settings.json file, which is IoT Hub Connection string, can get it from the Azure Portal > IoT Hub Resource > Buit-in endpoints under Hub Settings.
Runt the function where you will see your messages flowing from IoTHub to your Azure Functions (assuming you have devices or simulators connected that are sending data).
Refer to this article for step-by-step information on receiving the messages of IoT Hub to Azure Function.
To save those messages came from Sensors in Azure, you can use Azure Storage Account either Blob Container or Table Storage.
Step 1: Go to IoT Hub resource > Message routing > Custom Endpoints > Add the New Endpoint for the Storage Service.
Step 2: Give an endpoint name, pick a container created for storing the data (messages), encoding format either JSON or AVRO, File Name Format, Authentication Type, etc. in this window:
Now, we added Storage Account endpoint to route messages received by IoT Hub to the Azure Blob Container Storage.
Final Step is to add the route for routing data to the storage account.
Please visit these sources for detailed information:
Saving IoT Hub messages to Azure Blob Storage.
Saving IoT Hub Sensor Data to Azure Table Storage
Official Documentation of Azure Functions IoT Hub Trigger
Note: Azure Blob Storage is cost-effective (High Price). Basically, it is recommended this variant only for proof-of-concepts or very small, simple projects. Please refer this article for more information on using which storage account for IoT Hub Trigger to optimize cost.

Azure API Management - User Metadata

I am using Azure API Management to provide API gateway for some APIs. To set up a policy for a particular Api, I have used a Property(Named Value) to restore user metadata and then I assign it into a Variable in incoming request body. When adding a new user I need to add metadata for the new user in to the json. The property value has grown and exceeded the limit now and I cannot add more info to it anymore. I am wondering what the best way is to restore my large metadata in order to be accessible in API Management policy?
Update1:
I have switched the Authentication process from Azure to Auth0 so I can add the user metadata to Auth0 app_metadata and then in Azure policies I validate JWT from Auth0 and obtain token claim(app_metadata) explained in this article. By doing so I can solve the large user metadata (json) issue however this doesn't solve other non-related user metadata stored in other Properties(Named Value) and moreover the API gateway inbound policies are growing and becoming a huge bunch of logic which is not easy to manage and maintain.
At this stage I am looking for a solution to handle all the API gateway inbound policies in a better way and more manageable environment i.e. C#. So my two cents is to implement the API gateway inbound policies in a new .net Api and call this new API in the existing API gateway inbound policies so that it can play a bridge role between Azure API gateway and existing API however I'm still not sure if this is acheivable and whether existing API can be called via new API directly or it should be called via Azure API gateway in some way!
At this point you have to either store it in multiple variables or hardcode it in policy directly.
After more research I ended up with this solution which basically suggests to restore user metadata in Azure Cosmos DB and call Cosmos API in API Management Policy to access to the metadata and also the Cosmos API call can be cached in the policy.

How to identify the Requests received in azure API management

we have an production issue where the order is submitted twice. Currently we have an API for order and we are exposing this to client using API management and in these we have policies for URL mapping for customer facing to actual .
Now , our actual API got 2 request so we thought customer submitted twice but they have confirmed that they have not submitted twice , so either there is issue with API management which fired 2 request.
How can i Identify the request received by the API management ?
Is there any chance that API management will fire the request twice ?
Appreciate any pointers
The only way to fire request twice in APIM would be by the means of Retry policy or manually using SendRequest. Otherwise it should be a client calling your API two times. Each request in APIM get it's own unique id accessible in policies as context.RequestId, this is the main way to track and identify them. But these ids are produced inside APIM itself thus are useful only if you're tracking a call from APIM and into backend.
Your best option now is to try to identify requests by client ip, method, uri, and time frame. APIM allows you to grab logs for certain periods of time (better if kept short) in JSON or CSV with data I mentioned above. To do that look into byRequest report (https://learn.microsoft.com/en-us/rest/api/apimanagement/reports#ReportByRequest), grab JSON/CSV and try to identify calls of interest,
For future you could look into onboarding your service to azure monitor (https://learn.microsoft.com/en-us/azure/api-management/api-management-howto-use-azure-monitor) or log analytics those provide easier way to traverse logs.

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo