I am facing one issue. I am not able to find logic app in API management application.
I am 10 different workflows with HTTP trigger. Through Postman, they are working fine.
Both apps (logic app and API management app) are under same subscription and resource group. but when I am searching logic app it is not visible.
In API management ->backend, I am able to find it but not sure how to connect back-end with Front end in this approach.
please help
I've had the same and it took me a while to find out that the Logic App also needs to run under a "consumption plan":
https://learn.microsoft.com/en-us/azure/api-management/import-logic-app-as-api#prerequisites
When creating the Logic App you can chose between a Standard or Consuption plan.
Similar problem. I have a Logic App workflow defined as:
And yet this logic app does not show up when I attempt to import it into API Management:
The Logic App and API Management instances are in the same account and resource group.
Related
Have been working on the integration between Azure DevOps Services and ServiceNow. Our goal is to send Change Requests from ServiceNow to Azure DevOps, where they would become Features or User Stories. Whenever there is some update on Azure DevOps, that update should be sent to ServiceNow, and vice versa.
The idea is to work with REST API.
From our investigation, we have found that it is possible to send updates to other applications through Web Hooks. We are still not sure if this will suite our needs and if we are able to work with this. The problem is that the webhooks only support the HTTP method POST while Service Now requests PATCH to update on it’s side. Is this correct is there any way of creating webhooks with PATCH method?
Other way that we can integrate is to create some software that will send response needed. However, we cannot seem to find a way to automate this response. As I understood, it will generate response only when the script run, not when work item is updated. Is there any way to trigger the sending of a json file with all information within the work item whenever the work item state is updated?
As a workaround, you can try to create a custom service hook. Here is the document you can refer to .
Marketplace provides an extension(Azure DevOps Service Hooks DSL) . This extension framework is designed to ease the development of your own REST Web Hook web site to do this type of integration. It does this by providing a MVC WebAPI endpoint and a collection of helper methods, implemented as an extensible Domain Specific Language (DSL), for common processing steps and API operations such as calling back to the TFS/VSTS server that called the endpoint or accessing SMTP services.
Is there any way to trigger the sending of a json file with all
information within the work item whenever the work item state is
updated?
I am not sure if it is possible to trigger that.
But there is a ServiceNow DevOps extension for the integration between Azure Devops and Snow. You may use that.
Has anyone tried to Import an (private and not publicly accessible) API App as an API when your API management instance is deployed in a vnet? This is different than just pointing to an ip address like explained here, which no doubt will work.
However, directly pointing to an Azure resource, as the Import API App does, seems like the nicer approach. I wonder what the exact requirements are, as I've tried to import an API app that wasn't part of the same vnet, and it gave a DNS error after importing the API and trying it out in API Management (which only makes sense if it tried to call the API App as if it's hosted on the internet, if the API App is accessible from the internet, there is no issue, that works perfectly).
There is a direct reference between an API resource and the API Management resource (it sets the ResourceId on your ApiManagementBackend). Also, I've set HostNamesDisabled to true on the API App, which speficially claims that "the app is only accessible via API management process"... So Azure should handle the traffic between the two resources internally right (this is what i hoped)?
Trying to put both resources in the same vnet would be the next thing to try, but then i'd skip the whole Import API all together because then i can just point to the ip address. Which perhaps concludes that the Import API functionally is not beneficially in an API Management vnet scenario?
The best approach for me at the moment is to not put the API app in a vnet, and just disable access to the API app for all ip's except for the public endpoint of the API management instance.
After private endpoints become generally available for app services I might reevaluate this...
I have an enpoint in my MERN app which I would like to expose to developers.
I came across APIM and would like to use it.
After going to the documentation I would like to know how do I can use APIM for my specific enpoint and where I allow users to generate API's in my client side react app.
I am also going through the API management API. but don't know how to generate user specific API keys...
You could simply mimic what the Developer Portal does using APIMs REST API.
If you are using the Consumption Tier of APIM, you can just create a standalone subscription using the Create or Update Subscription API. Yon don't have to set properties.ownerId in the request payload here.
On the other tiers, standalone subscriptions are not supported yet (but will be as mentioned in the official announcement blog under New Features), so you will have to create a user first using the Create or Update User API and then create a new subscription mentioning this user under properties.ownerId as /users/{userId}.
Since these REST APIs call the Azure Management API, you shouldn't be making these requests from the client and instead should be calling this from your backend.
I have a web app and API for the app configured and completed, but work is now requesting more apps. The apps are web-heavy with a light API for mobile functionality. A monolithic apps seems out of the question, so I decided to make each one individually. Each app will have their own layout, database, and API. However, the one thing I want to share among all apps is the users' password, api token, and firebase messaging token. A separate app will be created just for authentication with IDnumber, password, api token, and fcm token. 4 simple fields. This single app will be the only one doing any writing to it's DB and this single table.
Creating requests to the auth app to verify every request to each API seems inefficient, so I was wondering if there was a way for the apps to tap into the auth database and verify tokens and passwords directly. There would be no joining of tables cross-apps and no cross-app creation/updating/deletion. Problems with keeping models and schemas synced make sense, but would read-only custom queries eliminate those issues?
Integration at DB level is messy - any change would need to be done on every application using it, and the security is a concerns too.
The typical solution for your problem (having several application share a unique authentication source) is OAuth - a way for the multiple apps to delegate authentication to your "Auth App". This is well supported by frameworks such as Devise.
I have an app that uses spring rest and deployed on PCF. Now inside the code I have to get the number of PCF instances running currently. Can anyone help?
Before I answer this - why do you want to know? It's an anti-pattern for cloud native apps to know about their peers; they should each be working in total isolation.
You can discover this by looking up application details by GUID in the CloudController. You can get your current app's GUID in the VCAP_APPLICATION environment variable.
https://apidocs.cloudfoundry.org/245/apps/get_app_summary.html
In order to hit the CloudController your app will need to know the system domain of your Cloud Foundry (eg api.mycf.com) and credentials that allow it to make that request.