Azure APIM - import single controller from an API or split controllers from single API into different APIM APIs - azure-api-management

I have backend APIs with multiple controllers which split up operations which are for 3rd parties, other are for frontend proxies / other micro-services and then there is an support/admin controller. I dont want all of these in the same APIM API / Product.
Currently either having to manually edit the OpenAPI def of the API before importing it into APIM or have to manually create the API in APIM and then using the dev tools extractor to export the templates for other environments.
My stack is
dotnet 5.0 / 6.0 (aspnet) with NSwag to document the API. Using the azure apim development toolkit to automate the bits we can.
I'd like to have a automated pipeline where the backend api is built, have separate OpenAPI defs or way to filter controller during importing, the output from that goes into a pipeline for APIM which then updates the dev environment APIM and then can be auto deployed into other environments when needed.
Does anyone else do this type of thing or do you put the whole API into a single APIM API/Product? Or do you have completely separate backend APIs that make up the a microservice? Or something else?

I want to share what we do at work.
The key idea is to have a custom OpenAPI processor (be it a command line tool so that you can call that in a pipeline) combined with OpenAPI Extension specs. The processor is basically a YAML or JSON parser based on your API spec format.
You create a master OpenAPI spec file that contains all the operations in your controller.
Create an extension, say x-api-product: "Product A", add this to the relevant API operations.
Your custom processor takes in the master spec file (e.g. YAML), and groups the operations by x-api-product then outputs a set of new OpenAPI specs.
Import the output files into APIM.
The thing to think about is how you manage the master API spec file. We follow the API Spec First approach, where we manually create and modify the YAML file, and use OpenAPI Generator to code gen the controllers.
Hope this gives you some ideas.

Related

APIM ARM based customization of OpenApi 'front-end' with REST to SOAP deployment

I have a WCF backend service which is a SOAP/XML service and I need to expose it to my consumers.
Importing the WSDL is not a problem, but I don't like the naming/url of the operations which is based on the SoapAction in the WSDL.
Manually I can change the display name, URL and HTTP verb to make it more 'restful' on the outside, but is there a way to automate this?
I like to add this to my ARM template somehow.
You can't change how WSDL import works, but it's only a batch operation of sorts. If you generate ARM template after you imported WSDL you will be able to use it to recreate your API without source WSDL. And you will be able to change that template however you see fit.

Generate JSON test data from RAML

Is there a tool that allows me to create random JSON test data starting from a RAML file?
Example: starting from a RAML file describing an API generate random static JSON responses to register in a WireMock mock server mappings so that I can run automated tests against the API.
I'm working with Java but tools/libraries in other languages would fit too.
Thanks
Maybe one of these suits your needs:
RAML Mock Server: Library for validating MockServer calls against a RAML API specification
RAML Tester: Test if a request/response matches a given raml definition
soapui-raml-plugin
Allows you to import RAML files into SoapUI for testing your REST APIs
Allows you to generate a REST Mock Service for a RAML file being imported
Otherwise check: http://raml.org/projects/projects and filter by type 'test' and language.
Perhaps https://github.com/wavesoft/raml-lipsum might be useful.
That seems to indicate that it can do as you ask. I didn't think that the other answers seemed to generate sample requests for a raml service, which is I think what the question asks for, and I was asked today at work.

Call rest service that returns JSON in mule esb?

How to call a rest service that returns JSON in a APIKit based message flow. I want to prepare the request for Rest service and want to extract the JSON message in a message flow.
Can any one help me to do this?
Thanks
The recommended way to invoke/consume REST services in mule is the use of HTTP Request
Do let us know if there is a specific you are facing
The request connector is especially handy when consuming a RESTful API
that is described in a RAML file. If you reference the API's RAML file
in the connector's configuration, it will proactively offer you the
set of available resources and operations contained in the RAML file,
as well as enforce the policies described in the file. It will also
expose the API metadata to Studio, which can then be used by other
elements such as DataWeave to autocomplete fields and make
configuration much easier
Use Rest URI path to invoke the Service and you can also get json response as output based on the type of service you invoke

How to use JSON API without document

This might be a weird question but I am open for all the suggestions.
The background is I want to use script to automatically deploy/remove docker container on Jelastic, but unfortunately this part is not well documented in Jelastic official API document. Jelastic provided me a piece of sample code demonstrated how to use bash to create a new environment with a new docker container but it is not enough, I still don't know how to create/remove docker container by looking at the sample code.
Since Jelastic is using standard JSON API, I am wondering is there any tool which can automatically retrieve/detect that the parameters I can use with Jelastic JSON API?
If you were me, how would you get over this if there is no document as reference?
I am keen to use Jelastic, but this issue stopped me from onboarding, many thanks.
J.
All the parameters that can be used with Jelastic JSON API are specified at http://docs.jelastic.com/api/ page.
To use JSON API without document I suggests to you check the Postman API tool https://www.getpostman.com/. This application allows you to see all the sent/received data and allows you to passes JSON values without any document or any additional actions.
Simplest scenario for beginners: Go to API docs, section Users>Authorization, using Signin method you should to obtain the session value, that is necessary almost for all further actions. Then you need to obtain information about environment, section Environment>Environment, at first you should to executes GetEnvs method, then using the application identifier of the environment that was obtained from the previous command you should to executes GetEnvInfo method. As a results of the described scenario you will get all parameters and values that can be used with Jelastic JSON API for certain type of the environment.

Creating WSO2 BPEL project for JSON webservices

I have the WSO2 Developer Studio Eclipse Plugin downloaded. And I was looking at this tutorial: http://wso2.com/library/tutorials/2010/07/eclipse-bpel-designer-wso2bps-tutorial/. But it seems to be talking about using SOAP. But my webservices, which are written in PHP(in live servers) are REST using JSON. Accepts data via HTTP GET methods by these webservices and respond back with JSON data.
So how will I implement a BPEL project making use of the JSON webservices? Any ideas or suggestion? Am completely new to this. Thank you.
EDIT
When I created the BPEL Process, I have used HTTP from the dropdown instead of SOAP
WSO2 uses a custom variant of the Apache ODE engine for executing BPEL processes. ODE provides extensions for REST, which you can try out. However, I am not sure if these extensions also support JSON or if they just allow XML data. Also have a look at this answer.