disable generation of generatedSample attribute in Response Representations of api configuration - azure-api-management

Is there a mechanism within Azure API Management to disable the generation of the sample data that is injected into the api specification by the APIM processes?
I have noticed in APIs that have significant large/complex models and a large number of operations that the sampleGenerated attribute is creating an extremely large overhead to the configuration of the api. For example we have an api that is ~260k on original import of the swagger file and when it ends up in the APIM repository the configuration file has expanded out to over 13 megs of data. This sample data doesn't appear to be used in the admin or developer portal so not sure of its value stored in the primary configuration file. I have attempted to update via the repository to clear these values however it appears to be recreated after the repository update.

The only way to do so is to provide your own samples.

Related

Is there a simple way to host a JSON document you can read and update in Google Cloud Platform?

What I'm trying to do is host a JSON document that will then, essentially, serve as a hosted version of json-server. I'm aware I can do something similar with My JSON Server, but I plan to move my entire architecture to GCP so want to get more familiar with it.
At first I looked into the Storage JSON API, but it seems like that's just for getting data about buckets rather than the items in the buckets itself. I created a bucket called test-json-api and added a test-data.json, but there's seemingly no way to access the data in the json file via this API.
I'm trying to keep it as simple as possible for testing purposes. In time, I'll probably use a firestore allocation, but for now I'd like to avoid all that complexity, and instead have a simple GET and a PUT/PATCH to a json file.
The Storage JSON API you are talking about are only for getting and updating the metadata and not for getting and updating the data inside the object. Objects inside the Google Cloud Storage bucket are immutable and one way to update them may be to get the object data from Google Cloud Storage bucket within the code, updating it, then uploading it again into the Google Cloud Storage bucket.
As you want to deal with JSON files you may explore using Cloud Datastore or Cloud Firestore. Also if you wish to use Firebase then you may explore Firebase Realtime Database.
A very quick and dirty way to make it easy to read some of the information in the json doc is to use that information in the blob name. For example the key information in
doc = {'id':3, 'name':'my name', ... }
could be stored in an object called "doc_3_my name", so that it can be read while browsing the bucket. You can then download the right doc if you need to see all the original data. An object name can be up to 1024 bytes of UTF-8 (with some exclusions), which is normally sufficient for surfacing basic information.
Note that you should never store PII like this.
https://cloud.google.com/storage/docs/objects

Retrieve Github Action metadata of GITHUB_TOKEN through API

I am trying to make sure we have a secure way to integrate our cloud and Github Actions.
We have multiple Accounts in our cloud to reduce the blast radius if there is an issue. For this we need to make sure we can assume the correct role to deploy to the correct sub-account. We where planning to do a discovery capability made based on the extraction of the metadata of the GITHUB_TOKEN generated in runtime.
Is there a way to obtain the repo name or action that generated the GITHUB_TOKEN?

Trigger for status update to send a patch method in Azure DevOps

Have been working on the integration between Azure DevOps Services and ServiceNow. Our goal is to send Change Requests from ServiceNow to Azure DevOps, where they would become Features or User Stories. Whenever there is some update on Azure DevOps, that update should be sent to ServiceNow, and vice versa.
The idea is to work with REST API.
From our investigation, we have found that it is possible to send updates to other applications through Web Hooks. We are still not sure if this will suite our needs and if we are able to work with this. The problem is that the webhooks only support the HTTP method POST while Service Now requests PATCH to update on it’s side. Is this correct is there any way of creating webhooks with PATCH method?
Other way that we can integrate is to create some software that will send response needed. However, we cannot seem to find a way to automate this response. As I understood, it will generate response only when the script run, not when work item is updated. Is there any way to trigger the sending of a json file with all information within the work item whenever the work item state is updated?
As a workaround, you can try to create a custom service hook. Here is the document you can refer to .
Marketplace provides an extension(Azure DevOps Service Hooks DSL) . This extension framework is designed to ease the development of your own REST Web Hook web site to do this type of integration. It does this by providing a MVC WebAPI endpoint and a collection of helper methods, implemented as an extensible Domain Specific Language (DSL), for common processing steps and API operations such as calling back to the TFS/VSTS server that called the endpoint or accessing SMTP services.
Is there any way to trigger the sending of a json file with all
information within the work item whenever the work item state is
updated?
I am not sure if it is possible to trigger that.
But there is a ServiceNow DevOps extension for the integration between Azure Devops and Snow. You may use that.

Autodesk Design Automation Activities: How to distinguish between input and output parameters

When creating a new activity, input and output parameters are not separated as it was need in v2. How does the design automation service distinguish between them?
Is it via the verb? I hope not, because sometimes there are poorly designed Rest-APIs or GraphQL endpoints that require a POST request for receiving data.
You are correct!, it is via verb - based on type of verb, Design Automation service takes an action if it requires to download the resource (get) or upload the resource (put | post) etc, V3 also supports other types of verbs like read, patch etc.

Implementing IoT PowerBI table schema

I'm currently implementing an IoT solution that has a bunch of sensors sending information in JSON format through a gateway.
I was reading about doing this on azure but couldn't quite figure out how the JSON scheme and the Event Hubs work to display the info on PowerBI?
Can I create a schema and upload it to PowerBI then connect it to my device?
there's multiple sides to this. To start with, the IoT ingestion in Azure is done tru Event Hubs as you've mentioned. If your gateway is able to do a RESTful call to the Event Hubs entry point, Event Hubs will get this data and store it temporarily for the retention period specified. Then stream analytics, will consume the data from Event Hubs and will enable you to do further processing and divert the data to different outputs. In your case, you can set one of the outputs to be a PowerBI dashboard which you can authorize with an organizational account (more on that later) and the output will automatically tied to PowerBI. The data schema part is interesting, the JSON itself defines the data table schema to be used on PowerBI side and will propagate from EventHubs to Stream Analytics to PowerBI with the first JSON package sent. Once the schema is there it is fixed and the rest of the data being streamed in should be in the same format.
If you don't have an organizational account at hand to use with PowerBI, you can register your domain under Azure Active Directory and use that account since it is considered within your org.
There may be a way of altering the schema afterwards using PowerBI rest api. Kindly find the links below..Haven't tried it myself tho.
https://msdn.microsoft.com/en-us/library/mt203557.aspx
Stream analytics with powerbi
Hope this helps, let me know if you need further info.
One way to achieve this is to send your data to Azure Events Hub, read it and send it to PowerBI with Stream Analytics. Listing all the steps here would be too long. I suggest that you take a look at a series of blog posts I wrote describing how I built a demo similar to what you try to achieve. That should give you enough info to get you started.
http://guyb.ca/IoTAzureDemo