Apigility field accepting an array of embedded objects? - json

I'd like to create an Apigility REST service which accepts POSTs of, for example, a user object which has a field containing an array of address objects. I can define the field without validators and process the raw JSON in my code, but I'm wondering if there is a better way to do this where the nested objects can also be validated by Apigility?

Apigility has a module called content-validation -- it allows you to configure input filters for your services, and request data will be passed through the input filter for validation and an appropriate ApiProblem response is returned when validation fails. (see https://apigility.org/documentation/api-primer/content-validation)
That still leaves the onus on you to configure an input filter that will suit your needs.
I would check packagist.org for a JSON Schema validator library which can take a JSON schema and a JSON payload and verify that the payload is well formed according to the schema. Then you can easily implement a custom InputFilter and bind it to your services. This will give you validation that the main object and sub objects are well formed (ie: user has name, email, birth date and address field contains objects that all have address/street/zip/etc).

Related

How to GET only custom Field Names from Azure Form Recognizer API?

I trained a Custom Form Recognizer Model. It tests great. But I can't find the API endpoint to call that returns ONLY the key/value pairs for the form I sent the model to analyze.
Example:
I trained a custom model to find First name and Last name only
When I POST a PDF to the endpoint:
https://{my-project}.cognitiveservices.azure.com/formrecognizer/documentModels/{my-model}:analyze?api-version=2022-08-31
Then view JSON from the Operation-Location header:
https://form-rec-ocr-test.cognitiveservices.azure.com/formrecognizer/documentModels/{model-name}/analyzeResults/{GUID}?api-version=2022-08-31
I get aaaaallll the text on the submitted PDF instead of only the First name and Last name fields. I just want the key/value pairs to insert them into a table.
What is the correct API endpoint to call? The API docs here are focused on pre-build models instead of Custom Models.
For some reason, this was not lined out in any documentation I came across. Found the answer in this video #36:30.
The data was in the original JSON object, just at line 3300 under the document node.
Would simplify things if the Form Recognizer API could return ONLY the document array by defining a simple query parameter.

Force an Azure Logic App to recognise a text string as a token and replace it with the token's value at run time

Details of the Azure Logic App:
Triggered by an incoming webhook
The triggering webhook contains a number of json name/value pairs, such as "variable1":"05-05-2021" and "variable2":"A text string"
The first step of the Logic App is to retrieve a json object from an external source.
The retrieved json object contains string values, some of which include text that match tokens that exist in the Logic App e.g. "triggerBody()?['variable1']" and "triggerBody()?['variable2']"
The Logic App then posts the json object to another external source.
Is there any way to make the logic app recognise these string values as tokens and replace the string value with the value of the token?
For example
If the retrieved json object contains the string "The date today is triggerBody()?['variable1']"
And the triggering Webhook provided the value of variable1 as "05-05-2021"
Is there a method I can use to force the Logic App to recognise the relevant string in the json object as a token and replace it with the token's value, so that the string in the json object becomes 'The date today is 05-05-2021"
I've tried adding steps in the app to convert the json object to a string and then back into an object, in the hope that this would force the Logic App to recognise the text triggerBody()?['variable1'] as a token and replace it with the token's value, but that's not working.
Any guidance would be greatly appreciated!
Unfortunately, this isn't possible. The tokens are replaced just once. The alternatives are
Use the replace function to do this. Though this will not work dynamically and should be OK with a small list of known variables.
You could use the inline function to replace tokens with values. Note that this requires an integration account.
Originally answered on Microsoft Q&A

Spring Kafka listener to consume JSON and infer the domain object

I have a topic that will be published with multiple types of JSON messages. I don't have control over the publisher code to add any headers etc. But, I want to leverage #KafkaHanlder to handle the different JSON messages inferred to the domain objects. I got some references https://github.com/spring-projects/spring-kafka/tree/master/samples/sample-02
As I don't have control over the publisher code, with the custom deserializer I want to handle multiple JSON types. Any references to write custom de-serializer to handle multiple JSON objects with #KafkaHandler.
You can't use class level listeners without deserialization. It's a catch-22.
To determine which method to call, we need to know the type (after deserialization); we can't infer the type if it hasn't been deserialized yet.
You could write a wrapper for muliple JSON deserializers, either using try...until success, or by "peeking" at the JSON to determine hueristically which deserializer to call for this record.
if (json.contains "\"foo\":") {
return deserializeWithFooDeser(data, headers);
}

Logic App dynamic content give null or ""

I want to create small automation between Jira and Azure. To do this, I execute HTTP trigger from Jira, which send all request properties to Azure Logic App. In "When a HTTP request is received" step in Logic App I can see properly JSON schema with all data which I need. In next steps for example I want to add user to Azure AD group. And problem starts here.
For example, I want to Initialize variable and set it with value from JSON. I choose properties from dynamic menu, but after script execute always it is null value ( but in first step in "raw output" I see whole schema with data). I tried many things - parse, compose, many different conversion - always without any luck - null value or "".
Expected value - when I want to initialize variable using Properties from Dynamic Content I want to have value from input json.
Output from Jira
Output with the same JSON send from Postman
Thanks for any help !
--
Flow example
Flow result
If you send the json with application/json content-type, you could just select the property with the dynamic content, however if not you have to parse it to json format with Parse Json action.
AS for the schema, you need use your json data to generate it with Use sample payload to generate schema. Paste the sample json payload.
Then you will be able to select property. However you could not implement it dynamic content, you have to write the expression. The format will be like this body('Parse_JSON')['test1'] and if your json has array data, you need to point the index it will be like this body('Parse_JSON')['test2'][0]['test3'].
Below is my test, you could have a try.

Send custom property with value as JSON array

We want to send some events to Application Insights with data showing which features a user owns, and are available for the session. These are variable, and the list of items will probably grow/change as we continue deploying updates. Currently we do this by building a list of properties dynamically at start-up, with values of Available/True.
Since AI fromats each event data as JSON, we thought it would be interesting to send through custom data as JSON so it can be processed in a similar fashion. Having tried to send data as JSON though, we bumped into an issue where AI seems to send through escape characters in the strings:
Eg. if we send a property through with JSON like:
{"Property":[{"Value1"},..]}
It gets saved in AI as:
{\"Property\":[{\"Value1\"},..]} ).
Has anyone successfully sent custom JSON to AI, or is the platform specifically trying to safeguard against such usage? In our case, where we parse the data out in Power BI, it would simplify and speed up a lot some queries by being able to send a JSON array.
AI treats custom properties as strings, you'd have to stringify any json you want to send (and keep it under the length limit for custom property sizes), and then re-parse it on the other side.