Logic App dynamic content give null or "" - json

I want to create small automation between Jira and Azure. To do this, I execute HTTP trigger from Jira, which send all request properties to Azure Logic App. In "When a HTTP request is received" step in Logic App I can see properly JSON schema with all data which I need. In next steps for example I want to add user to Azure AD group. And problem starts here.
For example, I want to Initialize variable and set it with value from JSON. I choose properties from dynamic menu, but after script execute always it is null value ( but in first step in "raw output" I see whole schema with data). I tried many things - parse, compose, many different conversion - always without any luck - null value or "".
Expected value - when I want to initialize variable using Properties from Dynamic Content I want to have value from input json.
Output from Jira
Output with the same JSON send from Postman
Thanks for any help !
--
Flow example
Flow result

If you send the json with application/json content-type, you could just select the property with the dynamic content, however if not you have to parse it to json format with Parse Json action.
AS for the schema, you need use your json data to generate it with Use sample payload to generate schema. Paste the sample json payload.
Then you will be able to select property. However you could not implement it dynamic content, you have to write the expression. The format will be like this body('Parse_JSON')['test1'] and if your json has array data, you need to point the index it will be like this body('Parse_JSON')['test2'][0]['test3'].
Below is my test, you could have a try.

Related

Azure Logic Apps - Verify content in JSON file

I have a logic app that runs as follows:
Step 1 = Recurrence (the schedule for the logic app to run)
Step 2 = HTTP (Performing a POST call on a custom API that returns a JSON file)
Step 3 = Create Blob (which uploads said document from Step 2 and uploads it to blob storage)
What I would like to do now is add an extra step between Step 2 and Step 3. After making the HTTP POST call, I would like to verify the content inside the JSON file that gets returned. If there is an error present in the JSON file, I want the Logic App to stop there.
Is there a particular step that I can use in Azure's Logic App to verify the data and have that step decide if it should continue on or not.
For this requirement, you can refer to my logic app below:
1. I initialize a variable named "resultFromHTTP" to simulate the json from your HTTP request(step 2). And I delete some character, so the "resultFromHTTP" is not a valid json format.
2. Then I initialize another variable and use the expression json(variables('resultFromHTTP')) in its value.
3. Run the logic app, it will fail and show error message like below screenshot. If the json is in valid format, it will run success.
===============================Update==============================
For your latest question, if the result json from HTTP request is not in valid json format and not a very long string, you can do it like this:
The contains(... expression is contains(variables('resultFromHTTP'), 'Data Not Found'). Then you can do what you want under "if true" or "if false".
If the result json from HTTP request is in valid json format, you can use "Parse JSON" action to parse it and get the specified field and then judge if it equals to "Data Not Found".

Send custom property with value as JSON array

We want to send some events to Application Insights with data showing which features a user owns, and are available for the session. These are variable, and the list of items will probably grow/change as we continue deploying updates. Currently we do this by building a list of properties dynamically at start-up, with values of Available/True.
Since AI fromats each event data as JSON, we thought it would be interesting to send through custom data as JSON so it can be processed in a similar fashion. Having tried to send data as JSON though, we bumped into an issue where AI seems to send through escape characters in the strings:
Eg. if we send a property through with JSON like:
{"Property":[{"Value1"},..]}
It gets saved in AI as:
{\"Property\":[{\"Value1\"},..]} ).
Has anyone successfully sent custom JSON to AI, or is the platform specifically trying to safeguard against such usage? In our case, where we parse the data out in Power BI, it would simplify and speed up a lot some queries by being able to send a JSON array.
AI treats custom properties as strings, you'd have to stringify any json you want to send (and keep it under the length limit for custom property sizes), and then re-parse it on the other side.

Pentaho HTTP Post using JSON

I'm brand new to Pentaho and I'm trying to do the following workflow:
read a bunch of lines out of a DB
do some transformations
POST them to a REST web service in JSON
I've got the first two figured out using an input step and the Json Output step.
However I have two problems doing the final step:
1) I can't get the JSON formatted how I want. It insists on doing {""=[{...}]} when I just want {...}. This isn't a big deal - I can work around this since I have control over the web service and I could relax the input requirements a bit. (Note: this page http://wiki.pentaho.com/display/EAI/JSON+output gives an example for the output I want by setting no. rows in a block=1 and an empty JSON block name, but it doesn't work as advertised.)
2) This is the critical one. I can't get the data to POST as JSON. It posts as key=value, where the key is the name I specify in the HTTP Post field name (on the 'Fields' tab) and the value is the encoded JSON. I just want to post the JSON as the request body. I've tried googling on this but can't find anyone else doing it, leading me to believe that I'm just approaching this wrong. Any pointers in the right direction?
Edit: I'm comfortable scripting (in Javascript or another language) but when I tried to use XmlHttpRequest in a custom javascript snippet I got an error that XmlHttpRequest is not defined.
Thanks!
This was trivial...just needed to use the REST Client (http://wiki.pentaho.com/display/EAI/Rest+Client) instead of the HTTP Post task. Somehow all my googling didn't discover that, so I'll leave this answer here in case someone else has the same problem as me.
You need to parse the JSON using a Modified JavaScript step. e.g. if the Output Value from the JSON Output is called result and its contents are {"data"=[{...}]}, you should call var plainJSON = JSON.stringify(JSON.parse(result).data[0]) to get the JSON.
In the HTTP Post step, the Request entity field should be plainJSON. Also, don't forget to add a header for Content-Type as application/json (you might have to add that as a constant)

Parse APEX RESTful WebService reference JSON response stored in CLOB001 column

I am trying to generate a report page in my APEX application, using the data obtained from a REST service response.
I added a new RESTful webservice reference and sepecified a JSON output.
Then I've generated a query/report page, but this is what's being currently displayed:
Instead I want the report to display the contents of data field on the JSON response (A single row with various columns and values)
Is there any straight forward way to show each response element and field on its row and column instead of a single column and row with the whole response, like there is with XML RESTful responses?
Consider whether another output type may be more handy. If you're putting out JSON and expect to handle it in PLSQL, then you do need to realize that PLSQL does not handle that natively.
Want to handle the rest response as json in plsql? Then look into some PLSQL libraries/projects that may do this:
http://reseau.erasme.org/pl-sql-library-for-JSON?lang=en
http://sourceforge.net/projects/pljson/
Then print out the returned HTML through HTP.P calls.
Though honestly, if you want to stick with plsql you would be much better off using XML as a return type, since you can then use all the xml goodies in the database. Though simply spitting out some html structure may not be quite too interesting (but I digress).
Or call the rest service through an ajax call from javascript, and handle the object there. After all, JSON is Javascript Object Notation and should be perfect for javascript, right? Then simply inject your html somewhere in the document. Ideally you would set up a region as a container for this.

Apigility field accepting an array of embedded objects?

I'd like to create an Apigility REST service which accepts POSTs of, for example, a user object which has a field containing an array of address objects. I can define the field without validators and process the raw JSON in my code, but I'm wondering if there is a better way to do this where the nested objects can also be validated by Apigility?
Apigility has a module called content-validation -- it allows you to configure input filters for your services, and request data will be passed through the input filter for validation and an appropriate ApiProblem response is returned when validation fails. (see https://apigility.org/documentation/api-primer/content-validation)
That still leaves the onus on you to configure an input filter that will suit your needs.
I would check packagist.org for a JSON Schema validator library which can take a JSON schema and a JSON payload and verify that the payload is well formed according to the schema. Then you can easily implement a custom InputFilter and bind it to your services. This will give you validation that the main object and sub objects are well formed (ie: user has name, email, birth date and address field contains objects that all have address/street/zip/etc).