ADF Copy Activity to JSON Sink Expanding Decimal Precision - json

I'm pulling Data from a Rest End Point . The value of the the JSON property which i get using Postman and via C# code is
"value": 9.9516
But after copy activity from Rest API Source and JSON sink( also tried inline sink), the decimal precision expands
"value":9.9515999999999991
Update:

Related

Azure Data Factory Copy Data from Rest API to CSV - Json object value isn't output

The values for data.CustomerNumber do not get written to the file, the other elements do.
The CustomerNumber is Pascal case because it is a custom value in the system I am pulling the data out of.
I have tried the map complex values option, advanced editor, deleted, started over. I originally was trying to save this to a Azure Sql table but gave up , and just tried to write to a file. I have been able to do this type of mapping in other pipelines so I am really at a loss for why it doesn't work in this case.
The actual json content for the mapping

Data Factory: API call returning JSON format wont go to JSON file

In data factory I'm using a Copy Data activity where the Source is a REST API, that's returning data in JSON format, with a Sink thats a JSON file type in ADLS Storage.
The problem I'm having is that only the first record in the "Domestic Title Nos" array appears in the JSON file. The remaining records, however many, are not to be seen.
Also, when I "Preview" my data in the Source page, it shows all the nested data in the array.
API Call Returns this JSON data
Mapping in my Copy Data activity
File contents after sink to JSON file type
As shown in below screenshot, you have used items as Collection reference.
You need to use “Domestic Title Nos” as Collection reference.
Note: For records where the array marked as collection reference is
empty and the check box is selected, the entire record is skipped.
Refer - Schema and data type mapping in copy activity
In the end it worked out to be easier to download the entire JSON document using
this mapping and then used "Flatten" in a Data Flow to extract the parts that I needed.

Logic App dynamic content give null or ""

I want to create small automation between Jira and Azure. To do this, I execute HTTP trigger from Jira, which send all request properties to Azure Logic App. In "When a HTTP request is received" step in Logic App I can see properly JSON schema with all data which I need. In next steps for example I want to add user to Azure AD group. And problem starts here.
For example, I want to Initialize variable and set it with value from JSON. I choose properties from dynamic menu, but after script execute always it is null value ( but in first step in "raw output" I see whole schema with data). I tried many things - parse, compose, many different conversion - always without any luck - null value or "".
Expected value - when I want to initialize variable using Properties from Dynamic Content I want to have value from input json.
Output from Jira
Output with the same JSON send from Postman
Thanks for any help !
--
Flow example
Flow result
If you send the json with application/json content-type, you could just select the property with the dynamic content, however if not you have to parse it to json format with Parse Json action.
AS for the schema, you need use your json data to generate it with Use sample payload to generate schema. Paste the sample json payload.
Then you will be able to select property. However you could not implement it dynamic content, you have to write the expression. The format will be like this body('Parse_JSON')['test1'] and if your json has array data, you need to point the index it will be like this body('Parse_JSON')['test2'][0]['test3'].
Below is my test, you could have a try.

JSON Syntax Data Formatting

I am wondering if the attached image that represents how JSON data is returned from a stock market API is improperly formatted JSON data.
Here is a link to the API call that the image pictorially represents: https://api.iextrading.com/1.0/stock/market/batch?symbols=A,AAPL&types=chart&range=1d
The API call essentiall returns market data for 1 day of both stock symbol A and AAPL (as an example). However, I thought JSON is typically [{}, {}, {}] or {[{}, {}]} format but this API batch request is {{[{}, {}]}}.
Would be much appreciated is someone could provide some insight.

Store variabe against each JSON object in a list using Dataweave

I am mapping a list of JSON objects from a database to an other list of JSON objects using Dataweave and Transform Message in Anypoint Studio 6.1.1 and Mule 3.8.1.
There are a couple of fields in the input list of JSONs that do not have a mapping in the new output and I want to use the values later in the processing but do not want them to be outputted at the end.
Is there a way I can assign a variable (which will be different) for each JSON object in the output link and then reference it later in the flow?
I just want to see if I can reduce processing time by not having to process one object at a time through Dataweave.
Thanks
You can create another output, besides just Payload in Anypoint Studio. See the following: