Data Factory - Retrieve value from field with dash "-" from JSON file - json

In my pipeline I reach through REST API using GET request to a 3rd party database. As an output I receive a bunch of JSON files. The number of JSON files I have to download (same as number of iterations I will have to use) is in one of the fields in JSON file. The problem is that the field's name is 'page-count' which contains "-".
#activity('Lookup1').output.firstRow.meta.page.page-count
Data Factory considers dash in field's name as a minus sign, so I get an error instead of value from that field.
{"code":"BadRequest","message":"ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'activity('Lookup1').output.firstRow.meta.page.['page-count']'","target":"pipeline/Product_pull/runid/f615-4aa0-8fcb-5c0a144","details":null,"error":null}
This is how the structure of JSON file looks like:
"firstRow": {
"meta": {
"page": {
"number": 1,
"size": 1,
"page-count": 7300,
"record-count": 7300
},
"non-compliant-record-count": 7267
}
},
"effectiveIntegrationRuntime": "intergrationRuntimeTest1",
"billingReference": {
"activityType": "PipelineActivity",
"billableDuration": [
{
"meterType": "SelfhostedIR",
"duration": 0.016666666666666666,
"unit": "Hours"
}
]
},
"durationInQueue": {
"integrationRuntimeQueue": 1
}
}
How to solve this problem?

The below syntax works when retrieving the value for a json element with a hyphen. It is otherwise treated as a minus sign by the parser. It does not seem to be documented by Microsoft however I managed to get this to work through trial and error on a project of mine.
#activity('Lookup1').output.firstRow.meta.page['page-count']

This worked for us too. We had the same issue where we couldn't reference an output field that contained a dash(-). We referenced this post and used the square brackets and single quote and it worked!
Example below.
#activity('get_token').output.ADFWebActivityResponseHeaders['Set-Cookie']

Related

JSON path evaluation

given this node in a json response:
{
"name": "RFM912Feilkode",
"valueCodeableConcept": {
"coding": [
{
"system": "http://xxx/error-code",
"code": "0"
}
],
"text": "OK"
}
I want to verify that the text "OK" is present in Gatling using Scala syntax.
Something (pseudo code ish):
.check(jsonPath("$..valueCodeableConcept.text").is("OK"))
but this does not work. Any tips on how to "hit" the OK value and check if it exists?
Your json isn't valid. Missing curly bracket.
I tried this and all works fine:
.check(jsonPath("$.valueCodeableConcept.text").is("OK"))
If you want to extract some json based on other elements in the json (what you seem to be getting at in the comments on other answers), then you can use filters in the jsonPath like
$.[?(#.name=="RFM912Feilkode")].valueCodeableConcept.text
which will get you the text element from valueCodeableConcept where the associated name is RFM912Feilkode

Mule 4 - How do I process a non-standard CSV file (row by row)

A conventional CSV file will contain rows that all have the same columns, and an optional a header row, however in this situation I need to process a CSV that seems non-standard in that it has a "header" row that contains 5 columns, followed by an unspecified number of "body" rows that contain about 15 columns each (unrelated to the header row), and ends with a "footer" row that contains 4 columns.
It looks like the entire file is representing an object, something like this:
headerValue1,headerValue2,headerValue3,headerValue4,headerValue5
bodyvalue1,bodyvalue2,bodyvalue3,bodyvalue4,bodyvalue5,bodyvalue6,bodyvalue7,bodyvalue8,bodyvalue9
bodyvalue1,bodyvalue2,bodyvalue3,bodyvalue4,bodyvalue5,bodyvalue6,bodyvalue7,bodyvalue8,bodyvalue9
footervalue1,footervalue2,footervalue3,footervalue4
I need to convert it to JSON format and so have been trying to set the CSV values to an Array of objects using a for-loop but have had no luck (no useful code to post). Both the CSV formats with header and without in mule don't seem to work as it seems to expect that each column will contain the same field type, which they don't.
Also tried to define a flat-file schema for it but had no luck either (could be due to my limited knowledge in the area).
So my question is how to correctly, or efficiently, pass this CSV data into a usable array or object, or perhaps even straight into JSON format?
DataWeave handles this case. Unnamed columns get a generic column name ('columns_N'). For an example see the documentation. A direct payload convertion (ie 'payload') works.
DataWeave example:
... set the csv into the payload and be sure its MIME type is to application/csv...
<ee:transform doc:name="Transform Message">
<ee:message >
<ee:set-payload ><![CDATA[%dw 2.0
output application/json
---
payload]]>
</ee:set-payload>
</ee:message>
</ee:transform>
Output:
[
{
"headerValue1": "bodyvalue1",
"headerValue2": "bodyvalue2",
"headerValue3": "bodyvalue3",
"headerValue4": "bodyvalue4",
"headerValue5": "bodyvalue5",
"column_5": "bodyvalue6",
"column_6": "bodyvalue7",
"column_7": "bodyvalue8",
"column_8": "bodyvalue9"
},
{
"headerValue1": "bodyvalue1",
"headerValue2": "bodyvalue2",
"headerValue3": "bodyvalue3",
"headerValue4": "bodyvalue4",
"headerValue5": "bodyvalue5",
"column_5": "bodyvalue6",
"column_6": "bodyvalue7",
"column_7": "bodyvalue8",
"column_8": "bodyvalue9"
},
{
"headerValue1": "footervalue1",
"headerValue2": "footervalue2",
"headerValue3": "footervalue3",
"headerValue4": "footervalue4"
}
]
If I understand correctly, your file is a flat file structure consist of header, data/body, footer, where each record / segment is a delimited record. Mule's flat file reader will not work as it can only support fixed-width records. What you can do is to separate each group of records into string and then read each record as csv. I can do that in two transform step, Transform 1 - to read the file and parse the payload as array of strings, Transform 2 - to transform the array of strings to proper json format. Note here that there might performance issue (if your flat file is too big) as it will hold the payload as string in memory.
Flow:
Transform 1:
%dw 2.0
output application/java
---
payload splitBy "\r\n"
Transform 2:
%dw 2.0
output application/json
var sizePayload = sizeOf(payload)
---
{
header: read(payload[0], "application/csv", {"header" : false})[0],
body: read(payload[1 to sizePayload-2] joinBy "\n", "application/csv", {"header" : false}),
footer: read(payload[sizePayload-1], "application/csv", {"header" : false})[0],
}
Given a sample file below:
headerValue1,headerValue2,headerValue3,headerValue4,headerValue5
bodyvalue1,bodyvalue2,bodyvalue3,bodyvalue4,bodyvalue5,bodyvalue6,bodyvalue7,bodyvalue8,bodyvalue9
bodyvalue12ndRow,bodyvalue2,bodyvalue3,bodyvalue4,bodyvalue5,bodyvalue6,bodyvalue7,bodyvalue8,bodyvalue92ndRow
footervalue1,footervalue2,footervalue3,footervalue4
This will result to below json:
{
"header": {
"column_0": "headerValue1",
"column_1": "headerValue2",
"column_2": "headerValue3",
"column_3": "headerValue4",
"column_4": "headerValue5"
},
"body": [
{
"column_0": "bodyvalue1",
"column_1": "bodyvalue2",
"column_2": "bodyvalue3",
"column_3": "bodyvalue4",
"column_4": "bodyvalue5",
"column_5": "bodyvalue6",
"column_6": "bodyvalue7",
"column_7": "bodyvalue8",
"column_8": "bodyvalue9"
},
{
"column_0": "bodyvalue12ndRow",
"column_1": "bodyvalue2",
"column_2": "bodyvalue3",
"column_3": "bodyvalue4",
"column_4": "bodyvalue5",
"column_5": "bodyvalue6",
"column_6": "bodyvalue7",
"column_7": "bodyvalue8",
"column_8": "bodyvalue92ndRow"
}
],
"footer": {
"column_0": "footervalue1",
"column_1": "footervalue2",
"column_2": "footervalue3",
"column_3": "footervalue4"
}
}

How to parse dynamic json reponse and get specific value and pass it as an input to next request

I get .json file as a response from an API and from that file I should parse and fins specific parameter and pass it as an input to the next request, how do I do that using Katalon.
If I say
response = JSON.parse("response.json");
it says it is unable to identify JSON as valid. Can someone help me out with the solution?
Your JSON is invalid, maybe it is a copy-paste issue.
The valid JSON should be
{
"responseStatusCode": "OK",
"data": {
"screenName": "employeeTimeslip",
"screenType": "Redirect",
"searchResultCount": 0,
"rows": [],
"tabs": [],
"searchParams": {
"employeeID": "000092926",
"timeslipNumber": "201900019701"
}
}
}
So, you were missing a "," between "OK" and "data" and two closing curly braces at the end of the file.
You can check JSON files for validity yourself using online JSON validators, for example, this one.
i found a way to read specific parameter from the json response file like below:
val scn = scenario("ClaimSubmission")
.exec(http("request_2")
.post("URL")
.headers(headers_2)
.body(RawFileBody("json file path"))
.check(jsonPath("$..timeslipnumber").find.saveAs("timeslipnumber")))
Timeslip number would be retrieved using : .check(jsonPath("$..timeslipnumber").find.saveAs("timeslipnumber")))

JSON Deserialization on Talend

Trying to figuring out how to deserialize this kind of json in talend components :
{
"ryan#toofr.com": {
"confidence":119,"email":"ryan#toofr.com","default":20
},
"rbuckley#toofr.com": {
"confidence":20,"email":"rbuckley#toofr.com","default":15
},
"ryan.buckley#toofr.com": {
"confidence":18,"email":"ryan.buckley#toofr.com","default":16
},
"ryanbuckley#toofr.com": {
"confidence":17,"email":"ryanbuckley#toofr.com","default":17
},
"ryan_buckley#toofr.com": {
"confidence":16,"email":"ryan_buckley#toofr.com","default":18
},
"ryan-buckley#toofr.com": {
"confidence":15,"email":"ryan-buckley#toofr.com","default":19
},
"ryanb#toofr.com": {
"confidence":14,"email":"ryanb#toofr.com","default":14
},
"buckley#toofr.com": {
"confidence":13,"email":"buckley#toofr.com","default":13
}
}
This JSON comes from the Toofr API where documentation can be found here .
Here the actual sitation :
For each line retreived in the database, I call the API and I got this (the first name, the last name and the company change everytime.
Does anyone know how to modify the tExtractJSONField (or use smthing else) to show the results in tLogRow (for each line in the database) ?
Thank you in advance !
EDIT 1:
Here's my tExtractJSONfields :
When using tExtractJSONFields with XPath, you need
1) a valid XPath loop point
2) valid XPath mapping to your structure relative to the loop path
Also, when using XPath with Talend, every value needs a key. The key cannot change if you want to loop over it. Meaning this is invalid:
{
"ryan#toofr.com": {
"confidence":119,"email":"ryan#toofr.com","default":20
},
"rbuckley#toofr.com": {
"confidence":20,"email":"rbuckley#toofr.com","default":15
},
but this structure would be valid:
{
"contact": {
"confidence":119,"email":"ryan#toofr.com","default":20
},
"contact": {
"confidence":20,"email":"rbuckley#toofr.com","default":15
},
So with the correct data the loop point might be /contact.
Then the mapping for Confidence would be confidence (the name from the JSON), the mapping for Email would be email and vice versa for default.
EDIT
JSONPath has a few disadvantages, one of them being you cannot go higher up in the hierarchy. You can try finding out the correct query with jsonpath.com
The loop expression could be $.*. I am not sure if that will satisfy your need, though - it has been a while since I've been using JSONPath in Talend because of the downsides.
I have been ingesting some complex json structures and did this via minimal json libraries, and tjava components within talend.

Building json path expression - presence of DOT in attribute name

We are working with a legacy system which gives json responses. We trying to test these json endpoints with jmeter. So, we are trying to use the json path extractor plugin for the purpose. But the structure of the json path is causing an issue in creating json path expressions.
The structure of the json which we are receiving from the server is as follows.
{
"ns9.Shopping": {
"transactionID": "XXXXXNEKIHJO7SRHN1",
"transactionStatus": "Success",
"ns9.shoppingResponseIDs": {
"ns9.owner": "1P",
"ns9.responseId": "abcdefghijklmnop"
},
"ns9.offersGroup": {"ns9.thanksGiving": [
{
"ns9.owner": "DL",
"ns9.retailOffer": [
{
"ns9.offerId": "offer1DL",
"ns9.price": 2639.08,
"ns9.currencyCode": "USD",
"ns9.taxTotal": 961.08,
"ns9.taxCode": "USD",
.........
The presence of . [DOT] in the attribute name is causing issues in my json path expression.
In short, can some one help in finding the "transactionID" from "ns9.Shopping"?
You can try to add Regular Expression Extractor within your HTTP Request element.
Put this regex:
"transactionID": "([a-zA-Z0-9]*)"
I hope this will help you.