How to convert epoch to timestamp in NiFi within a JSON file? - json

I'm having issues with getting an epoch conversion to Timestamp to work properly. So far my example timestamp looks like the following:
{"createTime": 1510932843000}
What my end goal is to make it look like the following:
2017-11-17 3:34:03.000
The things I've tried so far are the UpdateRecord and JoltTransformation Processor. For the UpdateRecord I have tried various ways but all end in an error. The current code I have for this is:
${field.value:format("yyyy-MM-dd HH:mm:ss.SSS")}
Which results in the following error:
JSON Object due to java.lang.NumberFormatException: For input string: "2017-11-17 15:34:03.000": For input string: "2017-11-17 15:34:03.000"
I have also tried the code without the multiply(1000) to the same effect.
I have also tried a Jolt Transformation of the following code:
{
"createTime": "${createTime:append('000'):format('yyyy-MM-dd HH:mm:ss.SSS')}"
}
This however results in the following:
"createTime": "1970-01-01 00:00:00.000"
Which isn't what I'm looking for as its the incorrect date result. Am I doing something wrong within my code itself or is another factor occurring? I've been working with this and searching all over for different kind of results and have tried multiple different formats with no success. Any help with this would be greatly appreciated!

My preferred solution:
Use a ScriptedTransformRecord processor:
Record Reader: JsonTreeReader
Record Writer: JsonRecordSetWriter
Script Language: Groovy
Script Body:
import java.time.Instant;
import java.time.format.DateTimeFormatter;
import java.time.ZoneId;
def formatter = DateTimeFormatter.ofPattern("yyyy-MM-dd HH:mm:ss.SSS").withZone(ZoneId.of("Europe/Bratislava"))
record.setValue("createTimeFormatted", formatter.format(Instant.ofEpochMilli(record.getAsLong("createTime"))))
return record
Output json:
{
"createTime" : 1510932843000,
"createTimeFormatted" : "2017-11-17 16:34:03.000"
}
Another approach
Use 2 processors: JoltTransformJSON (convert type from Long to String) -> UpdateRecord (convert date).
JoltTransformJSON processor:
Jolt Transformation DSL: Chain
Jolt specification:
[
{
"operation": "modify-overwrite-beta",
"spec": {
"createTime": "=toString"
}
}
]
UpdateRecord processor:
Record Reader: JsonTreeReader
Record Writer: JsonRecordSetWriter
Replacement Value Strategy: Literal Value
/createTime (dynamic property): ${field.value:format("yyyy-MM-dd HH:mm:ss.SSS")}
Output json:
{
"createTime" : "2017-11-17 16:34:03.000"
}

Related

Apache Nifi Transfrom json field to timestamp

I have the json with unix-timestamp field. I like to extract year from it.
So, for example:
{"eventno": "event1",
"unixtimestamp": 1589379890}
Expected result:
{"eventno": "event1",
"unixtime": 2020}
I try to do this using JoltTransfromJSON and NiFi expression language, but my attempts failed. One of them:
[
{
"operation": "shift",
"spec": {
"unixtime": "${unixtimestamp:multiply(1000):format('yyyy', 'GMT')}"
}
}
]
How can I transform it?
#GrigorySkvortsov
The Expression Language syntax should be:
${attribute:expressionLanguage():functions()}
If what you have above isn't just a typo retest after removing the } after unixtimestamp.
Unit Test outside of Jolt Transform with updateAttribute Processor to dial in the correct Expression Language chain. Here is an example I made to test it:
Then the 4 values are:

Length is not worked in JSON Extractor in Jmeter

I need to get the count of card from json file. For this I've used $.storedCards.cards.lenght
in JSON Extractor but it doesn't work. There is an error message:
Options AS_PATH_LIST and ALWAYS_RETURN_LIST are not allowed when using path functions!
After that I've tried JSR223 PostProcessor with next script on goovy
def jsonText = '''${AllCards}''' //${AllCards} has json value
def json = new JsonSlurper().parseText(jsonText)
log.info( "Json length---------->"+json.resource.size())
${CardsCount} = props.get("4") //vars.put(json.resource.size.toString())
but there is problem with set value to my variable. Or when i've created variable in Groovy it was impossible to use outside from script.
My json file
"storedCards":
{
"cards":
[
{
"CardId":"123",
"cardBrand":"Visa",
"lastFourDigits":"2968",
},
{
"CardId":"321",
"cardBrand":"Visa",
"lastFourDigits":"2968",
},
..........
],
How can i get the count of card and set to my Variables? what should i use for this?
Your JSON data seems to be invalid. Assuming you have the valid JSON like below, I'm answering your question.
{
"storedCards": {
"cards": [
{
"CardId": "123",
"cardBrand": "Visa",
"lastFourDigits": "2968"
},
{
"CardId": "321",
"cardBrand": "Visa",
"lastFourDigits": "2968"
}
]
}
}
You dont need to write Groovy code, you can resolve this using JSON Extractor. Instead of using length function, use JSON path predicate like this-
$.storedCards.cards[*]
Though Variable you used in JSON Extractor won't give the solution right away, another JMeter function helps - __RandomFromMultipleVars
Excerpt from documentation -
The RandomFromMultipleVars function returns a random value based on the variable values provided by Source Variables.
The variables can be simple or multi-valued as they can be generated by the following extractors:
Boundary Extractor
Regular Expression Extractor
CSS Selector Extractor
JSON Extractor
XPath Extractor
XPath2 Extractor
Multi-value vars are the ones that are extracted when you set -1 for
Match Numbers. This leads to creation of match number variable called
varName_matchNr and for each value to the creation of variable
varName_n where n = 1, 2, 3 etc.
So once you use the predicate, you will get the count in the yourVariableName_matchNr. Example:-
Hope this help.

Return nested JSON in AWS AppSync query

I'm quite new to AppSync (and GraphQL), in general, but I'm running into a strange issue when hooking up resolvers to our DynamoDB tables. Specifically, we have a nested Map structure for one of our item's attributes that is arbitrarily constructed (its complexity and form depends on the type of parent item) — a little something like this:
"item" : {
"name": "something",
"country": "somewhere",
"data" : {
"nest-level-1a": {
"attr1a" : "foo",
"attr1b" : "bar",
"nest-level-2" : {
"attr2a": "something else",
"attr2b": [
"some list element",
"and another, for good measure"
]
}
}
},
"cardType": "someType"
}
Our accompanying GraphQL type is the following:
type Item {
name: String!
country: String!
cardType: String!
data: AWSJSON! ## note: it was originally String!
}
When we query the item we get the following response:
{
"data": {
"genericItemQuery": {
"name": "info/en/usa/bra/visa",
"country": "USA:BRA",
"cardType": "visa",
"data": "{\"tourist\":{\"reqs\":{\"sourceURL\":\"https://travel.state.gov/content/passports/en/country/brazil.html\",\"visaFree\":false,\"type\":\"eVisa required\",\"stayLimit\":\"30 days from date of entry\"},\"pages\":\"One page per stamp required\"}}"
}}}
The problem is we can't seem to get the Item.data field resolver to return a JSON object (even when we attach a separate field-level resolver to it on top of the general Query resolver). It always returns a String and, weirdly, if we change the expected field type to String!, the response will replace all : in data with =. We've tried everything with our response resolvers, including suggestions like How return JSON object from DynamoDB with appsync?, but we're completely stuck at this point.
Our current response resolver for our query has been reverted back to the standard response after none of the suggestions in the aforementioned post worked:
## 'Before' response mapping template on genericItemQuery query; same result as the 'After' listed below **
#set($result = $ctx.result)
#set($result.data = $util.parseJson($ctx.result.data))
$util.toJson($result)
## 'After' response mapping template **
$util.toJson($ctx.result)
We're trying to avoid a situation where we need to include supporting types for each nest level in data (since it changes based on parent Item type and in cases like the example I gave it can have three or four tiers), and we thought changing the schema type to AWSJSON! would do the trick. I'm beginning to worry there's no way to get around rebuilding our base schema, though. Any suggestions to the contrary would be helpful!
P.S. I've noticed in the CloudWatch logs that the appropriate JSON response exists under the context.result.data response field, but somehow there's the following transformedTemplate (which, again, I find very unusual considering we're not applying any mapping template except to transform the result into valid JSON):
"arn": ...
"transformedTemplate": "{data={tourist={reqs={sourceURL=https://travel.state.gov/content/passports/en/country/brazil.html, visaFree=false, type=eVisa required, stayLimit=30 days from date of entry}, pages=One page per stamp required}}, resIds=USA:BRA, cardType=visa, id=info/en/usa/bra/visa}",
"context": ...
Apologies for the lengthy question, but I'm stumped.
AWSJSON is a JSON string type so you will always get back a string value (this is what your type definition must adhere to).
You could try to make a type for data field which contains all possible fields and then resolve fields to a corresponding to a parent type or alternatively you could try to implement graphQL interfaces

How to use jq to get a value of decimal/number type from a JSON response which is not surrounded by " "

I am new to shell scripting and I need some help.
I am trying to use jq to get values from a api response and check for its correctness.
Here is a sample for how the response looks like,
{
"data" : {
"transactionType" : "Sales",
"transactionSubType" : "DomesticSale",
"Items" : [ {
"itemID" : "2",
"itemType" : "Good",
"amount" : 5.0,
"tax" : 1.0
} ]
}
}
I am able to get the values for transactionType or transactionsubtype or even ItemID values etc as given below
jq '.data.transactionType'
jq '.data.Items[0].itemID'
for Transaction type and item id
but when it comes to values of numeric types i.e., without the quotes in it, I don't get any value.
I am using similar syntax for the numeric type also as shown below.
jq '.data.Items[0].amount'
jq '.data.Items[0].tax'
Please help!!!
Your jq invocations are fine, but the sample data is missing a final closing brace ("}"), so perhaps you were not feeding jq properly.
If you're wondering why you didn't see an error message, it's almost certainly because jq 1.5 is not very good about handling incomplete JSON. The problem has since been fixed at "master". With the current version, you'd see something like this:
parse error: Unfinished JSON term at EOF at line 15, column 0

Jmeter JSON Extractor retrieve the second item from last of a list

I have a JSON response like below
{
"queryStartDate": "20170523134739822",
"queryEndDate": "20170623134739822",
"Rows": [
{
"hasScdHistoryOnly": false,
"Values": [
"1",
"53265",
"CO"
]
},
{
"hasScdHistoryOnly": false,
"Values": [
"1",
"137382",
"CO"
]
},
{
"hasScdHistoryOnly": false,
"Values": [
"1",
"310824",
"CO"
]
}
]
}
I am using Jmeter's JSON Extractor post-processor to receive the second value from the last of the 'Values' list. i.e. 53265, 137382, 310824.
I've tried to use $.Rows[*].Values[-2:-1], and $.Rows[*].Values[(#.length-2)], according to Stefan's introduction: http://goessner.net/articles/JsonPath/index.html#e2, but neither of them are working. Would you please help me out?
I believe JMeter is using JayWay JSON Path library, so you should be looking for the documentation here instead.
In general I would recommend using JSR223 PostProcessor as an alternative to JSON Path Extractors, both are applicable for basic scenarios only, when it comes to advanced queries and operators their behaviour is flaky.
Add JSR223 PostProcessor as a child of the request which returns above JSON
Make sure you have "groovy" selected in the "Language" drop down and "Cache compiled script if available" box is ticked
Put the following code into "Script" area
def values = com.jayway.jsonpath.JsonPath.parse(prev.getResponseDataAsString()).read('$..Values')
values.eachWithIndex { val, idx ->
vars.put('yourVar_' + (idx + 1), val.get(val.size()-2))
}
It should generate the following JMeter Variables:
yourVar_1=53265
yourVar_2=137382
yourVar_3=310824
which seem to be something you're looking for.
References:
Groovy: Parsing and producing JSON
Apache Groovy - Why and How You Should Use It
Using View Results tree's JSon Path Tester I could see that the following expression you used for extracting the values were not correct (correct for online JSONPath Online Evaluator but not working for JMeter)
Used Expression: $.Rows[*].Values[-2:-1]
Output from JSon Path Tester: No Match Found.
Used Expression: $.Rows[*].Values[(#.length-2)]
Output from JSon Path Tester: Exception: Could not parse token starting at position 16. Expected ?, ', 0-9, *
If the expression $.Rows[*].Values[1] is used it extracts the desired responses.
Used Expression: $.Rows[*].Values[1]
Output from JSon Path Tester:
Result[0]=53265
Result[1]=137382
Result[2]=310824