Matching data in JsonPath with wiremock - json

I'm trying to create mocks for my login procedure. I use POST method with a couple of fields and login object (with login, password, etc.)
For that I'm using JsonPath. Code below:
{
"request": {
"method": "POST",
"url": "/login",
"bodyPatterns" : [
{"matchesJsonPath" : "$.method"},
{"matchesJsonPath" : "$.params[?(#.clientVersion == "1")]"},
{"matchesJsonPath" : "$.params.login"},
{"matchesJsonPath" : "$.params.password"}
]
},
"response": {
"status": 200,
"bodyFileName": "login.json"
}
}
I'm checking the clientVersion because it's similar to the examples.
My problem is, that with te given POST JSON:
{
"method": "login",
"params": {
"clientVersion": "1",
"login": "test#test.com",
"password": "681819535da188b6ef2"
}
}
I receive 404.
However, when I change
{"matchesJsonPath" : "$.params[?(#.clientVersion == "1")]"},
to normal
{"matchesJsonPath" : "$.params.clientVersion"},
everything works just fine.
So - how to check inside wiremock, using matchesJsonPath if given field equals some value?
How to apply it to the root field like method in my case?
And while we're at it - I had similar problems with checking if the value is not null. I tried to apply regular expressions and such - no luck.

It's working in my case :
wiremock:
"request": {
"urlPathPattern": "/api/authins-portail-rs/authins/inscription/infosperso",
"bodyPatterns" : [
{"matchesJsonPath" : "$[?(#.nir == '123456789')]"},
{"matchesJsonPath" : "$[?(#.nomPatronyme == 'aubert')]"},
{"matchesJsonPath" : "$[?(#.prenoms == 'christian')]"},
{"matchesJsonPath" : "$[?(#.dateNaissance == '01/09/1952')]"}
],
"method": "POST"
}
Json:
{
"nir": "123456789",
"nomPatronyme": "aubert",
"prenoms": "christian",
"dateNaissance": "01/09/1952"
}

Following worked for me.
"matchesJsonPath" : "$.rootItem.itemA[0].item..[?(#.fieldName=='file')]"
Json :
{
"rootItem" : {
"itemA" : [
{
"item" : {
"fieldName" : "file",
"name" : "test"
}
}
]
}
}
Wiremock
{
"request" : {
"urlPattern" : "/testjsonpath",
"method" : "POST",
"bodyPatterns" : [ {
"matchesJsonPath" : "$.rootItem.itemA[0].item..[?(#.fieldName=='file')]"
} ]
},
"response" : {
"status" : 200,
"body" : "{\"result\": \"success\"}",
"headers" : {
"Content-Type" : "application/json"
}
}
}

Update Wiremock. It should work with newer versions >= 2.0.0-beta. Its JsonPath dependency was very outdated (GitHub #261).
Using the double dots operator is semantically not the same, as the filter will also match for elements with the same name deeper down the tree.

try with double dots operator (recursive)
$..params[?(#.clientVersion == "1")]

Related

How to parse JSON data using Lua patterns?

I am integrating lua-module in nginx. I want to check parameter's value is empty or not. Whenever I use following code, I get true result against the JSON request
Request
{
"data": {
"user": {
"username": "Ethen",
"type": "PDF"
}
},
"passport": {
"user": "001"
},
}
Code
local arg = ngx.req.get_body_data();
L="return "..arg:gsub('("[^"]-"):','[%1]=')
T=loadstring(L)()
ngx.print(T.data.user.username)
if T.data.user.username == "" then
ngx.say("Empty username");
end
But when I use this request, I got error (attempt to call a nil value stack traceback)
Request
{
"reference" : "567456314",
"callback_url" : "http://www.example.com/",
"verification_mode" : "any",
"document" : {
"proof" : "data:image/png;base64,iVBOR=“,
"additional_proof": "data:image/png;base64,iVBORw0=",
"supported_types" : ["id_card","driving_license","passport"],
"expiry_date" : "",
"document_number" : ""
},
"address" : {
"proof" : "data:image/png;base64,iVBORw0KG=",
"supported_types" : ["id_card","bank_statement"],
"name" : "",
"issue_date":""
}
}
Code
local arg = ngx.req.get_body_data();
L="return "..arg:gsub('("[^"]-"):','[%1]=')
T=loadstring(L)()
ngx.print(T.document.proof)
if T.document.proof == "" then
ngx.say("Empty proof");
end
What's the problem and solution??....Thanks in advance!!

How to get entire parent node using jq json parser?

I am trying to find a value in the json file and based on that I need to get the entire json data instead of that particular block.
Here is my sample json
[{
"name" : "Redirect to Website 1",
"behaviors" : [ {
"name" : "redirect",
"options" : {
"mobileDefaultChoice" : "DEFAULT",
"destinationProtocol" : "HTTPS",
"destinationHostname" : "SAME_AS_REQUEST",
"responseCode" : 302
}
} ],
"criteria" : [ {
"name" : "requestProtocol",
"options" : {
"value" : "HTTP"
}
} ],
"criteriaMustSatisfy" : "all"
},
{
"name" : "Redirect to Website 2",
"behaviors" : [ {
"name" : "redirect",
"options" : {
"mobileDefaultChoice" : "DEFAULT",
"destinationProtocol" : "HTTPS",
"destinationHostname" : "SAME_AS_REQUEST",
"responseCode" : 301
}
} ],
"criteria" : [ {
"name" : "contentType",
"options" : {
"matchOperator" : "IS_ONE_OF",
"values" : [ "text/html*", "text/css*", "application/x-javascript*" ],
}
} ],
"criteriaMustSatisfy" : "all"
}]
I am trying to match for "name" : "redirect" inside each behaviors array and if it matches then I need the entire block including the "criteria" section, as you can see its under same block {}
I managed to find the values using select methods but not able to get the parent section.
https://jqplay.org/s/BWJwVdO3Zv
Any help is much appreciated!
To avoid unwanted duplication:
.[]
| first(select(.behaviors[].name == "redirect"))
Equivalently:
.[]
| select(any(.behaviors[]; .name == "redirect"))
You can try this jq command:
<file jq 'select(.[].behaviors[].name=="redirect")'

Unable to parse expectation on mockserver-netty when adding Content-Type header

I have an integration test where I am using mockserver-netty (v5.3.0) with a springboot 2.0 application..everything is working fine but if I try to add the Content-Type header, I get the following exception:
java.lang.IllegalArgumentException: Exception while parsing [{
"httpRequest" : {
"method" : "POST",
"body" : {
"type" : "XML",
"xml" : "......"
},
"times" : {
"remainingTimes" : 0,
"unlimited" : true
},
"timeToLive" : {
"unlimited" : true
}
}] for Expectation
at org.mockserver.client.AbstractClient.sendRequest(AbstractClient.java:95)
at org.mockserver.client.AbstractClient.sendExpectation(AbstractClient.java:441)
at org.mockserver.client.ForwardChainExpectation.respond(ForwardChainExpectation.java:25)
The expectation is the following:
{
"method" : "POST",
"body" : {
"type" : "XML",
"xml" : "......"
}
},
"httpResponse" : {
"statusCode" : 200,
"headers" : {
"Content-Type" : [ "text/xml" ]
},
"body" : "......."
},
"times" : {
"remainingTimes" : 0,
"unlimited" : true
},
"timeToLive" : {
"unlimited" : true
}
}
I create it with the following code:
private static HttpResponse responseWithBody(String responseBody, int statusCode, String contentType) {
return HttpResponse.response()
.withStatusCode(statusCode)
.withHeader("Content-Type",contentType)
.withBody(responseBody);
}
If I just comment the line with the .withHeader("Content-Type",contentType) statement, everything runs fine. Any clue about that?
Thanks a lot
I got the same problem. Looks like it was caused by a bug in xml parser used underneath, see https://github.com/jamesdbloom/mockserver/issues/451
Upgrade to mockserver 5.4.1 and you'll be fine :)

Cannot Transform Correctly with Elasticsearch Watcher {{ctx.payload.hits.hits}}

I have a watcher configuration as follows:
{
"trigger": {
"schedule": {
"interval": "5s"
}
},
"input" : {
"search" : {
"request" : {
"indices" : [ "my_index" ],
"types" : [ "my_type" ],
"body" : {
"query" : {
"match_all" : {}
}
}
}
}
},
"transform" : {
"script" : "return [ body: groovy.json.JsonOutput.toJson(ctx.payload.hits.hits)]"
},
"actions" : {
"hbase_webhook" : {
"webhook" : {
"method" : "POST",
"host" : "<some_ip>",
"port" : <some_port>,
"path": "/v0.1/_events",
"body" : "data: {{ctx.payload.body}}"
}
}
}
}
The data posted in the body is not a valid JSON: Something like:
{ 'data: ': { '{"_index":"my_index","_type":"my_type","_source":{"key":"val"}},"_id":"<some_id>","_score":1.0}': '' } }
I don't know how to parse this output as JSON.parse in Node.js won't correctly parse it anyway.
Never. Forget. Headers.
I was forgetting:
"headers" {
"Content-type": "application/json"
}
So it was impossible to parse with any tool.
Ran into this while creating an alert for an endpoint where we just wanted to send off the actual records that matched specific criteria. See sample below:
"actions": {
"my_webhook": {
"webhook": {
"scheme": "https",
"host": "webhook.site",
"port": 443,
"method": "post",
"path": "/webhooksiteguidwouldbehere",
"params": {},
"headers": {
"Content-type": "application/json"
},
"body": "{{#toJson}}ctx.payload.hits.hits{{/toJson}}"
}
}
}
Second Note:
If the body size is set to 0 your hits will be returned as null. :)

Sub-records in Avro with Morphlines

I'm trying to convert JSON into Avro using the kite-sdk morphline module. After playing around I'm able to convert the JSON into Avro using a simple schema (no complex data types).
Then I took it one step further and modified the Avro schema as displayed below (subrec.avsc). As you can see the schema consist of a subrecord.
As soon as I tried to convert the JSON to Avro using the morphlines.conf and the subrec.avsc it failed.
Somehow the JSON paths "/record_type[]/alert/action" are not translated by the toAvro function.
The morphlines.conf
morphlines : [
{
id : morphline1
importCommands : ["org.kitesdk.**"]
commands : [
# Read the JSON blob
{ readJson: {} }
{ logError { format : "record: {}", args : ["#{}"] } }
# Extract JSON
{ extractJsonPaths { flatten: false, paths: {
"/record_type[]/alert/action" : /alert/action,
"/record_type[]/alert/signature_id" : /alert/signature_id,
"/record_type[]/alert/signature" : /alert/signature,
"/record_type[]/alert/category" : /alert/category,
"/record_type[]/alert/severity" : /alert/severity
} } }
{ logError { format : "EXTRACTED THIS : {}", args : ["#{}"] } }
{ extractJsonPaths { flatten: false, paths: {
timestamp : /timestamp,
event_type : /event_type,
source_ip : /src_ip,
source_port : /src_port,
destination_ip : /dest_ip,
destination_port : /dest_port,
protocol : /proto,
} } }
# Create Avro according to schema
{ logError { format : "WE GO TO AVRO"} }
{ toAvro { schemaFile : /etc/flume/conf/conf.empty/subrec.avsc } }
# Create Avro container
{ logError { format : "WE GO TO BINARY"} }
{ writeAvroToByteArray { format: containerlessBinary } }
{ logError { format : "DONE!!!"} }
]
}
]
And the subrec.avsc
{
"type" : "record",
"name" : "Event",
"fields" : [ {
"name" : "timestamp",
"type" : "string"
}, {
"name" : "event_type",
"type" : "string"
}, {
"name" : "source_ip",
"type" : "string"
}, {
"name" : "source_port",
"type" : "int"
}, {
"name" : "destination_ip",
"type" : "string"
}, {
"name" : "destination_port",
"type" : "int"
}, {
"name" : "protocol",
"type" : "string"
}, {
"name": "record_type",
"type" : ["null", {
"name" : "alert",
"type" : "record",
"fields" : [ {
"name" : "action",
"type" : "string"
}, {
"name" : "signature_id",
"type" : "int"
}, {
"name" : "signature",
"type" : "string"
}, {
"name" : "category",
"type" : "string"
}, {
"name" : "severity",
"type" : "int"
}
] } ]
} ]
}
The output on { logError { format : "EXTRACTED THIS : {}", args : ["#{}"] } } I output the following:
[{
/record_type[]/alert / action = [allowed],
/record_type[]/alert / category = [],
/record_type[]/alert / severity = [3],
/record_type[]/alert / signature = [GeoIP from NL,
Netherlands],
/record_type[]/alert / signature_id = [88006],
_attachment_body = [{
"timestamp": "2015-03-23T07:42:01.303046",
"event_type": "alert",
"src_ip": "1.1.1.1",
"src_port": 18192,
"dest_ip": "46.231.41.166",
"dest_port": 62004,
"proto": "TCP",
"alert": {
"action": "allowed",
"gid": "1",
"signature_id": "88006",
"rev": "1",
"signature" : "GeoIP from NL, Netherlands ",
"category" : ""
"severity" : "3"
}
}],
_attachment_mimetype=[json/java + memory],
basename = [simple_eve.json]
}]
UPDATE 2017-06-22
you MUST populate the data in the structure in order for this to work, by using addValues or setValues
{
addValues {
micDefaultHeader : [
{
eventTimestampString : "2017-06-22 18:18:36"
}
]
}
}
after debugging the sources of morphline toAvro, it appears that the record is the first object to be evaluated, no matter what you put in your mappings structure.
the solution is quite simple, but unfortunately took a little extra time, eclipse, running the flume agent in debug mode, cloning the source code and lots of coffee.
here it goes.
my schema:
{
"type" : "record",
"name" : "co_lowbalance_event",
"namespace" : "co.tigo.billing.cboss.lowBalance",
"fields" : [ {
"name" : "dummyValue",
"type" : "string",
"default" : "dummy"
}, {
"name" : "micDefaultHeader",
"type" : {
"type" : "record",
"name" : "mic_default_header_v_1_0",
"namespace" : "com.millicom.schemas.root.struct",
"doc" : "standard millicom header definition",
"fields" : [ {
"name" : "eventTimestampString",
"type" : "string",
"default" : "12345678910"
} ]
}
} ]
}
morphlines file:
morphlines : [
{
id : convertJsonToAvro
importCommands : ["org.kitesdk.**"]
commands : [
{
readJson {
outputClass : java.util.Map
}
}
{
addValues {
micDefaultHeader : [{}]
}
}
{
logDebug { format : "my record: {}", args : ["#{}"] }
}
{
toAvro {
schemaFile : /home/asarubbi/Development/test/co_lowbalance_event.avsc
mappings : {
"micDefaultHeader" : micDefaultHeader
"micDefaultHeader/eventTimestampString" : eventTimestampString
}
}
}
{
writeAvroToByteArray {
format : containerlessJSON
codec : null
}
}
]
}
]
the magic lies here:
{
addValues {
micDefaultHeader : [{}]
}
}
and in the mappings:
mappings : {
"micDefaultHeader" : micDefaultHeader
"micDefaultHeader/eventTimestampString" : eventTimestampString
}
explanation:
inside the code the first field name that is evaluated is micDefaultHeader of type RECORD. as there's no way to specify a default value for a RECORD (logically correct), the toAvro code evaluates this, does not get any value configured in mappings and therefore it fails at it detects (wrongly) that the record is empty when it shouldn't.
however, taking a look at the code, you may see that it requires a Map object, containing no values to please the parser and continue to the next element.
so we add a map object using the addValues and fill it with an empty map [{}]. notice that this must match the name of the record that is causing you an empty value. in my case "micDefaultHeader"
feel free to comment if you have a better solution, as this looks like a "dirty fix"