Full Json match with RestAssured - json

I'm using REST-Assured to test some API. My API clearly respond with a JSON and according to the doc if this is the response:
{
"id": "390",
"data": {
"leagueId": 35,
"homeTeam": "Norway",
"visitingTeam": "England",
},
"odds": [{
"price": "1.30",
"name": "1"
},
{
"price": "5.25",
"name": "X"
}]
}
I could test like this:
#Test
public void givenUrl_whenSuccessOnGetsResponseAndJsonHasRequiredKV_thenCorrect() {
get("/events?id=390")
.then()
.statusCode(200)
.assertThat()
.body("data.leagueId", equalTo(35));
}
Surely this is readable but I would a full comparison of the JSON (i.e.: this is the JSON response; this is a canned JSON - a resource file would be perfect - are those JSON equals?). Does REST-Assured offer something like that or I need to make it manually.

Use RestAssured's JsonPath to parse the json file into a Map and then compare it with Hamcrest Matchers. This way the order etc didn't matter.
import static org.hamcrest.Matchers.equalTo;
import io.restassured.path.json.JsonPath;
...
JsonPath expectedJson = new JsonPath(new File("/path/to/expected.json"));
given()
...
.then()
.body("", equalTo(expectedJson.getMap("")));

Karate is exactly what you are looking for - you can perform a full equality match of a JSON payload in one step.
And for the cases where you have dynamic values (generated keys, timestamps) Karate provides a very elegant way for you to ignore (or just validate the format of) these keys.
One the primary motivations for creating Karate was to come up with a better alternative to REST-assured. You can refer to this document which can help you evaluate Karate and make a case for it in your organization: Karate vs REST-assured.

REST Assured does not support JSON comparison, only schema and parts of the body as you have in your question. What you can do is using Hamcrest's JSON comparitorSameJSONAs in Hamcrest JSON SameJSONAs

If somebody is looking for method without parsing json-file.
You can check the body size at the beginning using Matchers.aMapWithSize(size), and then check the contents as usual.
Example:
#Test
public void getAccount_forbidden_whenUserIsAnonymous() {
RestAssured
.get("/account")
.then()
.statusCode(HttpServletResponse.SC_FORBIDDEN)
.body("", Matchers.aMapWithSize(5),
"timestamp", Matchers.notNullValue(),
"status", Matchers.equalTo(HttpServletResponse.SC_FORBIDDEN),
"error", Matchers.equalTo("Forbidden"),
"message", Matchers.equalTo("Access Denied"),
"path", Matchers.equalTo("/account"));
}

You can use Validate with JSON SCHEMA in RestAssured.
Try this code:
// Base Test [BaseTest.java]
public class BaseTest {
protected RequestSpecification requestSpecificationToMerge = new RequestSpecBuilder()
.setBaseUri("BASE URL")
.setContentType(ContentType.JSON)
.build();
#BeforeMethod
public void setFilter() {
RestAssured.filters(new AllureRestAssured());
}
}
// Test [Home.java]
public class Home extends BaseTest {
#Test(priority = 0)
public void getHome() {
given()
.spec(requestSpecificationToMerge)
.basePath("/your endpoint")
.when()
.get()
.then()
.log().body()
.body(matchesJsonSchemaInClasspath("home.json"));
}
// JSON SCHEMA [home.json]
{
"type": "object",
"required": [
"data",
"meta",
"status"
],
"properties": {
"data": {
"type": ["array", "null"],
"items": {
"type": "object",
"required": [
"id",
"title",
"sorting"
],
"properties": {
"id": {
"type": "integer"
},
"title": {
"type": "string"
},
"sorting": {
"type": "integer"
}
}
}
},
"meta": {
"type": ["object", "null"],
"required": [
"pagination"
],
"items": {
"type": "object",
"required": [
"current_page",
"per_page",
"total",
"total_page"
],
"properties": {
"current_page": {
"type": "integer"
},
"per_page": {
"type": "integer"
},
"total": {
"type": "integer"
},
"total_page": {
"type": "integer"
}
}
}
},
"status": {
"type": "object",
"required": [
"code",
"message_client",
"message_server"
],
"properties": {
"code": {
"type": "integer",
"enum": [
200,
404
]
},
"message_client": {
"type": "string"
},
"message_server": {
"type": "string"
}
}
}
}
}

Easy way:
String s = "{\"ip\": \"your_ip\"}";
given().log().all().contentType(ContentType.JSON).get("http://ip.jsontest.com/").then().log().all().assertThat().body(containsString(s))
Not easy way: you can create custom matcher or use jsonpath, it has options to comapre jsons.

Apparently, rest-assured only provides capabilities to validate the schema as described here.
However, it's quite simple to make an exact comparison using jackson-databind and junit.
We should write a function that compares a body returned by rest-assured with a file in the resources directory
import org.junit.Assert;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.json.JsonMapper;
import com.fasterxml.jackson.databind.node.ObjectNode;
void assertJsonEquals(String expectedJson, ResponseBodyExtractionOptions actualBody) throws IOException {
Assert.assertNotNull("The request returned no body", expectedJson);
final ObjectMapper mapper = new ObjectMapper();
Assert.assertEquals(
mapper.readTree(Objects.requireNonNull(getClass().getClassLoader().getResource(expectedJsonPath)).openStream().readAllBytes()),
mapper.readTree(body.asByteArray())
);
}
Then, use it as shown below
final ResponseBodyExtractionOptions actualBody = given()
.accept("application/json")
.contentType(MediaType.APPLICATION_JSON)
.when()
.get("...")
.then()
.extract().body();
assertJsonEquals("expected-payload.json", actualBody);

You can make use of JSONAssert Library to match the entire JSON Response. I recently wrote a blog on how to achieve it.
Below is the basic usage on how to use the library:
JSONAssert.assertEquals(expectedResponse, actualResponse, JSONCompareMode.LENIENT);

Related

RealmList<String> as a JSON Schema - Mongo DB Realm

I have a simple model class from which I need to generate the schema on Mongo DB Atlas. But I'm having troubles when it comes to defining RealmList<String> inside a JSON schema. If I insert "array" as a bsonType, I get an error. What should I write instead?
Model class:
class Note : RealmObject {
#PrimaryKey
var _id: ObjectId = ObjectId.create()
var title: String = ""
var description: String = ""
var images: RealmList<String> = realmListOf()
var date: RealmInstant = RealmInstant.from(System.currentTimeMillis(),0)
}
Current Schema:
{
"bsonType": "object",
"properties": {
"_id": {
"bsonType": "objectId"
},
"title": {
"bsonType": "string"
},
"description": {
"bsonType": "string"
},
"images": {
"bsonType": "array"
},
"date": {
"bsonType": "date"
}
},
"required": [
"_id",
"title",
"description",
"images",
"date"
],
"title": "Note"
}
I am not sure which mode you're using but if you're in development mode, when you add an object in the SDK, the server will automatically generate a matching object, as long as the changes are additive, like adding a new object or property
In the queston, the 'images' bson definition looks incomplete
"images": {
"bsonType": "array"
},
While it is an array, it's an array of strings so I believe it should look more like this
"images": {
"bsonType": "array",
"items": {
"bsonType": "string"
}
}
Where the type of items is defined as a string

In Logic Apps JSON Array while parsing throwing error for single object but for multiple objects it is working fine

While parsing JSON in Azure Logic App in my array I can get single or multiple values/objects (Box as shown in below example)
Both type of inputs are correct but when only single object is coming then it is throwing an error "Invalid type. Expected Array but got Object "
Input 1 (Throwing error) : -
{
"MyBoxCollection":
{
"Box":{
"BoxName": "Box 1"
}
}
}
Input 2 (Working Fine) : -
{
"MyBoxCollection":
[
{
"Box":{
"BoxName": "Box 1"
},
"Box":{
"BoxName": "Box 2"
}
}]
}
JSON Schema :
"MyBoxCollection": {
"type": "object",
"properties": {
"box": {
"type": "array",
items": {
"type": "object",
"properties": {
"BoxName": {
"type": "string"
},
......
.....
..
}
Error Details :-
[
{
"message": "Invalid type. Expected Array but got Object .",
"lineNumber": 0,
"linePosition": 0,
"path": "Order.MyBoxCollection.Box",
"schemaId": "#/properties/Root/properties/MyBoxCollection/properties/Box",
"errorType": "type",
"childErrors": []
}
]
I used to use the trick of injecting a couple of dummy rows in the resultset as suggested by the other posts, but I recently found a better way. Kudos to Thomas Prokov for providing the inspiration in his NETWORG blog post.
The JSON parse schema accepts multiple choices as type, so simply replace
"type": "array"
with
"type": ["array","object"]
and your parse step will happily parse either an array or a single value (or no value at all).
You may then need to identify which scenario you're in: 0, 1 or multiple records in the resultset? I'm pasting below how you can create a variable (ResultsetSize) which takes one of 3 values (rs_0, rs_1 or rs_n) for your switch:
"Initialize_ResultsetSize": {
"inputs": {
"variables": [
{
"name": "ResultsetSize",
"type": "string",
"value": "rs_n"
}
]
},
"runAfter": {
"<replace_with_name_of_previous_action>": [
"Succeeded"
]
},
"type": "InitializeVariable"
},
"Check_if_resultset_is_0_or_1_records": {
"actions": {
"Set_ResultsetSize_to_0": {
"inputs": {
"name": "ResultsetSize",
"value": "rs_0"
},
"runAfter": {},
"type": "SetVariable"
}
},
"else": {
"actions": {
"Set_ResultsetSize_to_1": {
"inputs": {
"name": "ResultsetSize",
"value": "rs_1"
},
"runAfter": {},
"type": "SetVariable"
}
}
},
"expression": {
"and": [
{
"equals": [
"#string(body('<replace_with_name_of_Parse_JSON_action>')?['<replace_with_name_of_root_element>']?['<replace_with_name_of_list_container_element>']?['<replace_with_name_of_item_element>']?['<replace_with_non_null_element_or_attribute>'])",
""
]
}
]
},
"runAfter": {
"Initialize_ResultsetSize": [
"Succeeded"
]
},
"type": "If"
},
"Process_resultset_depending_on_ResultsetSize": {
"cases": {
"Case_no_record": {
"actions": {
},
"case": "rs_0"
},
"Case_one_record_only": {
"actions": {
},
"case": "rs_1"
}
},
"default": {
"actions": {
}
},
"expression": "#variables('ResultsetSize')",
"runAfter": {
"Check_if_resultset_is_0_or_1_records": [
"Succeeded",
"Failed",
"Skipped",
"TimedOut"
]
},
"type": "Switch"
}
For this problem, I met another stack overflow post which is similar to this problem. While there is one "Box", it will be shown as {key/value pair} but not [array] when we convert it to json format. I think it is caused by design, so maybe we can just add a record "Box" at the source of your xml data such as:
<Box>specific_test</Box>
And do some operation to escape the "specific_test" in the next steps.
Another workaround for your reference:
If your json data has only one array, we can use it to do a judgment. We can judge the json data if it contains "[" character. If it contains "[", the return value is the index of the "[" character. If not contains, the return value is -1.
The expression shows as below:
indexOf('{"MyBoxCollection":{"Box":[aaa,bbb]}}', '[')
The screenshot above is the situation when it doesn't contain "[", it return -1.
Then we can add a "If" condition. If >0, do "Parse JSON" with one of the schema. If =-1, do "Parse JSON" with the other schema.
Hope it would be helpful to your problem~
We faced a similar issue. The only solution we find is by manipulating the XML before conversion. We updated XML nodes which needs to be an array even when we have single element using this. We used a Azure function to update the required XML attributes and then returned the XML for conversion in Logic Apps. Hope this helps someone.

How do I get a configurationSection as string?

I am working with .NETCore 2.0 I want to load a complete configuration-section as string.
To be specific, I want to do Json-Schema validation and my schema is stored in appsettings.json:
{
...
"schemas": {
"project": {
"title": "Project",
"type": "object",
"required": [ "param1" ],
"additionalProperties": false,
"properties": {
"param1": {
"type": "string"
},
"param2": {
...
}
}
}
},
...
}
Now I want to load configuration-section "schemas.project" as string and let Json.NET Schema do the schema parsing.
Something like this:
var schemaString = this.configuration.GetSection("schemas.project").Get<string>();
var schema = JSchema.Parse(schemaString);
...
Is there a way to load a complete configuration-section as string? Otherwise I'll read in the schema-file as string..

angularJS $resource response is both array AND object

got this json file:
[
{
"name": "paprika",
"imgSrc": "img/paprika.jpg"
},
{
"name": "kurkku",
"imgSrc": "img/kurkku.jpg"
},
{
"name": "porkkana",
"imgSrc": "img/porkkana.jpg"
},
{
"name": "lehtisalaatti",
"imgSrc": "img/lehtisalaatti.jpg"
},
{
"name": "parsakaali",
"imgSrc": "img/parsakaali.jpg"
},
{
"name": "sipula",
"imgSrc": "img/sipuli.jpg"
},
{
"name": "peruna",
"imgSrc": "img/peruna.jpg"
},
{
"name": "soijapapu",
"imgSrc": "img/soijapapu.jpg"
},
{
"name": "pinaatti",
"imgSrc": "img/pinaatti.jpg"
}
]
Which I successfully fetch in a factory:
factory('getJson', ['$resource', function($resource) {
return $resource('json/vegs.json', {}, {
query: {method:'GET', isArray:true}
});
}]);
in my Controller I can get the json's file content:
var vegs = getJson.query();
$scope.vegs = vegs;
console.log(vegs)
console.log(typeof vegs)
The weird part is the first console.log produces an array of objects, as expected.
The second console says it's an "object", and not an array.
I can get the .json content to my view using {{vegs}}, and I can use ng-repeat as well, tho in the controller I can't do vegs[0] or vegs.length. It comes out empty.
I'm breaking my head on this for over 3 hours now :)
This isn't an 'answer'. Just an observation on one part of your issue. (Sorry, can't comment yet...new to stackoverflow).
Just a note on your comment that "The second console says it's an "object", and not an array." Using typeof on an array will always return "object".
There are various (and debated, it seems) ways to test if it's an array--Array.isArray(obj) for example.
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray

Creating an Avro schema for a simple json

I'm trying to build an avro schema for the following json:
{
"id":1234,
"my_name_field": "my_name",
"extra_data": {
"my_long_value": 1234567890,
"my_message_string": "Hello World!",
"my_int_value": 777,
"some_new_field": 1
}
}
The value for 'id' and 'my_name_field' are known but the fields in 'extra_data' dynamically change and are unknown.
The avro schema I had in mind is:
{
"name":"my_record",
"type":"record",
"fields":[
{"name":"id", "type":"int", "default":0},
{"name":"my_name_field", "type":"string", "default":"NoName"},
{ "name":"extra_data", "type":{"type":"map", "values":["null","int","long","string"]} }
]
}
My first idea was to make 'extra_data' a record with a map, but this does not work:
{ "name":"extra_data", "type":{"type":"map", "values":["null","int","long","string"]} }
I get:
AvroTypeException: Expected start-union. Got VALUE_NUMBER_INT
apache gives some nice examples in https://cwiki.apache.org/confluence/display/Hive/AvroSerDe but none seem to do the job.
This is the unit test I run to check:
public class AvroTest {
#Test
public void readRecord() throws IOException {
String event="{\"id\":1234,\"my_name_field\":\"my_name\",\"extra_data\":{\"my_long_value\":1234567890,\"my_message_string\":\"Hello World!\",\"my_int_value\":777,\"some_new_field\":1}}";
SchemaRegistry<Schema> registry = new com.linkedin.camus.schema.MySchemaRegistry();
DecoderFactory decoderFactory = DecoderFactory.get();
ObjectMapper mapper = new ObjectMapper();
GenericDatumReader<GenericData.Record> reader = new GenericDatumReader<GenericData.Record>();
Schema schema = registry.getLatestSchemaByTopic("record_topic").getSchema();
reader.setSchema(schema);
HashMap hashMap = mapper.readValue(event, HashMap.class);
long now = Long.valueOf(hashMap.get("now").toString())*1000;
GenericData.Record read = reader.read(null, decoderFactory.jsonDecoder(schema, event));
}
Would appreciate help with this,
Thanks.
If the list of extra data fields is really unknown using multiple optional value fields may help, like this:
{
"name":"my_record",
"type":"record",
"fields":[
{"name":"id", "type":"int", "default":0},
{"name":"my_name_field", "type":"string", "default":"NoName"},
{"name":"extra_data", "type": "array", "items": {
{"name": "extra_data_entry", "type":"record", "fields": [
{"name":"extra_data_field_name", "type": "string"},
{"name":"extra_data_field_type", "type": "string"},
{"name":"extra_data_field_value_string", "type": ["null", "string"]},
{"name":"extra_data_field_value_int", "type": ["null", "int"]},
{"name":"extra_data_field_value_long", "type": ["null", "long"]}
]}
}}
]
}
Then you can select the extra_data_field_value_* value based on the extra_data_field_type for that field.