Postman/Newman junit report customization - junit

I'm using postman and newman to perform automated tests and I do a JUnit export in order to exploit them in TFS.
However, when I open my .xml report, failures are indicated as follows:
-<failure type="AssertionFailure">
-<![CDATA[Failed 1 times.]]>
</failure>
I would like to know if it is possible to customize the "Failed 1 times." information in order to pass more relevant data about the failure (ie. json body error and description)
Thank you
Alexandre

Well, finally I found out how to proceed (not a clean way but sufficient for my purpose, so far):
I impact the file C:\Users\<myself>\AppData\Roaming\npm\node_modules\newman\lib\reporters\junit\index.js
Request's data and response can be recovered from 'executions' object:
stringExecutions = JSON.stringify(executions); //provide information about the arguments of the object "executions"
from this I can take general information by json-parsing this element and extracting what I want:
jsonExecutions = JSON.parse(stringExecutions)
jsonExecutions[0].response._details.code // gives me the http return code,
jsonExecutions[0].response._details.name // gives me the status,
jsonExecutions[0].response._details.detail //gives a bit more details
Error data (at test case/testsuite level) can be recovered from the 'err.error' object:
stringData = JSON.stringify(err.error); jsonData = JSON.parse(stringData);
from that I extract the data I need, ie.
jsonData.name // the error type
jsonData.message // the error detail
jsonData.stacktrace // the error stack
by the way, in the original file, stack cannot be displayed as there is no 'stack' argument in error.err (it is named 'stacktrace').
Finally failure data (at test step/testcase level) can be recovered from the 'failures' object:
stringFailure = JSON.stringify(failures); jsonFailure = JSON.parse(stringFailure);
from this I extract:
jsonFailure[0].name // the failure type
jsonFailure[0].stack // the failure stack
For my purpose, I add response details from jsonExecutions to my testsuite error data, which is much more verbose in the XML report than previousely.
If there is a cleaner/smarter way to perform this, do not hesitate to tell me, I'll be grateful
Next step : do it clean by creating a custom reporter. :)
Alexandre

Related

How to use logger to print messages with assert

I am trying to implement a logger in my repo and I am having some issues with implementing logger with Junit.
Sample assertion:
logger.info("Asserting the response.");
assertThat(response.statusCode())
.withFailMessage("The test failed with status code:" + response.statusCode())
.isEqualTo(200);
I want to use logger.error() function in place of withFailMessage but I can't seem to find any method.
Standard assertions (i.e., assertThat()) are meant to fail immediately with an AssertionError.
If you would like to have custom logic in case of failures, Soft Assertions together with the callback feature might be what you are looking for.
Your example would become something like:
SoftAssertions softly = new SoftAssertions();
softly.setAfterAssertionErrorCollected(error -> logger.error("Assertion failed: {}", error));
softly.assertThat(response.statusCode())
.isEqualTo(200);

How do I get django "Data too long for column '<column>' at row" errors to print the actual value?

I have a Django application. Sometimes in production I get an error when uploading data that one of the values is too long. It would be very helpful for debugging if I could see which value was the one that went over the limit. Can I configure this somehow? I'm using MySQL.
It would also be nice if I could enable/disable this on a per-model or column basis so that I don't leak user data to error logs.
When creating model instances from outside sources, one must take care to validate the input or have other guarantees that this data cannot violate constraints.
When not calling at least full_clean() on the model, but directly calling save, one bypasses Django's validators and will only get alerted to the problem by the database driver at which point it's harder to obtain diagnostics:
class JsonImportManager(models.Manager):
def import(self, json_string: str) -> int:
data_list = json.loads(json_string) # list of objects => list of dicts
failed = 0
for data in data_list:
obj = self.model(**data)
try:
obj.full_clean()
except ValidationError as e:
print(e.message_dict) # or use better formatting function
failed += 1
else:
obj.save()
return failed
This is of course very simple, but it's a good boilerplate to get started with.

Collect JSON object in a file when a Junit test fails

I have ~50 JSON arrays as an array of models being plugged into Unit tests to compare resultant configs. Each file looks like this:
0.json
1.json... and so on
[{model1},{model2},{model3}]
I am trying to run unit tests to compare the resultant configs and want to run the tests in a manner that the test itself keeps running and collect the models if an assertion fails and output it to a json file somewhere.
Say, model2 fails, I want to collect model2 into a file output.json as an array
Till now, the code looks like this, even if the test is file by file, its fine, but will save me days of effort:
#Test
public void compareAWithB() throws Exception {
File lbJsonFile1 = new File("src/test/resources/iad_ad3/6.json");
compareAWithBHelper(lbJsonFile1);
}
public void compareAWithBHelper(File lbJsonFile) throws Exception {
Model[] dtos = new ObjectMapper().readValue(lbJsonFile, Model[].class);
for(Model dto : dtos) {
Model model = ModelConverter.apiToDao(dto);
String A = doSomeThing();
String B = doSomething2();
Assert.assertEquals(A,B);
//Required: if assert fails, collect the json object and continue
}
I tried using SoftAssertions in AssertJ, but weirdly, it was not printing out all the json objects OR maybe, I don't really understand the checkThat() method properly.
Tried using collectors.checkThat, couldn't get it to work reliably. This is a production area, so, don't have much room for errors, and wanna reduce the manual effort.
Made another attempt to use collectors as one of the posts on stackoverflow, couldn't get it to work reliably
/*try {
collector.checkThat(A, CoreMatchers.equalTo(B));
} catch (AssertionError error) {
System.out.println(dto.toString());
throw new AssertionError(error.getMessage());
}*/
Can someone please help ?
If you want to gather all assertion errors and not stop at the first error then soft assertions is a good candidate to use. To get started with soft assertions you can follow the guide available here: https://assertj.github.io/doc/#assertj-core-soft-assertions.
collector.checkThat does not come from AssertJ (neither anything from your code samples), it's a bit confusing, I would suggest to write a reproducible test so that people can help more easily.
Alternatively if you are dealing with JSON, you can give a try to addressed by https://github.com/lukas-krecan/JsonUnit which provides first class citizen JSON assertions.
Hope it helps.

AWS SDK in java - How to get activities from worker when multple execution on going for a state machine

AWS Step Function
My problem is to how to sendTaskSuccess or sendTaskFailuer to Activity which are running under the state machine in AWS .
My Actual intent is to Notify the specific activities which belongs to particular State machine execution.
I successfully send notification to all waiting activities by activityARN. But my actual need is to send notification to specific activity which belong to particular state machine execution .
Example . StateMachine - SM1
There two execution on going for SM1-- SM1E1, SM1E2 . In that case I want to sendTaskSuccess to activity which belongs to SM1E1 .
follwoing code i used . But it send notification to all activities
GetActivityTaskResult getActivityTaskResult = client.getActivityTask(new GetActivityTaskRequest()
.withActivityArn("arn detail"));
if (getActivityTaskResult.getTaskToken() != null) {
try {
JsonNode json = Jackson.jsonNodeOf(getActivityTaskResult.getInput());
String outputResult = patientRegistrationActivity.setStatus(json.get("patientId").textValue());
System.out.println("outputResult " + outputResult);
SendTaskSuccessRequest sendTaskRequest = new SendTaskSuccessRequest().withOutput(outputResult)
.withTaskToken(getActivityTaskResult.getTaskToken());
client.sendTaskSuccess(sendTaskRequest);
} catch (Exception e) {
client.sendTaskFailure(
new SendTaskFailureRequest().withTaskToken(getActivityTaskResult.getTaskToken()));
}
As far as I know you have no control over which task token is returned. You may get one for SM1E1 or SM1E2 and you cannot tell by looking at the task token. GetActivityTask returns "input" so based on that you may be able to tell which execution you are dealing with but if you get a token you are not interested in, I don't think there's a way to put it back so you won't be able to get it again with GetActivityTask later. I guess you could store it in a database somewhere for use later.
One idea you can try is to use the new callback integration pattern. You can specify the Payload parameter in the state definition to include the task token like this token.$: "$$.Task.Token" and then use GetExecutionHistory to find the TaskScheduled state of the execution you are interested in and retrieve the parameters.Payload.token value and then use that with sendTaskSuccess.
Here's a snippet of my serverless.yml file that describes the state
WaitForUserInput: #Wait for the user to do something
Type: Task
Resource: arn:aws:states:::lambda:invoke.waitForTaskToken
Parameters:
FunctionName:
Fn::GetAtt: [WaitForUserInputLambdaFunction, Arn]
Payload:
token.$: "$$.Task.Token"
executionArn.$: "$$.Execution.Id"
Next: DoSomethingElse
I did a POC to check and below is the solution .
if token is consumed by getActivityTaskResult.getTaskToken() and if your conditions not satisfied by request input then you can use below line to avoid token consumption .awsStepFunctionClient.sendTaskHeartbeat(new SendTaskHeartbeatRequest().withTaskToken(taskToken))

JUnit4 - format assert message _after_ failure detected

Supposing I have non-trivial calculation function taking a bunch of parameters. And I have to test it for at least thousands of cases.
And I would like to have detailed message with all parameters values specified when certain case fails. I can format message string before check and pass it to assertXXX method. But it is very ineffective. My test spends most of its time formatting strings.
My question is:
Is there any smart way to format message string and pass it to JUnit after a test failure is detected and only then?
if (foo.conditionThatCanFail()) {
fail("condition failed for "+ foo);
}
As #bmargulies suggested, some assertion frameworks l(ike Hamcrest, Fest or Truth) will provide a nicely formatted failure message if an assertion fails.