I have the controller and unit test shown below. The controller returns a sorted list. In the unit test, the sort field is passed correctly; however, the sort direction is not being parsed:
I am following these examples:
Isolated Controller Test can't instantiate Pageable
4. Paging and Sorting
Controller:
#RequestMapping("/telemetry")
public Page<TelemetryMessage> List(Pageable pageable) {
return telemetryMessageRepository.findAll(pageable);
}
Unit Test:
#Before
public void setUp() throws Exception {
mvc = MockMvcBuilders
.standaloneSetup(new TelemetryController(telemetryMessageRepository))
.setCustomArgumentResolvers(new PageableHandlerMethodArgumentResolver())
.setViewResolvers(new ViewResolver() {
#Override
public View resolveViewName(String viewName, Locale locale) throws Exception {
return new MappingJackson2JsonView();
}
})
.build();
// etc.
}
#Test
public void testListReturnsDefault20() throws Exception {
Iterable<TelemetryMessage> expected = telemetryMessageRepository.findAll(new PageRequest(0, 20, Sort.Direction.DESC, "id"));
String json = mapper.writeValueAsString(expected);
MvcResult result = mvc.perform(MockMvcRequestBuilders.get("/telemetry")
.param("sort", "id")
.param("id.dir", "desc")
.accept(MediaType.APPLICATION_JSON))
.andExpect(status().isOk())
.andReturn();
String actual = result.getResponse().getContentAsString();
}
Use comma to separate the Property and the Direction, like following:
MvcResult result = mvc.perform(MockMvcRequestBuilders.get("/telemetry")
.param("sort", "id,desc")
//.param("id.dir", "desc")
Related
I'm not expert at unit test but trying to write unit test for :
#KafkaListener(id = "group_id", topics = "topic" )
public AvroObject listen(AvroObject test, Acknowledgment ack)
But no idea how I can make it when there is and interface as an argument. I try this but not sure is it something useful or not make sense as an test :
#InjectMocks
KafkaConsumer kafkaConsumerTest;
#Test
#DisplayName("Assert Valid Consume")
void consumeValidEvent() throws URISyntaxException, IOException, InterruptedException {
// given
AvroObject event = createEvent(); //Create sample object as AvroObject
// when
AvroObject response = kafkaConsumerTest.listen(event, new Acknowledgment() {
#Override
public void acknowledge() {
}
#Override
public void nack(long sleep) {
//do nothing
}
// then
assertNotNull(response);
assertEquals(response.getCode1() ,98765);
assertEquals(response.getCode2() ,123456);
}
I was wondering if you can give me the best approach for this situation! cheers
I am new to flink i am trying write junit test cases to test KeyedBroadCastProcessFunction. Below is my code ,i am currently calling the getDataStreamOutput method in TestUtils class and passing inputdata and patternrules to method once the input data is evaluated against list of pattern rules and if input data satisfy the condition i will get the signal and calling sink function and returning output data as string in getDataStreamOutput method
#Test
public void testCompareInputAndOutputDataForInputSignal() throws Exception {
Assertions.assertEquals(sampleInputSignal,
TestUtils.getDataStreamOutput(
inputSignal,
patternRules));
}
public static String getDataStreamOutput(JSONObject input, Map<String, String> patternRules) throws Exception {
env.setParallelism(1);
DataStream<JSONObject> inputSignal = env.fromElements(input);
DataStream<Map<String, String>> rawPatternStream =
env.fromElements(patternRules);
//Generate a key,value pair of set of patterns where key is pattern name and value is pattern condition
DataStream<Tuple2<String, Map<String, String>>> patternRuleStream =
rawPatternStream.flatMap(new FlatMapFunction<Map<String, String>,
Tuple2<String, Map<String, String>>>() {
#Override
public void flatMap(Map<String, String> patternRules,
Collector<Tuple2<String, Map<String, String>>> out) throws Exception {
for (Map.Entry<String, String> stringEntry : patternRules.entrySet()) {
JSONObject jsonObject = new JSONObject(stringEntry.getValue());
Map<String, String> map = new HashMap<>();
for (String key : jsonObject.keySet()) {
String value = jsonObject.get(key).toString();
map.put(key, value);
}
out.collect(new Tuple2<>(stringEntry.getKey(), map));
}
}
});
BroadcastStream<Tuple2<String, Map<String, String>>> patternRuleBroadcast =
patternStream.broadcast(patternRuleDescriptor);
DataStream<Tuple2<String, JSONObject>> validSignal = inputSignal.map(new MapFunction<JSONObject,
Tuple2<String, JSONObject>>() {
#Override
public Tuple2<String, JSONObject> map(JSONObject inputSignal) throws Exception {
String source =
inputSignal.getSource();
return new Tuple2<>(source, inputSignal);
}
}).keyBy(0).connect(patternRuleBroadcast).process(new MyKeyedBroadCastProcessFunction());
validSignal.map(new MapFunction<Tuple2<String, JSONObject>,
JSONObject>() {
#Override
public JSONObject map(Tuple2<String, JSONObject> inputSignal) throws Exception {
return inputSignal.f1;
}
}).addSink(new getDataStreamOutput());
env.execute("TestFlink");
}
return (getDataStreamOutput.dataStreamOutput);
}
#SuppressWarnings("serial")
public static final class getDataStreamOutput implements SinkFunction<JSONObject> {
public static String dataStreamOutput;
public void invoke(JSONObject inputSignal) throws Exception {
dataStreamOutput = inputSignal.toString();
}
}
I need to test different inputs with same broadcast rules but each time i am calling this function its again and again doing process from beginning take input signal broadcast data, is there a way i can broadcast once and keeping on sending the input to the method i explored i can use CoFlatMapFunction something like below to combine datastream and keep on sending the input rules while method is running but for this one of the datastream has to keep on getting data from kafka topic again it will overburden on method to load kafka utils and server
DataStream<JSONObject> inputSignalFromKafka = env.addSource(inputSignalKafka);
DataStream<org.json.JSONObject> inputSignalFromMethod = env.fromElements(inputSignal));
DataStream<JSONObject> inputSignal = inputSignalFromMethod.connect(inputSignalFromKafka)
.flatMap(new SignalCoFlatMapper());
public static class SignalCoFlatMapper
implements CoFlatMapFunction<JSONObject, JSONObject, JSONObject> {
#Override
public void flatMap1(JSONObject inputValue, Collector<JSONObject> out) throws Exception {
out.collect(inputValue);
}
#Override
public void flatMap2(JSONObject kafkaValue, Collector<JSONObject> out) throws Exception {
out.collect(kafkaValue);
}
}
I found a link in stackoverflow How to unit test BroadcastProcessFunction in flink when processElement depends on broadcasted data but this is confused me a lot
Any way i can only broadcast only once in Before method in test cases and keeping sending different kind of data to my broadcast function
You can use KeyedTwoInputStreamOperatorTestHarness in order to achieve this for example let's assume you have the following KeyedBroadcastProcessFunction where you define some business logic for both DataStream channels
public class SimpleKeyedBroadcastProcessFunction extends KeyedBroadcastProcessFunction<String, String, String, String> {
#Override
public void processElement(String inputEntry,
ReadOnlyContext readOnlyContext, Collector<String> collector) throws Exception {
//business logic for how you want to process your data stream records
}
#Override
public void processBroadcastElement(String broadcastInput, Context
context, Collector<String> collector) throws Exception {
//process input from your broadcast channel
}
Let's now assume your process function is stateful and is making modifications to the Flink internal state, you would have to create a TestHarness inside your test class to ensure you are able to keep track of the state during testing.
I would then create some unit tests using the following approach:
public class SimpleKeyedBroadcastProcessFunctionTest {
private SimpleKeyedBroadcastProcessFunction processFunction;
private KeyedTwoInputStreamOperatorTestHarness<String, String, String, String> testHarness;
#Before
public void setup() throws Exception {
processFunction = new SimpleKeyedBroadcastProcessFunction();
testHarness = new KeyedTwoInputStreamOperatorTestHarness<>(
new CoBroadcastWithKeyedOperator<>(processFunction, ImmutableList.of(BROADCAST_MAP_STATE_DESCRIPTOR)),
(KeySelector<String, String>) string -> string ,
(KeySelector<String, String>) string -> string,
TypeInformation.of(String.class));
testHarness.setup();
testHarness.open();
}
#After
public void cleanup() throws Exception {
testHarness.close();
}
#Test
public void testProcessRegularInput() throws Exception {
//processElement1 send elements into your regular stream, second param will be the event time of the record
testHarness.processElement1(new StreamRecord<>("Hello", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello", records.get(0).getValue())
}
#Test
public void testProcessBroadcastInput() throws Exception {
//processElement2 send elements into your broadcast stream, second param will be the event time of the record
testHarness.processElement2(new StreamRecord<>("Hello from Broadcast", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello from Broadcast", records.get(0).getValue())
}
}
I am new to java & Junit. Please help to write Junit test case to test the CargoBO method where Equals & Hashcode functionalities are not implemented.Basically i need to compare 2 objects using Equalbuilder class in junit.
public class CargoBO {
public Cargo cargoDetails(String name,String desc,double length,double width) {
return new Cargo(name,desc,length,width);
}
}
public class CargoJUnit {
Cargo cargo;
#Before
public void createObjectForCargo() {
cargo = new Cargo("audi","des",123.00,234.00);
}
#Test
public void testCargoDetails() {
CargoBO cargoBO = new CargoBO();
//assertTrue(cargo.equals(cargoBO.cargoDetails("audi","des",123.00,234.00)));
Assert.assertEquals(cargo, cargoBO.cargoDetails("audi","des",123.00,234.00));
}
}
Correct test case for your scenario is
#Test
public void testCargoDetails() {
String name = "test name";
String desc = "desc";
double length = 10d;
double width = 100d;
Cargo result = cargoBO.cargoDetails(name, desc, length, width);
Assert.assertEquals(cargo.getName, name);
Assert.assertEquals(cargo.getDesc, desc);
Assert.assertEquals(cargo.getLength, length);
Assert.assertEquals(cargo.getWidth, width);
}
You are testing a method which accepts parameters and calls a constructor passing those parameters.
Your test should be verifying if the given parameters are correctly passed by the method or not.
I have this custom matcher:
public class CofmanStringMatcher extends TypeSafeMatcher<String> {
private List<String> options;
private CofmanStringMatcher(final List<String> options) {
this.options = Lists.newArrayList(options);
}
#Override
protected boolean matchesSafely(final String sentResult) {
return options.stream().anyMatch(option -> option.equals(sentResult));
}
public static CofmanStringMatcher isCofmanStringOnOfTheStrings(List<String> options) {
return new CofmanStringMatcher(options);
}
#Override
public void describeTo(final Description description) {
System.out.println("in describeTo");
// description.appendText("expected to be equal to of the list: "+options);
}
}
which compares a string to few options for strings.
when i run this test code:
verify(cofmanService, times(1))
.updateStgConfigAfterSimulation(argThat(isCofmanStringOnOfTheStrings(ImmutableList.of(expectedConditionsStrings , expectedConditionsStrings2))), eq(Constants.addCommitMsg+SOME_REQUEST_ID));
I get this error:
Comparison Failure: <Click to see difference>
Argument(s) are different! Wanted:
cofmanService.updateStgConfigAfterSimulation(
,
"add partner request id = 1234"
);
-> at com.waze.sdkService.services.pubsub.callback.RequestToCofmanSenderTest.localAndRtValidationSucceeds_deployCofmanStg(RequestToCofmanSenderTest.java:131)
Actual invocation has different arguments:
cofmanService.updateStgConfigAfterSimulation(
"some text"
);
The test fails even though the method updateStgConfigAfterSimulation calls with 1st arg that matches on of the list elements
I'm using
mockito 1.10 and hamcrest 1.3
here is the method's signature
void updateStgConfigAfterSimulation(String conditionsMap, String commitMsg) throws Exception
I am in the process of rewriting a bottle neck in the code of the project I am on, and in doing so I am creating a top level item that contains a self populating Ehcache. I am attempting to write a test to make sure that the basic call chain is established, but when the test executes it hands when retrieving the item from the cache.
Here are the Setup and the test, for reference mocking is being done with Mockito:
#Before
public void SetUp()
{
testCache = new Cache(getTestCacheConfiguration());
recordingFactory = new EntryCreationRecordingCache();
service = new Service<Request, Response>(testCache, recordingFactory);
}
#Test
public void retrievesResultsFromSuppliedCache()
{
ResultType resultType = mock(ResultType.class);
Response expectedResponse = mock(Response.class);
addToExpectedResults(resultType, expectedResponse);
Request request = mock(Request.class);
when(request.getResultType()).thenReturn(resultType);
assertThat(service.getResponse(request), sameInstance(expectedResponse));
assertTrue(recordingFactory.requestList.contains(request));
}
private void addToExpectedResults(ResultType resultType,
Response response) {
recordingFactory.responseMap.put(resultType, response);
}
private CacheConfiguration getTestCacheConfiguration() {
CacheConfiguration cacheConfiguration = new CacheConfiguration("TEST_CACHE", 10);
cacheConfiguration.setLoggingEnabled(false);
return cacheConfiguration;
}
private class EntryCreationRecordingCache extends ResponseFactory{
public final Map<ResultType, Response> responseMap = new ConcurrentHashMap<ResultType, Response>();
public final List<Request> requestList = new ArrayList<Request>();
#Override
protected Map<ResultType, Response> generateResponse(Request request) {
requestList.add(request);
return responseMap;
}
}
Here is the ServiceClass
public class Service<K extends Request, V extends Response> {
private Ehcache cache;
public Service(Ehcache cache, ResponseFactory factory) {
this.cache = new SelfPopulatingCache(cache, factory);
}
#SuppressWarnings("unchecked")
public V getResponse(K request)
{
ResultType resultType = request.getResultType();
Element cacheEntry = cache.get(request);
V response = null;
if(cacheEntry != null){
Map<ResultType, Response> resultTypeMap = (Map<ResultType, Response>) cacheEntry.getValue();
try{
response = (V) resultTypeMap.get(resultType);
}catch(NullPointerException e){
throw new RuntimeException("Result type not found for Result Type: " + resultType);
}catch(ClassCastException e){
throw new RuntimeException("Incorrect Response Type for Result Type: " + resultType);
}
}
return response;
}
}
And here is the ResponseFactory:
public abstract class ResponseFactory implements CacheEntryFactory{
#Override
public final Object createEntry(Object request) throws Exception {
return generateResponse((Request)request);
}
protected abstract Map<ResultType,Response> generateResponse(Request request);
}
After wrestling with it for a while, I discovered that the cache wasn't being initialized. Creating a CacheManager and adding the cache to it resolved the problem.
I also had a problem with EHCache hanging, although only in a hello-world example. Adding this to the end fixed it (the application ends normally).
CacheManager.getInstance().removeAllCaches();
https://stackoverflow.com/a/20731502/2736496