How Do I Verify and Capture the Contents of an InputStream in JUnit Mockito? - junit

I have the following setup:
public interface CommandRunner
{
void run(String cmd, InputStream is) throws IOException;
}
public class CommandRunnerInputFile
{
private final CommandRunner commandRunner;
public CommandRunnerInputFile(CommandRunner commandRunner) {
this.commandRunner = commandRunner;
}
public void run(String command, File inputFile) throws IOException {
try (FileInputStream is = new FileInputStream(inputFile)) {
this.commandRunner.run(command, is);
}
}
}
#ExtendWith(MockitoExtension.class)
public class TestCommandRunnerInputFile
{
#Mock CommandRunner commandRunner;
#Captor ArgumentCaptor<InputStream> inputStream;
private CommandRunnerInputFile commandRunnerInputFile;
#BeforeEach
void initService() {
commandRunnerInputFile = new CommandRunnerInputFile(commandRunner);
}
#Test
public void testHappyPath() throws IOException {
ClassLoader classLoader = this.getClass().getClassLoader();
File file = new File(classLoader.getResource("MyTestInputFile.txt").getFile());
commandRunnerInputFile .run("MyApplication.exe", file);
verify(commandRunner).run(eq("MyApplication.exe"), inputStream.capture());
assertEquals('c', (char)inputStream.getValue().read());
}
}
When I run this test, it fails with the following Exception:
java.io.IOException: Stream Closed
at java.io.FileInputStream.read0(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:207)
This make sense to me since, during the execution of the method, the underlying FileInputStream is closed when it has completed execution. That is to say, at the point in time that the FileInputStream is captured by Mockito, it is open, but by the time I verify that it was passed (and attempt to verify its contents) it is closed. What can I do not to simply capture the InputStream object itself, but actually capture its contents for verification?

Related

mockito simulate test exception thrown by service

I am trying to test exception thrown from service when I use the instance of it.
Like I am trying to use Imock in Trans.class and use IMock method.
Below is the code
public class MockImpl implements IMock{
public String external(String str) throws Exception {
if(str.equals("throw")){
throw new Exception("Thrown exception.");
}
return str;
}
}
public class Trans {
private IMock mc;
public static int failed;
public String performTrans(String str) throws Exception {
return call(str);
}
private String call(String str) throws Exception {
mc = new MockImpl();
try {
return mc.external(str);
}
catch(Exception e){
failed++;
throw e;
}
}
}
In test class I am trying to do this
public class TestMock {
#Test
public void testMock() throws Exception {
Trans trans = mock(Trans.class);
IMock iMock = mock(IMock.class);
doThrow(new Exception()).when(iMock).external(any(String.class));
for(int i =0;i<10 ;i++){
trans.performTrans("any");
}
System.out.println(Trans.failed);
assertEquals(9, Trans.failed);
}
}
As I am new to this, I am not sure if my understanding is correct, what I am trying to achieve is
When I do # Trans.performTrans(String);
Then doThrow(new Exception()).when(iMock).external(any(String.class)); should happen. How can I tell Mockito or any test framework that The exception should be thrown should be simulated from the Imock service method, even if it is indirectly called.
UPDATE
After trying to test this way
#RunWith(MockitoJUnitRunner.class)
public class TestMock {
#Test
public void testMock() throws Exception {
Trans trans = new Trans();
IMock iMock = mock(IMock.class);
trans.setInter(iMock);
//doThrow(new Exception()).when(iMock).external(any(String.class));
trans.performTrans("abc");
verify(iMock).external(new String("a"));
System.out.println(Trans.failed);
assertEquals(9, Trans.failed);
}
}
I get this error.
Wanted but not invoked: iMock.external("a");
-> at com.app.TestMock.testMock(TestMock.java:32) Actually, there were zero interactions with this mock.
what could be wrong?

How to test keyedbroadcastprocessfunction in flink?

I am new to flink i am trying write junit test cases to test KeyedBroadCastProcessFunction. Below is my code ,i am currently calling the getDataStreamOutput method in TestUtils class and passing inputdata and patternrules to method once the input data is evaluated against list of pattern rules and if input data satisfy the condition i will get the signal and calling sink function and returning output data as string in getDataStreamOutput method
#Test
public void testCompareInputAndOutputDataForInputSignal() throws Exception {
Assertions.assertEquals(sampleInputSignal,
TestUtils.getDataStreamOutput(
inputSignal,
patternRules));
}
public static String getDataStreamOutput(JSONObject input, Map<String, String> patternRules) throws Exception {
env.setParallelism(1);
DataStream<JSONObject> inputSignal = env.fromElements(input);
DataStream<Map<String, String>> rawPatternStream =
env.fromElements(patternRules);
//Generate a key,value pair of set of patterns where key is pattern name and value is pattern condition
DataStream<Tuple2<String, Map<String, String>>> patternRuleStream =
rawPatternStream.flatMap(new FlatMapFunction<Map<String, String>,
Tuple2<String, Map<String, String>>>() {
#Override
public void flatMap(Map<String, String> patternRules,
Collector<Tuple2<String, Map<String, String>>> out) throws Exception {
for (Map.Entry<String, String> stringEntry : patternRules.entrySet()) {
JSONObject jsonObject = new JSONObject(stringEntry.getValue());
Map<String, String> map = new HashMap<>();
for (String key : jsonObject.keySet()) {
String value = jsonObject.get(key).toString();
map.put(key, value);
}
out.collect(new Tuple2<>(stringEntry.getKey(), map));
}
}
});
BroadcastStream<Tuple2<String, Map<String, String>>> patternRuleBroadcast =
patternStream.broadcast(patternRuleDescriptor);
DataStream<Tuple2<String, JSONObject>> validSignal = inputSignal.map(new MapFunction<JSONObject,
Tuple2<String, JSONObject>>() {
#Override
public Tuple2<String, JSONObject> map(JSONObject inputSignal) throws Exception {
String source =
inputSignal.getSource();
return new Tuple2<>(source, inputSignal);
}
}).keyBy(0).connect(patternRuleBroadcast).process(new MyKeyedBroadCastProcessFunction());
validSignal.map(new MapFunction<Tuple2<String, JSONObject>,
JSONObject>() {
#Override
public JSONObject map(Tuple2<String, JSONObject> inputSignal) throws Exception {
return inputSignal.f1;
}
}).addSink(new getDataStreamOutput());
env.execute("TestFlink");
}
return (getDataStreamOutput.dataStreamOutput);
}
#SuppressWarnings("serial")
public static final class getDataStreamOutput implements SinkFunction<JSONObject> {
public static String dataStreamOutput;
public void invoke(JSONObject inputSignal) throws Exception {
dataStreamOutput = inputSignal.toString();
}
}
I need to test different inputs with same broadcast rules but each time i am calling this function its again and again doing process from beginning take input signal broadcast data, is there a way i can broadcast once and keeping on sending the input to the method i explored i can use CoFlatMapFunction something like below to combine datastream and keep on sending the input rules while method is running but for this one of the datastream has to keep on getting data from kafka topic again it will overburden on method to load kafka utils and server
DataStream<JSONObject> inputSignalFromKafka = env.addSource(inputSignalKafka);
DataStream<org.json.JSONObject> inputSignalFromMethod = env.fromElements(inputSignal));
DataStream<JSONObject> inputSignal = inputSignalFromMethod.connect(inputSignalFromKafka)
.flatMap(new SignalCoFlatMapper());
public static class SignalCoFlatMapper
implements CoFlatMapFunction<JSONObject, JSONObject, JSONObject> {
#Override
public void flatMap1(JSONObject inputValue, Collector<JSONObject> out) throws Exception {
out.collect(inputValue);
}
#Override
public void flatMap2(JSONObject kafkaValue, Collector<JSONObject> out) throws Exception {
out.collect(kafkaValue);
}
}
I found a link in stackoverflow How to unit test BroadcastProcessFunction in flink when processElement depends on broadcasted data but this is confused me a lot
Any way i can only broadcast only once in Before method in test cases and keeping sending different kind of data to my broadcast function
You can use KeyedTwoInputStreamOperatorTestHarness in order to achieve this for example let's assume you have the following KeyedBroadcastProcessFunction where you define some business logic for both DataStream channels
public class SimpleKeyedBroadcastProcessFunction extends KeyedBroadcastProcessFunction<String, String, String, String> {
#Override
public void processElement(String inputEntry,
ReadOnlyContext readOnlyContext, Collector<String> collector) throws Exception {
//business logic for how you want to process your data stream records
}
#Override
public void processBroadcastElement(String broadcastInput, Context
context, Collector<String> collector) throws Exception {
//process input from your broadcast channel
}
Let's now assume your process function is stateful and is making modifications to the Flink internal state, you would have to create a TestHarness inside your test class to ensure you are able to keep track of the state during testing.
I would then create some unit tests using the following approach:
public class SimpleKeyedBroadcastProcessFunctionTest {
private SimpleKeyedBroadcastProcessFunction processFunction;
private KeyedTwoInputStreamOperatorTestHarness<String, String, String, String> testHarness;
#Before
public void setup() throws Exception {
processFunction = new SimpleKeyedBroadcastProcessFunction();
testHarness = new KeyedTwoInputStreamOperatorTestHarness<>(
new CoBroadcastWithKeyedOperator<>(processFunction, ImmutableList.of(BROADCAST_MAP_STATE_DESCRIPTOR)),
(KeySelector<String, String>) string -> string ,
(KeySelector<String, String>) string -> string,
TypeInformation.of(String.class));
testHarness.setup();
testHarness.open();
}
#After
public void cleanup() throws Exception {
testHarness.close();
}
#Test
public void testProcessRegularInput() throws Exception {
//processElement1 send elements into your regular stream, second param will be the event time of the record
testHarness.processElement1(new StreamRecord<>("Hello", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello", records.get(0).getValue())
}
#Test
public void testProcessBroadcastInput() throws Exception {
//processElement2 send elements into your broadcast stream, second param will be the event time of the record
testHarness.processElement2(new StreamRecord<>("Hello from Broadcast", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello from Broadcast", records.get(0).getValue())
}
}

Camel - CSV Headers setting not working

I have CSV files without headers. Since I'm using 'useMaps' I want to specify the headers dynamically. If I set headers statically and then use in route it works fine as below Approach 1 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
ArrayList<String> list = new ArrayList<String>();
list.add("DeviceName");
list.add("Brand");
list.add("status");
list.add("type");
list.add("features_c");
list.add("battery_c");
list.add("colors");
csv.setHeader(list);
from("direct:bulkImport")
.convertBodyTo(String.class)
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
However, if the list is passed via Camel headers as below then it does not work Approach 2 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
from("direct:bulkImport")
.convertBodyTo(String.class)
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
ArrayList<String> fileHeaders = (ArrayList<String>)headers.get(Constants.FILE_HEADER_LIST);
if (fileHeaders != null && fileHeaders.size() > 0) {
csv.setHeader(fileHeaders);
}
}
})
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
What could be missing in the Approach 2?
The big difference between approach 1 and 2 is the scope.
In approach 1 you fully configure the CSV data format. This is all done when the Camel Context is created, since the data format is shared within the Camel Context. When messages are processed, it is the same config for all messages.
In approach 2 you just configure the basics globally. The header configuration is within the route and therefore can change for every single message. Every message would overwrite the header configuration of the context-global data format instance.
Without being sure about this, I guess that it is not possible to change a context-global DataFormat inside the routes.
What would you expect (just for example) when messages are processed in parallel? They would overwrite the header config against each other.
As an alternative, you could use a POJO where you can do your dynamic marshal / unmarshal from Java code.

Exception when embedded Cassandra running multiple test cases: Keyspace ** does not exist

The schema creation is done inside the target class SimpleRepo.java.
public class SimpleRepo {
private Cluster cluster;
private Session session;
private String keyspace = "app";
private String table = "myTable";
#Autowired
public SimpleRepo(Cluster cluster) {
this.cluster = cluster;
}
#PostConstruct
private void init() {
session = cluster.connect();
createSchema();
}
public void createSchema() {
.....
}
}
When running the SimpleTest.java with one test case inside, it will pass. When running with two cases inside, only the first one passes and the second one throws out the exception: "com.datastax.driver.core.exceptions.InvalidQueryException: Keyspace app does not exist".
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {TestConfig.class, SimpleRepo.class})
#TestExecutionListeners({CassandraUnitTestExecutionListener.class, DependencyInjectionTestExecutionListener.class})
#EmbeddedCassandra
public class SimpleTest {
#Autowired
private SimpleRepo simpleRepo;
#Test
public void testSave() throws Exception {
......
}
#Test
public void testDel() throws IOException {
......
}
}
#Configuration
public class TestConfig {
#Bean(destroyMethod = "shutdown")
public Cluster cluster() throws ConfigurationException, TTransportException, IOException, InterruptedException{
EmbeddedCassandraServerHelper.startEmbeddedCassandra();
Cluster cluster = Cluster.builder()
.addContactPoints("127.0.0.1")
.withPort(9142)
.build();
return cluster;
}
}
Why the keyspace created inside createSchema() would disappear when running the second test case? How to fix this problem? Thanks for any guidance.
CassandraUnitTestExecutionListener calls cleanServer() after each test. That calls EmbeddedCassandraServerHelper.cleanEmbeddedCassandra() which drops all non-system keyspaces.
Your code creates your keyspace only once, in #PostConstruct, so only 1st test case can use it.
It looks like you should use #CassandraDataSet to initialize a keyspace for each new test
https://github.com/jsevellec/cassandra-unit/wiki/Spring-for-Cassandra-unit

JMock triggers AssertionError: invokation expected once, never invoked - but it has been invoked

I'm pretty new to programming with java but I've tried to directly start with unit-testing and therefore also used JMock. I have already implemented some test-cases (with JMock) that work, but this one I just can't get to run.
What I did:
I wrote a test-class which creates a mock object and then I'm expectation one (using oneOf) invocation. After running the unit test it says it fails (but the logs say otherwise, as i print out the data I returned at the invocation using will(returnValue(x)).
The next funny/weird thing is - if I change the oneOf to "never" the unit test succeeds, but it throws an Exception:
Exception in thread "Thread-2" java.lang.AssertionError: unexpected invocation: blockingQueue.take()
expectations:
expected never, never invoked: blockingQueue.take(); returns
what happened before this: nothing!
Here the code:
#RunWith(JMock.class)
public class ExecuteGameRunnableTest {
private Mockery context = new JUnit4Mockery();
private Thread testObject;
private BlockingQueue<Game> queueMock;
private Executor executorMock;
#SuppressWarnings("unchecked")
#Before
public void setUp() {
queueMock = context.mock(BlockingQueue.class);
executorMock = context.mock(Executor.class);
testObject = new Thread(new ExecuteGameRunnable(queueMock, executorMock, true));
}
#After
public void tearDown() {
queueMock = null;
executorMock = null;
testObject = null;
}
#Test
public void testQueueTake() throws InterruptedException {
final Game game = new Game();
game.setId(1);
game.setProcessing(false);
context.checking(new Expectations() {{
never(queueMock).take(); will(returnValue(game));
}});
testObject.start();
context.assertIsSatisfied();
}
}
and the runnable that I'm testing:
public class ExecuteGameRunnable implements Runnable {
private BlockingQueue<Game> queue;
private Executor executor;
private Boolean unitTesting = false;
static Logger logger = Logger.getLogger(ExecuteGameRunnable.class);
public ExecuteGameRunnable(BlockingQueue<Game> queue, Executor executor) {
this.queue = queue;
this.executor = executor;
}
public ExecuteGameRunnable (BlockingQueue<Game> queue, Executor executor, Boolean unitTesting) {
this(queue,executor);
this.unitTesting = unitTesting;
}
public void run() {
try {
do {
if (Thread.interrupted()) throw new InterruptedException();
Game game = queue.take();
logger.info("Game "+game.getId()+" taken. Checking if it is processing"); // THIS ONE PRINTS OUT THE GAME ID THAT I RETURN WITH JMOCK-FRAMEWORK
if (game.isProcessing()) {
continue;
}
game.updateProcessing(true);
executor.execute(new Runnable() {
#Override
public void run() {
// TODO Auto-generated method stub
}
});
} while (!unitTesting);
} catch (InterruptedException ex) {
logger.info("Game-Execution-Executor interrupted.");
return;
} catch (DataSourceException ex) {
logger.fatal("Unable to connect to DB whilst executing game: "+id_game,ex);
return;
}
}
}
JMock isn't thread safe. It's intended to support unit testing, rather than what is a very small integration test. Frankly, in this case I'd use a real BlockingQueue rather than a mock one. And there is no way you should have a unitTesting flag in your production code.
One more thing, you don't need to set the fields in the test class to null, jUnit flushes the instance for every test.