How can i write a test case in Junit for amazon Rekognition.
public class SearchFaceMatchingImageCollection {
public static final String collectionId = "MyCollection";
public static final String bucket = "bucket";
public static final String photo = "input.jpg";
public static void main(String[] args) throws Exception {
AmazonRekognition rekognitionClient = AmazonRekognitionClientBuilder.defaultClient();
ObjectMapper objectMapper = new ObjectMapper();
// Get an image object from S3 bucket.
Image image=new Image()
.withS3Object(new S3Object()
.withBucket(bucket)
.withName(photo));
// Search collection for faces similar to the largest face in the image.
SearchFacesByImageRequest searchFacesByImageRequest = new SearchFacesByImageRequest()
.withCollectionId(collectionId)
.withImage(image)
.withFaceMatchThreshold(70F)
.withMaxFaces(2);
SearchFacesByImageResult searchFacesByImageResult =
rekognitionClient.searchFacesByImage(searchFacesByImageRequest);
System.out.println("Faces matching largest face in image from" + photo);
List < FaceMatch > faceImageMatches = searchFacesByImageResult.getFaceMatches();
for (FaceMatch face: faceImageMatches) {
System.out.println(objectMapper.writerWithDefaultPrettyPrinter()
.writeValueAsString(face));
System.out.println();
}
}
}
For the above program I wanted to write a Junit testcase. Kindly help me with the same
Related
I am new to flink i am trying write junit test cases to test KeyedBroadCastProcessFunction. Below is my code ,i am currently calling the getDataStreamOutput method in TestUtils class and passing inputdata and patternrules to method once the input data is evaluated against list of pattern rules and if input data satisfy the condition i will get the signal and calling sink function and returning output data as string in getDataStreamOutput method
#Test
public void testCompareInputAndOutputDataForInputSignal() throws Exception {
Assertions.assertEquals(sampleInputSignal,
TestUtils.getDataStreamOutput(
inputSignal,
patternRules));
}
public static String getDataStreamOutput(JSONObject input, Map<String, String> patternRules) throws Exception {
env.setParallelism(1);
DataStream<JSONObject> inputSignal = env.fromElements(input);
DataStream<Map<String, String>> rawPatternStream =
env.fromElements(patternRules);
//Generate a key,value pair of set of patterns where key is pattern name and value is pattern condition
DataStream<Tuple2<String, Map<String, String>>> patternRuleStream =
rawPatternStream.flatMap(new FlatMapFunction<Map<String, String>,
Tuple2<String, Map<String, String>>>() {
#Override
public void flatMap(Map<String, String> patternRules,
Collector<Tuple2<String, Map<String, String>>> out) throws Exception {
for (Map.Entry<String, String> stringEntry : patternRules.entrySet()) {
JSONObject jsonObject = new JSONObject(stringEntry.getValue());
Map<String, String> map = new HashMap<>();
for (String key : jsonObject.keySet()) {
String value = jsonObject.get(key).toString();
map.put(key, value);
}
out.collect(new Tuple2<>(stringEntry.getKey(), map));
}
}
});
BroadcastStream<Tuple2<String, Map<String, String>>> patternRuleBroadcast =
patternStream.broadcast(patternRuleDescriptor);
DataStream<Tuple2<String, JSONObject>> validSignal = inputSignal.map(new MapFunction<JSONObject,
Tuple2<String, JSONObject>>() {
#Override
public Tuple2<String, JSONObject> map(JSONObject inputSignal) throws Exception {
String source =
inputSignal.getSource();
return new Tuple2<>(source, inputSignal);
}
}).keyBy(0).connect(patternRuleBroadcast).process(new MyKeyedBroadCastProcessFunction());
validSignal.map(new MapFunction<Tuple2<String, JSONObject>,
JSONObject>() {
#Override
public JSONObject map(Tuple2<String, JSONObject> inputSignal) throws Exception {
return inputSignal.f1;
}
}).addSink(new getDataStreamOutput());
env.execute("TestFlink");
}
return (getDataStreamOutput.dataStreamOutput);
}
#SuppressWarnings("serial")
public static final class getDataStreamOutput implements SinkFunction<JSONObject> {
public static String dataStreamOutput;
public void invoke(JSONObject inputSignal) throws Exception {
dataStreamOutput = inputSignal.toString();
}
}
I need to test different inputs with same broadcast rules but each time i am calling this function its again and again doing process from beginning take input signal broadcast data, is there a way i can broadcast once and keeping on sending the input to the method i explored i can use CoFlatMapFunction something like below to combine datastream and keep on sending the input rules while method is running but for this one of the datastream has to keep on getting data from kafka topic again it will overburden on method to load kafka utils and server
DataStream<JSONObject> inputSignalFromKafka = env.addSource(inputSignalKafka);
DataStream<org.json.JSONObject> inputSignalFromMethod = env.fromElements(inputSignal));
DataStream<JSONObject> inputSignal = inputSignalFromMethod.connect(inputSignalFromKafka)
.flatMap(new SignalCoFlatMapper());
public static class SignalCoFlatMapper
implements CoFlatMapFunction<JSONObject, JSONObject, JSONObject> {
#Override
public void flatMap1(JSONObject inputValue, Collector<JSONObject> out) throws Exception {
out.collect(inputValue);
}
#Override
public void flatMap2(JSONObject kafkaValue, Collector<JSONObject> out) throws Exception {
out.collect(kafkaValue);
}
}
I found a link in stackoverflow How to unit test BroadcastProcessFunction in flink when processElement depends on broadcasted data but this is confused me a lot
Any way i can only broadcast only once in Before method in test cases and keeping sending different kind of data to my broadcast function
You can use KeyedTwoInputStreamOperatorTestHarness in order to achieve this for example let's assume you have the following KeyedBroadcastProcessFunction where you define some business logic for both DataStream channels
public class SimpleKeyedBroadcastProcessFunction extends KeyedBroadcastProcessFunction<String, String, String, String> {
#Override
public void processElement(String inputEntry,
ReadOnlyContext readOnlyContext, Collector<String> collector) throws Exception {
//business logic for how you want to process your data stream records
}
#Override
public void processBroadcastElement(String broadcastInput, Context
context, Collector<String> collector) throws Exception {
//process input from your broadcast channel
}
Let's now assume your process function is stateful and is making modifications to the Flink internal state, you would have to create a TestHarness inside your test class to ensure you are able to keep track of the state during testing.
I would then create some unit tests using the following approach:
public class SimpleKeyedBroadcastProcessFunctionTest {
private SimpleKeyedBroadcastProcessFunction processFunction;
private KeyedTwoInputStreamOperatorTestHarness<String, String, String, String> testHarness;
#Before
public void setup() throws Exception {
processFunction = new SimpleKeyedBroadcastProcessFunction();
testHarness = new KeyedTwoInputStreamOperatorTestHarness<>(
new CoBroadcastWithKeyedOperator<>(processFunction, ImmutableList.of(BROADCAST_MAP_STATE_DESCRIPTOR)),
(KeySelector<String, String>) string -> string ,
(KeySelector<String, String>) string -> string,
TypeInformation.of(String.class));
testHarness.setup();
testHarness.open();
}
#After
public void cleanup() throws Exception {
testHarness.close();
}
#Test
public void testProcessRegularInput() throws Exception {
//processElement1 send elements into your regular stream, second param will be the event time of the record
testHarness.processElement1(new StreamRecord<>("Hello", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello", records.get(0).getValue())
}
#Test
public void testProcessBroadcastInput() throws Exception {
//processElement2 send elements into your broadcast stream, second param will be the event time of the record
testHarness.processElement2(new StreamRecord<>("Hello from Broadcast", 0));
//Access records collected during processElement
List<StreamRecord<? extends String>> records = testHarness.extractOutputStreamRecords();
assertEquals("Hello from Broadcast", records.get(0).getValue())
}
}
I have CSV files without headers. Since I'm using 'useMaps' I want to specify the headers dynamically. If I set headers statically and then use in route it works fine as below Approach 1 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
ArrayList<String> list = new ArrayList<String>();
list.add("DeviceName");
list.add("Brand");
list.add("status");
list.add("type");
list.add("features_c");
list.add("battery_c");
list.add("colors");
csv.setHeader(list);
from("direct:bulkImport")
.convertBodyTo(String.class)
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
However, if the list is passed via Camel headers as below then it does not work Approach 2 -
#Component
public class BulkActionRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
CsvDataFormat csv = new CsvDataFormat(",");
csv.setUseMaps(true);
from("direct:bulkImport")
.convertBodyTo(String.class)
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
ArrayList<String> fileHeaders = (ArrayList<String>)headers.get(Constants.FILE_HEADER_LIST);
if (fileHeaders != null && fileHeaders.size() > 0) {
csv.setHeader(fileHeaders);
}
}
})
.unmarshal(csv)
.split(body()).streaming()
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
GenericObjectModel model = null;
HashMap<String, String> csvRecord = (HashMap<String, String>)exchange.getIn().getBody();
}
});
}
}
What could be missing in the Approach 2?
The big difference between approach 1 and 2 is the scope.
In approach 1 you fully configure the CSV data format. This is all done when the Camel Context is created, since the data format is shared within the Camel Context. When messages are processed, it is the same config for all messages.
In approach 2 you just configure the basics globally. The header configuration is within the route and therefore can change for every single message. Every message would overwrite the header configuration of the context-global data format instance.
Without being sure about this, I guess that it is not possible to change a context-global DataFormat inside the routes.
What would you expect (just for example) when messages are processed in parallel? They would overwrite the header config against each other.
As an alternative, you could use a POJO where you can do your dynamic marshal / unmarshal from Java code.
I am working in AEM trying to get create txt files with JSON output so that I can load them into my unit test as strings and test my model / model processors. So far I have this...
public String readFile(String path, Charset encoding) throws IOException
{
byte[] encoded = Files.readAllBytes(Paths.get(path));
return new String(encoded, encoding);
}
private String sampleInput = readFile("/test/resources/map/sample-
input.txt",Charset.forName("UTF-8"));
I need sampleInput to take the json that is in 'sampleInput.txt' and convert it to a string. I am also running into issues with the Charset encoding.
I think the easiest way to manage JSON documents you use for unit testing is by keeping them organized in the classpath. Guava provides a neat wrapper for loading classpath resources.
import com.google.common.base.Charsets;
import com.google.common.io.Resources;
import java.io.IOException;
import java.net.URL;
public class TestJsonDocumentLoader {
public TestJsonDocumentLoader(Class clazz) {
this.clazz = clazz;
}
public String loadTestJson(String fileName) {
URL url = Resources.getResource(clazz, fileName);
try {
String data = Resources.toString(url, Charsets.UTF_8);
return data;
} catch (IOException e) {
throw new RuntimeException("Couldn't load a JSON file.", e);
}
}
}
This can then be used to load arbitrary JSON files placed in the same package as the test class. It is assumed that the files are UTF-8 encoded. I suggest keeping all sources encoded that way, regardless of the OS your team is using. It saves you a lot of trouble with version control.
Let's say you have MyTest in src/test/java/com/example/mytestsuite, then you could place a file data.json in src/test/resources/com/example/mytestsuite and load id by calling
TestJsonDocumentLoader loader = new TestJsonDocumentLoader(MyTest.class);
String jsonData = loader.loadTestJson("data.json");
String someOtherExample = loader.loadTestJson("other.json");
Actually, this could be used for all sorts of text files.
You could have also used object mapper from Jackson as an alternative
public class JsonResourceObjectMapper<T> {
private Class<T> model;
public JsonResourceObjectMapper(Class<T> model) {
this.model = model;
}
public T loadTestJson(String fileName) throws IOException{
ClassLoader classLoader = this.getClass().getClassLoader();
InputStream inputStream= classLoader.getResourceAsStream(fileName);
return new ObjectMapper().readValue(inputStream, this.model);
}
}
And then setup a fixture in the test passing a .class
private JsonClass json;
#Before
public void setUp() throws IOException {
JsonResourceObjectMapper mapper = new JsonResourceObjectMapper(JsonClass.class);
json = (JsonClass) mapper.loadTestJson("json/testJson.json");
}
Note that the testJson.json file is in resources/json folder same as what #toniedzwiedz mentioned
So then you could use the json model as:
#Test
public void testJsonNameProperty(){
//act
String name = json.getName();
// assert
assertEquals("testName", name);
}
(Jackson 2.8.7)
I am trying to serialize an Iterator of objects with the following code:
static final CsvMapper csvMapper = new CsvMapper();
static final CsvSchema networkSchema = csvMapper.schemaFor(NetworkRecord.class).withHeader();
static final CsvFactory factory = newFactory();
private static CsvFactory newFactory(){
CsvFactory factory = new CsvFactory();
factory.setCodec(csvMapper);
return factory;
}
static void serializeNetworks(final Iterator<NetworkRecord> records, final OutputStream os) throws IOException {
try(final CsvGenerator generator = factory.createGenerator(os)){
generator.setSchema(networkSchema);
generator.writeObject(records);
generator.flush();
}
}
However, the output only contains the header and none of the value rows. (If I serialize the objects from a while loop, everything works as expected.) How can I make Jackson serialize the Iterator correctly?
If I do this:
public static volatile ArrayList<Process> processes = new ArrayList<Process>(){
{
add(new Process("News Workflow", "This is the workflow for the news segment", "image"));
}
};
and then this:
String jsonResponse = gson.toJson(processes);
jsonResponse is null.
But if I do this:
public static volatile ArrayList<Process> processes = new ArrayList<Process>();
processes.add(new Process("nam", "description", "image"));
String jsonResponse = gson.toJson(processes);
Json response is:
[{"name":"nam","description":"description","image":"image"}]
Why is that?
I do not know what is the problem with Gson, but do you know, that you are creating subclass of ArrayList here?
new ArrayList<Process>(){
{
add(new Process("News Workflow", "This is the workflow for the news segment", "image"));
}
};
You can check that by
System.out.println( processes.getClass().getName() );
it won't print java.util.ArrayList.
I think you wanted to use static initialization as
public static volatile ArrayList<Process> processes = new ArrayList<Process>();
static {
processes.add( new Process( "News Workflow", "This is the workflow for the news segment", "image" ) );
};
It seems that there is problem with anonymous classes, same problem is here
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
public class GSonAnonymTest {
interface Holder {
String get();
}
static Holder h = new Holder() {
String s = "value";
#Override
public String get() {
return s;
}
};
public static void main( final String[] args ) {
final GsonBuilder gb = new GsonBuilder();
final Gson gson = gb.create();
System.out.println( "h:" + gson.toJson( h ) );
System.out.println( h.get() );
}
}
UPD: look at Gson User Guide - Finer Points with Objects, last point "...anonymous classes, and local classes are ignored and not included in serialization or deserialization..."