How can we read JSON file - json

Utility to read JSON file from file server
2. Utility should run at scheduled time let’s say 6AM
3. Error message if JSON file is not properly formatted
4. Error message if Category is missing and ITEM need that Category to be saved.
5. Entity mapping as per the given relationship in data model
5. Documentation for the API, preferably using any tool
6. Junit Test cases using Mockito
7. Use either MySQL or Oracle for API development
8. add one point as JSON validation for not null and value range
Thanks

i tried below to cover first question of this problem but not sure what to do to answer remaining 7 questions related with this probelm:-
Create a class named "JSONRead" in eclipse. In this we will using "JSONParser" to convert the JSON string in the file to JSONOBject.
In order to use JSON Parser makes sure that your string is in JSON format.
enter code here`enter code here`
enter code here
{
package logicProgramming;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import java.util.Iterator;
import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;
public class JsonRead {
public static void main(String[] args) {
JSONParser parser = new JSONParser();
//JsonParser to convert JSON string into Json Object
try {
Object obj = parser.parse(new FileReader("g:\\newfile.json"));
//parsing the JSON string inside the file that we created earlier.
JSONObject jsonObject = (JSONObject) obj;
System.out.println(jsonObject);
//Json string has been converted into JSONObject
String name = (String) jsonObject.get("name");
System.out.println(name);
String department = (String) jsonObject.get("department");
System.out.println(department);
String branch = (String) jsonObject.get("branch");
System.out.println(branch);
long year = (long) jsonObject.get("year");
System.out.println(year);
//Displaying values from JSON OBject by using Keys
JSONArray remarks = (JSONArray) jsonObject.get("remarks");
//converting the JSONObject into JSONArray as remark was an array.
Iterator<String> iterator = remarks.iterator();
//Iterator is used to access the each element in the list
//loop will continue as long as there are elements in the array.
while (iterator.hasNext()) {
System.out.println(iterator.next());
//accessing each elemnt by using next function.
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} catch (ParseException e) {
e.printStackTrace();
}
}
}
enter code here
enter code here
Thanks

Related

How to test a Datastream with jsonobject in Apache Flink

I am new to testing and i am trying to write a unit test cases on a Flink Datastream which takes input a jsonobject and passes the json object to a processfuntion and it returns a valid or invalid jsonobject when certain rule conditions are met below is the junit test case, below i am trying to compare the output jsonobject from process function with the jsonobject of the input file
#Test
public void testcompareInputAndOutputDataJSONSignal() throws Exception {
org.json.JSONObject jsonObject = toJsonObject();
String input = jsonObject.toString();
String output = JSONDataStreamOutput();
assertEquals(mapper.readTree(input), mapper.readTree(output));
}
below is my toJSONObject and JSONDataStream meathods
public static JSONObject toJsonObject() throws IOException, ParseException {
JSONParser jsonParser = new JSONParser();
FileReader fileReader = new FileReader(getFileFromResources("input.json"));
JSONObject obj = (JSONObject) jsonParser.parse(fileReader);
return obj;
}
public String SignalDataStreamOutput() throws Exception {
final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment();
DataStream<JSONObject> validSignal = env.fromElements(toJsonObject())
.process(new JsonFilter());
String outputFolder = "output";
validSignal.writeAsText(outputFolder).setParallelism(1);
env.execute();
String content = new String(Files.readAllBytes(Paths.get("output.txt")));
return content;
}
What i am doing is i am converting a jsonfile to jsonobject using the toJSONObject method and sending to a data stream using SignalDataStreamOutput method which will intern send it to a process function in JsonFilter class and validate it against a set of rules and if it's valid it will return a jsonobject and when trying to access the jsonobject directly from stream i am getting value like org.apache.flink#994jdkeiri so i am trying to write the output to a file and trying to read it back to a string and comparing it in test method but this is a work around process and i found a link to use Mockito framework here i changed it to use json object like below
final Collector<JSONObject> collectorMock = (Collector<JSONObject>)Mockito.mock(JsonFilter.class);
final Context contextMock = Mockito.mock(Context.class);
#Test
public void testcompareInputAndOutputDataForValidSignal() throws Exception {
org.json.JSONObject jsonObject = convertToJsonObject();
Mockito.verify(collectorMock).collect(jsonObject);
}
but the above approach is also not working can you suggest me simplified approach to test the json object

Loading json into my unit test from a text file

I am working in AEM trying to get create txt files with JSON output so that I can load them into my unit test as strings and test my model / model processors. So far I have this...
public String readFile(String path, Charset encoding) throws IOException
{
byte[] encoded = Files.readAllBytes(Paths.get(path));
return new String(encoded, encoding);
}
private String sampleInput = readFile("/test/resources/map/sample-
input.txt",Charset.forName("UTF-8"));
I need sampleInput to take the json that is in 'sampleInput.txt' and convert it to a string. I am also running into issues with the Charset encoding.
I think the easiest way to manage JSON documents you use for unit testing is by keeping them organized in the classpath. Guava provides a neat wrapper for loading classpath resources.
import com.google.common.base.Charsets;
import com.google.common.io.Resources;
import java.io.IOException;
import java.net.URL;
public class TestJsonDocumentLoader {
public TestJsonDocumentLoader(Class clazz) {
this.clazz = clazz;
}
public String loadTestJson(String fileName) {
URL url = Resources.getResource(clazz, fileName);
try {
String data = Resources.toString(url, Charsets.UTF_8);
return data;
} catch (IOException e) {
throw new RuntimeException("Couldn't load a JSON file.", e);
}
}
}
This can then be used to load arbitrary JSON files placed in the same package as the test class. It is assumed that the files are UTF-8 encoded. I suggest keeping all sources encoded that way, regardless of the OS your team is using. It saves you a lot of trouble with version control.
Let's say you have MyTest in src/test/java/com/example/mytestsuite, then you could place a file data.json in src/test/resources/com/example/mytestsuite and load id by calling
TestJsonDocumentLoader loader = new TestJsonDocumentLoader(MyTest.class);
String jsonData = loader.loadTestJson("data.json");
String someOtherExample = loader.loadTestJson("other.json");
Actually, this could be used for all sorts of text files.
You could have also used object mapper from Jackson as an alternative
public class JsonResourceObjectMapper<T> {
private Class<T> model;
public JsonResourceObjectMapper(Class<T> model) {
this.model = model;
}
public T loadTestJson(String fileName) throws IOException{
ClassLoader classLoader = this.getClass().getClassLoader();
InputStream inputStream= classLoader.getResourceAsStream(fileName);
return new ObjectMapper().readValue(inputStream, this.model);
}
}
And then setup a fixture in the test passing a .class
private JsonClass json;
#Before
public void setUp() throws IOException {
JsonResourceObjectMapper mapper = new JsonResourceObjectMapper(JsonClass.class);
json = (JsonClass) mapper.loadTestJson("json/testJson.json");
}
Note that the testJson.json file is in resources/json folder same as what #toniedzwiedz mentioned
So then you could use the json model as:
#Test
public void testJsonNameProperty(){
//act
String name = json.getName();
// assert
assertEquals("testName", name);
}

Processing JSON using java Mapreduce

I am new to hadoop mapreduce
I have input text file where data has been stored as follow. Here are only a few tuples (data.txt)
{"author":"Sharīf Qāsim","book":"al- Rabīʻ al-manshūd"}
{"author":"Nāṣir Nimrī","book":"Adīb ʻAbbāsī"}
{"author":"Muẓaffar ʻAbd al-Majīd Kammūnah","book":"Asmāʼ Allāh al-ḥusná al-wāridah fī muḥkam kitābih"}
{"author":"Ḥasan Muṣṭafá Aḥmad","book":"al- Jabhah al-sharqīyah wa-maʻārikuhā fī ḥarb Ramaḍān"}
{"author":"Rafīqah Salīm Ḥammūd","book":"Taʻlīm fī al-Baḥrayn"}
This is my java file that I am supposed to write my code in (CombineBooks.java)
package org.hwone;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.util.GenericOptionsParser;
//TODO import necessary components
/*
* Modify this file to combine books from the same other into
* single JSON object.
* i.e. {"author": "Tobias Wells", "books": [{"book":"A die in the country"},{"book": "Dinky died"}]}
* Beaware that, this may work on anynumber of nodes!
*
*/
public class CombineBooks {
//TODO define variables and implement necessary components
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf, args)
.getRemainingArgs();
if (otherArgs.length != 2) {
System.err.println("Usage: CombineBooks <in> <out>");
System.exit(2);
}
//TODO implement CombineBooks
Job job = new Job(conf, "CombineBooks");
//TODO implement CombineBooks
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
My task is to create a Hadoop program in “CombineBooks.java”
returned in the “question-2” directory. The program should do
the following: Given the input author-book tuples, map-reduce
program should procude a JSON object which contains all the
books from same author in a JSON array, i.e.
{"author": "Tobias Wells", "books":[{"book":"A die in the country"},{"book": "Dinky died"}]}
Any idea how it can be done ?
First, the JSON objects you are trying to work with are not available for you. To solve this:
Go here and download as zip: https://github.com/douglascrockford/JSON-java
Extract to your sources folder in subdirectory org/json/*
Next, the first line of your code makes a package "org.json", which is incorrect, you shold create a separate package, for instance "my.books".
Third, using combiner here is useless.
Here's the code I ended up with, it works and solves your problem:
package my.books;
import java.io.IOException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.Reducer;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
import org.json.*;
import javax.security.auth.callback.TextInputCallback;
public class CombineBooks {
public static class Map extends Mapper<LongWritable, Text, Text, Text>{
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{
String author;
String book;
String line = value.toString();
String[] tuple = line.split("\\n");
try{
for(int i=0;i<tuple.length; i++){
JSONObject obj = new JSONObject(tuple[i]);
author = obj.getString("author");
book = obj.getString("book");
context.write(new Text(author), new Text(book));
}
}catch(JSONException e){
e.printStackTrace();
}
}
}
public static class Reduce extends Reducer<Text,Text,NullWritable,Text>{
public void reduce(Text key, Iterable<Text> values, Context context) throws IOException, InterruptedException{
try{
JSONObject obj = new JSONObject();
JSONArray ja = new JSONArray();
for(Text val : values){
JSONObject jo = new JSONObject().put("book", val.toString());
ja.put(jo);
}
obj.put("books", ja);
obj.put("author", key.toString());
context.write(NullWritable.get(), new Text(obj.toString()));
}catch(JSONException e){
e.printStackTrace();
}
}
}
public static void main(String[] args) throws Exception {
Configuration conf = new Configuration();
if (args.length != 2) {
System.err.println("Usage: CombineBooks <in> <out>");
System.exit(2);
}
Job job = new Job(conf, "CombineBooks");
job.setJarByClass(CombineBooks.class);
job.setMapperClass(Map.class);
job.setReducerClass(Reduce.class);
job.setMapOutputKeyClass(Text.class);
job.setMapOutputValueClass(Text.class);
job.setOutputKeyClass(NullWritable.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(TextInputFormat.class);
job.setOutputFormatClass(TextOutputFormat.class);
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
Here's the folder structure of my project:
src
src/my
src/my/books
src/my/books/CombineBooks.java
src/org
src/org/json
src/org/json/zip
src/org/json/zip/BitReader.java
...
src/org/json/zip/None.java
src/org/json/JSONStringer.java
src/org/json/JSONML.java
...
src/org/json/JSONException.java
Here's the input
[localhost:CombineBooks]$ hdfs dfs -cat /example.txt
{"author":"author1", "book":"book1"}
{"author":"author1", "book":"book2"}
{"author":"author1", "book":"book3"}
{"author":"author2", "book":"book4"}
{"author":"author2", "book":"book5"}
{"author":"author3", "book":"book6"}
The command to run:
hadoop jar ./bookparse.jar my.books.CombineBooks /example.txt /test_output
Here's the output:
[pivhdsne:CombineBooks]$ hdfs dfs -cat /test_output/part-r-00000
{"books":[{"book":"book3"},{"book":"book2"},{"book":"book1"}],"author":"author1"}
{"books":[{"book":"book5"},{"book":"book4"}],"author":"author2"}
{"books":[{"book":"book6"}],"author":"author3"}
You can use on of the three options to put the org.json.* classes into your cluster:
Pack the org.json.* classes into your jar file (can easily be done using GUI IDE). This is the option I used in my answer
Put the jar file containing org.json.* classes on each of the cluster nodes into one of the CLASSPATH directories (see yarn.application.classpath)
Put the jar file containing org.json.* into HDFS (hdfs dfs -put <org.json jar> <hdfs path>) and use job.addFileToClassPath call for this jar file to be available for all of the tasks executing your job on the cluster. In my answer you should add job.addFileToClassPath(new Path("<jar_file_on_hdfs_location>")); to the main
Refer for splittable multi-line JSON:
https://github.com/alexholmes/json-mapreduce

looping over json response in classic asp

SO I am making a rest request to the JIRA API and getting a json response that includes all objects.
my request look like this:
Set restReq = CreateObject("MSXML2.ServerXMLHTTP.3.0")
restReq.open "GET", "URI",False
restReq.setRequestHeader "Authorization","Basic{user:Password}"
restReq.setOption SXH_OPTION_IGNORE_SERVER_SSL_CERT_ERROR_FLAGS,SXH_SERVER_CERT_IGNORE_ALL_SERVER_ERRORS
restReq.send("")
'response.write(restReq.responseText)
the response.write looks like this (but much longer):
[{"self":"https://JIRA:8343/rest/api/2/project/CT","id":"10004","key":"CT","name":"Core Technologies"}},
{"self":"https://JIRA:8343/rest/api/2/project/CTCCG","id":"10006","key":"CTCCG","name":"CT CCG"}}]
I would like to be able to loop through the response and use the "id", "key" and "name" in an unordered list. I can create a ul, but how do I extract the information I need from the json?
You check this question relating to using the Gson library. It is very small, quick and easy to use to convert between JSON to Objects.
import java.io.FileReader;
import com.google.gson.Gson;
public class Test {
public static void main(String[] args) throws Exception
{
Gson gson = new Gson();
TypeDTO[] myTypes = gson.fromJson(new FileReader("D:\\temp\\inputjson.txt"), TypeDTO[].class);
for (int i = 0; i < myTypes.length; ++i)
System.out.println(myTypes[i].self);
}
class TypeDTO
{
String self;
String id;
String key;
String name;
}
}
inputjson.txt had
[{"self":"https://JIRA:8343/rest/api/2/project/CT","id":"10004","key":"CT","name":"Core Technologies"},
{"self":"https://JIRA:8343/rest/api/2/project/CTCCG","id":"10006","key":"CTCCG","name":"CT CCG"}]
note the absence of addtional } when compared to yours at the end of each line.

Using json with Play 2

I'm trying to create a simple application that allows me to create, read, update and delete various users. I have a basic UI-based view, controller and model that work, but wanted to be more advanced than this and provide a RESTful json interface.
However, despite reading everything I can find in the Play 2 documentation, the Play 2 Google groups and the stackoverflow website, I still can't get this to work.
I've updated my controller based on previous feedback and I now believe it is based on the documentation.
Here is my updated controller:
package controllers;
import models.Member;
import play.*;
import play.mvc.*;
import play.libs.Json;
import play.data.Form;
public class Api extends Controller {
/* Return member info - version to serve Json response */
public static Result member(Long id){
ObjectNode result = Json.newObject();
Member member = Member.byid(id);
result.put("id", member.id);
result.put("email", member.email);
result.put("name", member.name);
return ok(result);
}
// Create a new body parser of class Json based on the values sent in the POST
#BodyParser.Of(Json.class)
public static Result createMember() {
JsonNode json = request().body().asJson();
// Check that we have a valid email address (that's all we need!)
String email = json.findPath("email").getTextValue();
if(name == null) {
return badRequest("Missing parameter [email]");
} else {
// Use the model's createMember class now
Member.createMember(json);
return ok("Hello " + name);
}
}
....
But when I run this, I get the following error:
incompatible types [found: java.lang.Class<play.libs.Json>] [required: java.lang.Class<?extends play.mvc.BodyParser>]
In /Users/Mark/Development/EclipseWorkspace/ms-loyally/loyally/app/controllers/Api.java at line 42.
41 // Create a new body parser of class Json based on the values sent in the POST
42 #BodyParser.Of(Json.class)
43 public static Result createMember() {
44 JsonNode json = request().body().asJson();
45 // Check that we have a valid email address (that's all we need!)
46 String email = json.findPath("email").getTextValue();
As far as I can tell, I've copied from the documentation so I would appreciate any help in getting this working.
There appear to be conflicts in the use of the Json class in the Play 2 documentation. To get the example above working correctly, the following imports are used:
import play.mvc.Controller;
import play.mvc.Result;
import play.mvc.BodyParser;
import play.libs.Json;
import play.libs.Json.*;
import static play.libs.Json.toJson;
import org.codehaus.jackson.JsonNode;
import org.codehaus.jackson.node.ObjectNode;
#BodyParser.Of(play.mvc.BodyParser.Json.class)
public static index sayHello() {
JsonNode json = request().body().asJson();
ObjectNode result = Json.newObject();
String name = json.findPath("name").getTextValue();
if(name == null) {
result.put("status", "KO");
result.put("message", "Missing parameter [name]");
return badRequest(result);
} else {
result.put("status", "OK");
result.put("message", "Hello " + name);
return ok(result);
}
}
Note the explicit calling of the right Json class in #BodyParser
I'm not sure if this is a bug or not? But this is the only way I could get the example to work.
Import those two
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.node.ObjectNode;
According to this documentation: http://fasterxml.github.io/jackson-databind/javadoc/2.0.0/com/fasterxml/jackson/databind/node/ObjectNode.html
Try this:
import play.*;
import play.mvc.*;
import org.codehaus.jackson.JsonNode; //Fixing "error: cannot find symbol" for JsonNode
// Testing JSON
#BodyParser.Of(BodyParser.Json.class) //Or you can import play.mvc.BodyParser.Json
public static Result sayHello() {
JsonNode json = request().body().asJson();
String name = json.findPath("name").getTextValue();
if(name==null) {
return badRequest("Missing parameter [name]");
} else {
return ok("Hello " + name);
}
}
AFAIK, the code you are using has not reached any official Play version (neither 2.0 or 2.0.1) according to this: https://github.com/playframework/Play20/pull/212
Instead, you can do this (not tested):
if(request().getHeader(play.mvc.Http.HeaderNames.ACCEPT).equalsIgnoreCase("application/json")) {
Did you try checking out the documentation for it?
Serving a JSON response looks like:
#BodyParser.Of(Json.class)
public static index sayHello() {
JsonNode json = request().body().asJson();
ObjectNode result = Json.newObject();
String name = json.findPath("name").getTextValue();
if(name == null) {
result.put("status", "KO");
result.put("message", "Missing parameter [name]");
return badRequest(result);
} else {
result.put("status", "OK");
result.put("message", "Hello " + name);
return ok(result);
}
}
You have imported play.libs.Json and then use the BodyParser.Of annotation with this Json.class.
The above annotation expects a class which extends a play.mvc.BodyParser. So simply replace #BodyParser.Of(Json.class) by #BodyParser.Of(BodyParser.Json.class).