catching wrong path in JUnit - junit

import org.junit.Test;
import org.junit.Assert;
import org.junit.Assert.*;
import java.io.FileNotFoundException;
ChartIO io = new ChartIO();
#Test
public void test_readFile_SadPath() {
String path="abc.txt";
try {
io.readFile(path);
}catch(Exception ex) {
assertTrue(ex instanceof FileNotFoundException );
}
}
I want to create a test case for below code but it gives always this error
cannot find symbol
[ERROR] symbol: method assertTrue(boolean)
[ERROR] location: class demo.junit_class_demo.grades.ChartIOTest
public int[] readFile(String filepath) {
List<Integer> linesList = new ArrayList<>();
try (BufferedReader reader = new BufferedReader(new FileReader(filepath))) {
String line;
while ((line = reader.readLine()) != null) {
linesList.add(Integer.parseInt(line));
}
} catch (FileNotFoundException e) {
throw new IllegalArgumentException("File " + filepath + " does not exist");
} catch (IOException e) {
throw new NumberFormatException();
}
int[] linesArray = new int[linesList.size()];
for (int index = 0; index < linesList.size(); index++) {
linesArray[index] = linesList.get(index);
}
return linesArray;
}
Where is my mistake?
Whole error lines:
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] /home/X/Desktop/Y/Z/A/project/unit_test_demo/src/test/java/grades/ChartIOTest.java:[22,25] cannot find symbol
symbol: method assertTrue(boolean)
location: class demo.junit_class_demo.grades.ChartIOTest
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.787 s
[INFO] Finished at: 2020-03-16T18:04:35+03:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:testCompile (default-testCompile) on project 481.demo.child: Compilation failure
[ERROR] /home/Z/Desktop/X/Y/K/project/unit_test_demo/src/test/java/grades/ChartIOTest.java:[22,25] cannot find symbol
[ERROR] symbol: method assertTrue(boolean)
[ERROR] location: class demo.junit_class_demo.grades.ChartIOTest
[ERROR]
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
This is whole error lines.

Related

Exception in thread "main" io.restassured.path.json.exception.JsonPathException: Failed to parse the JSON document while trying to "automate an API "

I am getting the following exception while trying to Automate an API:
Exception in thread "main" io.restassured.path.json.exception.JsonPathException: Failed to parse the JSON document
at io.restassured.path.json.JsonPath$ExceptionCatcher.invoke(JsonPath.java:1002)
at io.restassured.path.json.JsonPath$4.doParseWith(JsonPath.java:967)
at io.restassured.path.json.JsonPath$JsonParser.parseWith(JsonPath.java:1047)
at io.restassured.path.json.JsonPath.toJsonString(JsonPath.java:1064)
at io.restassured.path.json.JsonPath.prettify(JsonPath.java:685)
at io.restassured.path.json.JsonPath.prettyPrint(JsonPath.java:700)
at ApiAutomation_BDD.ApiAutomation_BDD.bearerToken_GETRequest.main(bearerToken_GETRequest.java:35)
Caused by: groovy.json.JsonException: Lexing failed on line: 2, column: 1, while reading '<', no possible valid JSON value or punctuation could be recognized.
at groovy.json.JsonLexer.nextToken(JsonLexer.java:86)
at groovy.json.JsonLexer$nextToken.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:130)
at io.restassured.internal.path.json.ConfigurableJsonSlurper.parse(ConfigurableJsonSlurper.groovy:97)
at io.restassured.internal.path.json.ConfigurableJsonSlurper$parse.callCurrent(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:51)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:171)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:185)
at io.restassured.internal.path.json.ConfigurableJsonSlurper.parseText(ConfigurableJsonSlurper.groovy:83)
at io.restassured.path.json.JsonPath$4$1.method(JsonPath.java:965)
at io.restassured.path.json.JsonPath$ExceptionCatcher.invoke(JsonPath.java:1000)
... 6 more
and the following is the code which I tried (again not sharing any values coz of security risk):
public static void main(String args[]) throws IOException {
RestAssured.baseURI = "{Url}";
RequestSpecification httpRequest = RestAssured.given()
.contentType("x-www-form-urlencoded")
.formParam("grant_type", "")
.formParam("client_id", "")
.formParam("scope", "")
.formParam("client_secret","");
Response response = httpRequest.request(Method.GET);
JsonPath res = response.jsonPath();
//String BearerToken = response.getBody().asString();
//System.out.println("The response is:" +res);
res.prettyPrint();
int code = response.getStatusCode();
System.out.println("The status code is" + code);
assertEquals(code, 200);
}
}

Run Sikuli commands on Katalon-Studio

I'm trying to run a test that contains Sikuli methods in it via Katalon-Studio.
I added the necessary libraries but still getting this error when it tries to execute the Sikuli command:
[error] ResourceLoaderBasic: checkLibsDir: libs dir is not on system path: C:\Users\roinr\git\QA\Drivers\libs
[action] ResourceLoaderBasic: checkLibsDir: Please wait! Trying to add it to user's path
[info] runcmd: reg QUERY HKCU
[info] runcmd: reg QUERY HKEY_CURRENT_USER\Environment /v PATH
[error] ResourceLoaderBasic: checkLibsDir: Logout and Login again! (Since libs folder is in user's path, but not activated)
[error] Terminating SikuliX after a fatal error! Sorry, but it makes no sense to continue!
If you do not have any idea about the error cause or solution, run again
with a Debug level of 3. You might paste the output to the Q&A board.
This is my script:
import internal.GlobalVariable as GlobalVariable
import java.io.IOException;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.chrome.ChromeDriver;
import org.openqa.selenium.support.ui.ExpectedConditions;
import org.openqa.selenium.support.ui.WebDriverWait;
import org.sikuli.script.Key;
import org.sikuli.script.Screen;
//Initializing server
System.setProperty("webdriver.chrome.driver", "C:/selenium/chromedriver.exe");
//Initializing variables
ChromeDriver wd = new ChromeDriver();
WebDriverWait wait = new WebDriverWait(wd,10);
Screen s = new Screen();
wd.manage().window().maximize();
wd.manage().timeouts().implicitlyWait(20, TimeUnit.SECONDS);
System.out.println("*** Login Sikuli ***");
wd.get("https://autoqa-materials-zone.firebaseapp.com/login");
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//div[2]/div/input")));
wd.findElement(By.xpath("//div[2]/div/input")).click();
try { Thread.sleep(1000l); } catch (Exception e) { throw new RuntimeException(e); }
s.paste("<USERNAME>");
try { Thread.sleep(1000l); } catch (Exception e) { throw new RuntimeException(e); }
s.type(Key.TAB);
try { Thread.sleep(800l); } catch (Exception e) { throw new RuntimeException(e); }
s.paste("<PASSWORD>");
try { Thread.sleep(1000l); } catch (Exception e) { throw new RuntimeException(e); }
wd.findElement(By.xpath("//form[#id='form']//paper-button[.='login']")).click();
try { Thread.sleep(5000l); } catch (Exception e) { throw new RuntimeException(e); }
wd.quit();
Hope you can help me with that,
thanks.

NullPointerException in sessionState$lzycompute Spark

I am trying to construct a custom receiver to get input stream from a MySQL database, but a NullPointerException appears.
class CustomReceiver(sqlContext: SQLContext, url: String) extends Receiver[DataFrame](StorageLevel.MEMORY_AND_DISK_2) {
val connectionProperties = new Properties()
connectionProperties.setProperty("user", "XXX")
connectionProperties.setProperty("password", "XXX")
connectionProperties.setProperty("driver", "com.mysql.jdbc.Driver")
def onStart() {
// Start the thread that receives data over a connection
new Thread("Database Receiver") {
override def run() { receive() }
}.start()
}
def onStop() {
// There is nothing much to do as the thread calling receive()
// is designed to stop by itself if isStopped() returns false
}
/** Create a database connection and receive data until receiver is stopped */
private def receive() {
var data:DataFrame = null
try {
val jdbcQuery = "SQL Query String"
data = sqlContext.read.jdbc(url, jdbcQuery, connectionProperties)
if (data != null) {
store(data)
}
restart("Trying to connect again")
} catch {
case e: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException =>
restart("MySQLSyntaxError", e)
case n: java.lang.NullPointerException =>
restart("NullPointer", n)
}
}
}
Here is the error log:
java.lang.NullPointerException
at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:112)
at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:110)
at org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:547)
at org.apache.spark.sql.SparkSession.read(SparkSession.scala:595)
at org.apache.spark.sql.SQLContext.read(SQLContext.scala:504)
at CustomReceiver.CustomReceiver$$receive(receiver.scala:44)
at CustomReceiver$$anon$1.run(receiver.scala:27
The error is from here:
data = sqlContext.read.jdbc(url, jdbcQuery, connectionProperties)
I think it is because the SQL Context can only be executed in the driver side. I can't find any answer to this question but I found some articles which said that it's impossible to stream from the database. Is this the reason the null pointer exception occurs?

Hadoop: Unable to run mapreduce program ..java.io.IOException: error=12

Im trying to run a mapreduce program in hadoop. Basically it takes in a text file as input in which each line is a json text. Im using simple json to parse this data in my mapper and the reducer does some other stuff. I have included the simple json jar file in hadoop/lib folder. here is the code below
package org.myorg;
import java.io.IOException;
import java.util.Iterator;
import java.util.*;
import org.json.simple.JSONArray;
import org.json.simple.JSONObject;
import org.json.simple.parser.JSONParser;
import org.json.simple.parser.ParseException;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat;
import org.apache.hadoop.util.GenericOptionsParser;
public class ALoc
{
public static class AMapper extends Mapper<Text, Text, Text, Text>
{
private Text kword = new Text();
private Text vword = new Text();
JSONParser parser = new JSONParser();
public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{
try {
String line = value.toString();
Object obj = parser.parse(line);
JSONObject jsonObject = (JSONObject) obj;
String val = (String)jsonObject.get("m1") + "," + (String)jsonObject.get("m3");
kword.set((String)jsonObject.get("m0"));
vword.set(val);
context.write(kword, vword);
}
catch (IOException e) {
e.printStackTrace();
}
catch (ParseException e) {
e.printStackTrace();
}
}
}
public static class CountryReducer
extends Reducer<Text,Text,Text,Text>
{
private Text result = new Text();
public void reduce(Text key, Iterable<Text> values,
Context context
) throws IOException, InterruptedException
{
int ccount = 0;
HashMap<Text, Integer> hm = new HashMap<Text, Integer>();
for (Text val : values)
{
if(hm.containsKey(val)){
Integer n = (Integer)hm.get(val);
hm.put(val, n+1);
}else{
hm.put(val, new Integer(1));
}
}
Set set = hm.entrySet();
Iterator i = set.iterator();
String agr = "";
while(i.hasNext()) {
Map.Entry me = (Map.Entry)i.next();
agr += "|" + me.getKey() + me.getValue();
}
result.set(agr);
context.write(key, result);
}
}
public static void main(String[] args) throws Exception
{
Configuration conf = new Configuration();
Job job = new Job(conf, "ALoc");
job.setJarByClass(ALoc.class);
job.setMapperClass(AMapper.class);
job.setReducerClass(CountryReducer.class);
job.setOutputKeyClass(Text.class);
job.setOutputValueClass(Text.class);
job.setInputFormatClass(TextInputFormat.class);
FileInputFormat.addInputPath(job, new Path(args[0]));
FileOutputFormat.setOutputPath(job, new Path(args[1]));
System.exit(job.waitForCompletion(true) ? 0 : 1);
}
}
When I try to run the job. It gives the following error.
I am running this in a aws micro instance single node.
I have been following this tutorial http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
hadoop#domU-18-11-19-02-92-8E:/$ bin/hadoop jar ALoc.jar org.myorg.ALoc /user/hadoop/adata /user/hadoop/adata-op5 -D mapred.reduce.tasks=16
13/02/12 08:39:50 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/02/12 08:39:50 INFO input.FileInputFormat: Total input paths to process : 1
13/02/12 08:39:50 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/02/12 08:39:50 WARN snappy.LoadSnappy: Snappy native library not loaded
13/02/12 08:39:51 INFO mapred.JobClient: Running job: job_201302120714_0006
13/02/12 08:39:52 INFO mapred.JobClient: map 0% reduce 0%
13/02/12 08:40:10 INFO mapred.JobClient: Task Id : attempt_201302120714_0006_m_000000_0, Status : FAILED
java.lang.RuntimeException: Error while running command to get file permissions : java.io.IOException: Cannot run program "/bin/ls": java.io.IOException: error=12, Cannot allocate memory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:475)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:200)
at org.apache.hadoop.util.Shell.run(Shell.java:182)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:710)
at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:443)
at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:426)
at org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:267)
at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:260)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.io.IOException: java.io.IOException: error=12, Cannot allocate memory
at java.lang.UNIXProcess.<init>(UNIXProcess.java:164)
at java.lang.ProcessImpl.start(ProcessImpl.java:81)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:468)
... 15 more
at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:468)
at org.apache.hadoop.fs.RawLocalFileSystem$RawLocalFileStatus.getOwner(RawLocalFileSystem.java:426)
at org.apache.hadoop.mapred.TaskLog.obtainLogDirOwner(TaskLog.java:267)
at org.apache.hadoop.mapred.TaskLogsTruncater.truncateLogs(TaskLogsTruncater.java:124)
at org.apache.hadoop.mapred.Child$4.run(Child.java:260)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:416)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
I guess you must be trying hadoop on Micro instance which have very less memory (~700MB).
Try increasing the HADOOP Heapsize parameter (in hadoop/conf/hadoop-env.sh) .. as the basic reason is shortage of memory required to fork processes

Failed to instantiate SLF4J LoggerFactory

So,
I'm working from this example BONECP:
package javasampleapps;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Statement;
import com.jolbox.bonecp.BoneCP;
import com.jolbox.bonecp.BoneCPConfig;
/** A test project demonstrating the use of BoneCP in a JDBC environment.
* #author wwadge
*/
public class BoneCPExample {
/** Start test
* #param args none expected.
*/
public static void main(String[] args) {
BoneCP connectionPool = null;
Connection connection = null;
try {
// load the database driver (make sure this is in your classpath!)
Class.forName("com.mysql.jdbc.Driver");
} catch (Exception e) {
e.printStackTrace();
return;
}
try {
// setup the connection pool
BoneCPConfig config = new BoneCPConfig();
config.setJdbcUrl("jdbc:mysql://domain/db");
config.setUsername("root");
config.setPassword("pass");
config.setMinConnectionsPerPartition(5);
config.setMaxConnectionsPerPartition(10);
config.setPartitionCount(1);
connectionPool = new BoneCP(config); // setup the connection pool
connection = connectionPool.getConnection(); // fetch a connection
if (connection != null){
System.out.println("Connection successful!");
Statement stmt = connection.createStatement();
ResultSet rs = stmt.executeQuery("SELECT id from batches limit 1"); // do something with the connection.
while(rs.next()){
System.out.println(rs.getString(1)); // should print out "1"'
}
}
connectionPool.shutdown(); // shutdown connection pool.
} catch (SQLException e) {
e.printStackTrace();
} finally {
if (connection != null) {
try {
connection.close();
} catch (SQLException e) {
e.printStackTrace();
}
}
}
}
}
I added slf4j in my Libraries menu in netbeans, adding
D:/Documents%20and%20Settings/DavidH/My%20Documents/NetBeansProjects/jars/slf4j-api-1.6.4.jar
and
D:/Documents%20and%20Settings/DavidH/My%20Documents/NetBeansProjects/jars/slf4j-log4j12-1.6.4.jar
to the library.
Then I made a Google Guava library and added the jar that they distribute for that to another library.
I then added both of the libraries to the project and hit run.
I now get this error:
Failed to instantiate SLF4J LoggerFactory
Reported exception:
java.lang.NoClassDefFoundError: org/apache/log4j/Level
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:108)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:279)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:252)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:265)
at com.jolbox.bonecp.BoneCPConfig.<clinit>(BoneCPConfig.java:60)
at javasampleapps.BoneCPExample.main(BoneCPExample.java:28)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
... 7 more
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/log4j/Level
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:128)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:108)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:279)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:252)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:265)
at com.jolbox.bonecp.BoneCPConfig.<clinit>(BoneCPConfig.java:60)
at javasampleapps.BoneCPExample.main(BoneCPExample.java:28)
Caused by: java.lang.ClassNotFoundException: org.apache.log4j.Level
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
... 7 more
Java Result: 1
BUILD SUCCESSFUL (total time: 0 seconds)
What can I do to fix this?
If you include slf4j-log4j12-1.6.4.jar, then you must also include the log4j jar. Slf4j is a logging facade, which means it gives you a uniform interface to multiple other logging APIs.
The slf4j-log4j12 provides a conversion to the log4j API. As you don't include the log4j library, it throws an error. Not including the slf4j-log4j12 library should be enough (if only the slf4j-api library is included, then it should then default to a no-operation logger AFAIK).