ant sql task throws "no ResultSet available" with org.sqlite.JDBC driver - exception

I'm trying to use org.sqlite.JDBC to create and update a sqlite database in ant.
The sqlitejdbc-v056.jar comes from http://www.zentus.com/sqlitejdbc/ and is the latest version (056)
This is my build.xml:
<?xml version="1.0" encoding="utf-8"?>
<project name="My Project" default="mytarget" basedir=".">
<path id="antclasspath">
<fileset dir="_ant">
<include name="*.jar"/>
</fileset>
</path>
<target name="mytarget">
<property name="antclasspathar" refid="antclasspath" />
<echo message="Classpath is ${antclasspathar}"/>
<sql
driver="org.sqlite.JDBC"
url="jdbc:sqlite:C:/Projects/dummy/test.db"
userid=""
password=""
classpathref="antclasspath"
>
DROP TABLE IF EXISTS people;
CREATE TABLE people (name, occupation);
</sql>
</target>
</project>
This is the output I get:
C:\Projects\dummy>ant -v
Apache Ant version 1.7.1 compiled on June 27 2008
Buildfile: build.xml
Detected Java version: 1.6 in: C:\Program Files (x86)\Java\jdk1.6.0_10\jre
Detected OS: Windows Vista
parsing buildfile C:\Projects\dummy\build.xml with URI = file:/C:/Projects/dummy/build.xml
Project base dir set to: C:\Projects\dummy
[antlib:org.apache.tools.ant] Could not load definitions from resource org/apache/tools/ant/antlib.xml. It could not be found.
Build sequence for target(s) `mytarget' is [mytarget]
Complete build sequence is [mytarget, ]
mytarget:
[echo] Classpath is C:\Projects\dummy\_ant\sqlitejdbc-v056.jar
[sql] connecting to jdbc:sqlite:C:/Projects/dummy/test.db
[sql] Loading org.sqlite.JDBC using AntClassLoader with classpath C:\Projects\dummy\_ant\sqlitejdbc-v056.jar
[sql] Executing commands
[sql] SQL: DROP TABLE IF EXISTS people
[sql] Failed to execute: DROP TABLE IF EXISTS people
BUILD FAILED
java.sql.SQLException: no ResultSet available
at org.sqlite.Stmt.getResultSet(Stmt.java:111)
at org.apache.tools.ant.taskdefs.SQLExec.execSQL(SQLExec.java:567)
at org.apache.tools.ant.taskdefs.SQLExec.runStatements(SQLExec.java:535)
at org.apache.tools.ant.taskdefs.SQLExec$Transaction.runTransaction(SQLExec.java:764)
at org.apache.tools.ant.taskdefs.SQLExec$Transaction.access$000(SQLExec.java:706)
at org.apache.tools.ant.taskdefs.SQLExec.execute(SQLExec.java:449)
at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:288)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
at org.apache.tools.ant.Task.perform(Task.java:348)
at org.apache.tools.ant.Target.execute(Target.java:357)
at org.apache.tools.ant.Target.performTasks(Target.java:385)
at org.apache.tools.ant.Project.executeSortedTargets(Project.java:1337)
at org.apache.tools.ant.Project.executeTarget(Project.java:1306)
at org.apache.tools.ant.helper.DefaultExecutor.executeTargets(DefaultExecutor.java:41)
at org.apache.tools.ant.Project.executeTargets(Project.java:1189)
at org.apache.tools.ant.Main.runBuild(Main.java:758)
at org.apache.tools.ant.Main.startAnt(Main.java:217)
at org.apache.tools.ant.launch.Launcher.run(Launcher.java:257)
at org.apache.tools.ant.launch.Launcher.main(Launcher.java:104)
Total time: 0 seconds

DDL (Data Definition Language, e.g. CREATE, DROP, etc) statements does not return a ResultSet while your Ant script is apparently expecting it. At least, the SQLException is basically telling that you. I don't do Ant extensively, so I can't go in detail, but you at least need to change the script so so that no return value is expected.

Related

org.apache.maven.surefire.booter.SurefireBooterForkException: There was an error in the forked process Test mechanism null

When i am executing the tests through git bash i am getting build failure even after all tests are passing with the following error:
Git bash command - mvn clean install -B -Pdev -Dcucumber.options=-m -Dtest=EndToEndTestSuite
Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.22.2:test (default-test) on project acceptance-test: There are test failures.
Please refer to dump files (if any exist) [date].dump, [date]-jvmRun[N].dump and [date].dumpstream.
There was an error in the forked process
Test mechanism :: null
org.apache.maven.surefire.booter.SurefireBooterForkException: There was an error in the forked process
Test mechanism :: null
at org.apache.maven.plugin.surefire
.booterclient.ForkStarter.fork(ForkStarter.java:656)
at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:282)
at org.apache.maven.plugin.surefire.booterclient.ForkStarter.run(ForkStarter.java:245)
at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeProvider(AbstractSurefireMojo.java:1183)
at org.apache.maven.plugin.surefire.AbstractSurefireMojo.executeAfterPreconditionsChecked(AbstractSurefireMojo.java:1011)
at org.apache.maven.plugin.surefire.AbstractSurefireMojo.execute(AbstractSurefireMojo.java:857)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:957)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:289)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:193)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347)
My surefire configurations in the pom.xml looks like this
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M3</version>
<configuration>
<parallel>classes</parallel>
<threadCount>2</threadCount>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
</plugin>
Also, i am using junit version - 4.12
My intent here is to run tests parallelly so as to reduce build time.

Using Xtend in a feature based plugin

I am trying to learn Xtend and Eclipse plugin development at the same time. I created an Eclipse plugin project and added three Xtend classes to implement Parts in the application model. If I leave the project as a plugin based project and I try to launch it from the product configuration, everything works fine.
However, if I convert it to a feature based project and try to launch it from the product configuration, I get an error trying to resolve com.google.guava.
Here is the error log:
!SESSION 2018-07-27 06:36:27.121 -----------------------------------------------
eclipse.buildId=unknown
java.version=1.8.0_181
java.vendor=Oracle Corporation
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=en_US
Framework arguments: -product com.example.e4.rcp.todo.product -clearPersistedState
Command-line arguments: -product com.example.e4.rcp.todo.product -data D:\WiseOldBird\Workspaces\VogellaRcpXtend/runtime-todo.product -dev file:D:/WiseOldBird/Workspaces/VogellaRcpXtend/.metadata/.plugins/org.eclipse.pde.core/todo.product/dev.properties -os win32 -ws win32 -arch x86_64 -consoleLog -clearPersistedState
!ENTRY org.eclipse.equinox.app 0 0 2018-07-27 06:36:28.434
!MESSAGE Product com.example.e4.rcp.todo.product could not be found.
!ENTRY com.example.e4.rcp.todo 2 0 2018-07-27 06:36:28.526
!MESSAGE Could not resolve module: com.example.e4.rcp.todo [75]
Unresolved requirement: Require-Bundle: com.google.guava
!ENTRY org.eclipse.osgi 4 0 2018-07-27 06:36:28.531
!MESSAGE Application error
!STACK 1
java.lang.RuntimeException: No application id has been found.
at org.eclipse.equinox.internal.app.EclipseAppContainer.startDefaultApp(EclipseAppContainer.java:242)
at org.eclipse.equinox.internal.app.MainApplicationLauncher.run(MainApplicationLauncher.java:29)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:388)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:243)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:656)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:592)
at org.eclipse.equinox.launcher.Main.run(Main.java:1498)
at org.eclipse.equinox.launcher.Main.main(Main.java:1471)
An error has occurred. See the log file
D:\WiseOldBird\Workspaces\VogellaRcpXtend\.metadata\.plugins\org.eclipse.pde.core\todo.product\1532691387713.log.
Here is the MANIFEST.MF file:
Manifest-Version: 1.0
Bundle-ManifestVersion: 2
Bundle-Name: Todo
Bundle-SymbolicName: com.example.e4.rcp.todo;singleton:=true
Bundle-Version: 1.0.0.qualifier
Automatic-Module-Name: com.example.e4.rcp.todo
Bundle-RequiredExecutionEnvironment: JavaSE-1.8
Require-Bundle: com.google.guava,
org.eclipse.xtext.xbase.lib,
org.eclipse.xtend.lib,
org.eclipse.xtend.lib.macro
and here is the feature.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<feature
id="com.example.e4.rcp.todo.feature"
label="Feature"
version="1.0.0.qualifier">
<description url="http://www.example.com/description">
[Enter Feature Description here.]
</description>
<copyright url="http://www.example.com/copyright">
[Enter Copyright Description here.]
</copyright>
<license url="http://www.example.com/license">
[Enter License Description here.]
</license>
<requires>
<import plugin="com.google.guava"/>
<import plugin="org.eclipse.xtext.xbase.lib"/>
</requires>
<plugin
id="com.example.e4.rcp.todo"
download-size="0"
install-size="0"
version="0.0.0"
unpack="false"/>
</feature>
I have been unable to figure out how to specify the additional Xtend dependencies in a feature project and need advice.
As the error message says, your product doesn't contain the Guava plugin.
As you want your product to be feature-based you have to add a feature that contains the Guava plugin. If necessary, you can define your own feature for that.

Hive 1.2 Metastore Service doesn't start after configuring it to S3 storage instead HDFS

I have an Apache Spark Cluster(2.2.0) in standalone mode. Till now was running using HDFS to store the parquet files. I'm using the Hive Metastore Service of Apache Hive 1.2 to access, using the Thriftserver, Spark over JDBC.
Now I want to use S3 Object Storage instead HDFS. I have added the following configuration to my hive-site.xml:
<property>
<name>fs.s3a.access.key</name>
<value>access_key</value>
<description>Profitbricks Access Key</description>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value>secret_key</value>
<description>Profitbricks Secret Key</description>
</property>
<property>
<name>fs.s3a.endpoint</name>
<value>s3-de-central.profitbricks.com</value>
<description>ProfitBricks S3 Object Storage Endpoint</description>
</property>
<property>
<name>fs.s3a.endpoint.http.port</name>
<value>80</value>
<description>ProfitBricks S3 Object Storage Endpoint HTTP Port</description>
</property>
<property>
<name>fs.s3a.endpoint.https.port</name>
<value>443</value>
<description>ProfitBricks S3 Object Storage Endpoint HTTPS Port</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>s3a://dev.spark.my_bucket/parquet/</value>
<description>Profitbricks S3 Object Storage Hive Warehouse Location</description>
</property>
I have the hive metastore in a MySQL 5.7 database. I have added to the Hive lib folder the following jar files:
aws-java-sdk-1.7.4.jar
hadoop-aws-2.7.3.jar
I have deleted the old hive metastore schema on MySQL and then I start the metastore service with the following command: hive --service metastore & and I get the following error:
java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/ObjectMapper
at com.amazonaws.util.json.Jackson.<clinit>(Jackson.java:27)
at com.amazonaws.internal.config.InternalConfig.loadfrom(InternalConfig.java:182)
at com.amazonaws.internal.config.InternalConfig.load(InternalConfig.java:199)
at com.amazonaws.internal.config.InternalConfig$Factory.<clinit>(InternalConfig.java:232)
at com.amazonaws.ServiceNameFactory.getServiceName(ServiceNameFactory.java:34)
at com.amazonaws.AmazonWebServiceClient.computeServiceName(AmazonWebServiceClient.java:703)
at com.amazonaws.AmazonWebServiceClient.getServiceNameIntern(AmazonWebServiceClient.java:676)
at com.amazonaws.AmazonWebServiceClient.computeSignerByURI(AmazonWebServiceClient.java:278)
at com.amazonaws.AmazonWebServiceClient.setEndpoint(AmazonWebServiceClient.java:160)
at com.amazonaws.services.s3.AmazonS3Client.setEndpoint(AmazonS3Client.java:475)
at com.amazonaws.services.s3.AmazonS3Client.init(AmazonS3Client.java:447)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:391)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:371)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:235)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5757)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5990)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.databind.ObjectMapper
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
The missing class belongs to the Jackson library, then I have copied the Jackson-*.jar located on my spark-2.2.0-bin-hadoop2.7/jars/ folder which are:
jackson-annotations-2.6.5.jar
jackson-core-2.6.5.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.6.5.jar
jackson-jaxrs-1.9.13.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-paranamer-2.6.5.jar
jackson-module-scala_2.11-2.6.5.jar
jackson-xc-1.9.13.jar
But then I got the following error:
2018-01-05 17:51:00,819 ERROR [main]: metastore.HiveMetaStore (HiveMetaStore.java:main(5920)) - Metastore Thrift Server threw an exception...
java.lang.NumberFormatException: For input string: "100M"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1319)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:248)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5757)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5990)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
I think the error here it have something to do with some jar version incompatibility but I'm not able to find the correct versions.
Can someone help me here?
You absolutely cannot mix versions of the Hadoop-common, hadoop-aws, aws-s3-sdk and jackson versions from what everything expects, or you will see stack traces.
And its all open source, so if you D/L all the source JARs locally, your IDE will help you find what's causing the stack trace. This is what we all do. It's not magic, modern IDEs (intellij IDEA) even have special stack debugging.
This one is coming in because the value of fs.s3a.multipart.size set in hadoop-common's /core-default.xml resource is 100M, which came in with HADOOP-13680 and the range parsing handling numbers like "100M" instead of 104857600 . This stack trace says "Hadoop 2.8+ configuration"
You could try setting the property in your configs to that numeric value, but its a warning sign that versions of JARs are out of sync and you will probably only get a few lines further before something else breaks.
Fix: make sure that hadoop-common.jar and hadoop-aws.jar are in sync. It looks like you've got the jackson and aws ones lined up, though jackson is complex enough you can never take that for granted.

Generating a faulty report when running JMeter 3.0 test with Ant

I have been at it for days. Basically I want to create a HTML report from JMeter test file, using Ant. I am using Linux Ubuntu 16.04.1. Here is my build.xml :
<project name="performance-tests" default="run-performance-tests" basedir=".">
<property name="testpath" value="${user.dir}"/>
<poperty name="jmeter.home" value="/home/richard/Asjad/apache-jmeter-3.0"/>
<!-- Name of test (without .jmx) -->
<property name="test" value="Test"/>
<path id="jmeter.path">
<fileset dir="${basedir}" includes="/*.jar" />
</path>
<target name="run-performance-tests">
<delete dir="${basedir}/target" quiet="true" failonerror="false"/>
<mkdir dir="${basedir}/target"/>
<!-- Allow jar to be picked up locally -->
<path id="jmeter.classpath">
<fileset dir="${basedir}">
<include name="ant-jmeter*.jar"/>
</fileset>
</path>
<taskdef name="jmeter"
classpathref="jmeter.classpath"
classname="org.programmerplanet.ant.taskdefs.jmeter.JMeterTask" />
<echo message="Running load tests in testing.xml"/>
<jmeter
jmeterhome="${jmeter.home}"
testplan ="${testpath}/${test}.jmx"
resultlog="${basedir}/target/JMeterResults.xml">
<jvmarg value="-Xincgc"/>
<jvmarg value="-Xmx512m"/>
<jvmarg value="-Dproperty=value"/>
<property name="request.threads" value="5"/>
<property name="request.loop" value="50"/>
<property name="jmeter.save.saveservice.assertion_results" value="all"/>
<property name="jmeter.save.saveservice.output_format" value="xml"/>
</jmeter>
<xslt in="${basedir}/target/JMeterResults.xml"
out="${basedir}/target/Test.html"
style="${basedir}/jmeter-results-detail-report.xsl"/>
</target>
This is what I see when I run the script:
Buildfile: /home/richard/Asjad/apache-jmeter-3.0/extras/build.xml
run-performance-tests:
[delete] Deleting directory /home/richard/Asjad/apache-jmeter-3.0/extras/target
[mkdir] Created dir: /home/richard/Asjad/apache-jmeter-3.0/extras/target
[echo] Running load tests in testing.xml
[jmeter] Executing test plan: /home/richard/Asjad/apache-jmeter-3.0/extras/TestPlan/Test.jmx ==> /home/richard/Asjad/apache-jmeter-3.0/extras/target/JMeterResults.xml
[jmeter] Java HotSpot(TM) 64-Bit Server VM warning: Using incremental CMS is deprecated and will likely be removed in a future release
[jmeter] Writing log file to: /home/richard/Asjad/apache-jmeter-3.0/bin/jmeter.log
[jmeter] Creating summariser <summary>
[jmeter] Created the tree successfully using /home/richard/Asjad/apache-jmeter-3.0/extras/TestPlan/Test.jmx
[jmeter] Starting the test # Fri Jul 29 11:57:25 EEST 2016 (1469782645099)
[jmeter] Waiting for possible Shutdown/StopTestNow/Heapdump message on port 4445
[jmeter] summary = 10 in 00:00:02 = 4.6/s Avg: 214 Min: 105 Max: 344 Err: 0 (0.00%)
[jmeter] Tidying up ... # Fri Jul 29 11:57:27 EEST 2016 (1469782647345)
[jmeter] ... end of run
[xslt] Processing /home/richard/Asjad/apache-jmeter-3.0/extras/target/JMeterResults.xml to /home/richard/Asjad/apache-jmeter-3.0/extras/target/Result_One.html
[xslt] Loading stylesheet /home/richard/Asjad/apache-jmeter-3.0/extras/jmeter-results-detail-report.xsl
BUILD SUCCESSFUL
Total time: 3 seconds
The build succeeds and then it gives me a HTML file and when I open it I see this:
HTML report
I would be really thankful if someone could at least tell me my mistake or direction I am supposed to go.
It looks like JMeter developers broke jmeter-results-detail-report.xsl file, bug was reported here:
https://bz.apache.org/bugzilla/show_bug.cgi?id=59918
It will be fixed in next Apache JMeter version 3.1.
In order to revert to previous behaviour take the following steps:
Download apache-jmeter-2.13.zip
Extract jmeter-results-detail-report_21.xsl file from apache-jmeter-2.13/extras folder and drop it to "extras" folder of your JMeter 3.0 installation
Modify your build.xml file to use jmeter-results-detail-report_21.xsl file like:
<xslt in="${basedir}/target/JMeterResults.xml"
out="${basedir}/target/Test.html"
style="${basedir}/jmeter-results-detail-report_21.xsl"/>
Alternatively you can get XSLT file i.e. from here
See Visualizing JMeter .jtl Files Viewed as An .xsl Stylesheet article for more details on transforming JMeter XML result files to HTML format.
It's a bug that I reported here:
https://bz.apache.org/bugzilla/show_bug.cgi?id=59918
It will be fixed in next Apache JMeter version 3.1 or 3.0.1.
You can test if it's ok by using nightly build present here:
http://jmeter.apache.org/nightly.html
Why still use Ant + xslt to generate a report while since 3.0 you have a new web report with lots of dynamic graphsand tables with significant metrics.
See :
https://jmeter.apache.org/usermanual/generating-dashboard.html

JBoss 7.1 - SQL Server - Datasource configuration (JTDS)

I'm fighting with this problem for several days and I has not been able to solve it. I've got a server with an instance of SQL Server 2008 R2 and a JBoss 7.1 installed (I'm using standalone configuration). I was trying to configure a datasource in the application server to connect to database, using JTDS drivers. The application server starts correctly but when I tried to test the datasource through the admin console it spools out the following error:
17:49:42,117 WARN [org.jboss.jca.core.connectionmanager.pool.strategy.OnePool] (HttpManagementService-threads - 1) IJ000604: Throwable while attempting to get a new connection: null: javax.resource.ResourceException: Co
uld not create connection
at org.jboss.jca.adapters.jdbc.local.LocalManagedConnectionFactory.getLocalManagedConnection(LocalManagedConnectionFactory.java:277) [ironjacamar-jdbc-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.adapters.jdbc.local.LocalManagedConnectionFactory.createManagedConnection(LocalManagedConnectionFactory.java:235) [ironjacamar-jdbc-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.core.connectionmanager.pool.mcp.SemaphoreArrayListManagedConnectionPool.createConnectionEventListener(SemaphoreArrayListManagedConnectionPool.java:761) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.
9.Final]
at org.jboss.jca.core.connectionmanager.pool.mcp.SemaphoreArrayListManagedConnectionPool.getConnection(SemaphoreArrayListManagedConnectionPool.java:343) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.core.connectionmanager.pool.AbstractPool.getSimpleConnection(AbstractPool.java:397) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.core.connectionmanager.pool.AbstractPool.getConnection(AbstractPool.java:365) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.core.connectionmanager.pool.AbstractPool.internalTestConnection(AbstractPool.java:627) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.jca.core.connectionmanager.pool.strategy.OnePool.testConnection(OnePool.java:88) [ironjacamar-core-impl-1.0.9.Final.jar:1.0.9.Final]
at org.jboss.as.connector.pool.PoolOperations$TestConnectionInPool.invokeCommandOn(PoolOperations.java:121) [jboss-as-connector-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.connector.pool.PoolOperations$1.execute(PoolOperations.java:60) [jboss-as-connector-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.executeStep(AbstractOperationContext.java:385) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.doCompleteStep(AbstractOperationContext.java:272) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.completeStep(AbstractOperationContext.java:200) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.connector.pool.PoolOperations.execute(PoolOperations.java:74) [jboss-as-connector-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.executeStep(AbstractOperationContext.java:385) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.doCompleteStep(AbstractOperationContext.java:272) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.completeStep(AbstractOperationContext.java:200) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.ModelControllerImpl$DefaultPrepareStepHandler.execute(ModelControllerImpl.java:466) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.executeStep(AbstractOperationContext.java:385) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.doCompleteStep(AbstractOperationContext.java:272) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.AbstractOperationContext.completeStep(AbstractOperationContext.java:200) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.ModelControllerImpl.execute(ModelControllerImpl.java:121) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.ModelControllerImpl$1.execute(ModelControllerImpl.java:309) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.controller.ModelControllerImpl$1.execute(ModelControllerImpl.java:299) [jboss-as-controller-7.1.1.Final.jar:7.1.1.Final]
at org.jboss.as.domain.http.server.DomainApiHandler.processRequest(DomainApiHandler.java:294)
at org.jboss.as.domain.http.server.DomainApiHandler.doHandle(DomainApiHandler.java:201)
at org.jboss.as.domain.http.server.DomainApiHandler.handle(DomainApiHandler.java:208)
at org.jboss.as.domain.http.server.security.SubjectAssociationHandler.handle(SubjectAssociationHandler.java:51)
at org.jboss.com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:78)
at org.jboss.sun.net.httpserver.AuthFilter.doFilter(AuthFilter.java:69)
at org.jboss.com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:81)
at org.jboss.sun.net.httpserver.ServerImpl$Exchange$LinkHandler.handle(ServerImpl.java:710)
at org.jboss.com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:78)
at org.jboss.as.domain.http.server.RealmReadinessFilter.doFilter(RealmReadinessFilter.java:54)
at org.jboss.com.sun.net.httpserver.Filter$Chain.doFilter(Filter.java:81)
at org.jboss.sun.net.httpserver.ServerImpl$Exchange.run(ServerImpl.java:682)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1110) [rt.jar:1.7.0_11]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:603) [rt.jar:1.7.0_11]
at java.lang.Thread.run(Thread.java:722) [rt.jar:1.7.0_11]
at org.jboss.threads.JBossThread.run(JBossThread.java:122) [jboss-threads-2.0.0.GA.jar:2.0.0.GA]
Caused by: java.sql.SQLException: Cannot open database "SQLEXPRESS" requested by the login. The login failed.
at net.sourceforge.jtds.jdbc.SQLDiagnostic.addDiagnostic(SQLDiagnostic.java:368)
at net.sourceforge.jtds.jdbc.TdsCore.tdsErrorToken(TdsCore.java:2820)
at net.sourceforge.jtds.jdbc.TdsCore.nextToken(TdsCore.java:2258)
at net.sourceforge.jtds.jdbc.TdsCore.login(TdsCore.java:603)
at net.sourceforge.jtds.jdbc.ConnectionJDBC2.<init>(ConnectionJDBC2.java:345)
at net.sourceforge.jtds.jdbc.ConnectionJDBC3.<init>(ConnectionJDBC3.java:50)
at net.sourceforge.jtds.jdbc.Driver.connect(Driver.java:184)
at org.jboss.jca.adapters.jdbc.local.LocalManagedConnectionFactory.getLocalManagedConnection(LocalManagedConnectionFactory.java:249) [ironjacamar-jdbc-1.0.9.Final.jar:1.0.9.Final]
... 39 more
It seems a simple authentication problem, but the user/pwd is absolutely correct! Through SQL Server Management Studio I'm able to correctly connect to the db using the following:
Server name: DAN-Aladino-vs.usersad.everis.int\SQLEXPRESS
Authentication: SQL Server Authentication
Login: aladinoDs
Password: aladinoDs
To configure the datasource I've made the following steps:
1) In JBoss I've created the directory "modules\net\sourceforge\jtds\main".
Inside it I've put the jtds-1.2.5.jar and a new module.xml with the following content:
<?xml version="1.0" encoding="UTF-8"?>
<module xmlns="urn:jboss:module:1.0" name="net.sourceforge.jtds">
<resources>
<resource-root path="jtds-1.2.5.jar"/>
<!-- Insert resources here -->
</resources>
<dependencies>
<module name="javax.api"/>
<module name="javax.transaction.api"/>
</dependencies>
</module>
2) I've modified the standalone.xml configuration file adding the followings:
<datasource jndi-name="java:jboss/datasources/AladinoDS" pool-name="AladinoDS" enabled="true" use-java-context="true">
<connection-url>jdbc:jtds:sqlserver://DAN-Aladino-vs.usersad.everis.int:1433/SQLEXPRESS</connection-url>
<driver>JTDS</driver>
<new-connection-sql>select 1</new-connection-sql>
<transaction-isolation>TRANSACTION_READ_COMMITTED</transaction-isolation>
<pool>
<min-pool-size>5</min-pool-size>
<max-pool-size>50</max-pool-size>
</pool>
<security>
<user-name>aladinoDs</user-name>
<password>aladinoDs</password>
</security>
<validation>
<check-valid-connection-sql>select 1</check-valid-connection-sql>
</validation>
<timeout>
<set-tx-query-timeout>true</set-tx-query-timeout>
<blocking-timeout-millis>5000</blocking-timeout-millis>
<idle-timeout-minutes>15</idle-timeout-minutes>
</timeout>
<statement>
<track-statements>false</track-statements>
</statement>
</datasource>
and, in the <drivers> section:
<driver name="JTDS" module="net.sourceforge.jtds">
<driver-class>net.sourceforge.jtds.jdbc.Driver</driver-class>
</driver>
According to what I've found on the web, it should be correct, but it still not work. By the way, I don't want to use Windows Authentication for the datasource but I tried also that way, unsuccessfully.
I hope someone can find something wrong in my caonfiguration. If it's correct, could be a problem of the database server/instance configuration? I'm puzzled... through SQL Server Mgmt Studio all seems to work.
Thank you all,
Luca
(Answered in comments and edits. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
#Jon Skeet: The SQL Server error log shows the following:
01/25/2013 09:47:02,Logon,Unknown,Login failed for user 'aladinoDs'. Reason: Failed to open the explicitly specified database.
So, the problem should be that I'm not using the proper database name.
#CoolBeans: I'm not used to SQL Server (usually I work with Oracle DB), and I'm not familiar with its distinction between databases/instances/logins, but if I'm using the credentials above to successfully logon into db through SQL Server Mgmt Studio, why am I not able to use them for datasource? When I connect to db server, I can see in SQL Server Mgmt Studio, into "databases" folder, two objects: System Databases and AladinoSFA2. Should I use the latest one as server name in the connection string? This will be my next try. I'll post an update.
SOLVED:
I corrected the connection string: <connection-url>jdbc:jtds:sqlserver://DAN-Aladino-vs.usersad.everis.int:1433/AladinoSFA2</connection-url>
It seems that i would have to use the single database name and not the database server name into the connection string. I don't really understand why using SQL Server Mgmt Studio I do not need to specify db instance name, but only db server name, and in the datasource, on the contrary, the database instance name is the only one needed.
However now it all works.