com.mysql.jdbc.Driver not found on classpath while starting spark sql and thrift server - mysql

I am receiving the following errors on starting the spark-sql shell.
But when I start the shell using the command it works
./spark-sql --jars /usr/local/hive/lib/mysql-connector-java.jar
But when I start the thrift server in the same way using below comamnd it throws the same error again.
/usr/local/spark/sbin/start-thriftserver.sh --jars /usr/local/hive/lib/mysql-connector-java.jar
Please help me in understanding how this can be resolved so that I dont have to pass the jar path externally and why is it working for the spark-sql case and not with thrift server. Do I need to set classpath somewhere that I am missing ?
Please let me know if you need anything else.
5/10/18 05:15:33 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/10/18 05:15:33 INFO server.AbstractConnector: Started SocketConnector#0.0.0.0:47703
15/10/18 05:15:33 INFO util.Utils: Successfully started service 'HTTP file server' on port 47703.
15/10/18 05:15:33 INFO spark.SparkEnv: Registering OutputCommitCoordinator
15/10/18 05:15:38 INFO server.Server: jetty-8.y.z-SNAPSHOT
15/10/18 05:15:38 INFO server.AbstractConnector: Started SelectChannelConnector#0.0.0.0:4040
15/10/18 05:15:38 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
15/10/18 05:15:38 INFO ui.SparkUI: Started SparkUI at http://192.168.1.12:4040
15/10/18 05:15:38 INFO spark.SparkContext: Added JAR file:/usr/local/hive/lib/mysql-connector-java.jar at http://192.168.1.12:47703/jars/mysql-connector-java.jar with timestamp 1445125538564
15/10/18 05:15:38 INFO client.AppClient$ClientActor: Connecting to master akka.tcp://sparkMaster#192.168.1.12:7077/user/Master...
15/10/18 05:15:38 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20151018051538-0018
15/10/18 05:15:38 INFO client.AppClient$ClientActor: Executor added: app-20151018051538-0018/0 on worker-20151018024224-192.168.1.12-50211 (192.168.1.12:50211) with 4 cores
15/10/18 05:15:38 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20151018051538-0018/0 on hostPort 192.168.1.12:50211 with 4 cores, 512.0 MB RAM
15/10/18 05:15:38 INFO client.AppClient$ClientActor: Executor updated: app-20151018051538-0018/0 is now LOADING
15/10/18 05:15:38 INFO client.AppClient$ClientActor: Executor updated: app-20151018051538-0018/0 is now RUNNING
15/10/18 05:15:39 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 43460.
15/10/18 05:15:39 INFO netty.NettyBlockTransferService: Server created on 43460
15/10/18 05:15:39 INFO storage.BlockManagerMaster: Trying to register BlockManager
15/10/18 05:15:39 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.1.12:43460 with 265.4 MB RAM, BlockManagerId(driver, 192.168.1.12, 43460)
15/10/18 05:15:39 INFO storage.BlockManagerMaster: Registered BlockManager
15/10/18 05:15:39 INFO cluster.SparkDeploySchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0
15/10/18 05:15:40 INFO hive.HiveContext: Initializing execution hive, version 0.13.1
15/10/18 05:15:40 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead
15/10/18 05:15:40 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/10/18 05:15:40 INFO metastore.ObjectStore: ObjectStore, initialize called
15/10/18 05:15:41 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/10/18 05:15:41 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/10/18 05:15:41 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/10/18 05:15:41 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/10/18 05:15:42 INFO cluster.SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor#192.168.1.12:56227/user/Executor#-1120183734]) with ID 0
15/10/18 05:15:42 INFO storage.BlockManagerMasterEndpoint: Registering block manager 192.168.1.12:34713 with 265.4 MB RAM, BlockManagerId(0, 192.168.1.12, 34713)
15/10/18 05:15:52 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead
15/10/18 05:15:52 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
15/10/18 05:15:52 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "#" (64), after : "".
15/10/18 05:15:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/10/18 05:15:54 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/10/18 05:16:01 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
15/10/18 05:16:01 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
15/10/18 05:16:03 INFO metastore.ObjectStore: Initialized ObjectStore
15/10/18 05:16:04 WARN metastore.ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 0.13.1aa
15/10/18 05:16:05 INFO metastore.HiveMetaStore: Added admin role in metastore
15/10/18 05:16:05 INFO metastore.HiveMetaStore: Added public role in metastore
15/10/18 05:16:05 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
15/10/18 05:16:05 INFO session.SessionState: No Tez session required at this point. hive.execution.engine=mr.
15/10/18 05:16:05 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead
15/10/18 05:16:05 INFO hive.HiveContext: Initializing HiveMetastoreConnection version 0.13.1 using Spark classes.
15/10/18 05:16:06 WARN conf.HiveConf: DEPRECATED: hive.metastore.ds.retry.* no longer has any effect. Use hive.hmshandler.retry.* instead
15/10/18 05:16:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/10/18 05:16:06 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
15/10/18 05:16:06 INFO metastore.ObjectStore: ObjectStore, initialize called
15/10/18 05:16:07 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
15/10/18 05:16:07 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
15/10/18 05:16:07 WARN DataNucleus.Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:105)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.liftedTree1$1(IsolatedClientLoader.scala:170)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.<init>(IsolatedClientLoader.scala:166)
at org.apache.spark.sql.hive.HiveContext.metadataHive$lzycompute(HiveContext.scala:212)
at org.apache.spark.sql.hive.HiveContext.metadataHive(HiveContext.scala:175)
at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:55)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:73)
at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1412)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2453)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2465)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:340)
... 21 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1410)
... 26 more
Caused by: javax.jdo.JDOFatalInternalException: Error creating transactional connection factory
NestedThrowables:
java.lang.reflect.InvocationTargetException
at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)
at java.security.AccessController.doPrivileged(Native Method)
at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)
at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)
at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)
at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:310)
at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:339)
at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:248)
at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:223)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58)
at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:497)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:475)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:523)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:397)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:356)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4944)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:171)
... 31 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)
at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)
at org.datanucleus.store.AbstractStoreManager.<init>(AbstractStoreManager.java:240)
at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:286)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)
at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)
at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)
at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)
at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)
... 60 more
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the "dbcp-builtin" plugin to create a ConnectionPool gave an error : The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:259)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.<init>(ConnectionFactoryImpl.java:85)
... 78 more
Caused by: org.datanucleus.store.rdbms.connectionpool.DatastoreDriverNotFoundException: The specified datastore driver ("com.mysql.jdbc.Driver") was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:58)
at org.datanucleus.store.rdbms.connectionpool.DBCPBuiltinConnectionPoolFactory.createConnectionPool(DBCPBuiltinConnectionPoolFactory.java:49)
at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)
... 80 more
15/10/18 05:16:07 INFO spark.SparkContext: Invoking stop() from shutdown hook

copy mysql-connector-java-5.1.38-bin.jar to spark jars location in spark 2.x versions
$ cp -r $HIVE_HOME/lib/mysql-connector-java-5.1.38-bin.jar $SPARK_HOME/jars/

The main problem:
"Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient"
So in Hive lib you have copied mysql connector. put that mysql class path in spark-env.sh file.
export SPARK_CLASSPATH="/home/hadoop/work/apache-hive-2.0.0-bin/lib/mysql-connector-java-5.1.38-bin.jar"
Finally Place hive-site.xml in spark conf folder. Now check this problem will resolve.

Try including jars in SPARK_CLASSPATH. You can update that spark-env.sh as well. Which version of spark you are using? Spark 1.3 and lower versions --jars has issues with adding JDBC drivers.

Add SPARK_HOME into your ~/.bash_profile file, like the following
# set spark directory
export SPARK_HOME=/var/www/python_project/extras/spark/spark-2.4.4-bin-hadoop2.7
Once you saved this file, Run the following command,
source ~/.bash_profile
This saved my day :)
Hope this helps.

Related

Apache Drill 1.17.0 on Windows 10 - Trouble Getting Drill to Run (Embedded Mode)

Details:
Apache Drill 1.17.0
Windows 10 64 bit
Java JDK1.8.0_241
New installation. Unable to get Apache Drill to load successfully.
Command line: c:\Users\floodb\Software\Drill\apache-drill-1.17.0\bin>drill-embedded
Error Received: Error: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
[Error Id: 7c1b33eb-7a27-4e39-af06-5ba22e5ffae6 ] (state=,code=0)
java.sql.SQLException: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
There is no 'hadoop_home' environment variable set (as suggested by other posts on StackOverflow).
Partial Log:
2020-02-19 15:55:42,315 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Stopwatch patched for
old HBase Guava version. 2020-02-19 15:55:42,319 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Closeables patched for
old HBase Guava version. 2020-02-19 15:55:42,333 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Preconditions were
patched to hold new methods. 2020-02-19 15:55:42,693 [main] INFO
o.a.drill.common.config.DrillConfig - Configuration and plugin file(s)
identified in 32ms. Base Configuration:
- jar:file:/C:/Users/floodb/Software/Drill/apache-drill-1.17.0/jars/drill-common-1.17.0.jar!/drill-default.conf
(Bunch of log lines deleted)
2020-02-19 15:55:45,134 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 22 classes for org.apache.drill.common.logical.data.LogicalOperator took 4ms
2020-02-19 15:55:45,138 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 12 classes for org.apache.drill.common.logical.StoragePluginConfig took 3ms
2020-02-19 15:55:45,146 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 15 classes for org.apache.drill.common.logical.FormatPluginConfig took 7ms 2020-02-19
15:55:45,179 [main] INFO o.a.drill.common.config.DrillConfig - User
Error Occurred: Failure while attempting to load instance of the class
of type org.apache.drill.exec.store.StoragePluginRegistry requested at
path drill.exec.storage.registry. (null)
org.apache.drill.common.exceptions.UserException:
UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance
of the class of type org.apache.drill.exec.store.StoragePluginRegistry
requested at path drill.exec.storage.registry.
[Error Id: 7c1b33eb-7a27-4e39-af06-5ba22e5ffae6 ] at
org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:637)
at
org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:92)
at
org.apache.drill.exec.server.DrillbitContext.(DrillbitContext.java:113)
at org.apache.drill.exec.work.WorkManager.start(WorkManager.java:116)
at org.apache.drill.exec.server.Drillbit.run(Drillbit.java:221) at
org.apache.drill.jdbc.impl.DrillConnectionImpl.(DrillConnectionImpl.java:134)
at
org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:67)
at
org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:67)
at
org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
at org.apache.drill.jdbc.Driver.connect(Driver.java:75) at
sqlline.DatabaseConnection.connect(DatabaseConnection.java:135) at
sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:192)
at sqlline.Commands.connect(Commands.java:1364) at
sqlline.Commands.connect(Commands.java:1244) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:38)
at sqlline.SqlLine.dispatch(SqlLine.java:730) at
sqlline.SqlLine.initArgs(SqlLine.java:410) at
sqlline.SqlLine.begin(SqlLine.java:515) at
sqlline.SqlLine.start(SqlLine.java:267) at
sqlline.SqlLine.main(SqlLine.java:206) Caused by:
java.lang.reflect.InvocationTargetException: null at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:88)
... 22 common frames omitted Caused by:
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
Method) at
org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:645)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1230) at
org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1435) at
org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:493)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910)
at
org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:678)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910)
at
org.apache.drill.exec.store.dfs.DrillFileSystem.listStatus(DrillFileSystem.java:563)
at
org.apache.drill.exec.util.FileSystemUtil.listNonRecursive(FileSystemUtil.java:224)
at
org.apache.drill.exec.util.FileSystemUtil.list(FileSystemUtil.java:209)
at
org.apache.drill.exec.util.FileSystemUtil.listFiles(FileSystemUtil.java:104)
at
org.apache.drill.exec.util.DrillFileSystemUtil.listFiles(DrillFileSystemUtil.java:86)
at
org.apache.drill.exec.store.sys.store.LocalPersistentStore.getRange(LocalPersistentStore.java:121)
at
org.apache.drill.exec.store.sys.BasePersistentStore.getAll(BasePersistentStore.java:27)
at
org.apache.drill.exec.store.StoragePluginRegistryImpl.initPluginsSystemTable(StoragePluginRegistryImpl.java:277)
at
org.apache.drill.exec.store.StoragePluginRegistryImpl.(StoragePluginRegistryImpl.java:90)
... 27 common frames omitted 2020-02-19 15:55:46,199 [main] INFO
o.apache.drill.exec.server.Drillbit - Shutdown completed (1018 ms).
The problem was that the 32 bit version of the Java JDK was installed. If you are having this problem, check to make sure that the 64 bit version of Java is installed.

Newbie in Apache Drill: Cannot see Web Console

I downloaded apache-drill-1.2.0 on a ubuntu 14.04 64 box.
Extracted the tar.zip contents, went to bin folder and ran drill.
Now I tried to open: http://localhost:8047, but I'm getting a "can't establish a connection to server" error.
I tried to enable https by http.ssl_enabled: "TRUE". But still cannot open the web console either using http/https.
Here are some relevant logs:
himanshu#himanshu-HP-ProBook-4430s:~/Downloads/softwares/apache-drill-1.2.0/bin$ ./drill-embedded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/classb/logback-classic-1.0.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/11/13 15:38:56 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
15/11/13 15:39:00 INFO reflections.Reflections: Reflections took 4050 ms to scan 7 urls, producing 5602 keys and 21840 values
com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
apache drill 1.2.0
"got drill?"
0: jdbc:drill:zk=local> !quit
15/11/13 15:39:02 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
at org.apache.drill.common.util.PathScanner.scanForImplementations(PathScanner.java:110)
at org.apache.drill.common.util.PathScanner.scanForImplementationsArr(PathScanner.java:86)
at org.apache.drill.common.logical.data.LogicalOperatorBase.getSubTypes(LogicalOperatorBase.java:92)
at org.apache.drill.common.config.DrillConfig.<init>(DrillConfig.java:82)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:226)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:152)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init>(DrillConnectionImpl.java:92)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:66)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.close(Commands.java:925)
at sqlline.Commands.quit(Commands.java:889)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
at sqlline.SqlLine.dispatch(SqlLine.java:742)
at sqlline.SqlLine.begin(SqlLine.java:621)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
15/11/13 15:39:02 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:39:03 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
at org.apache.drill.common.util.PathScanner.scanForImplementations(PathScanner.java:110)
at org.apache.drill.common.util.PathScanner.scanForImplementationsArr(PathScanner.java:86)
at org.apache.drill.common.logical.data.LogicalOperatorBase.getSubTypes(LogicalOperatorBase.java:92)
at org.apache.drill.common.config.DrillConfig.<init>(DrillConfig.java:82)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:226)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:152)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init>(DrillConnectionImpl.java:92)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:66)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.close(Commands.java:925)
at sqlline.Commands.closeall(Commands.java:899)
at sqlline.SqlLine.begin(SqlLine.java:649)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
Guava library present in distribution: guava-14.0.1
Please help.
You started with ./drill-embedded
Right. First of all you need to check, Drill is running or not on 8047 port.
Using this command check is there any process started on this port.
netstat -tlnp | grep 8047
you can also use jps command and check is there any process with name sqlline.
If its showing a process running on this port. Then there may be issue with mapping localhost. Then you can try
http://<IP_ADDRESS>:8047
You should start drill first.. Then try to open localhost:8047

Suppressing stack trace when catching exception

I have a short 2 step Spring Batch job that reads from an xml rest service call using StaxEventItemReader in Strict mode. I want it in strict mode because without the ability to connect to the rest service, there is no reason to continue. I want to catch the exception and exit gracefully with a custom message, not with a stack trace.
I'm able to catch the exception using a StepExecutionListener with afterStep() method checking the failures (stepExecution.getFailureExceptions()), but the stack trace is still being output. I'm getting my custom message (13:05:32,981 in the STS Output below) plus the stack trace. When I used the ItemReaderListener with onError method the onError method is not called, I'm guessing because it is an initialization exception not a read error.
What is the best method for capturing the failed to initialize reader exception and suppress the stack trace from the output?
SpringBatch Job:
<batch:job id="randomSends2NG">
<batch:step id="getStations" next="generateAndSend">
<batch:tasklet>
<batch:chunk reader="getStationsFromNG_RestXML"
processor="idExtractor"
writer="stationsWriter"
commit-interval="1">
</batch:chunk>
<batch:listeners>
<batch:listener ref="noStationsStopListener" />
</batch:listeners>
</batch:tasklet>
</batch:step>
<batch:step id="generateAndSend">
<batch:tasklet>
<batch:chunk reader="readStationIdsList"
processor="createRandomRequests"
writer="ngBroadcasterService"
commit-interval="400">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
STS output with stack trace:
13:05:30,479 [main] INFO SimpleJobLauncher - No TaskExecutor has been set, defaulting to synchronous executor.
13:05:30,776 [main] INFO DepartmentSendCountsNoOpReader - constructing stationSendCountsReader
LightYellow color:43
13:05:31,368 [main] INFO ProcessStatisticsTasklet - constructor...
13:05:31,525 [main] INFO RequestsListReader - constructing requestsListReader
13:05:31,588 [main] INFO Jaxb2Marshaller - Creating JAXBContext with classes to be bound [class com.pevco.pevcotubesystem.StationConfig]
13:05:31,886 [main] INFO SimpleJobLauncher - Job: [FlowJob: [name=randomSends2NG]] launched with the following parameters: [{hostNameOrIP=192.168.100.10:8383}]
13:05:31,902 [main] INFO SimpleStepHandler - Executing step: [getStations]
13:05:32,981 [main] ERROR AbstractStep - Encountered an error executing step getStations in job randomSends2NG
org.springframework.batch.item.ItemStreamException: Failed to initialize the reader
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open(AbstractItemCountingItemStreamItemReader.java:147)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:307)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:183)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:150)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:131)
at org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:119)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:172)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:202)
at com.sun.proxy.$Proxy8.open(Unknown Source)
at org.springframework.batch.item.support.CompositeItemStream.open(CompositeItemStream.java:96)
at org.springframework.batch.core.step.tasklet.TaskletStep.open(TaskletStep.java:310)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:195)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:67)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:165)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:134)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:304)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.start(CommandLineJobRunner.java:362)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.main(CommandLineJobRunner.java:590)
Caused by: java.lang.IllegalStateException: Input resource must exist (reader is in 'strict' mode)
at org.springframework.batch.item.xml.StaxEventItemReader.doOpen(StaxEventItemReader.java:195)
at org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open(AbstractItemCountingItemStreamItemReader.java:144)
... 27 more
13:05:32,981 [main] ERROR NoStationsStopListener - Not able to get stations list from NG. Check that NG is running
13:05:32,998 [main] INFO SimpleJobLauncher - Job: [FlowJob: [name=randomSends2NG]] completed with the following parameters: [{hostNameOrIP=192.168.100.10:8383}] and the following status: [FAILED]
13:05:32,998 [main] INFO ClassPathXmlApplicationContext - Closing org.springframework.context.support.ClassPathXmlApplicationContext#4f4a7090: startup date [Wed Jul 01 13:05:29 EDT 2015]; root of context hierarchy
When facing this problem probably the best solution is to check for REST service as soon as possible because "without the ability to connect to the rest service, there is no reason to continue [...] and exit gracefully with a custom message".
Before first step use a JobExecutionDecider where you check for REST availability and continue or end your job according to service presence/absence.

Error Setting up MySQL database with Gov Registry 4.6.0

I am trying to setup the Gov. registry 4.6.0 with a MySQL 5.6 community edition database
I followed the setup instructions in the 4.6.0 documentation and when I start the registry with wso2server.bat -Dsetup option I get the following error
C:\Apps\wso2greg-4.6.0\bin>wso2server.bat -Dsetup
JAVA_HOME environment variable is set to C:\Program Files\Java\jre6
CARBON_HOME environment variable is set to C:\Apps\WSO2GR~1.0\bin\..
[2014-03-03 13:52:57,979] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Starting WSO2 Carbon...
[2014-03-03 13:52:57,982] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Operating System : Windows 7 6.1, amd64
[2014-03-03 13:52:57,983] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Java Home : C:\Program Files\Java\jre6
[2014-03-03 13:52:57,983] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Java Version : 1.6.0_29
[2014-03-03 13:52:57,983] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Java VM : Java HotSpot(TM) 64-Bit Server VM 20.4-b02,Sun Microsystems Inc.
[2014-03-03 13:52:57,984] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Carbon Home : C:\Apps\WSO2GR~1.0\bin\..
[2014-03-03 13:52:57,984] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - Java Temp Dir : C:\Apps\WSO2GR~1.0\bin\..\tmp
[2014-03-03 13:52:57,984] INFO {org.wso2.carbon.core.internal.CarbonCoreActivator} - User : xxxxx, en-US, America/New_York
[2014-03-03 13:52:58,056] WARN {org.wso2.carbon.core.bootup.validator.util.ValidationResultPrinter} - The default keystore (wso2carbon.jks) is currently being used. To maximize security when deploying to a production environment, configure a new keystore with a unique password in the production server profile.
[2014-03-03 13:52:58,063] INFO {org.wso2.carbon.databridge.agent.thrift.AgentHolder} - Agent created !
[2014-03-03 13:52:58,079] INFO {org.wso2.carbon.databridge.agent.thrift.internal.AgentDS} - Successfully deployed Agent Client
[2014-03-03 13:52:59,096] ERROR {org.wso2.carbon.user.core.internal.Activator} - Cannot start User Manager Core bundle
java.lang.Exception: Error in creating the database
at org.wso2.carbon.user.core.common.DefaultRealmService.initializeDatabase(DefaultRealmService.java:285)
at org.wso2.carbon.user.core.common.DefaultRealmService.<init>(DefaultRealmService.java:90)
at org.wso2.carbon.user.core.common.DefaultRealmService.<init>(DefaultRealmService.java:114)
at org.wso2.carbon.user.core.internal.Activator.startDeploy(Activator.java:70)
at org.wso2.carbon.user.core.internal.BundleCheckActivator.start(BundleCheckActivator.java:61)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683)
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:390)
at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1176)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:559)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:544)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:457)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:243)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:438)
at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:1)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:340)
Caused by: java.lang.Exception: Error occurred while executing : CREATE INDEX REG_PATH_IND_BY_PATH_VALUE USING HASH ON REG_PATH(REG_PATH_VALUE, REG_TENANT_ID)
at org.wso2.carbon.utils.dbcreator.DatabaseCreator.executeSQL(DatabaseCreator.java:169)
at org.wso2.carbon.utils.dbcreator.DatabaseCreator.executeSQLScript(DatabaseCreator.java:325)
at org.wso2.carbon.utils.dbcreator.DatabaseCreator.createRegistryDatabase(DatabaseCreator.java:61)
at org.wso2.carbon.user.core.common.DefaultRealmService.initializeDatabase(DefaultRealmService.java:278)
... 19 more
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Specified key was too long; max key length is 767 bytes
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1054)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4237)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4169)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2617)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2778)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2828)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2777)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:949)
at com.mysql.jdbc.StatementImpl.execute(StatementImpl.java:795)
at org.wso2.carbon.utils.dbcreator.DatabaseCreator.executeSQL(DatabaseCreator.java:139)
... 22 more
[2014-03-03 13:53:05,444] INFO {org.apache.catalina.startup.TaglibUriRule} - TLD skipped. URI: http://tiles.apache.org/tags-tiles is already defined
What is wrong?
Thanks
I guess same issue has been mentioned in the jira. Workaround that, they have mentioned is
This occurs when an encoding like UTF-8 is used, because it takes more than 1 byte to represent a character. When an encoding like latin1 is used, this exception does not occur.
You can try it and check

Sonar, Maven, Jenkins, MySQL - Cannot create JDBC driver of class 'com.mysql.jdbc.Driver'

Situation
We're running sonar with a MySQL database, jenkins and a build environment with maven.
Problem
When executing a jenkins job it is aborted with the message:
[ERROR] Failed to execute goal
org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on
project playground_eb: Can not execute Sonar: Fail to connect to
database: Cannot create JDBC driver of class 'com.mysql.jdbc.Driver'
for connect URL 'http://192.168.1.220:3306': No suitable driver
Configuration
Servers
192.168.1.220 (Debian 6.0.6):
- runs Jenkins (# port 8080)
- Sonar (# port 9000)
- MySQL Database (# port 3306)
192.168.1.221 (Windows 7):
- runs Maven and the whole build environment
Configurations
The database was configured in the sonar.properties file - with localhost.
Sonar and its database were configured via the jenkins web interface - with the IP.
I guess this could cause errors but configuring the IP in the sonar.properties file causes sonar to not even start up anymore.
Full exception stack
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:sonar-maven-plugin:2.0:sonar (default-cli) on project playground_eb: Can not execute Sonar
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
Caused by: org.apache.maven.plugin.MojoExecutionException: Can not execute Sonar
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:118)
at org.codehaus.mojo.sonar.Bootstraper.start(Bootstraper.java:65)
at org.codehaus.mojo.sonar.SonarMojo.execute(SonarMojo.java:90)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
... 19 more
Caused by: java.lang.IllegalStateException: Fail to connect to database
at org.sonar.core.persistence.DefaultDatabase.start(DefaultDatabase.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.invokeMethod(ReflectionLifecycleStrategy.java:110)
at org.picocontainer.lifecycle.ReflectionLifecycleStrategy.start(ReflectionLifecycleStrategy.java:89)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.start(AbstractInjectionFactory.java:84)
at org.picocontainer.behaviors.AbstractBehavior.start(AbstractBehavior.java:169)
at org.picocontainer.behaviors.Stored$RealComponentLifecycle.start(Stored.java:132)
at org.picocontainer.behaviors.Stored.start(Stored.java:110)
at org.picocontainer.DefaultPicoContainer.potentiallyStartAdapter(DefaultPicoContainer.java:1009)
at org.picocontainer.DefaultPicoContainer.startAdapters(DefaultPicoContainer.java:1002)
at org.picocontainer.DefaultPicoContainer.start(DefaultPicoContainer.java:760)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:72)
at org.sonar.batch.bootstrap.Module.start(Module.java:67)
at org.sonar.batch.bootstrap.BootstrapModule.doStart(BootstrapModule.java:83)
at org.sonar.batch.bootstrap.Module.start(Module.java:68)
at org.sonar.batch.bootstrapper.Batch.startBatch(Batch.java:75)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:60)
at org.sonar.maven3.SonarMojo.execute(SonarMojo.java:142)
at org.codehaus.mojo.sonar.Bootstraper.executeMojo(Bootstraper.java:113)
... 23 more
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot create JDBC driver of class 'com.mysql.jdbc.Driver' for connect URL 'http://192.168.1.220:3306'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1452)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1371)
at org.apache.commons.dbcp.BasicDataSource.getLogWriter(BasicDataSource.java:1098)
at org.apache.commons.dbcp.BasicDataSourceFactory.createDataSource(BasicDataSourceFactory.java:350)
at org.sonar.core.persistence.DefaultDatabase.initDatasource(DefaultDatabase.java:131)
at org.sonar.core.persistence.DefaultDatabase.start(DefaultDatabase.java:68)
... 44 more
Caused by: java.sql.SQLException: No suitable driver
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1443)
... 49 more
Just found the problem.
In Jenkins the Sonar database URL was set to
http://192.168.1.221:3306/sonar
Now I set it to the WHOLE JDBC-URL (including jdbc: ...) just like in the sonar.properties file
... and now it works.