I had a webproject in Eclipse, which uses a connection to my mysql database.
I exported the project as .WAR-File and deployed it on a different machine to the tomcat server. On that machine there is also the mysql-database.
I do this:
envContext = new InitialContext();
DataSource ds = (DataSource) envContext.lookup( "java:comp/env/jdbc/Database" );
con = ds.getConnection();
My Context.xml:
<Resource name="jdbc/Database" auth="Container"
type="javax.sql.DataSource"
maxActive="100" maxIdle="30" maxWait="10000"
username="user"
password="pw"
driverClassName="com.mysql.jdbc.Driver"
url="jdbc:mysql://localhost:3306/databasename"/>
But I get this error:
java.sql.SQLException: com.mysql.jdbc.Driver
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:254)
at org.apache.tomcat.jdbc.pool.PooledConnection.connect(PooledConnection.java:182)
at org.apache.tomcat.jdbc.pool.ConnectionPool.createConnection(ConnectionPool.java:701)
at org.apache.tomcat.jdbc.pool.ConnectionPool.borrowConnection(ConnectionPool.java:635)
at org.apache.tomcat.jdbc.pool.ConnectionPool.init(ConnectionPool.java:486)
at org.apache.tomcat.jdbc.pool.ConnectionPool.<init>(ConnectionPool.java:144)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.pCreatePool(DataSourceProxy.java:116)
at org.apache.tomcat.jdbc.pool.DataSourceProxy.createPool(DataSourceProxy.java:103)
at org.apache.tomcat.jdbc.pool.DataSourceFactory.createDataSource(DataSourceFactory.java:554)
at org.apache.tomcat.jdbc.pool.DataSourceFactory.getObjectInstance(DataSourceFactory.java:242)
at org.apache.naming.factory.ResourceFactory.getObjectInstance(ResourceFactory.java:141)
at javax.naming.spi.NamingManager.getObjectInstance(NamingManager.java:321)
at org.apache.naming.NamingContext.lookup(NamingContext.java:842)
at org.apache.naming.NamingContext.lookup(NamingContext.java:153)
at org.apache.naming.NamingContext.lookup(NamingContext.java:830)
at org.apache.naming.NamingContext.lookup(NamingContext.java:153)
at org.apache.naming.NamingContext.lookup(NamingContext.java:830)
at org.apache.naming.NamingContext.lookup(NamingContext.java:153)
at org.apache.naming.NamingContext.lookup(NamingContext.java:830)
at org.apache.naming.NamingContext.lookup(NamingContext.java:167)
at org.apache.naming.SelectorContext.lookup(SelectorContext.java:156)
at javax.naming.InitialContext.lookup(InitialContext.java:411)
at bachelor.Cronjob.run(Cronjob.java:63)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThr$
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPool$
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:278)
at org.apache.tomcat.jdbc.pool.PooledConnection.connectUsingDriver(PooledConnection.java:246)
... 29 more
In the "WEB-INF/lib"-Folder I got the 'mysql-connector-java-5.1.35-bin.jar'.
Does anyone know why it works in eclipse but not in tomcat on the other machine? Thanks alot!
I am almost sure that if you use a JNDI datasource defined in Context.xml, the DB driver has to be in $CATALINA_HOME/lib, i.e. it is not enough to have it in WEB-INF/lib.
(Some Tomcat installations have different CATALINA_HOME and CATALINA_BASE, so if you cannot find the proper lib folder, then probably the easiest way to find where it is is to run ps axwww | grep -i catalina command and then to look for the -Dcatalina.home=... option.)
Related
when i tried to connect Hive Mysql metastore throws error as connection refused. Given below the details from pom.xml, hive-site.xml and code
spark-submit --class org.com.IoTLogAnalytics.iot_fetch_ad --master yarn
--deploy-mode client --files file:///opt/spark/conf/hive-site.xml /opt/script/jar/com.IoTLogAnalytics-0.0.1-SNAPSHOT.jar
package org.com.IoTLogAnalytics
import org.apache.spark.SparkContext._
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.io.File
//import java.util.Properties
object iot_fetch_ad {
def main(args:Array[String])
{
// val sconf = new SparkConf()
// .setAppName("IoTAnalytics")
// // .setMaster("local")
// .set("Spark.executor.memory","1g")
// val sc = new SparkContext(sconf)
//val sqlc = new SqlContext(sconf)
val table = "iotlog_tbl"
//val url = "jdbc:mysql://172.25.140.126:3306"
//val properties = new Properties()
//properties.put("user","root")
//properties.put("password","Password_123")
//val warehouseLocation = new File("spark-warehouse").getAbsolutePath
val warehouseLocation = "hdfs://172.25.140.125:8020/user/hive/warehouse"
val spark = SparkSession.builder()
.master("yarn")
.appName("iotla")
//.config("spark.yarn.jars","hdfs//172.25.140.125:8020/tmp/*.jar")
.config("spark.sql.warehouse.dir", warehouseLocation)
.config("hive.metastore.uris","thrift://cch1wpsteris01:9083")
.enableHiveSupport()
.getOrCreate()
import spark.implicits._
import spark.sql
//val mysqlDF = spark.read.jdbc(url, table, properties)
spark.sqlContext.sql("show databases").show();
spark.sqlContext.sql("show tables").show();
spark.sqlContext.sql("use sterisdb")
val getLog = spark.sqlContext.sql("select * from sterisdb.iotlog_tbl")
getLog.show()
}
}
19/06/02 00:14:24 INFO hive.HiveUtils: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.
19/06/02 00:14:24 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:24 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:24 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
19/06/02 00:14:25 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:25 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:25 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
19/06/02 00:14:26 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:26 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:26 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
19/06/02 00:14:27 WARN metadata.Hive: Failed to access metastore. This class should not accessed in runtime.
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:271)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:247)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand.run(databases.scala:44)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694)
at org.com.IoTLogAnalytics.iot_fetch_ad$.main(iot_fetch_ad.scala:47)
at org.com.IoTLogAnalytics.iot_fetch_ad.main(iot_fetch_ad.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1523)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
... 59 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
... 65 more
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused (Connection refused)
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:420)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1521)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3005)
at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3024)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1234)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:174)
at org.apache.hadoop.hive.ql.metadata.Hive.<clinit>(Hive.java:166)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:503)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:271)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:247)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand.run(databases.scala:44)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694)
at org.com.IoTLogAnalytics.iot_fetch_ad$.main(iot_fetch_ad.scala:47)
at org.com.IoTLogAnalytics.iot_fetch_ad.main(iot_fetch_ad.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 73 more
)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:466)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:236)
at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:74)
... 70 more
19/06/02 00:14:27 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:27 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:27 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
19/06/02 00:14:28 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:28 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:28 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
19/06/02 00:14:29 INFO hive.metastore: Trying to connect to metastore with URI thrift://cch1wpsteris01:9083
19/06/02 00:14:29 WARN hive.metastore: Failed to connect to the MetaStore Server...
19/06/02 00:14:29 INFO hive.metastore: Waiting 1 seconds before next connection attempt.
Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.hive.HiveSessionStateBuilder.org$apache$spark$sql$hive$HiveSessionStateBuilder$$externalCatalog(HiveSessionStateBuilder.scala:39)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$1.apply(HiveSessionStateBuilder.scala:54)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:90)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:247)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand$$anonfun$2.apply(databases.scala:44)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.execution.command.ShowDatabasesCommand.run(databases.scala:44)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:194)
at org.apache.spark.sql.Dataset$$anonfun$53.apply(Dataset.scala:3364)
at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3363)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:194)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:79)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:694)
at org.com.IoTLogAnalytics.iot_fetch_ad$.main(iot_fetch_ad.scala:47)
at org.com.IoTLogAnalytics.iot_fetch_ad.main(iot_fetch_ad.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstruct
There could be 2 possible reasons:
It looks like your hive metastore has been locked and spark is unable to access. You need to remove the lock files from the metastore.
rm metastore_db/*.lck
Also you need to check if the metastore server is up and running. Try the below code:
ps -ef | grep metastore
If not, can you try looking into the log: /var/log/hive/hivemetastore.log
Hope this helps :)
I have installed intellij 2018.1 from https://www.jetbrains.com/idea/download/#section=windows
Getting the below error on opening a project/directory
My system configuration:
OS Name: Microsoft Windows 10 Home Insider Preview Single Language
OS Version: 10.0.17672
BIOS Version: Dell Inc. 1.5.4
Logs:
E:\Apps\ideaIU-2018.1.4.win\bin>idea.bat
java.lang.RuntimeException: cannot load system cursor: CopyDrop.32x32
at sun.awt.windows.WToolkit.lazilyLoadDesktopProperty(WToolkit.java:894)
at java.awt.Toolkit.getDesktopProperty(Toolkit.java:1808)
at java.awt.dnd.DragSource.load(DragSource.java:131)
at java.awt.dnd.DragSource.<clinit>(DragSource.java:148)
at com.intellij.ide.palette.impl.PaletteWindow.<init>(PaletteWindow.java:90)
at com.intellij.ide.palette.impl.PaletteToolWindowManager.<init>(PaletteToolWindowManager.java:43)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.picocontainer.defaults.InstantiatingComponentAdapter.newInstance(InstantiatingComponentAdapter.java:193)
at com.intellij.util.pico.CachingConstructorInjectionComponentAdapter.doGetComponentInstance(CachingConstructorInjectionComponentAdapter.java:103)
at com.intellij.util.pico.CachingConstructorInjectionComponentAdapter.instantiateGuarded(CachingConstructorInjectionComponentAdapter.java:80)
at com.intellij.util.pico.CachingConstructorInjectionComponentAdapter.getComponentInstance(CachingConstructorInjectionComponentAdapter.java:63)
at com.intellij.openapi.components.impl.ComponentManagerImpl$ComponentConfigComponentAdapter.getComponentInstance(ComponentManagerImpl.java:459)
at com.intellij.openapi.components.impl.ComponentManagerImpl.createComponents(ComponentManagerImpl.java:106)
at com.intellij.openapi.components.impl.ComponentManagerImpl.init(ComponentManagerImpl.java:90)
at com.intellij.openapi.components.impl.ComponentManagerImpl.init(ComponentManagerImpl.java:77)
at com.intellij.openapi.project.impl.ProjectImpl.init(ProjectImpl.java:262)
at com.intellij.openapi.project.impl.ProjectManagerImpl.initProject(ProjectManagerImpl.java:274)
at com.intellij.openapi.project.impl.ProjectManagerImpl.newProject(ProjectManagerImpl.java:190)
at com.intellij.platform.PlatformProjectOpenProcessor.doOpenProject(PlatformProjectOpenProcessor.java:217)
at com.intellij.platform.PlatformProjectOpenProcessor.doOpenProject(PlatformProjectOpenProcessor.java:89)
at com.intellij.ide.impl.ProjectUtil.openOrImport(ProjectUtil.java:131)
at com.intellij.ide.actions.OpenFileAction.doOpenFile(OpenFileAction.java:81)
at com.intellij.ide.actions.OpenFileAction.lambda$actionPerformed$0(OpenFileAction.java:56)
at com.intellij.openapi.fileChooser.ex.FileChooserDialogImpl.choose(FileChooserDialogImpl.java:162)
at com.intellij.openapi.fileChooser.FileChooser.chooseFiles(FileChooser.java:114)
at com.intellij.openapi.fileChooser.FileChooser.chooseFiles(FileChooser.java:91)
at com.intellij.ide.actions.OpenFileAction.actionPerformed(OpenFileAction.java:48)
at com.intellij.openapi.actionSystem.ex.ActionUtil.invokeAction(ActionUtil.java:337)
at com.intellij.openapi.actionSystem.ex.ActionUtil.invokeAction(ActionUtil.java:324)
at com.intellij.ui.components.labels.ActionLink$1.linkSelected(ActionLink.java:60)
at com.intellij.ui.components.labels.LinkLabel.doClick(LinkLabel.java:142)
at com.intellij.ui.components.labels.ActionLink.doClick(ActionLink.java:69)
at com.intellij.ui.components.labels.LinkLabel$MyMouseHandler.mouseReleased(LinkLabel.java:319)
at java.awt.Component.processMouseEvent(Component.java:6548)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3325)
at java.awt.Component.processEvent(Component.java:6313)
at java.awt.Container.processEvent(Container.java:2237)
at java.awt.Component.dispatchEventImpl(Component.java:4903)
at java.awt.Container.dispatchEventImpl(Container.java:2295)
at java.awt.Component.dispatchEvent(Component.java:4725)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4889)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4526)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4467)
at java.awt.Container.dispatchEventImpl(Container.java:2281)
at java.awt.Window.dispatchEventImpl(Window.java:2746)
at java.awt.Component.dispatchEvent(Component.java:4725)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:764)
at java.awt.EventQueue.access$500(EventQueue.java:98)
at java.awt.EventQueue$3.run(EventQueue.java:715)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:90)
at java.awt.EventQueue$4.run(EventQueue.java:737)
at java.awt.EventQueue$4.run(EventQueue.java:735)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:80)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:734)
at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:786)
at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:723)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:395)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Caused by: java.awt.AWTException: Exception: class java.lang.ArrayIndexOutOfBoundsException 171 occurred while creating cursor CopyDrop.32x32
at java.awt.Cursor.getSystemCustomCursor(Cursor.java:363)
at sun.awt.windows.WToolkit.lazilyLoadDesktopProperty(WToolkit.java:892)
... 69 more
Tried to install/uninstall/re-install multiple times already but the same error.
can anyone help me on this one please ?
Thanks,
Try to disable this plugin by creating a text file named disabled_plugins.txt inside config directory, which is under
<SYSTEM DRIVE>\Users\<USER ACCOUNT NAME>\.<PRODUCT><VERSION>\config
with following content:
com.intellij.uiDesigner
Similar issue: intellij-support.jetbrains.com/hc/en-us/community/posts/207295405/comments/203694684
Config directory location: intellij-support.jetbrains.com/hc/en-us/articles/206544519
I have a Java application running on successfully on Amazon Web Services Elastic Beanstalk.
I am trying to set up MySQL. I have created a DB Instance as you can see:
Question 1:
How do I connect my Java App to the Database?
I have the following code (after viewing this):
#Bean
public DataSource dataSource() {
// Openshift
// String host = System.getenv("OPENSHIFT_MYSQL_DB_HOST");
// String port = System.getenv("OPENSHIFT_MYSQL_DB_PORT");
// String username = System.getenv("OPENSHIFT_MYSQL_DB_USERNAME");
// String password = System.getenv("OPENSHIFT_MYSQL_DB_PASSWORD");
// AWS
String host = System.getenv("RDS_HOSTNAME");
String port = System.getenv("RDS_PORT");
String username = System.getenv("RDS_USERNAME");
String password = System.getenv("RDS_PASSWORD");
String dbname = System.getenv("RDS_DB_NAME");
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(environment.getRequiredProperty("jdbc.driverClassName"));
if (host == null) {
dataSource.setUrl(environment.getRequiredProperty("jdbc.url"));
dataSource.setUsername(environment.getRequiredProperty("jdbc.username"));
dataSource.setPassword(environment.getRequiredProperty("jdbc.password"));
} else {
//dataSource.setUrl("jdbc:mysql://" + host + ":" + port + "/");
// Openshift
//dataSource.setUrl("jdbc:mysql://" + host + ":" + port + "/" + environment.getRequiredProperty("jdbc.dbname"));
// AWS
dataSource.setUrl("jdbc:mysql://" + host + ":" + port + "/" + dbname);
dataSource.setUsername(username);
dataSource.setPassword(password);
System.out.println("jdbc:mysql://" + host + ":" + port + "/" + dbname+", username: "+username+", password: "+password);
}
return dataSource;
}
(p.s. the commented out code works on OpenShifts server)
But I get the following:
org.springframework.web.util.NestedServletException: Request
processing failed; nested exception is
org.springframework.transaction.CannotCreateTransactionException:
Could not open JPA EntityManager for transaction; nested exception is
org.hibernate.exception.JDBCConnectionException: Unable to acquire
JDBC Connection
org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:982)
Question 2:
From an existing database, in MySQL Workbench, I have exported sql scripts to create a schema and tables, and then populate them. How do I import this into the AWS RDS MySQL database?
More info:
Server Logs:
-------------------------------------
/var/log/tomcat8/localhost.2017-02-21.log
-------------------------------------
21-Feb-2017 14:42:51.133 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log 1 Spring WebApplicationInitializers detected on classpath
21-Feb-2017 14:42:51.737 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log Initializing Spring FrameworkServlet 'rest'
21-Feb-2017 14:57:36.680 SEVERE [http-nio-8080-exec-7] org.apache.catalina.core.StandardWrapperValve.invoke Servlet.service() for servlet [rest] in context with path [] threw exception [Request processing failed; nested exception is org.springframework.transaction.CannotCreateTransactionException: Could not open JPA EntityManager for transaction; nested exception is org.hibernate.exception.JDBCConnectionException: Unable to acquire JDBC Connection] with root cause
java.net.ConnectException: Connection refused (Connection refused)
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at java.net.Socket.connect(Socket.java:538)
at java.net.Socket.<init>(Socket.java:434)
at java.net.Socket.<init>(Socket.java:244)
at com.mysql.jdbc.StandardSocketFactory.connect(StandardSocketFactory.java:258)
at com.mysql.jdbc.MysqlIO.<init>(MysqlIO.java:306)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2504)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2541)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2323)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:832)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:46)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:408)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:417)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:344)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriverManager(DriverManagerDataSource.java:153)
at org.springframework.jdbc.datasource.DriverManagerDataSource.getConnectionFromDriver(DriverManagerDataSource.java:144)
at org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnectionFromDriver(AbstractDriverBasedDataSource.java:155)
at org.springframework.jdbc.datasource.AbstractDriverBasedDataSource.getConnection(AbstractDriverBasedDataSource.java:120)
at org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl.getConnection(DatasourceConnectionProviderImpl.java:122)
at org.hibernate.internal.NonContextualJdbcConnectionAccess.obtainConnection(NonContextualJdbcConnectionAccess.java:35)
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.acquireConnectionIfNeeded(LogicalConnectionManagedImpl.java:99)
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.getPhysicalConnection(LogicalConnectionManagedImpl.java:129)
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.getConnectionForTransactionManagement(LogicalConnectionManagedImpl.java:247)
at org.hibernate.resource.jdbc.internal.LogicalConnectionManagedImpl.begin(LogicalConnectionManagedImpl.java:254)
at org.hibernate.resource.transaction.backend.jdbc.internal.JdbcResourceLocalTransactionCoordinatorImpl$TransactionDriverControlImpl.begin(JdbcResourceLocalTransactionCoordinatorImpl.java:203)
at org.hibernate.engine.transaction.internal.TransactionImpl.begin(TransactionImpl.java:56)
at org.springframework.orm.jpa.DefaultJpaDialect.beginTransaction(DefaultJpaDialect.java:67)
at org.springframework.orm.jpa.JpaTransactionManager.doBegin(JpaTransactionManager.java:380)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.getTransaction(AbstractPlatformTransactionManager.java:373)
at org.springframework.transaction.interceptor.TransactionAspectSupport.createTransactionIfNecessary(TransactionAspectSupport.java:427)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:276)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
at com.sun.proxy.$Proxy55.findAll(Unknown Source)
at com.jobs.spring.controller.CategoryRESTService.findAllCategorys(CategoryRESTService.java:31)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:221)
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:136)
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:114)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:827)
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:738)
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:85)
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:963)
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:897)
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:970)
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:861)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:622)
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:846)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:729)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:292)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.RemoteIpValve.invoke(RemoteIpValve.java:676)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:509)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1104)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:684)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1520)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1476)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
21-Feb-2017 15:09:21.090 INFO [localhost-startStop-2] org.apache.catalina.core.ApplicationContext.log Destroying Spring FrameworkServlet 'rest'
21-Feb-2017 15:09:38.729 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log 1 Spring WebApplicationInitializers detected on classpath
21-Feb-2017 15:09:39.329 INFO [localhost-startStop-1] org.apache.catalina.core.ApplicationContext.log Initializing Spring FrameworkServlet 'rest'
UPDATE
I also added rules to open it for everyone, but still get the same error:
The problem has to do with your database and the security group.
Hover to the "i" icon next to the Cluster Endpoint.
A popup will appear with the security groups.
Hit the first default link and you will be redirected to the security groups configuration.
Then you have to edit the security group and add an inbound rule in order to include your local ip.
Keep in mind that this will work only if your rds is on a public subnet.
This link is pretty helpful for troubleshooting.
More information on security groups
Having trouble deploying a service created in WSO2 Data Services Server. It is being displayed as faulty.
The error is as below:
DS Fault Message: The request 'patientDetailsByNumber' contains the query with id 'patientDetailsByNumberSQL' contains a result with no element wrappe
DS Code: UNKNOWN_ERROR
at org.wso2.carbon.dataservices.core.DBDeployer.validateRequestQueryResults(DBDeployer.java:729)
at org.wso2.carbon.dataservices.core.DBDeployer.validateDataService(DBDeployer.java:652)
at org.wso2.carbon.dataservices.core.DBDeployer.createDBService(DBDeployer.java:759)
at org.wso2.carbon.dataservices.core.DBDeployer.processService(DBDeployer.java:1106)
at org.wso2.carbon.dataservices.core.DBDeployer.deploy(DBDeployer.java:174)
at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:136)
at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:810)
at org.apache.axis2.deployment.repository.util.WSInfoList.update(WSInfoList.java:144)
at org.apache.axis2.deployment.RepositoryListener.update(RepositoryListener.java:377)
at org.apache.axis2.deployment.RepositoryListener.checkServices(RepositoryListener.java:254)
at org.apache.axis2.deployment.DeploymentEngine.loadServices(DeploymentEngine.java:139)
at org.wso2.carbon.core.CarbonAxisConfigurator.loadServices(CarbonAxisConfigurator.java:464)
at org.apache.axis2.context.ConfigurationContextFactory.createConfigurationContext(ConfigurationContextFactory.java:95)
at org.wso2.carbon.core.CarbonConfigurationContextFactory.createNewConfigurationContext(CarbonConfigurationContextFactory.java:65)
at org.wso2.carbon.core.init.CarbonServerManager.initializeCarbon(CarbonServerManager.java:398)
at org.wso2.carbon.core.init.CarbonServerManager.start(CarbonServerManager.java:219)
at org.wso2.carbon.core.internal.CarbonCoreServiceComponent.activate(CarbonCoreServiceComponent.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.activate(ServiceComponent.java:260)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.activate(ServiceComponentProp.java:146)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.build(ServiceComponentProp.java:347)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponent(InstanceProcess.java:620)
at org.eclipse.equinox.internal.ds.InstanceProcess.buildComponents(InstanceProcess.java:197)
at org.eclipse.equinox.internal.ds.Resolver.getEligible(Resolver.java:343)
at org.eclipse.equinox.internal.ds.SCRManager.serviceChanged(SCRManager.java:222)
at org.eclipse.osgi.internal.serviceregistry.FilteredServiceListener.serviceChanged(FilteredServiceListener.java:107)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.dispatchEvent(BundleContextImpl.java:861)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEventPrivileged(ServiceRegistry.java:819)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.publishServiceEvent(ServiceRegistry.java:771)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistrationImpl.register(ServiceRegistrationImpl.java:130)
at org.eclipse.osgi.internal.serviceregistry.ServiceRegistry.registerService(ServiceRegistry.java:214)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.registerService(BundleContextImpl.java:433)
at org.eclipse.equinox.http.servlet.internal.Activator.registerHttpService(Activator.java:81)
at org.eclipse.equinox.http.servlet.internal.Activator.addProxyServlet(Activator.java:60)
at org.eclipse.equinox.http.servlet.internal.ProxyServlet.init(ProxyServlet.java:40)
at org.wso2.carbon.tomcat.ext.servlet.DelegationServlet.init(DelegationServlet.java:38)
at org.apache.catalina.core.StandardWrapper.initServlet(StandardWrapper.java:1267)
at org.apache.catalina.core.StandardWrapper.loadServlet(StandardWrapper.java:1186)
at org.apache.catalina.core.StandardWrapper.load(StandardWrapper.java:1081)
at org.apache.catalina.core.StandardContext.loadOnStartup(StandardContext.java:5027)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5314)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
The relevant part of the XML file generated from creating this service is shown below:
<operation name="patientDetailsByNumber">
<call-query href="patientDetailsByNumberSQL">
<with-param name="patientNumber" query-param="patientNumber"/>
</call-query>
</operation>
</data>
I am somewhat new to XML and not sure if I'm missing anything obvious. Running MySQL 5.6, WSO2 DSS 3.1.1 and Windows 7.
Any help would be greatly appreciated.
I'm trying to copy data from Mysql (5.1.37) to an Oracle database (11) using Camel and Spring. The configuration looks like this:
<camelContext id="camelContext" xmlns="http://camel.apache.org/schema/spring">
<route>
<from uri="direct:start" />
<to uri="sql:select id, name, description from user?dataSourceRef=fromDataSource" />
<to uri="sql:insert into user (id, name, description) values (#, #, #)?dataSourceRef=toDataSource"/>
</route>
</camelContext>
I'm running it using the following code:
public void fetchUsers() throws Exception {
CamelContext context = (CamelContext) SpringUtil.getBean("camelContext");
ProducerTemplate template = context.createProducerTemplate();
context.start();
template.sendBody("direct:start", "test");
}
However, I'm getting an exception:
10:04:24,208 ERROR [main] DefaultErrorHandler - Failed delivery for (MessageId: ID-UNISTHDW177-55800-1338278663095-0-3 on ExchangeId: ID-UNISTHDW177-55800-1338278663095-0-2). Exhausted after delivery attempt: 1 caught: org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: [{id=1, name=Carl, description=Developer}, {id=2, name=Lars, description=Developer}]]
org.apache.camel.CamelExecutionException: Exception occurred during execution on the exchange: Exchange[Message: [{id=1, name=Carl, description=Developer}, {id=2, name=Lars, description=Developer}]]
at org.apache.camel.util.ObjectHelper.wrapCamelExecutionException(ObjectHelper.java:1237)
at org.apache.camel.impl.DefaultExchange.setException(DefaultExchange.java:282)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:64)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.SendProcessor$2.doInAsyncProducer(SendProcessor.java:115)
at org.apache.camel.impl.ProducerCache.doInAsyncProducer(ProducerCache.java:285)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:110)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:71)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.processor.interceptor.TraceInterceptor.process(TraceInterceptor.java:91)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.RedeliveryErrorHandler.processErrorHandler(RedeliveryErrorHandler.java:333)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:223)
at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:45)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.processor.interceptor.DefaultChannel.process(DefaultChannel.java:304)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:117)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
at org.apache.camel.processor.RouteContextProcessor.processNext(RouteContextProcessor.java:45)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.processor.UnitOfWorkProcessor.process(UnitOfWorkProcessor.java:122)
at org.apache.camel.processor.RouteInflightRepositoryProcessor.processNext(RouteInflightRepositoryProcessor.java:50)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.processor.DelegateAsyncProcessor.processNext(DelegateAsyncProcessor.java:99)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:90)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:71)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:73)
at org.apache.camel.component.direct.DirectProducer.process(DirectProducer.java:61)
at org.apache.camel.processor.UnitOfWorkProcessor.processAsync(UnitOfWorkProcessor.java:150)
at org.apache.camel.processor.UnitOfWorkProcessor.process(UnitOfWorkProcessor.java:117)
at org.apache.camel.util.AsyncProcessorHelper.process(AsyncProcessorHelper.java:99)
at org.apache.camel.processor.DelegateAsyncProcessor.process(DelegateAsyncProcessor.java:86)
at org.apache.camel.processor.UnitOfWorkProducer.process(UnitOfWorkProducer.java:63)
at org.apache.camel.impl.ProducerCache$2.doInProducer(ProducerCache.java:352)
at org.apache.camel.impl.ProducerCache$2.doInProducer(ProducerCache.java:324)
at org.apache.camel.impl.ProducerCache.doInProducer(ProducerCache.java:223)
at org.apache.camel.impl.ProducerCache.sendExchange(ProducerCache.java:324)
at org.apache.camel.impl.ProducerCache.send(ProducerCache.java:169)
at org.apache.camel.impl.DefaultProducerTemplate.send(DefaultProducerTemplate.java:111)
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:124)
at org.apache.camel.impl.DefaultProducerTemplate.sendBody(DefaultProducerTemplate.java:131)
at com.unibet.finance.reporting.camel.GordiumFetcherTest.camelShouldFetchGordiumAttributes(GordiumFetcherTest.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:28)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:31)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Caused by: java.lang.AbstractMethodError
at org.apache.camel.component.sql.SqlProducer$1.doInPreparedStatement(SqlProducer.java:50)
at org.apache.camel.component.sql.SqlProducer$1.doInPreparedStatement(SqlProducer.java:48)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:586)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:614)
at org.apache.camel.component.sql.SqlProducer.process(SqlProducer.java:48)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
... 70 more
What am I missing here?
This error occurs when your JDBC driver implements the older version of JDBC API then the one available in JRE you use to run your application (and which is required by Camel). You need JDBC API version >= 3 in this case.
The problem should be solved be updating your Oracle driver to the latest version (supporting your database).
You wrote in the comment that you had already tried to update the driver version (with no effect). Probably you got an older version of Oracle driver somewhere in the lib/ext directory in your application server. Probably you are using Maven to Fetch Oracle JDBC driver while another driver instance is provided by application server.
The is not the Camel problem, but JDBC driver and server classpath problem. Fix your driver classpath issue and Camel will do his job :) .