I am trying to create an external table using hive with hadoop but somehow it failed. These are the error I get when I try to run my queries.
02:23:29.516 [HiveServer2-Background-Pool: Thread-39] ERROR hive.ql.exec.DDLTask - org.apache.hadoop.hive.ql.metadata.HiveException: Cannot validate serde: org.openx.data.jsonserde.JsonSerDe
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3858)
at org.apache.hadoop.hive.ql.plan.CreateTableDesc.toTable(CreateTableDesc.java:700)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3960)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:333)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1858)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1562)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1313)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1084)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1077)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:235)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:90)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:299)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1926)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:312)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: Class org.openx.data.jsonserde.JsonSerDe not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2329)
at org.apache.hadoop.hive.ql.exec.DDLTask.validateSerDe(DDLTask.java:3852)
... 22 more
How can I solve it?
The exception says
java.lang.ClassNotFoundException: Class
org.openx.data.jsonserde.JsonSerDe not found
Install JSONSerDe (download JARS from http://www.congiu.net/hive-json-serde/ and put them into hive/lib), read instructions here: Hive-JSON-Serde
Also instead of putting jars into hive/lib you can try adding jars in hive session:
ADD JAR ADD JAR /usr/lib/hive/lib/json-serde-1.3.8-jar-with-dependencies.jar;
ADD JAR ADD JAR /usr/lib/hive/lib/json-udf-1.3.8-jar-with-dependencies.jar;
Alternatively you can try native Hive JSONSerDe: org.apache.hive.hcatalog.data.JsonSerDe - just change the class name in table DDL SerDe. It should be already installed. Read more details about differences here: https://docs.aws.amazon.com/athena/latest/ug/json-serde.html
I have the same issue when I use the hive command in cmd. But it works normally when I use the beeline with hive2 connection.
Related
I'm trying to build an application using Play Framework 2.2 and Scala.
I'm not very acquainted to Java environments, so I don't know exactly what's going on.
To make it work with MySql, I should configure my conf/application.conf like this:
db.default.driver=com.mysql.jdbc.Driver
db.default.url="jdbc:mysql://localhost:3306/sakila"
db.default.user=root
db.default.password="mypass"
Everything seems right to me, but when I try to access it, I get this:
Cannot connect to database [default]
Why? These information are right! The database can be found at localhost:3306/sakila.
What am I doing wrong?
EDIT: Here is my stacktrace. It seems to be missing the mysql connector .jar file, or something like that. What should I do?
[success] Compiled in 734ms
[error] c.j.b.h.AbstractConnectionHook - Failed to obtain initial connection Sle
eping for 0ms and trying again. Attempts left: 0. Exception: null.Message:No sui
table driver found for mysql://localhost:3306/world
[error] application -
! #6gjp5p29b - Internal server error, for (GET) [/] ->
play.api.Configuration$$anon$1: Configuration error[Cannot connect to database [
default]]
at play.api.Configuration$.play$api$Configuration$$configError(Configura
tion.scala:92) ~[play_2.10.jar:2.2.1]
at play.api.Configuration.reportError(Configuration.scala:570) ~[play_2.
10.jar:2.2.1]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:252) ~[pla
y-jdbc_2.10.jar:2.2.1]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:243) ~[pla
y-jdbc_2.10.jar:2.2.1]
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike
.scala:244) ~[scala-library.jar:na]
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike
.scala:244) ~[scala-library.jar:na]
Caused by: java.sql.SQLException: No suitable driver found for mysql://localhost
:3306/world
at java.sql.DriverManager.getConnection(Unknown Source) ~[na:1.7.0_07]
at java.sql.DriverManager.getConnection(Unknown Source) ~[na:1.7.0_07]
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:363)
~[bonecp.jar:na]
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) ~[bonecp.jar:na]
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.jav
a:120) ~[bonecp.jar:na]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:245) ~[pla
y-jdbc_2.10.jar:2.2.1]
[error] application -
! #6gjp5p29b - Internal server error, for (GET) [/] ->
play.api.Configuration$$anon$1: Configuration error[Cannot connect to database [
default]]
at play.api.Configuration$.play$api$Configuration$$configError(Configura
tion.scala:92) ~[play_2.10.jar:2.2.1]
at play.api.Configuration.reportError(Configuration.scala:570) ~[play_2.
10.jar:2.2.1]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:252) ~[pla
y-jdbc_2.10.jar:2.2.1]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:243) ~[pla
y-jdbc_2.10.jar:2.2.1]
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike
.scala:244) ~[scala-library.jar:na]
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike
.scala:244) ~[scala-library.jar:na]
Caused by: java.sql.SQLException: No suitable driver found for mysql://localhost
:3306/world
at java.sql.DriverManager.getConnection(Unknown Source) ~[na:1.7.0_07]
at java.sql.DriverManager.getConnection(Unknown Source) ~[na:1.7.0_07]
at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:363)
~[bonecp.jar:na]
at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) ~[bonecp.jar:na]
at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.jav
a:120) ~[bonecp.jar:na]
at play.api.db.BoneCPPlugin$$anonfun$onStart$1.apply(DB.scala:245) ~[pla
y-jdbc_2.10.jar:2.2.1]
OK, I don't know what's going on.
I put this, and it started to work:
db.default.url="jdbc:mysql://localhost:3306/world"
db.default.driver="com.mysql.jdbc.Driver"
db.default.user="root"
db.default.pass="mypasswrd"
db.default.host="localhost"
Perhaps it needed a more detailed configuration, i just added the db.default.hostconfiguration, stringified everything with "" (I think this is not necessary, but whatever) and checked if mysql was listed in play dependencies listing. Since it was listed there (in fact it was the first entry), and the error didn't say it was a missing library, I just headed to Error when i try connect play with mysql 5.5 and fixed my configurations.
Thanks to everyone!
You need to have the MySQL JDBC driver jar in your dependencies. This page in the documentation shows how to add the derby driver to the dependencies. Substitute the derby coordinates with the MySQL ones.
I am not doing anything in code - I just created an eclipse connection and can't seem to be able to ping it. The connection properties :
The jar is where I say it is ($GLASSFISH_HOME\domains\domain1\lib\ext) but I am getting :
java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.net.FactoryURLClassLoader.loadClass(URLClassLoader.java:789)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.eclipse.datatools.connectivity.drivers.jdbc.JDBCConnection.createConnection(JDBCConnection.java:297)
at org.eclipse.datatools.connectivity.DriverConnectionBase.internalCreateConnection(DriverConnectionBase.java:105)
at org.eclipse.datatools.connectivity.DriverConnectionBase.open(DriverConnectionBase.java:54)
at org.eclipse.datatools.connectivity.drivers.jdbc.JDBCConnection.open(JDBCConnection.java:81)
at org.eclipse.datatools.enablement.internal.mysql.connection.JDBCMySQLConnectionFactory.createConnection(JDBCMySQLConnectionFactory.java:28)
at org.eclipse.datatools.connectivity.internal.ConnectionFactoryProvider.createConnection(ConnectionFactoryProvider.java:83)
at org.eclipse.datatools.connectivity.internal.ConnectionProfile.createConnection(ConnectionProfile.java:359)
at org.eclipse.datatools.connectivity.ui.PingJob.createTestConnection(PingJob.java:76)
at org.eclipse.datatools.connectivity.ui.PingJob.run(PingJob.java:59)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:53)
That zip is very likely the distribution zip of MySQL Connector/J, inside of that zip is the mysql-connector-java-5.1.27-bin.jar library with the driver. So you need to unzip it first and then add that jar-library instead of the zip.
I'm using Hadoop 0.20.2 with Hive 0.11. I have succesfully inserted into hive/hdfs some csv-files in seperate tables. selects and joins work flawlessly. When trying to analyse some data, i needed to make use of the built in functions of hive like:
substr
to_date
rand
etc.
for example:
select sid, request_id, to_date(times), to_unix_timestamp(times) from contents where sid = '5000000032066010373';
sid and request id are strings here, times is a timestamp column
Unfortanetely i only get errors (always the same error stack) when using these functions:
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:354)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 5 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 10 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 13 more
Caused by: java.lang.RuntimeException: Map operator initialization failed
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:121)
... 18 more
Caused by: java.lang.NoClassDefFoundError: org/codehaus/jackson/JsonFactory
at org.apache.hadoop.hive.ql.udf.generic.GenericUDTFJSONTuple.<clinit>(GenericUDTFJSONTuple.java:56)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:113)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:526)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.registerGenericUDTF(FunctionRegistry.java:520)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:423)
at org.apache.hadoop.hive.ql.exec.DefaultUDFMethodResolver.getEvalMethod(DefaultUDFMethodResolver.java:59)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge.initialize(GenericUDFBridge.java:154)
at org.apache.hadoop.hive.ql.udf.generic.GenericUDF.initializeAndFoldConstants(GenericUDF.java:111)
at org.apache.hadoop.hive.ql.exec.ExprNodeGenericFuncEvaluator.initialize(ExprNodeGenericFuncEvaluator.java:141)
at org.apache.hadoop.hive.ql.exec.Operator.initEvaluators(Operator.java:970)
at org.apache.hadoop.hive.ql.exec.Operator.initEvaluatorsAndReturnStruct(Operator.java:996)
at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOperator.java:60)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
at org.apache.hadoop.hive.ql.exec.FilterOperator.initializeOp(FilterOperator.java:78)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451)
at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:407)
at org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableScanOperator.java:186)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.java:543)
at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375)
at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:100)
... 18 more
Caused by: java.lang.ClassNotFoundException: org.codehaus.jackson.JsonFactory
what am i doing wrong here?
SHOW FUNCTIONS;
shows me that these functions are in there ...
Try updating the jackson*.jar in your HADOOP_HOME/lib with the ones in HIVE_HOME/lib, restart Hadoop and try submitting the job.
I met the same problem while using hive + apache hadoop 0.20.2
the problem is that hadoop-0.20.2 doesn't have these jackson jars in the lib folder, so either you add it manually in the hive console every time:
add jars $HIVE_HOME/lib/jackson-core-asl-1.8.8.jar $HIVE_HOME/lib/jackson-jaxrs-1.8.8.jar $HIVE_HOME/lib/jackson-mapper-asl-1.8.8.jar $HIVE_HOME/lib/jackson-xc-1.8.8.jar;
or you copy all these jars to each machine in the cluster?
try updating the jackson*.jar in your HADOOP_HOME/lib with the ones in HIVE_HOME/lib, restart Hadoop and try submitting the job.
Thanks it worked
I followed the instructions here but when attempted I got the following error:
hudson.util.IOException2: remote file operation failed: /scratch/jenkins/workspace/Xinco Demo Publish/Xinco/target/Xinco-2012-08-30_00-20-05.war at hudson.remoting.Channel#1fc6bdea:s-50b0ae50
at hudson.FilePath.act(FilePath.java:783)
at hudson.FilePath.act(FilePath.java:769)
at com.cloudbees.plugins.deployer.DeployPublisher.perform(DeployPublisher.java:108)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:707)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:682)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:660)
at hudson.model.Build$RunnerImpl.post2(Build.java:162)
at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:629)
at hudson.model.Run.run(Run.java:1433)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:238)
Caused by: hudson.remoting.ProxyException: hudson.util.IOException2: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:151)
at com.cloudbees.plugins.deployer.deployables.Deployable$DeployFileCallable.invoke(Deployable.java:342)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2048)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:287)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.api.BeesClient.readResponse(BeesClient.java:850)
at com.cloudbees.api.BeesClient.applicationDeployArchive(BeesClient.java:435)
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:123)
... 11 more
Build step 'Deploy to CloudBees' marked build as failure
The full output can be seen here.
Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
Your cloudbees-web.xml doesn't follow the correct format.
See http://wiki.cloudbees.com/bin/view/RUN/CloudBeesWebXml - as the cloudbees-web.xml needs to be wrapped in an outer <cloudbees-web-app> element
You also don't have to use cloudbees-web.xml if you don't want - if you bind your app to a DB it will make it available as a named datasource automatically.
(see bees app:bind command).
You only have to do this once - and then the app will know about the datasource.
http://developer.cloudbees.com/bin/view/RUN/Resource+Management
and
https://developer.cloudbees.com/bin/view/RUN/DatabaseGuide
(sorry, still working on docs).
I am using Pentaho Data Integration and I am trying to connect to my database via MySQL but when I do I get this error.....
Error connecting to database [devdb2] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Exception while loading class
org.gjt.mm.mysql.Driver
org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Exception while loading class
org.gjt.mm.mysql.Driver
at org.pentaho.di.core.database.Database.normalConnect(Database.java:368)
at org.pentaho.di.core.database.Database.connect(Database.java:317)
at org.pentaho.di.core.database.Database.connect(Database.java:279)
at org.pentaho.di.core.database.Database.connect(Database.java:269)
at org.pentaho.di.core.database.DatabaseFactory.getConnectionTestReport(DatabaseFactory.java:86)
at org.pentaho.di.core.database.DatabaseMeta.testConnection(DatabaseMeta.java:2464)
at org.pentaho.ui.database.event.DataHandler.testDatabaseConnection(DataHandler.java:533)
at sun.reflect.GeneratedMethodAccessor55.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:139)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:123)
at org.pentaho.ui.xul.swt.tags.SwtButton.access$500(SwtButton.java:26)
at org.pentaho.ui.xul.swt.tags.SwtButton$4.widgetSelected(SwtButton.java:119)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.eclipse.jface.window.Window.runEventLoop(Window.java:820)
at org.eclipse.jface.window.Window.open(Window.java:796)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:378)
at org.pentaho.ui.xul.swt.tags.SwtDialog.show(SwtDialog.java:304)
at org.pentaho.di.ui.core.database.dialog.XulDatabaseDialog.open(XulDatabaseDialog.java:115)
at org.pentaho.di.ui.core.database.dialog.DatabaseDialog.open(DatabaseDialog.java:62)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:493)
at org.pentaho.di.ui.spoon.delegates.SpoonDBDelegate.newConnection(SpoonDBDelegate.java:478)
at org.pentaho.di.ui.spoon.Spoon.newConnection(Spoon.java:7770)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.ui.xul.impl.AbstractXulDomContainer.invoke(AbstractXulDomContainer.java:329)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:139)
at org.pentaho.ui.xul.impl.AbstractXulComponent.invoke(AbstractXulComponent.java:123)
at org.pentaho.ui.xul.swt.tags.SwtMenuitem.access$100(SwtMenuitem.java:27)
at org.pentaho.ui.xul.swt.tags.SwtMenuitem$1.widgetSelected(SwtMenuitem.java:77)
at org.eclipse.swt.widgets.TypedListener.handleEvent(Unknown Source)
at org.eclipse.swt.widgets.EventTable.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Widget.sendEvent(Unknown Source)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Unknown Source)
at org.eclipse.swt.widgets.Display.readAndDispatch(Unknown Source)
at org.pentaho.di.ui.spoon.Spoon.readAndDispatch(Spoon.java:1183)
at org.pentaho.di.ui.spoon.Spoon.start(Spoon.java:6966)
at org.pentaho.di.ui.spoon.Spoon.main(Spoon.java:567)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.pentaho.commons.launcher.Launcher.main(Launcher.java:134)
Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
Exception while loading class
org.gjt.mm.mysql.Driver
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:423)
at org.pentaho.di.core.database.Database.normalConnect(Database.java:352)
... 50 more
Caused by: java.lang.ClassNotFoundException: org.gjt.mm.mysql.Driver
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at org.pentaho.di.core.database.Database.connectUsingClass(Database.java:414)
... 51 more
If I used SQLite as my Connection Type it works but no data is returned when I goto explore it. So my question is how do I get MySQL working or get the data using SQLite?
Am I missing a library or a class?
I just came across the same issue while trying to query a MySQL Database from Pentaho.
Error connecting to database [Local MySQL DB] : org.pentaho.di.core.exception.KettleDatabaseException:
Error occured while trying to connect to the database
Exception while loading class
org.gjt.mm.mysql.Driver
Expanding post by #user979331 the solution is:
Download the MySQL Java Connector / Driver that is compatible with your kettle version
Unzip the zip file (in my case it was mysql-connector-java-5.1.31.zip)
copy the .jar file (mysql-connector-java-5.1.31-bin.jar) and paste it in your Lib folder:
PC: C:\Program Files\pentaho\design-tools\data-integration\lib
Mac: /Applications/data-integration/lib
Restart Pentaho (Data Integration) and re-test the MySQL Connection.
Additional interesting replies from others that could also help:
think to download the good JDBC driver version, compatible with your
PDI version:
https://help.pentaho.com/Documentation/8.1/Setup/JDBC_Drivers_Reference#MY_SQL
take the zip version ("platform independant") to extract the jar file
take into lib folder
see also proper way to handle it proposed by Ryan Tuck
Missing driver file.
This error is really common for people just getting started with PDI.
Drivers go in \pentaho\design-tools\data-integration\libext\JDBC for PDI. If you are using other tools in the Pentaho suite, you may need to copy drivers to additional locations for those tools. For reference, here are the appropriate folders for some of the other design tools:
Aggregation Designer: \pentaho\design-tools\aggregation-designer\drivers
Metadata Editor: \pentaho\design-tools\metadata-editor\libext\JDBC
Report Designer: \pentaho\design-tools\report-designer\lib\jdbc
Schema Workbench: \pentaho\design-tools\schema-workbench\drivers
If this transformation or job will run on another box, such as a test or production server, don't forget to include copying the jar file and restarting PDI or the Data Integration Server in your deployment considerations.
Turns out I will missing a class called mysql-connector-java-5.1.2.jar, I added it this folder (C:\Program Files\pentaho\design-tools\data-integration\lib) and it worked with a MySQL connection and my data and tables appear.
You need to download mysql-connector-java-5.1.46.tar.gz, not the latest version. The Driver class that Pentaho uses is not included in mysql-connector-java-8.xx.yy versions.
In addition to the other answers here, here's how you can do it on Ubuntu (14.04):
sudo apt-get install libmysql-java
this will download mysql-connector-java-5.x.x.jar to /usr/share/java/, which i believe also automatically creates a symlink named mysql-connector-java.jar.
Then, create a symlink in /your/path/to/data-integration/lib/:
ln -s /usr/share/java/mysql-connector-java.jar /your/path/to/data-integration/lib/mysql-connector-java.jar
At the present time, there is a simple way to fix this problem:
Go to Tools → MarketPlace and search for "PDI MySQL Plugin"
Install it ( this will automatically install the missing driver here: data-integration\plugins\databases\pdi-mysql-plugin\lib )
Restart Pentaho
Done.
To be concise and precise download the compatible jdbc (.jar) file compatible with your MySql version and put it in lib folder.
For example for MySQL 8.0.2 download Connector/J 8.0.20
Above answers were helpful, but for unknown reasons they did not seem to work.
So if you have already installed MySql workbench on your system, instead of downloading the jar files and struggling with the correct version just go to
C:\Program Files (x86)\MySQL\Connector J 8.0
and copy mysql-connector-java-8.0.12 (does not matter what version it is) the jar file in that location and paste it to C:\Program Files\pentaho\design-tools\data-integration\lib
First of all you need to download Mysql connector which is compatible with your pentaho version.after that paste it to data-integration/lib folder and restart your pentaho.
check this
https://help.pentaho.com/Documentation/8.1/Setup/JDBC_Drivers_Reference#MY_SQL