JDBC driver not found error in monkeyrunner/jython - mysql

I need to Insert something in the DB. im using JDBC as a connector, jython the script, mysql the DB and the script is running in CentOS.
my code looks something like this:
from com.android.monkeyrunner import MonkeyRunner, MonkeyDevice, MonkeyImage
from com.ziclix.python.sql import zxJDBC
db=zxJDBC.connect("jdbc:mysql://XXX.XXX.XXX.XXX:3306/dbname","USER","PASSWORD","org.gjt.mm.mysql.Driver")
c=db.cursor() c.execute("INSERT INTO tablename values ('X','X','X')")
before that, I downloaded and decompressed the file from here (in the desktop)
I added the path to classpath by doing this
export PATH=/home/XX/Desktop/mysql-connector-java-5.1.22
and when I ran the script, it gave me this error
zxJDBC.DatabaseError.driver [org.gjt.mm.mysql.Driver] not found
what have I done wrong? is the name of the driver name correct? because I just copied it in one of the tutorials that I've seen. or probably did I install the driver correctly?
Thanks.

this is how I managed to solve the error:
Download the JDBC driver here
Extract the tar.gz file anywhere you want.
You will find mysql-connector-java-5.1.22-bin.jar inside that folder. Copy that and paste to (in my case) /%android-sdk%/tools/lib
Add the new location of mysql-connector-java-5.1.22-bin.jar to classpath
do the script like this
from com.android.monkeyrunner import MonkeyRunner, MonkeyDevice,
MonkeyImage
from com.ziclix.python.sql import zxJDBC
db=zxJDBC.connect("jdbc:mysql://XXX.XXX.XXX.XXX:3306/dbname","USER","PASSWORD","com.mysql.jdbc.Driver")
c=db.cursor()
c.execute("INSERT INTO tablename values ('X','X','X')")
db.commit()
Hope this helps to those who will need it in the future. :)

How are you running jython? If you're using the standalone install, i.e. java -jar jython.jar, then from the Java Documentation ...
-jar
When you use this option, the JAR file is the source of all user classes, and other user class path settings are ignored.
... you can't add anything to the classpath. Repackaging the required classes into the jython jar is one approach or this answer has an alternative solution - basically add the jython.jar to the classpath too (either using -cp or CLASSPATH) and run the org.python.util.jython class directly.

I got the sample problem in windows7,I slove this problem by this:
download the JDBC driver
add the mysql-connector-java-ver-bin.jar to envionment variables:
such as:
CLASSPATH : C:\xxx-path\mysql-connector-java-5.1.41-bin.jar
then I slove this problem

Related

Pantaho MySQL 8 connection error Driver class 'org.gjt.mm.mysql.Driver' could not be found

While upgrading ETL scripts for Mysql 5.8 to MySQL8 upgrade, as soon as I have updated the data-integration/lib jar to mysql-connector-java-8.0.xx.jar, it has started blowing with following error.
Driver class 'org.gjt.mm.mysql.Driver' could not be found, make sure the 'MySQL' driver (jar file) is installed.
Could you please try to add both jars in
data-integration/lib
As I did, I have added the lastest jar of 5.x i.e mysql-connector-java-5.1.48.jarand the same version mine is 8.0.19 so i have added mysql-connector-java-8.0.19.jar i copied into location data-integration/lib .
After the test its working fine now.
I did spent lot of time debugging and finally concluded, two things, I hope this may save others time in similar situation.
Reason: There is hardcoded jdbc drive name in org.pentaho.di.core.database.MySQLDatabaseMeta, and it always returns org.gjt.mm.mysql.Driver which is removed and new Driver with name com.mysql.jdbc.Driver or com.mysql.cj.jdbc.Driver should be used.
Solutions Any of below should be done to resolve.
Continue using the old jdbc jar.
Modify the org.pentaho.di.core.database.MySQLDatabaseMeta below method, compile and place it in classes directory.
public String getDriverClass() {
if (getAccessType()==DatabaseMeta.TYPE_ACCESS_ODBC)
{
return "sun.jdbc.odbc.JdbcOdbcDriver";
}
else
{
return "com.mysql.cj.jdbc.Driver";
} }
Use the Generic database connection, then you can specify the driver class yourself. (Based on #Cyrus comment.)
Pentaho open bug reference.
I was working with pdi 9.1, all I did to get rid of this error was to copy these two jar files in the data-integration\lib:
mysql-connector-java-5.1.49.jar
mysql-connector-java-5.1.49-bin.jar
The link to the zip folder is mentioned in the above comments
Restart your spoon!
And that's it.
i just to use Pentaho version 9.1 - 9.1.0.0-324 and mysql-connector-java-8.0.25
1- Make sure the Pentaho is not running.
2- Download the mysql connector at the link below.(https://mvnrepository.com/artifact/mysql/mysql-connector-java)
3- Copy the .jar file (mysql-connector-java-8.0.25.jar) and paste it in your Lib folder:
Example: C:....\pentaho-data-integration\lib
4- Execute Pentaho (Spoon.bat)
MySQL driver for version 8 changed the class name. Therefore, you must set it as a Generic connection instead, and use com.mysql.jdbc.Driver as the class.
I have fixed the issue by replacing mysql-connector-java-5.1.42-bin.jar with mysql-connector-java-5.1.44.jar
Reference
I am using Pentaho 9.1.0.0-324 and had the same problem. I downloaded the Everything app and through it I found that I have already a file named mysql-connector-j-8.0.31.jar at C:\Program Files (x86)\MySQL\Connector J 8.0\ folder. I simply copied the file into C:\Pentaho\lib and the problem solved. However, now I am getting another error as below:
Connection failed. Verify all connection parameters and confirm that
the appropriate driver is installed. Access denied for user
'root'#'localhost' (using password: YES)
Update
I solved the Access denied error by reinstalling MySQL with legacy password option along with a password without non-alphabetical symbols.

How do I import a mySQL database table into Solr 6?

I am trying to import a simple mySQL table into Solr and keep failing.
Having ploughed through:
https://gist.github.com/maxivak/3e3ee1fca32f3949f052
https://wiki.apache.org/solr/DataImportHandler
How to import data from mysql to solr
and half a dozen other posts that are related to Solr4, I am in desperate need of some community help.
I need specific instructions for Solr 6, it's installed on RedHat to connect to a mySQL DB. Thanks a lot.
Please find the steps below..
Install Java Runtime Environment (JRE) version 1.8 for Solr 6.1
Set Java home "export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_101/"
Download solr 6.1 "solr-6.1.0.tgz" from http://www-eu.apache.org/dist/lucene/solr/6.1.0/
Extract tgz file using "tar zxf solr-6.1.0.tgz".
Open the port 8983 as to communicate with solr.
Go to the path ../solr-6.1.0/ and start the server with "bin/solr start"
Create a config folder named myConfig under configsets directory , move the conf folder into it. Add the data-config.xml into "/myConfig/conf".
Modified the schema.xml under the folder "/myConfig/conf" and add the data-config.xml.
add teh entry in solconfig.xml
data-config.xml
Fire the url to create the core.
http://localhost:8983/solr/admin/cores?action=CREATE&name=mycore&instanceDir=my_instance&configSet=myConfig
Add the jars "solr-dataimporthandler-6.1.0.jar", "solr-dataimporthandler-extras-6.1.0.jar" to path "/home/abhijit/Downloads/solr-6.1.0/server/lib".
Add the "solr-core-6.1.0.jar" jar to the path "/home/abhijit/Downloads/solr-6.1.0/server/lib".
Add the "solr-solrj-6.1.0.jar" jar to the path "/home/abhijit/Downloads/solr-6.1.0/server/lib".
If you do the above steps you will ge the error like :
"Error Instantiating requestHandler, org.apache.solr.handler.dataimport.DataImportHandler failed to instantiate org.apache.solr.request.SolrRequestHandler"
This problem occurred because dataimporthandler and requesthandler are loaded by two different class loaders. To solve it, ensure that solr loads its jars only from the same class loader.
Option to steps 5,6,7 create a folder lib inside the solr folder at path "/home/abhijit/Downloads/solr-6.1.0/server/solr"
and add the entry of the same to solrConfig.xml as "".
Comment the other lib entries from solrConfig.xml
the jars "solr-dataimporthandler-6.1.0.jar", "solr-dataimporthandler-extras-6.1.0.jar" to path "/home/abhijit/Downloads/solr-6.1.0/server/solr/lib".
Commented the updateRequestProcessorChain in solrConfig.xml.
Command for full import
http://localhost:8983/solr/mycore/dataimport?command=full-import&clean=true&commit=true
if you have any params to pass on then do something like this
http://localhost:8983/solr/mycore/dataimport?command=full-import&clean=true&commit=true&cabinetId=5654174

Adding Spark CSV dependency to Zeppelin

I'm running an EMR with a spark cluster on AWS.
Spark version is 1.6
When running the folllowing command:
proxy = sqlContext.read.load("/user/zeppelin/ProxyRaw.csv",
format="com.databricks.spark.csv",
header="true",
inferSchema="true")
I get the following error:
Py4JJavaError: An error occurred while calling o162.load.
: java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.csv. Please find packages at
http://spark-packages.org
at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.lookupDataSource(ResolvedDataSource.scala:77)
How can I solve this? I assume I should add a package but how do I install it and where?
There is many way to add packages in Zeppelin :
One of them is to actually change the conf/zeppelin-env.sh configuration file adding the packages you need e.g com.databricks:spark-csv_2.10:1.4.0 in your case to the submit options since Zeppelin uses the spark-submit command under the hood :
export SPARK_SUBMIT_OPTIONS="--packages com.databricks:spark-csv_2.10:1.4.0"
But let's say that you don't have actually access to those configuration. You can then use Dynamic Dependency Loading via %dep interpreter (deprecated) :
%dep
z.load("com.databricks:spark-csv_2.10:1.4.0")
This will require that you load the dependencies before launching or restarting the interpreter.
Another way to do it is do add the dependency you need via the interpreter dependency manager as described in the following link : Dependency Management for Interpreter.
Well,
First you need to download the CSV liv from Maven repository:
https://mvnrepository.com/artifact/com.databricks/spark-csv_2.10/1.5.0
Check the scala version that you are using. If is 2.10 or 2.11.
When you call spark-shell our spark-submit or pyspark. Or even your Zeppelin you need to add the option --jars and the path to your lib.
Like this:
pyspark --jars /path/to/jar/spark-csv_2.10-1.5.0.jar
Than you can call it as you did above.
You can see other close issue here: How to add third party java jars for use in pyspark

MonkeyRunner and MySQL

I got a problem using MySQL with MonkeyRunner, need a help with MySQL Driver Name. The starting of my script is:
#coding : utf-8
from com.android.monkeyrunner import MonkeyRunner,MonkeyDevice
from com.ziclix.python.sql import zxJDBC
db = zxJDBC.connect('jdbc:mysql://localhost/android','dani','123456','com.mysql.jdbc.Driver')
.
.
etc
Each time I got an error that com.mysql.jdbc.Driver not found even when using org.git.mm.mysql.Driver. I set classpath as following :
export CLASSPATH=$CLASSPATH:/usr/share/java/mysql-connector-java.5.1.10.jar
Any tips or hits?
Regards,,,
You need to include the relevant jar in the correct location. There should be a libs folder where you need to put the jar. It should be in the top level of your project folder, at the same level as the src and res folders.

How to register a JDBC driver using jruby-complete.jar?

I'm trying to write a script that is executed with the jruby-complete.jar like so:
java -cp derby.jar; -Djdbc.drivers=org.apache.derby.jdbc.EmbeddedDriver -jar jruby-complete.jar -S my_script.rb
I'm using JVM 1.6.0_11 and JRuby 1.4.
In my jruby script I attempt to connect to the database like this.
connection = Java::com.sql.DriverManager.getConnection("jdbc:derby:path_to_my_DB")
This throws a java.sql.SQLException: "No suitable driver found" exception.
I've tried manually loading the driver into the class loader using Class.forName which gives me the same error.
It looks like to me that the class loader being used by the DriverManager is not the same as the current thread's. I've tried setting the current thread's class loader using:
JThread = java.lang.Thread
...
class_loader = JavaLang::URLClassLoader.new(
[JavaLang::URL.new("jar:file:/derby.jar!/")].to_java(
JavaLang::URL),JRuby.runtime.jruby_class_loader)
JThread.currentThread().setContextClassLoader(class_loader )
But this doesn't help.
Any ideas?
OK I downloaded jruby-complete.jar and had a go....
This seems to work for me:
java -classpath c:\ruby\db-derby-10.5.3.0-bin\lib\derby.jar;jruby-complete-1.4.0.jar org.jruby.Main -S derby.rb
When using the -jar switch, the -classpath option is ignored (maybe the CLASSPATH shell var is too). But on the above line, we put both required jars on the class path and pass the class name to execute (i.e. org.jruby.Main). The script being passed in is as per my other answer.
Another option (which I have not tried) would be to alter the jruby-complete.jar manifest file to specify as classpath, as described here:
Adding Classes to the JAR File's Classpath
First, make sure your driver jar is not corrupted (this made me waste a couple of days one time).
Second, read this about JRuby/Java classloade: JRuby Wiki
Third (because I haven't played with "jruby-complete") try this simple script and then see if you can adapt as you need.
require 'java'
require 'C:\ruby\db-derby-10.5.3.0-bin\lib\derby.jar' # adjust for your machine
include_class "java.sql.DriverManager"
derby = org.apache.derby.jdbc.EmbeddedDriver.new
connection = DriverManager.getConnection("jdbc:derby:derbyDB;create=true")