I updated my Play app with Scala 2.12 and Play 2.6.1, also I've changed Slick to v3.2 . Before (Scala 2.11.11 and Play 2.5.14) everything was working fine with Mysql driver version 5.1.23, which is the officially supported version, as said in this page: https://github.com/slick/slick . Now I get
java.lang.ClassNotFoundException: slick.driver.MySQLDriver
Also after several sbt clean. Is there any form of dependence between mysql driver and scala?
It was obvious maybe, but I searched in the docs: http://slick.lightbend.com/doc/3.2.0/api/#slick.driver.package
and the driver package has changed from slick.driver to slick.jdbc.(db)Profile,
so now
slick.driver.MySQLDriver
became
slick.jdbc.MySQLProfile
Related
/build/Debug/ant/pdm.jar!/META-INF/versions/9/org/apache/logging/log4j/util/Base64Util.class: Class Version Error. Please recompile with a supported JDK or check for an update to DashO which supports the new version.
We are using ant as build tool and Dasho as the code trimming tool.
We are migrating our code from log4j1.x to log4j2.x and we replaced old jar with new Jars(2.xapi & 2.xcore)
Since then after adding the libraries itself we are getting this error while building the project,as we are using java 8 and as per official log4j2 documentation any version of lof4j2 above 2.17.1 does support java8
Found this line when searched for this Base64Util.class in the official documentation of 2.x
link
Tried using 2.17.1 && 2.15 && 2.13 but no luck
Why this unsupported JDK is coming even after using java8 in project??
In order to support Java 8 and all later releases the log4j-api and log4j-core artifacts are multi-release jars. The class file that gives you problems uses Java 9 bytecode.
According to their web site DashO does not support multi-release jars.
Remark: removing the Java 9 classes from log4j-api and log4j-core will break logger context selection and location information on JDK 9 and later, so it is not an option.
I am using json-path-2.4.0 library in spark jobs which has a dependency on json-smart 2.x , but the spark jars default classpath folder (/usr/hdp/2.6.5.0-292/spark2/jars/) has json-smart 1.x which always gets precedence and I am unable to use the json-path 2.x library.
Facing the below error everytime I run :
java.lang.NoSuchFieldError: defaultReader
at com.jayway.jsonpath.spi.json.JsonSmartJsonProvider.(JsonSmartJsonProvider.java:39)
at com.jayway.jsonpath.internal.DefaultsImpl.jsonProvider(DefaultsImpl.java:21)
at com.jayway.jsonpath.Configuration.defaultConfiguration(Configuration.java:174)
Similar issue has been reported earlier :
JSON Path 2.3.0 conflicts with hadoop 2.7 Environment JSON-smart1.2.0.jar
But havent found any working solution. Please help.
I've been following the 5 min how to for setting up an htap databse with tidb_tispark and everything works until I get to the section Launch TiSpark. My first issue occurs when executing the line:
docker-compose exec tispark-master /opt/spark-2.1.1-bin-hadoop2.7/bin/spark-shell
But I got around that by modifying the spark version to the version I found inside the container:
docker-compose exec tispark-master /opt/spark-2.3.3-bin-hadoop2.7/bin/spark-shell
My second issue occurs when executing the three line block:
import org.apache.spark.sql.TiContext
val ti = new TiContext(spark)
ti.tidbMapDatabase("TPCH_001")
When I run the last statement I get the following output
scala> ti.tidbMapDatabase("TPCH_001")
2019-07-11 16:14:32 WARN General:96 - Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-2.3.3-bin-hadoop2.7/jars/datanucleus-core-3.2.10.jar."
2019-07-11 16:14:32 WARN General:96 - Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-2.3.3-bin-hadoop2.7/jars/datanucleus-api-jdo-3.2.6.jar."
2019-07-11 16:14:32 WARN General:96 - Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/spark/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/spark-2.3.3-bin-hadoop2.7/jars/datanucleus-rdbms-3.2.9.jar."
2019-07-11 16:14:36 WARN ObjectStore:568 - Failed to get database global_temp, returning NoSuchObjectException
This doesn't prevent me from running the query:
spark.sql("select * from nation").show(30);
But when I follow the further steps of the tutorial to modify the db from MySQL, the changes are not reflected immediately in Spark. Furthermore, at some point in the future (I believe > 5 minutes later), the row that was modified stops showing up in Spark SQL queries.
I'm rather new to this kind of setup and don't really know how to debug this issue. Searches for the warnings I received weren't illuminating.
I don't know if it's helpful but when I connect MySQL this is the server version I get:
Server version: 5.7.25-TiDB-v3.0.0-rc.1-309-g8c20289c7 MySQL Community Server (Apache License 2.0)
I'm one of the main dev of TiSpark. Sorry for your bad experience with it.
Due to my docker problem, I cannot directly reproduce your issue but it seems you hit one of the bug fixed recently.
https://github.com/pingcap/tispark/pull/862/files
The tutorial document is not quite up-to-date and points to an older version. That's why it didn't work with spark 2.1.1 as in tutorial. We will update it ASAP.
Newer version of TiSpark doesn't use tidbMapDatabase anymore but hooks with catalog directly instead. Method tidbMapDatabase remains for backward compatibility. Unfortunately, the tidbMapDatabase had a bug(when we ported it from older version) that it retrieves timestamp for query only once you call the function. That causes TiSpark always uses old timestamp to do snapshot reading and newer data would never be seen by it.
In newer version of TiSpark (TiSpark 2.0+ with Spark 2.3+), databases and tables are directly hooked into catalog services and you can directly call
spark.sql("use TPCH_001").show
spark.sql("select * from nation").show
This should give you fresh data.
So try restart your Spark driver, just try the two lines of code above and see if it works.
Let me know if this fix your problem. On the other hand, we will check our docker image to make sure if it contains the fix already.
If things still get wrong, would you please help to run below code and let us know the version of TiSpark.
spark.sql("select ti_version()").show
Again, sorry for causing you trouble and thanks for trying.
EDIT
To address your comment:
The warning is due to spark itself will try to locate the database in its native catalog first and this will cause a Failed to get warning. But the failover process will delegate the search to tispark and then behave correctly. So this warning can be ignored. It's recommended that add below lines to your log4j.properties in conf folder of your spark.
log4j.logger.org.apache.hadoop.hive.metastore.ObjectStore=ERROR
We will polish the docker tutorial image soon. Thank you so much for trying.
I am running Squirrel-SQL version 3.9.0 using JDK 10 on MS-Windows 10. I have configured the Microsoft SQL server JDBC driver sqljdbc42.jar along with it's native DDLs to enable native kerberos authentication. But when I try to connect to my database, I get the message JDBC Driver class not found class java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter. How do you fix this issue?
I was able to fix this issue by placing the jaxb-api-2.3.0.jar into squirrel-sql-3.9.0/lib folder and restarting the application.
had same problem . i updgrated everything to last version of Squirel 4.5.1 (janvier 2023) & last JDK 19 & last driver SqlServer , but problem continue to occur
i found another post on StackOverflow who asked which repository had this jaxb jar
How do I resolve Could not find artifact javax.xml.bind:jaxb-api:pom:2.3.0-b161121.1438 in central (https://repo1.maven.org/maven2)?
this is the link to jaxb jar, i added it in lib/ under squirell directory , then it worked
https://maven.java.net/#nexus-search;gav~javax.xml.bind~jaxb-api~2.3.0-b161121.1438~~
I built my App using Mysql Connector/C to connect a remote Mysql database, its works fine on the simulator (no errors, no warnings) but when i try to run it on my device (iphone5) i got this error:
No architectures to compile for (ARCHS=armv7 armv7s, VALID_ARCHS=armv7 armv7s)
i tried -as in some answers- to change setting (Architectures - Build Active Architectures- Valid Architectures) but the error still, only when i change the setting (Architectures & Valid Architectures) to "armv6" its build without error but many warnings appears says:
warning: no rule to process file '(my App dir)/main.m' of type sourcecode.c.objc for architecture armv6
and also for all .m files, when i tried to start the App i got message:
Xcode cannot run using selected device
I know that the Connector library need to update , but are there any solution ?
your need compile the connector lib in xcode for iOS (armv6, armv7, armv7s, i386), then use lipo tool to combine output libs.
direct connect your mysql in app is not safe, a suggest way is setup a Apache+PHP+MySQL server, then on iPhone useing ASIHTTPRequest to connect your server.