PlayFramework Slick NoSuchMethodError - mysql

Im having strange problem while using new slick version with mysql:
[RuntimeException: java.lang.NoSuchMethodError: slick.driver.JdbcProfile$API.streamableQueryActionExtensionMethods(Lslick/lifted/Query;)Lslick/profile/BasicActionComp$$$$6aa48549c0a7603df1fa229cf7177493$$$$sionMethodsImpl;]
in my application.conf:
slick.dbs.default.driver = "slick.driver.MySQLDriver$"
slick.dbs.default.db.driver = "com.mysql.jdbc.Driver"
slick.dbs.default.db.url = "jdbc:mysql://localhost/test?useUnicode=true&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=UTC&useSSL=false"
slick.dbs.default.db.user = "root"
slick.dbs.default.db.password = ""
and code:
throwing exception:
Await.result(db.run(table.result), Duration.Inf)
evolutions did good job, tables created etc. But here I have such nasty error ;/
My built.sbt:
name := """bettor"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
val utilsDeps = Seq("joda-time" % "joda-time" % "2.9.4",
"com.github.tototoshi" %% "slick-joda-mapper" % "2.2.0",
"org.joda" % "joda-convert" % "1.8.1")
val dbsDeps = Seq("com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0",
"mysql" % "mysql-connector-java" % "6.0.2")
val jsonDeps = Seq("org.json4s" %% "json4s-jackson" % "3.4.0",
"org.jsoup" % "jsoup" % "1.9.2")
libraryDependencies ++= Seq(
cache,
ws,
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test
) ++ utilsDeps ++ dbsDeps ++ jsonDeps
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
Any ideas how to solve this issue ?
even using this template without changing anything I have same error :(
play-slick-mysql

In my case, I used Seq("-Xmax-classfile-name","78") feature suggested here https://stackoverflow.com/a/32862972/1432640 without reading the comments (this case was mentioned there), and was struggling with the error for 4+ hours. Nightmare is over! Smeagol is free!

Heh, adding to build sbt:
scalacOptions := Seq("-feature")
resolved problem.

Related

Getting error while creating schema in mysql using slick

Error:
value <> is not a member of (slick.lifted.Rep[String], slick.lifted.Rep[String], slick.lifted.Rep[String])
def * : ProvenShape[Employee] = (id, name, dept) <> (Employee.tupled, Employee.unapply)
My build.sbt look like this:
name := "AkkaHttpDemo"
version := "0.1"
scalaVersion := "2.13.3"
lazy val root = project in file(".")
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.6.8" ,
"com.typesafe.akka" %% "akka-stream" % "2.6.8",
"com.typesafe.akka" %% "akka-http" % "10.2.0",
"com.typesafe.slick" %% "slick" % "3.3.2",
"com.typesafe.slick" %% "slick-hikaricp" % "3.3.2",
"org.slf4j" % "slf4j-nop" % "1.6.6" % Test,
"mysql" % "mysql-connector-java" % "8.0.21"
)
Table schema look like this:
package services
import akka.stream.scaladsl._
import slick.jdbc.MySQLProfile._
import slick.ast.ScalaBaseType.stringType
import slick.lifted._
import slick.util._
trait EmployeeStore {
class EmployeeStoreImpl(tag: Tag) extends Table[Employee](tag, _schemaName = Some("MYSCHEMA"),"EMPL") {
def id= column[String]("empid")
def name = column[String]("name")
def dept = column[String]("dept")
def * : ProvenShape[Employee] = (id, name, dept) <> (Employee.tupled, Employee.unapply)
}
val dict = TableQuery[EmployeeStoreImpl]
}
Note*: when I tried given below code also get the error
dict.schema.create
Cannot resolve symbol schema
You need to import the Slick API:
import slick.jdbc.MySQLProfile.api._
You can also replace <> with the more convenient:
def * = (id, name, dept).mapTo[Employee]
You likely don't need the slick.ast, slick.lifted, or slick.util imports, but I appreciate you including the full list of imports in your example.

NoSuchMethodError: org.apache.spark.sql.SQLContext.sql

I write a Spark program with elastic search libraries.
Here is my build.sbt.
scalaVersion := "2.10.5"
val sparkVersion = "2.0.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-catalyst" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided"
)
libraryDependencies += "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.3.3"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.3"
Here is error message.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset;
at com.minsu.house.BatchProgram.process(BatchProgram.scala:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
And My code is as follows...
val sqlContext = new SQLContext(sparkContext)
val dataframe = sqlContext.sql(sqlString) // <----- HERE !!!
I do not think it has anything to do with the elastic-search library.
It simply seems to be due to dependency or version problems.
What should I do? Help me.. Thank you.
The Elasticsearch Spark connector version that you are trying to use does not support Spark 2. You have two options here:
Use Spark 1.6.x since Elasticsearch 2.x does not support Spark 2
Upgrade the Elasticsearch Spark connector and Elasticsearch itself to 5.x
For instance, I used org.elasticsearch:elasticsearch-spark-20_2.11:5.0 and the following Spark 2 code:
// add to your class imports
import org.elasticsearch.spark.sql._
// Use Spark 2.0 SparkSession object to provide your config
val sparkSession = SparkSession.builder().config(...).getOrCreate()
// Optional step, imports things like $"column"
import sparkSession.implicits._
// Specify your index and type in ES
val df = spark.esDF("index/type")
// Perform an action
df.count()

Unresolved symbol column - Scala Slick

I'm currently trying to create a table using slick and I'm baffled as to what import I'm missing as the examples I've seen don't seem to have a relevant looking import in them.
Currently the column, question mark and the O's are all unresolved.
Could someone let me know what I'm doing wrong please?
Here is my table class:
package com.grimey.tabledefinitions
import slick.driver.MySQLDriver.api._
import com.grimey.staticpage.StaticPage
import slick.lifted.Tag
import slick.model.Table
class StaticPageDef(tag: Tag) extends Table[StaticPage](tag, "static_page") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def pageType = column[String]("page_type")
def contentHtml = column[String]("content_html")
def * = (id.?, pageType, contentHtml) <>(StaticPage, StaticPage.unapply _)
}
And here is my build.sbt:
name := """grimey-cms"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"mysql" % "mysql-connector-java" % "5.1.38",
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
fork in run := true
And finally, here is the case class I'm using for the table:
package com.grimey.staticpage
import java.time.LocalDateTime
case class StaticPage(id: Long, htmlContent: String, pageType: String,
created: LocalDateTime, updated: LocalDateTime)
I bet it's something really silly :)
The O object is from the table and it varies driver to driver. Some drivers may not support certain column options supported by others. Therefore you will need to import the column options that are specific to your database - MySQL in this case:
import slick.driver.MySQLDriver.api._
You may check this full tutorial on how to use Play + Slick + MySQL: http://pedrorijo.com/blog/play-slick/
Or you may just navigate through the code: https://github.com/pedrorijo91/play-slick3-steps

Spark java.lang.SecurityException: class "javax.servlet.FilterRegistration"' with sbt

This is driving me insane, I have tried solutions around but can't get it to work. So I have a project that uses Spark, managed by sbt.
I'm getting the known by all error:
Exception in thread "main" java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
The (partial) build.sbt is:
scalaVersion := "2.11.7"
//Library repositories
resolvers ++= Seq(
Resolver.mavenLocal,
"Scala-Tools Maven2 Repository" at "http://scala-tools.org/repo-releases",
"Java.net repository" at "http://download.java.net/maven/2",
"GeoTools" at "http://download.osgeo.org/webdav/geotools",
"Apache" at "https://repository.apache.org/service/local/repositories/releases/content",
"Cloudera" at "https://repository.cloudera.com/artifactory/cloudera-repos/",
"OpenGeo Maven Repository" at "http://repo.opengeo.org",
"Typesafe" at "https://repo.typesafe.com/typesafe/releases/",
"Spray Repository" at "http://repo.spray.io"
)
//Library versions
val geotools_version = "13.2"
val accumulo_version = "1.6.0-cdh5.1.4"
val hadoop_version = "2.6.0-cdh5.4.5"
val hadoop_client_version = "2.6.0-mr1-cdh5.4.5"
val geowave_version = "0.9.0-SNAPSHOT"
val akka_version = "2.4.0"
val spray_version = "1.3.3"
val spark_version = "1.5.0"
//Library Dependencies
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-library" % scalaVersion.value,
"org.scala-lang" % "scala-reflect" % scalaVersion.value,
"org.geotools" % "gt-data" % geotools_version,
"org.geotools" % "gt-geojson" % geotools_version,
"org.apache.accumulo" % "accumulo-core" % accumulo_version
exclude(org = "javax.servlet", name = "servlet-api")
exclude(org = "javax.servlet", name = "jsp-api"),
"org.apache.hadoop" % "hadoop-common" % hadoop_version
exclude(org = "javax.servlet", name = "servlet-api")
exclude(org = "javax.servlet", name = "jsp-api"),
"org.apache.hadoop" % "hadoop-client" % hadoop_client_version
exclude(org = "javax.servlet", name = "servlet-api")
exclude(org = "javax.servlet", name = "jsp-api"),
"mil.nga.giat" % "geowave-core-store" % geowave_version,
"mil.nga.giat" % "geowave-datastore-accumulo" % geowave_version,
"mil.nga.giat" % "geowave-adapter-vector" % geowave_version,
"com.typesafe" % "config" % "1.3.0",
"com.typesafe.akka" %% "akka-actor" % akka_version,
"io.spray" %% "spray-can" % spray_version,
"io.spray" %% "spray-routing" % spray_version,
"io.spray" %% "spray-testkit" % spray_version % "test",
"org.apache.spark" %% "spark-core" % spark_version,
"org.scalatest" % "scalatest_2.11" % "2.2.4" % "test"
)
test in assembly := {}
I have tried to exclude anything that has the name javax.servlet on it, but must be doing something wrong because it does not work.
Thank you for your time
Adding an exclusion rule on org.mortbay.jetty worked.Something like:
libraryDependencies ++= Seq(...).map(
_.excludeAll(ExclusionRule(organization = "org.mortbay.jetty"))
)

Connecting to Mysql using Slick 3.0 - No username, no password and bogus driver does not equal error

I'm writing a veery simple scala script to connect to Mysql using slick 3.
My build.sbt looks like this:
name := "slick_sandbox"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"com.typesafe.slick" %% "slick" % "3.0.3",
"org.slf4j" % "slf4j-nop" % "1.6.4",
"mysql" % "mysql-connector-java" % "5.1.6"
)
application.conf:
Drivder is an intentional mistake; also, I did not provide a db username or password!
mysqldb = {
url = "jdbc:mysql://localhost/slickdb"
driver = com.mysql.jdbc.Drivder
connectionPool = disabled
keepAliveConnection = true
}
Main.scala
import slick.driver.MySQLDriver.api._
import scala.concurrent.ExecutionContext.Implicits.global
object Main {
def main(args: Array[String]) {
// test to see this function is being run; it IS
println("foobar")
// I expected an error here due to the intentional
// mistake I've inserted into application.conf
// I made sure the conf file is getting read; if I change mysqldb
// to some other string, I get correctly warned it is not a
// valid key
val db = Database.forConfig("mysqldb")
val q = sql"select u.name from users ".as[String]
db.run(q).map{ res=>
println(res)
}
}
}
It compiles OK. Now this is the result I see when I run sbt run on the terminal:
felipe#felipe-XPS-8300:~/slick_sandbox$ sbt run
[info] Loading project definition from /home/felipe/slick_sandbox/project
[info] Set current project to slick_sandbox (in build file:/home/felipe/slick_sandbox/)
[info] Compiling 1 Scala source to /home/felipe/slick_sandbox/target/scala-2.11/classes...
[info] Running Main
foobar
[success] Total time: 5 s, completed Sep 17, 2015 3:29:39 AM
Everything looks deceptively OK; even though I explicitly ran the query on a database that doesn't exist, slick went ahead as if nothing has happened.
What am I missing here?
Slick runs queries asynchronously. So it just didn't have enough time to execute it. In your case you have to wait for result.
object Main {
def main(args: Array[String]) {
println("foobar")
val db = Database.forConfig("mysqldb")
val q = sql"select u.name from users ".as[String]
Await.result(
db.run(q).map{ res=>
println(res)
}, Duration.Inf)
}
}