NoSuchMethodError: org.apache.spark.sql.SQLContext.sql - exception

I write a Spark program with elastic search libraries.
Here is my build.sbt.
scalaVersion := "2.10.5"
val sparkVersion = "2.0.1"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion % "provided",
"org.apache.spark" %% "spark-sql" % sparkVersion % "provided",
"org.apache.spark" %% "spark-catalyst" % sparkVersion % "provided",
"org.apache.spark" %% "spark-hive" % sparkVersion % "provided"
)
libraryDependencies += "org.elasticsearch" % "elasticsearch-spark_2.10" % "2.3.3"
libraryDependencies += "org.elasticsearch" % "elasticsearch" % "2.3.3"
Here is error message.
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.SQLContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/Dataset;
at com.minsu.house.BatchProgram.process(BatchProgram.scala:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:674)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
And My code is as follows...
val sqlContext = new SQLContext(sparkContext)
val dataframe = sqlContext.sql(sqlString) // <----- HERE !!!
I do not think it has anything to do with the elastic-search library.
It simply seems to be due to dependency or version problems.
What should I do? Help me.. Thank you.

The Elasticsearch Spark connector version that you are trying to use does not support Spark 2. You have two options here:
Use Spark 1.6.x since Elasticsearch 2.x does not support Spark 2
Upgrade the Elasticsearch Spark connector and Elasticsearch itself to 5.x
For instance, I used org.elasticsearch:elasticsearch-spark-20_2.11:5.0 and the following Spark 2 code:
// add to your class imports
import org.elasticsearch.spark.sql._
// Use Spark 2.0 SparkSession object to provide your config
val sparkSession = SparkSession.builder().config(...).getOrCreate()
// Optional step, imports things like $"column"
import sparkSession.implicits._
// Specify your index and type in ES
val df = spark.esDF("index/type")
// Perform an action
df.count()

Related

Getting error while creating schema in mysql using slick

Error:
value <> is not a member of (slick.lifted.Rep[String], slick.lifted.Rep[String], slick.lifted.Rep[String])
def * : ProvenShape[Employee] = (id, name, dept) <> (Employee.tupled, Employee.unapply)
My build.sbt look like this:
name := "AkkaHttpDemo"
version := "0.1"
scalaVersion := "2.13.3"
lazy val root = project in file(".")
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.6.8" ,
"com.typesafe.akka" %% "akka-stream" % "2.6.8",
"com.typesafe.akka" %% "akka-http" % "10.2.0",
"com.typesafe.slick" %% "slick" % "3.3.2",
"com.typesafe.slick" %% "slick-hikaricp" % "3.3.2",
"org.slf4j" % "slf4j-nop" % "1.6.6" % Test,
"mysql" % "mysql-connector-java" % "8.0.21"
)
Table schema look like this:
package services
import akka.stream.scaladsl._
import slick.jdbc.MySQLProfile._
import slick.ast.ScalaBaseType.stringType
import slick.lifted._
import slick.util._
trait EmployeeStore {
class EmployeeStoreImpl(tag: Tag) extends Table[Employee](tag, _schemaName = Some("MYSCHEMA"),"EMPL") {
def id= column[String]("empid")
def name = column[String]("name")
def dept = column[String]("dept")
def * : ProvenShape[Employee] = (id, name, dept) <> (Employee.tupled, Employee.unapply)
}
val dict = TableQuery[EmployeeStoreImpl]
}
Note*: when I tried given below code also get the error
dict.schema.create
Cannot resolve symbol schema
You need to import the Slick API:
import slick.jdbc.MySQLProfile.api._
You can also replace <> with the more convenient:
def * = (id, name, dept).mapTo[Employee]
You likely don't need the slick.ast, slick.lifted, or slick.util imports, but I appreciate you including the full list of imports in your example.

PlayFramework Slick NoSuchMethodError

Im having strange problem while using new slick version with mysql:
[RuntimeException: java.lang.NoSuchMethodError: slick.driver.JdbcProfile$API.streamableQueryActionExtensionMethods(Lslick/lifted/Query;)Lslick/profile/BasicActionComp$$$$6aa48549c0a7603df1fa229cf7177493$$$$sionMethodsImpl;]
in my application.conf:
slick.dbs.default.driver = "slick.driver.MySQLDriver$"
slick.dbs.default.db.driver = "com.mysql.jdbc.Driver"
slick.dbs.default.db.url = "jdbc:mysql://localhost/test?useUnicode=true&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=UTC&useSSL=false"
slick.dbs.default.db.user = "root"
slick.dbs.default.db.password = ""
and code:
throwing exception:
Await.result(db.run(table.result), Duration.Inf)
evolutions did good job, tables created etc. But here I have such nasty error ;/
My built.sbt:
name := """bettor"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
val utilsDeps = Seq("joda-time" % "joda-time" % "2.9.4",
"com.github.tototoshi" %% "slick-joda-mapper" % "2.2.0",
"org.joda" % "joda-convert" % "1.8.1")
val dbsDeps = Seq("com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0",
"mysql" % "mysql-connector-java" % "6.0.2")
val jsonDeps = Seq("org.json4s" %% "json4s-jackson" % "3.4.0",
"org.jsoup" % "jsoup" % "1.9.2")
libraryDependencies ++= Seq(
cache,
ws,
"org.scalatestplus.play" %% "scalatestplus-play" % "1.5.1" % Test
) ++ utilsDeps ++ dbsDeps ++ jsonDeps
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
Any ideas how to solve this issue ?
even using this template without changing anything I have same error :(
play-slick-mysql
In my case, I used Seq("-Xmax-classfile-name","78") feature suggested here https://stackoverflow.com/a/32862972/1432640 without reading the comments (this case was mentioned there), and was struggling with the error for 4+ hours. Nightmare is over! Smeagol is free!
Heh, adding to build sbt:
scalacOptions := Seq("-feature")
resolved problem.

Unresolved symbol column - Scala Slick

I'm currently trying to create a table using slick and I'm baffled as to what import I'm missing as the examples I've seen don't seem to have a relevant looking import in them.
Currently the column, question mark and the O's are all unresolved.
Could someone let me know what I'm doing wrong please?
Here is my table class:
package com.grimey.tabledefinitions
import slick.driver.MySQLDriver.api._
import com.grimey.staticpage.StaticPage
import slick.lifted.Tag
import slick.model.Table
class StaticPageDef(tag: Tag) extends Table[StaticPage](tag, "static_page") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def pageType = column[String]("page_type")
def contentHtml = column[String]("content_html")
def * = (id.?, pageType, contentHtml) <>(StaticPage, StaticPage.unapply _)
}
And here is my build.sbt:
name := """grimey-cms"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"mysql" % "mysql-connector-java" % "5.1.38",
"com.typesafe.play" %% "play-slick" % "2.0.0",
"com.typesafe.play" %% "play-slick-evolutions" % "2.0.0"
)
resolvers += "scalaz-bintray" at "http://dl.bintray.com/scalaz/releases"
fork in run := true
And finally, here is the case class I'm using for the table:
package com.grimey.staticpage
import java.time.LocalDateTime
case class StaticPage(id: Long, htmlContent: String, pageType: String,
created: LocalDateTime, updated: LocalDateTime)
I bet it's something really silly :)
The O object is from the table and it varies driver to driver. Some drivers may not support certain column options supported by others. Therefore you will need to import the column options that are specific to your database - MySQL in this case:
import slick.driver.MySQLDriver.api._
You may check this full tutorial on how to use Play + Slick + MySQL: http://pedrorijo.com/blog/play-slick/
Or you may just navigate through the code: https://github.com/pedrorijo91/play-slick3-steps

Cannot write an instance of play.api.libs.json.JsLookupResult to HTTP response. Try to define a Writable[play.api.libs.json.JsLookupResult]

I am upgrading Play Framework server from 2.2.2 to 2.4.4, as my application is working fine with old version, but while upgrading it is giving various errors like:
Cannot write an instance of play.api.libs.json.JsLookupResult to HTTP response. Try to define a Writable[play.api.libs.json.JsLookupResult]
Ok<gts<0>>
^
build.sbt:
name := """TestApp"""
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
//scalaVersion := "2.11.6"
//scalaVersion := "2.10.2"
libraryDependencies ++= Seq(
javaJdbc,
cache,
javaWs,
jdbc,
cache,
"com.typesafe.play" %% "anorm" % "2.4.0",
"com.google.inject" % "guice" % "4.0",
"javax.inject" % "javax.inject" % "1",
"org.reactivemongo" %% "reactivemongo" % "0.10.0",
"org.reactivemongo" %% "play2-reactivemongo" % "0.10.2",
"org.mockito" % "mockito-core" % "1.9.5" % "test",
"org.webjars" % "requirejs" % "2.1.1",
"org.postgresql" % "postgresql" % "9.4-1200-jdbc41",
"postgresql" % "postgresql" % "9.1-901.jdbc4",
"com.typesafe.play" % "play-iteratees_2.10" % "2.2.3",
"com.typesafe.slick" % "slick_2.10" % "2.1.0",
"com.typesafe.play" % "play-jdbc_2.10" % "2.4.4",
"org.apache.flume" % "flume-ng-core" % "1.5.2",
"org.apache.flume" % "flume-ng-sdk" % "1.5.2",
"org.scala-lang" % "scala-compiler" % "2.11.6"
)
app.scala:
package controllers
import play.modules.reactivemongo.MongoController
import play.modules.reactivemongo.json.collection.JSONCollection
import scala.concurrent.Future
import reactivemongo.api.Cursor
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import javax.inject.Singleton
import play.api.mvc._
import play.api.libs.json._
#Singleton
class Application extends Controller with MongoController {
def collection: JSONCollection = db.collection[JSONCollection]("Test")
import models._
import models.JsonFormats._
import reactivemongo.bson._
def index = Action.async {
val cursor: Cursor[JsValue] = collection.
find(Json.obj()).
cursor[JsValue]
val futureUsersList: Future[List[JsValue]] = cursor.collect[List]()
val futurePersonsJsonArray: Future[JsArray] = futureUsersList.map { dao =>
Json.arr(dao)
}
futurePersonsJsonArray.map {
dao =>
Ok(dao(0))
}
}
}
Please let me know where I am doing wrong whether it is in my controller class or build.sbt ? Thanks in advance.
Indexing a JsArray does not return the JsValue at that index but a JsLookupResult, which is either JsDefined or JsUndefined (if, e.g, the index is out-of-bounds.) I think this changed in Play 2.3.
While you can write a JsValue directly to an HTTP response via the default JSON writeable provided by Play, there is no such writeable for a JsLookupResult. You can fix it by pattern matching on the JsLookupResult to extract the value, and handle its absence:
futurePersonsJsonArray.map { dao =>
dao(0).match {
case JsDefined(js) => Ok(js)
case _ => NotFound
}
}
or use toOption to get an Option[JsValue]:
futurePersonsJsonArray.map { dao =>
dao(0).toOption.map(js => Ok(js)).getOrElse(NotFound)
}
However, a better way to fix the code would be to use Json.toJson() directly on the redeemed result of futureUserList, rather than putting it in a JSON array, and then immediately taking it out again:
def index = Action.async {
val cursor: Cursor[JsValue] = collection.
find(Json.obj()).
cursor[JsValue]
val futureUsersList: Future[List[JsValue]] = cursor.collect[List]()
futureUsersList.map { list =>
Ok(Json.toJson(list))
}
}

sending parameters as JSON in POST body using spray

Ok this is really bugging me.I have implemented a REST api using spray and scala,akka. I have successfully implemented spray get, post routes to query mysql database and give the json output.I am able to send the parameters within the post body but however I am not able to send the same as JSON. Its looks very simple but its just not working.I am getting error - empty reply from server. Please Please help.
Here is my sample curl command:
curl -H "Content-Type: application/json" -X POST -d '{ "name": "Jane", "favNumber" : 42 }' http://localhost:8080/jsonTesting
I am trying this small sample code to see if it is possible to unmarshall json parameters and get the required output. If it works then I am implementing the same in my actual code.
import spray.json.DefaultJsonProtocol
import spray.json._
import spray.httpx.SprayJsonSupport
object MyJsonProtocol extends DefaultJsonProtocol with SprayJsonSupport {
implicit val Format = jsonFormat2(Person)
case class Person(name:String, favNumber:Int)
//Spray main class
import akka.actor.ActorSystem
import spray.routing.SimpleRoutingApp
import spray.json.DefaultJsonProtocol
import spray.json._
import spray.http.MediaTypes._
import MyJsonProtocol._
import spray.routing._
object SprayTest extends App with SimpleRoutingApp {
implicit val system = ActorSystem()
def testRoute(route: Route): Route = {
post {
respondWithMediaType(`application/json`){
route
}
}
}
startServer(interface = "localhost", port = 8080) {
testRoute {
path("departed") {
parameter('count.as[Boolean]) {count =>
complete(if (count == true) "even ball" else "odd ball")
}
}
} ~
path("jsonTesting") {
post{
entity(as[Person]) { person =>
complete(s"Person: ${person.name} - favoritenumber:${person.favNumber}")
}
}
}
}
}
Here is the error:
Uncaught error from thread [default-akka.actor.default-dispatcher-2] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[default]
java.lang.NoSuchMethodError: spray.json.JsonParser$.apply(Ljava/lang/String;)Lspray/json/JsValue;
at spray.httpx.SprayJsonSupport$$anonfun$sprayJsonUnmarshaller$1.applyOrElse(SprayJsonSupport.scala:36)
at spray.httpx.SprayJsonSupport$$anonfun$sprayJsonUnmarshaller$1.applyOrElse(SprayJsonSupport.scala:34)
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
at spray.httpx.unmarshalling.Unmarshaller$$anon$1$$anonfun$unmarshal$1.apply(Unmarshaller.scala:29)
at spray.httpx.unmarshalling.SimpleUnmarshaller.protect(SimpleUnmarshaller.scala:40)
at spray.httpx.unmarshalling.Unmarshaller$$anon$1.unmarshal(Unmarshaller.scala:29)
at spray.httpx.unmarshalling.SimpleUnmarshaller.apply(SimpleUnmarshaller.scala:29)
at spray.httpx.unmarshalling.SimpleUnmarshaller.apply(SimpleUnmarshaller.scala:23)
at spray.httpx.unmarshalling.UnmarshallerLifting$$anon$3.apply(UnmarshallerLifting.scala:35)
Sbt dependecies:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.6",
"com.typesafe.akka" %% "akka-http-experimental" % "0.7",
"io.spray" %% "spray-json" % "1.3.2",
"io.spray" %% "spray-routing" % sprayVersion,
"io.spray" %% "spray-client" % sprayVersion,
"io.spray" %% "spray-testkit" % sprayVersion % "test",
"mysql" % "mysql-connector-java" % "5.1.25",
"com.typesafe.slick" %% "slick" % "3.0.0",
"com.typesafe.scala-logging" %% "scala-logging-slf4j" % "2.1.2"
)