How do I deal with kafka-avro-console-producer fails? - kafka-producer-api

When followed the confluent platform quickstart, I got stuck in step 5. An error occured:
[root#CentOS-70 confluent-3.1.2]# ./bin/kafka-avro-console-producer --broker-list localhost:9092 --topic mytopic --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/wei_liu/confluent-3.1.2/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/wei_liu/confluent-3.1.2/share/java/schema-registry/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
{"f1": "value1"}
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: {"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Register schema operation failed while writing to the Kafka store; error code: 50001
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:187)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:238)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:230)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:225)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)
at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:158)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:55)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)

Related

Scala SBT project error happens while execute run

I'm doing Scala SBT project using mySQL and quill in intellij idea community
when I'm run my project using sbt run command in terminal, it gives following error messages.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by io.netty.util.internal.PlatformDependent0$1 (file:/tmp/sbt_ca022510/target/fc3580dd/177fa64e/netty-all-4.1.6.Final.jar) to field java.nio.Buffer.address
WARNING: Please consider reporting this to the maintainers of io.netty.util.internal.PlatformDependent0$1
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
But execution result also receive. (Example as when I run a method for insert data to a table, data have insert the table while error displaying)
I search this error and "org.slf4j.impl.StaticLoggerBinder" in google. But I couldn't find about it. All I want to know, what is the error message and will it impact to my result anyway.

com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'RECEPTORS.r_name' in 'field list'

I am using play 2.3.8 framework to create an API and access mariaDB. When I ran the query on mariaDB console, it works OK but when I run it from play I get error that the field RECEPTORS.r_name is not available which is not true.
My code is
package models.dao
import anorm._
import models.Profile
import play.api.db.DB
import play.api.Play.current
object ProfileDAO {
def index(r_name: String): List[Profile] = {
DB.withConnection { implicit c =>
val results = SQL(
"""
| SELECT `RECEPTORS.r_name`,`RECEPTORS.pdbCode`, `LIGANDS.l_id`, `LIGANDS.l_score`
| FROM `RECEPTORS`
| INNER JOIN `LIGANDS`
| WHERE `RECEPTORS.r_name`={r_name};
""".stripMargin).on(
"r_name" -> r_name
).apply()
results.map { row =>
Profile(row[String]("r_name"), row[String]("pdbCode"),row[String]("l_id"),row[Double]("l_score"))
}.force.toList
}
}
}
Query that I ran on mariaDB console is
SELECT RECEPTORS.r_name, pdbCode, l_id, l_score FROM RECEPTORS INNER JOIN LIGANDS WHERE RECEPTORS.r_name="receptor";
Error which running with Play 2.3.8 is as follows
laeeq#optiplex:~/Desktop/Backup/Project5/cpvsAPI$ sbt -jvm-debug 9999
run Listening for transport dt_socket at address: 9999 [info] Loading
project definition from
/home/laeeq/Desktop/Backup/Project5/cpvsAPI/project [info] Set current
project to cpvsAPI (in build
file:/home/laeeq/Desktop/Backup/Project5/cpvsAPI/) [info] Updating
{file:/home/laeeq/Desktop/Backup/Project5/cpvsAPI/}root... [info]
Resolving jline#jline;2.11 ... [info] Done updating.
--- (Running the application, auto-reloading is enabled) ---
[info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
(Server started, use Ctrl+D to stop and go back to the console...)
SLF4J: The following set of substitute loggers may have been accessed
SLF4J: during the initialization phase. Logging calls during this
SLF4J: phase were not honored. However, subsequent logging calls to
these SLF4J: loggers will work as normally expected. SLF4J: See also
http://www.slf4j.org/codes.html#substituteLogger SLF4J:
org.webjars.WebJarExtractor [info] Compiling 1 Scala source to
/home/laeeq/Desktop/Backup/Project5/cpvsAPI/target/scala-2.11/classes...
[info] play - database [default] connected at
jdbc:mysql://localhost:3306/db_profile [info] play - Application
started (Dev) [error] application -
! #766oc7b8l - Internal server error, for (GET) [/profiles/receptor]
->
play.api.Application$$anon$1: Execution
exception[[MySQLSyntaxErrorException: Unknown column
'RECEPTORS.r_name' in 'field list']] at
play.api.Application$class.handleError(Application.scala:296)
~[play_2.11-2.3.8.jar:2.3.8] at
play.api.DefaultApplication.handleError(Application.scala:402)
[play_2.11-2.3.8.jar:2.3.8] at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:205)
[play_2.11-2.3.8.jar:2.3.8] at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:202)
[play_2.11-2.3.8.jar:2.3.8] at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
[scala-library-2.11.1.jar:na] Caused by:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown
column 'RECEPTORS.r_name' in 'field list' at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
~[na:1.8.0_151] at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
~[na:1.8.0_151] at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
~[na:1.8.0_151] at
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
~[na:1.8.0_151] at
com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
~[mysql-connector-java-5.1.18.jar:na]
You need to quote the table and column individually, so use:
`RECEPTORS`.`r_name`
otherwise MySQL thinks you are trying to reference a column with the name RECEPTORS.r_name in some implicit table.
You need to do this for all your (quoted) column references. Specifically in this case, quoting seems to be unnecessary, so you could also just use RECEPTORS.r_name without backticks.

Handlebars-proto and logstash-logback-encoder slf4j conflict

My project uses handlebars-proto to bind templates to json. Also i am trying to use logstash-logback-encoder to log in form of Json for logstash. Below are my compile dependencies ( along with other dependencies)
compile 'net.logstash.logback:logstash-logback-encoder:4.6'
compile 'com.github.jknack:handlebars-proto:4.0.5'
If I remove handlebars dependency logging works fine. If handlebars is present I get the below warning :
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found
binding in
[jar:file:/C:/Users/manish/.gradle/caches/modules-2/files-2.1/com.github.jknack/handlebars-proto/4.0.5/5979737344d99e0d8b482e828f247ae86fd0113/handlebars-proto-4.0.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/C:/Users/manish/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-classic/1.1.6/665e3de72f19ec66ac67d82612d7b8e6b3de3cd0/logback-classic-1.1.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
followed by error :
Failed to instantiate [ch.qos.logback.classic.LoggerContext] Reported
exception:
java.lang.NoSuchMethodError: ch.qos.logback.core.spi.ContextAwareBase.(Lch/qos/logback/core/spi/ContextAware;)V
at
net.logstash.logback.composite.CompositeJsonFormatter.(CompositeJsonFormatter.java:106)
at
net.logstash.logback.composite.loggingevent.LoggingEventCompositeJsonFormatter.(LoggingEventCompositeJsonFormatter.java:28)
at
net.logstash.logback.LogstashFormatter.(LogstashFormatter.java:122)
at
net.logstash.logback.LogstashFormatter.(LogstashFormatter.java:118)
at
net.logstash.logback.LogstashFormatter.(LogstashFormatter.java:114)
at
net.logstash.logback.encoder.LogstashEncoder.createFormatter(LogstashEncoder.java:31)
at
net.logstash.logback.encoder.CompositeJsonEncoder.(CompositeJsonEncoder.java:48)
at
net.logstash.logback.encoder.LoggingEventCompositeJsonEncoder.(LoggingEventCompositeJsonEncoder.java:23)
at
net.logstash.logback.encoder.LogstashEncoder.(LogstashEncoder.java:27)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method) at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at java.lang.Class.newInstance(Class.java:379) at
ch.qos.logback.core.joran.action.NestedComplexPropertyIA.begin(NestedComplexPropertyIA.java:122)
at
ch.qos.logback.core.joran.spi.Interpreter.callBeginAction(Interpreter.java:276)
at
ch.qos.logback.core.joran.spi.Interpreter.startElement(Interpreter.java:148)
at
ch.qos.logback.core.joran.spi.Interpreter.startElement(Interpreter.java:130)
at
ch.qos.logback.core.joran.spi.EventPlayer.play(EventPlayer.java:50)
I tried excluding slf4j as transitive dependency from handlebars but
it did not help.
compile
('com.github.jknack:handlebars-proto:4.0.5'){
exclude module: 'slf4j-api'
}
Thanks in advance for help. Let me know if more details are required.
The error message is because the handlebars-proto jar contains an implementation of org.slf4j.impl.StaticLoggerBinder, which means that it has a logging implementation. You are also trying to add your own logging implementation of Logback, and since SLF4J can't use two logging systems at once, it's complaining.
I'm not familiar with the handlebars project, but I'm guessing that the library you have there is a full application that includes a logging system, and not really designed to be used as a dependency by another project. Perhaps there's a different version available that is just the library, and that doesn't try to include its own logging implementation?

Newbie in Apache Drill: Cannot see Web Console

I downloaded apache-drill-1.2.0 on a ubuntu 14.04 64 box.
Extracted the tar.zip contents, went to bin folder and ran drill.
Now I tried to open: http://localhost:8047, but I'm getting a "can't establish a connection to server" error.
I tried to enable https by http.ssl_enabled: "TRUE". But still cannot open the web console either using http/https.
Here are some relevant logs:
himanshu#himanshu-HP-ProBook-4430s:~/Downloads/softwares/apache-drill-1.2.0/bin$ ./drill-embedded
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/local/hadoop/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/classb/logback-classic-1.0.13.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/11/13 15:38:56 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:38:56 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
15/11/13 15:39:00 INFO reflections.Reflections: Reflections took 4050 ms to scan 7 urls, producing 5602 keys and 21840 values
com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
apache drill 1.2.0
"got drill?"
0: jdbc:drill:zk=local> !quit
15/11/13 15:39:02 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
at org.apache.drill.common.util.PathScanner.scanForImplementations(PathScanner.java:110)
at org.apache.drill.common.util.PathScanner.scanForImplementationsArr(PathScanner.java:86)
at org.apache.drill.common.logical.data.LogicalOperatorBase.getSubTypes(LogicalOperatorBase.java:92)
at org.apache.drill.common.config.DrillConfig.<init>(DrillConfig.java:82)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:226)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:152)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init>(DrillConnectionImpl.java:92)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:66)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.close(Commands.java:925)
at sqlline.Commands.quit(Commands.java:889)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)
at sqlline.SqlLine.dispatch(SqlLine.java:742)
at sqlline.SqlLine.begin(SqlLine.java:621)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
15/11/13 15:39:02 INFO config.DrillConfig: Loading base configuration file at jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-default.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading 7 module configuration files at:
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hbase-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-hive-exec-shaded-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-common-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-jdbc-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-java-exec-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-mongo-storage-1.2.0.jar!/drill-module.conf
- jar:file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/jars/drill-storage-hive-core-1.2.0.jar!/drill-module.conf.
15/11/13 15:39:02 INFO config.DrillConfig: Loading override config. file at file:/home/himanshu/Downloads/softwares/apache-drill-1.2.0/conf/drill-override.conf.
15/11/13 15:39:03 INFO config.DrillConfig: Loading override Properties parameter {user=, password=, zk=local}.
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Stopwatch.elapsed(Ljava/util/concurrent/TimeUnit;)J
at org.apache.drill.common.util.PathScanner.scanForImplementations(PathScanner.java:110)
at org.apache.drill.common.util.PathScanner.scanForImplementationsArr(PathScanner.java:86)
at org.apache.drill.common.logical.data.LogicalOperatorBase.getSubTypes(LogicalOperatorBase.java:92)
at org.apache.drill.common.config.DrillConfig.<init>(DrillConfig.java:82)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:226)
at org.apache.drill.common.config.DrillConfig.create(DrillConfig.java:152)
at org.apache.drill.jdbc.impl.DrillConnectionImpl.<init>(DrillConnectionImpl.java:92)
at org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:66)
at org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:69)
at net.hydromatic.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:126)
at org.apache.drill.jdbc.Driver.connect(Driver.java:72)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:167)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:213)
at sqlline.Commands.close(Commands.java:925)
at sqlline.Commands.closeall(Commands.java:899)
at sqlline.SqlLine.begin(SqlLine.java:649)
at sqlline.SqlLine.start(SqlLine.java:375)
at sqlline.SqlLine.main(SqlLine.java:268)
Guava library present in distribution: guava-14.0.1
Please help.
You started with ./drill-embedded
Right. First of all you need to check, Drill is running or not on 8047 port.
Using this command check is there any process started on this port.
netstat -tlnp | grep 8047
you can also use jps command and check is there any process with name sqlline.
If its showing a process running on this port. Then there may be issue with mapping localhost. Then you can try
http://<IP_ADDRESS>:8047
You should start drill first.. Then try to open localhost:8047

slf4j exception with quartz

I am trying to use quartz in a simple example in project. I am getting the following exception, I am not sure what it means...However I updated my slf4j to 1.6.1 in my POM file even then this still appears,
SLF4J: slf4j-api 1.6.x (or later) is incompatible with this binding.
SLF4J: Your binding is version 1.5.5 or earlier.
SLF4J: Upgrade your binding to version 1.6.x. or 2.0.x
Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder;
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:121)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:111)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:268)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:241)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:131)
at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:395)
at org.quartz.impl.StdSchedulerFactory.<init>(StdSchedulerFactory.java:249)
............
Any help on this would be highly appreciated. Thanks.
You need all your SLF4J dependencies to use the same version.
SLF4J: Your binding is version 1.5.5 or earlier.
SLF4J: Upgrade your binding to version 1.6.x. or 2.0.x
If you look at your dependency tree, I expect that you'll find more then one version of SLF4J for the various jar it uses.
For example
[INFO] +- org.hibernate:hibernate-core:jar:3.5.3-Final:compile
[INFO] | +- antlr:antlr:jar:2.7.7:compile (version managed from 2.7.6)
[INFO] | \- org.slf4j:slf4j-api:jar:1.5.8:compile
[INFO] +- org.slf4j:slf4j-log4j12:jar:1.5.8:compile
Here the two slf4j deps have the same version.
Looks like the SLF4J binding used inside quartz is too old. You should exclude the old version from quartz and add a new one explicitly to your project. Run mvn dependency:tree and post your result here. I will be able to give you exact instructions then.