com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'RECEPTORS.r_name' in 'field list' - mysql

I am using play 2.3.8 framework to create an API and access mariaDB. When I ran the query on mariaDB console, it works OK but when I run it from play I get error that the field RECEPTORS.r_name is not available which is not true.
My code is
package models.dao
import anorm._
import models.Profile
import play.api.db.DB
import play.api.Play.current
object ProfileDAO {
def index(r_name: String): List[Profile] = {
DB.withConnection { implicit c =>
val results = SQL(
"""
| SELECT `RECEPTORS.r_name`,`RECEPTORS.pdbCode`, `LIGANDS.l_id`, `LIGANDS.l_score`
| FROM `RECEPTORS`
| INNER JOIN `LIGANDS`
| WHERE `RECEPTORS.r_name`={r_name};
""".stripMargin).on(
"r_name" -> r_name
).apply()
results.map { row =>
Profile(row[String]("r_name"), row[String]("pdbCode"),row[String]("l_id"),row[Double]("l_score"))
}.force.toList
}
}
}
Query that I ran on mariaDB console is
SELECT RECEPTORS.r_name, pdbCode, l_id, l_score FROM RECEPTORS INNER JOIN LIGANDS WHERE RECEPTORS.r_name="receptor";
Error which running with Play 2.3.8 is as follows
laeeq#optiplex:~/Desktop/Backup/Project5/cpvsAPI$ sbt -jvm-debug 9999
run Listening for transport dt_socket at address: 9999 [info] Loading
project definition from
/home/laeeq/Desktop/Backup/Project5/cpvsAPI/project [info] Set current
project to cpvsAPI (in build
file:/home/laeeq/Desktop/Backup/Project5/cpvsAPI/) [info] Updating
{file:/home/laeeq/Desktop/Backup/Project5/cpvsAPI/}root... [info]
Resolving jline#jline;2.11 ... [info] Done updating.
--- (Running the application, auto-reloading is enabled) ---
[info] play - Listening for HTTP on /0:0:0:0:0:0:0:0:9000
(Server started, use Ctrl+D to stop and go back to the console...)
SLF4J: The following set of substitute loggers may have been accessed
SLF4J: during the initialization phase. Logging calls during this
SLF4J: phase were not honored. However, subsequent logging calls to
these SLF4J: loggers will work as normally expected. SLF4J: See also
http://www.slf4j.org/codes.html#substituteLogger SLF4J:
org.webjars.WebJarExtractor [info] Compiling 1 Scala source to
/home/laeeq/Desktop/Backup/Project5/cpvsAPI/target/scala-2.11/classes...
[info] play - database [default] connected at
jdbc:mysql://localhost:3306/db_profile [info] play - Application
started (Dev) [error] application -
! #766oc7b8l - Internal server error, for (GET) [/profiles/receptor]
->
play.api.Application$$anon$1: Execution
exception[[MySQLSyntaxErrorException: Unknown column
'RECEPTORS.r_name' in 'field list']] at
play.api.Application$class.handleError(Application.scala:296)
~[play_2.11-2.3.8.jar:2.3.8] at
play.api.DefaultApplication.handleError(Application.scala:402)
[play_2.11-2.3.8.jar:2.3.8] at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:205)
[play_2.11-2.3.8.jar:2.3.8] at
play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$14$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:202)
[play_2.11-2.3.8.jar:2.3.8] at
scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)
[scala-library-2.11.1.jar:na] Caused by:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown
column 'RECEPTORS.r_name' in 'field list' at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
~[na:1.8.0_151] at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
~[na:1.8.0_151] at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
~[na:1.8.0_151] at
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
~[na:1.8.0_151] at
com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
~[mysql-connector-java-5.1.18.jar:na]

You need to quote the table and column individually, so use:
`RECEPTORS`.`r_name`
otherwise MySQL thinks you are trying to reference a column with the name RECEPTORS.r_name in some implicit table.
You need to do this for all your (quoted) column references. Specifically in this case, quoting seems to be unnecessary, so you could also just use RECEPTORS.r_name without backticks.

Related

Apache Drill 1.17.0 on Windows 10 - Trouble Getting Drill to Run (Embedded Mode)

Details:
Apache Drill 1.17.0
Windows 10 64 bit
Java JDK1.8.0_241
New installation. Unable to get Apache Drill to load successfully.
Command line: c:\Users\floodb\Software\Drill\apache-drill-1.17.0\bin>drill-embedded
Error Received: Error: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
[Error Id: 7c1b33eb-7a27-4e39-af06-5ba22e5ffae6 ] (state=,code=0)
java.sql.SQLException: Failure in starting embedded Drillbit: UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance of the class of type org.apache.drill.exec.store.StoragePluginRegistry requested at path drill.exec.storage.registry.
There is no 'hadoop_home' environment variable set (as suggested by other posts on StackOverflow).
Partial Log:
2020-02-19 15:55:42,315 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Stopwatch patched for
old HBase Guava version. 2020-02-19 15:55:42,319 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Closeables patched for
old HBase Guava version. 2020-02-19 15:55:42,333 [main] INFO
o.a.drill.common.util.GuavaPatcher - Google's Preconditions were
patched to hold new methods. 2020-02-19 15:55:42,693 [main] INFO
o.a.drill.common.config.DrillConfig - Configuration and plugin file(s)
identified in 32ms. Base Configuration:
- jar:file:/C:/Users/floodb/Software/Drill/apache-drill-1.17.0/jars/drill-common-1.17.0.jar!/drill-default.conf
(Bunch of log lines deleted)
2020-02-19 15:55:45,134 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 22 classes for org.apache.drill.common.logical.data.LogicalOperator took 4ms
2020-02-19 15:55:45,138 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 12 classes for org.apache.drill.common.logical.StoragePluginConfig took 3ms
2020-02-19 15:55:45,146 [main] INFO o.a.d.c.s.persistence.ScanResult
- loading 15 classes for org.apache.drill.common.logical.FormatPluginConfig took 7ms 2020-02-19
15:55:45,179 [main] INFO o.a.drill.common.config.DrillConfig - User
Error Occurred: Failure while attempting to load instance of the class
of type org.apache.drill.exec.store.StoragePluginRegistry requested at
path drill.exec.storage.registry. (null)
org.apache.drill.common.exceptions.UserException:
UNSUPPORTED_OPERATION ERROR: Failure while attempting to load instance
of the class of type org.apache.drill.exec.store.StoragePluginRegistry
requested at path drill.exec.storage.registry.
[Error Id: 7c1b33eb-7a27-4e39-af06-5ba22e5ffae6 ] at
org.apache.drill.common.exceptions.UserException$Builder.build(UserException.java:637)
at
org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:92)
at
org.apache.drill.exec.server.DrillbitContext.(DrillbitContext.java:113)
at org.apache.drill.exec.work.WorkManager.start(WorkManager.java:116)
at org.apache.drill.exec.server.Drillbit.run(Drillbit.java:221) at
org.apache.drill.jdbc.impl.DrillConnectionImpl.(DrillConnectionImpl.java:134)
at
org.apache.drill.jdbc.impl.DrillJdbc41Factory.newDrillConnection(DrillJdbc41Factory.java:67)
at
org.apache.drill.jdbc.impl.DrillFactory.newConnection(DrillFactory.java:67)
at
org.apache.calcite.avatica.UnregisteredDriver.connect(UnregisteredDriver.java:138)
at org.apache.drill.jdbc.Driver.connect(Driver.java:75) at
sqlline.DatabaseConnection.connect(DatabaseConnection.java:135) at
sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:192)
at sqlline.Commands.connect(Commands.java:1364) at
sqlline.Commands.connect(Commands.java:1244) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:38)
at sqlline.SqlLine.dispatch(SqlLine.java:730) at
sqlline.SqlLine.initArgs(SqlLine.java:410) at
sqlline.SqlLine.begin(SqlLine.java:515) at
sqlline.SqlLine.start(SqlLine.java:267) at
sqlline.SqlLine.main(SqlLine.java:206) Caused by:
java.lang.reflect.InvocationTargetException: null at
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at
org.apache.drill.common.config.DrillConfig.getInstance(DrillConfig.java:88)
... 22 common frames omitted Caused by:
java.lang.UnsatisfiedLinkError:
org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native
Method) at
org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:645)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1230) at
org.apache.hadoop.fs.FileUtil.list(FileUtil.java:1435) at
org.apache.hadoop.fs.RawLocalFileSystem.listStatus(RawLocalFileSystem.java:493)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910)
at
org.apache.hadoop.fs.ChecksumFileSystem.listStatus(ChecksumFileSystem.java:678)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1868)
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1910)
at
org.apache.drill.exec.store.dfs.DrillFileSystem.listStatus(DrillFileSystem.java:563)
at
org.apache.drill.exec.util.FileSystemUtil.listNonRecursive(FileSystemUtil.java:224)
at
org.apache.drill.exec.util.FileSystemUtil.list(FileSystemUtil.java:209)
at
org.apache.drill.exec.util.FileSystemUtil.listFiles(FileSystemUtil.java:104)
at
org.apache.drill.exec.util.DrillFileSystemUtil.listFiles(DrillFileSystemUtil.java:86)
at
org.apache.drill.exec.store.sys.store.LocalPersistentStore.getRange(LocalPersistentStore.java:121)
at
org.apache.drill.exec.store.sys.BasePersistentStore.getAll(BasePersistentStore.java:27)
at
org.apache.drill.exec.store.StoragePluginRegistryImpl.initPluginsSystemTable(StoragePluginRegistryImpl.java:277)
at
org.apache.drill.exec.store.StoragePluginRegistryImpl.(StoragePluginRegistryImpl.java:90)
... 27 common frames omitted 2020-02-19 15:55:46,199 [main] INFO
o.apache.drill.exec.server.Drillbit - Shutdown completed (1018 ms).
The problem was that the 32 bit version of the Java JDK was installed. If you are having this problem, check to make sure that the 64 bit version of Java is installed.

connecting to mysql db from jenkins pipeline not wroking

I have Jenkins v2.219 installed (running in docker if it matters) and iv'e installed:
Database plugin v1.5
MySQL Database Plugin v1.3
I've created the simplest pipeline to check the db connection and it doesn't work.
My pipeline is:
import groovy.sql.Sql
node {
Class.forName("com.mysql.jdbc.Driver")
def sql = Sql.newInstance("jdbc:mysql://<mysql_db_host>:3306/<db_name>", "myuser","mypass", "com.mysql.jdbc.Driver")
sql.execute "SELECT * FROM table"
}
And when I run it I get:
Rebuilds build #47
Running in Durability level: MAX_SURVIVABILITY
Running on Jenkins in /var/jenkins_home/workspace/pipeline_playground
java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:543)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
at org.codehaus.groovy.runtime.callsite.CallSiteClassLoader.loadClass(CallSiteClassLoader.java:54)
at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
at org.codehaus.groovy.reflection.ClassLoaderForClassArtifacts.loadClass(ClassLoaderForClassArtifacts.java:60)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at java_lang_Class$forName$1.callStatic(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallStatic(CallSiteArray.java:56)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:194)
at org.kohsuke.groovy.sandbox.impl.Checker$2.call(Checker.java:191)
at org.kohsuke.groovy.sandbox.GroovyInterceptor.onStaticCall(GroovyInterceptor.java:35)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onStaticCall(SandboxInterceptor.java:186)
at org.kohsuke.groovy.sandbox.impl.Checker$2.call(Checker.java:189)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedStaticCall(Checker.java:193)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:98)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
at WorkflowScript.run(WorkflowScript:3)
at ___cps.transform___(Native Method)
at com.cloudbees.groovy.cps.impl.ContinuationGroup.methodCall(ContinuationGroup.java:86)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.dispatchOrArg(FunctionCallBlock.java:113)
at com.cloudbees.groovy.cps.impl.FunctionCallBlock$ContinuationImpl.fixArg(FunctionCallBlock.java:83)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
at com.cloudbees.groovy.cps.Next.step(Next.java:83)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:174)
at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:163)
at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:129)
at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:268)
at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:163)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:51)
at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:185)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:405)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$400(CpsThreadGroup.java:96)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:317)
at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:281)
at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:67)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:131)
at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
at jenkins.security.ImpersonatingExecutorService$1.run(ImpersonatingExecutorService.java:59)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Finished: FAILURE
I've also tried running it without: "Class.forName("com.mysql.jdbc.Driver")" that gave me "java.sql.SQLException: No suitable driver found for ..."
UPDATE:
btw, i do see:
database-mysql.jar, mysql-connector-java-8.0.13.jar, protobuf-java-3.6.1.jar
at: /plugins/database-mysql/WEB-INF/lib
I had the same problem and this was how I fixed it.
First link the database jar files to your jenkins system.
To find out where jenkins is looking for jar files go to the script console of your jenkins server and run this code
println System.getProperty("java.ext.dirs")
Result
/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.fc29.x86_64/jre/lib/ext:/usr/java/packages/lib/ext
Next go to one of those directories and create symlinks to the jar files of the plugins. On a linux system...
cd /usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.fc29.x86_64/jre/lib/ext
ln -s /var/lib/jenkins/plugins/database-mysql/WEB-INF/lib/database-mysql.jar
ln -s /var/lib/jenkins/plugins/database-mysql/WEB-INF/lib/mysql-connector-java-8.0.13.jar
Restart jenkins so that the jar files are loaded.
Now you can query your MySQL database from your jenkins pipeline
import groovy.sql.Sql
node{
def conn = Sql.newInstance("jdbc:mysql://localhost:3306/cms", "fred", "fredpassword", "com.mysql.jdbc.Driver")
def rows = conn.rows("select username from users LIMIT 10")
assert rows.size() == 10
println rows.join('\n')
}

Error when trying to load JDBC sink connector

I am trying to stream data from a Kafka topic to a MySQL database unsuccessfully. Although the source connector works fine (i.e. streaming data from a MySQL database to kafka topic), sink connector fails to load.
Here is my sink-mysql.properties file:
name=sink-mysql
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=test-mysql-jdbc-foobar
connection.url=jdbc:mysql://127.0.0.1:3306/demo?user=user1&password=user1pass
auto.create=true
When I try to execute
./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-mysql.properties
the following errors are reported:
[2018-02-01 16:17:43,019] ERROR WorkerSinkTask{id=sink-mysql-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. (org.apache.kafka.connect.runtime.WorkerSinkTask:515)
org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table: test-mysql-jdbc-foobar
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:127)
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:64)
at io.confluent.connect.jdbc.sink.BufferedRecords.add(BufferedRecords.java:71)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:66)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:69)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[2018-02-01 16:17:43,020] ERROR WorkerSinkTask{id=sink-mysql-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:517)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table: test-mysql-jdbc-foobar
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:127)
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:64)
at io.confluent.connect.jdbc.sink.BufferedRecords.add(BufferedRecords.java:71)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:66)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:69)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495)
... 10 more
[2018-02-01 16:17:43,021] ERROR WorkerSinkTask{id=sink-mysql-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173)
Note that topic test-mysql-jdbc-foobar contains data streamed from MySQL to kafka however, I am not able to stream this data from MySQL back to kafka. The content of sink-mysql.properties looks identical to the one used in the official confluent's documentation but it doesn't seem to work. Also, mysql-connector is placed in the right directory (under share/java/kafka-connect-jdbc/).
EDIT
Here is the content of my worker config file:
bootstrap.servers=localhost:9092
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
# Local storage file for offset data
offset.storage.file.filename=/tmp/connect.offsets
plugin.path=share/java
To be able to use the JDBC Sink, your messages must have a schema. This can be by using Avro + Schema Registry, or JSON with schemas. In your worker config you've specified:
key.converter.schemas.enable=false
value.converter.schemas.enable=false
Which means that the JSON will not contain the schemas.
Here's an example of the JSON that Kafka Connect will produce (as a source) and expect (as a sink) if you enable schemas: https://gist.github.com/rmoff/2b922fd1f9baf3ba1d66b98e9dd7b364

Unable to connect to host MySQL database on application deployed to CloudBees

I followed the instructions here but when attempted I got the following error:
hudson.util.IOException2: remote file operation failed: /scratch/jenkins/workspace/Xinco Demo Publish/Xinco/target/Xinco-2012-08-30_00-20-05.war at hudson.remoting.Channel#1fc6bdea:s-50b0ae50
at hudson.FilePath.act(FilePath.java:783)
at hudson.FilePath.act(FilePath.java:769)
at com.cloudbees.plugins.deployer.DeployPublisher.perform(DeployPublisher.java:108)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:707)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:682)
at hudson.model.AbstractBuild$AbstractRunner.performAllBuildSteps(AbstractBuild.java:660)
at hudson.model.Build$RunnerImpl.post2(Build.java:162)
at hudson.model.AbstractBuild$AbstractRunner.post(AbstractBuild.java:629)
at hudson.model.Run.run(Run.java:1433)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:238)
Caused by: hudson.remoting.ProxyException: hudson.util.IOException2: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:151)
at com.cloudbees.plugins.deployer.deployables.Deployable$DeployFileCallable.invoke(Deployable.java:342)
at hudson.FilePath$FileCallableWrapper.call(FilePath.java:2048)
at hudson.remoting.UserRequest.perform(UserRequest.java:118)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:287)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:662)
Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
at com.cloudbees.api.BeesClient.readResponse(BeesClient.java:850)
at com.cloudbees.api.BeesClient.applicationDeployArchive(BeesClient.java:435)
at com.cloudbees.plugins.deployer.deployables.Deployable.deployFile(Deployable.java:123)
... 11 more
Build step 'Deploy to CloudBees' marked build as failure
The full output can be seen here.
Caused by: hudson.remoting.ProxyException: com.cloudbees.api.BeesClientException: Server.InternalError - Invalid WEB-INF/cloudbees-web.xml: resource
Your cloudbees-web.xml doesn't follow the correct format.
See http://wiki.cloudbees.com/bin/view/RUN/CloudBeesWebXml - as the cloudbees-web.xml needs to be wrapped in an outer <cloudbees-web-app> element
You also don't have to use cloudbees-web.xml if you don't want - if you bind your app to a DB it will make it available as a named datasource automatically.
(see bees app:bind command).
You only have to do this once - and then the app will know about the datasource.
http://developer.cloudbees.com/bin/view/RUN/Resource+Management
and
https://developer.cloudbees.com/bin/view/RUN/DatabaseGuide
(sorry, still working on docs).

Error during Sonar Configuration ( use Derby default DB )

I changed my Sonar DB from Oracle to Default Derby. I successfully configured the Sonar Server, however I have error during the integration with Hudson.
Caused by: java.sql.SQLException: SQL driver not found oracle.jdbc.OracleDriver
at org.sonar.jpa.session.DriverDatabaseConnector.getConnection(DriverDatabaseConnector.java:91)
at org.sonar.jpa.session.AbstractDatabaseConnector.testConnection(AbstractDatabaseConnector.java:185)
... 41 more
Caused by: java.lang.ClassNotFoundException: oracle.jdbc.OracleDriver
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:251)
at org.sonar.jpa.session.DriverDatabaseConnector.getConnection(DriverDatabaseConnector.java:88)
... 42 more
Error states that I can't found OracleDriver, Which I should not use anymore.
In my Hudson configuration, I have removed my oracle configuration and replaced it with these :
Any idea on what I configured wrongly?
fixed by changing the driver to "org.apache.derby.jdbc.ClientDriver". Turns out that the remarks "Do not set if you use default embedded" is a misleading.