ebtables.c:61:3: error: implicit declaration of function 'xt_compat_calc_jump' [-Werror=implicit-function-declaration] - android-kernel

I am getting the following exception while building the recovery for a lineageos project and need some help diagnosing and resolving the issue
Device tree:= https://github.com/darran-kelinske-fivestars/android_device_lenovo_tb8504f/tree/lineage-15.1
Vendor tree:= https://github.com/darran-kelinske-fivestars/android_vendor_lenovo_tb8504f/tree/lineage-15.1
Kernel source:= https://github.com/darran-kelinske-fivestars/android_kernel_lenovo_tb8504f/tree/lineage-15.1
ROM Source:= https://github.com/LineageOS/android
Command: source build/envsetup.sh && breakfast tb8504f && repo sync --force-sync -q -j6 && mka recoveryimage -j6 | tee recovery.log
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c: In function 'ebt_standard_compat_from_user':
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:61:3: error: implicit declaration of function 'xt_compat_calc_jump' [-Werror=implicit-function-declaration]
v += xt_compat_calc_jump(NFPROTO_BRIDGE, v);
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c: At top level:
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:76:15: error: variable 'ebt_standard_target' has initializer but incomplete type
static struct xt_target ebt_standard_target = {
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:77:2: error: unknown field 'name' specified in initializer
.name = "standard",
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:77:2: warning: excess elements in struct initializer
error, forbidden warning: ebtables.c:77
CC net/core/gen_estimator.o
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:257: recipe for target 'net/bridge/netfilter/ebtables.o' failed
make[4]: *** [net/bridge/netfilter/ebtables.o] Error 1
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:402: recipe for target 'net/bridge/netfilter' failed
make[3]: *** [net/bridge/netfilter] Error 2
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:402: recipe for target 'net/bridge' failed
make[2]: *** [net/bridge] Error 2
make[2]: *** Waiting for unfinished jobs....
Full log:
https://pastebin.com/v2ZsfRuc

I resolved this issue by replacing the source in the include directory of this project with the include directory of another project.
I replaced the source with the source from this directory:
https://github.com/HighwayStar/android_kernel_lenovo_tb8704/tree/tab4-8plus-LA.UM.5.6.r1-0/include

Related

Can not install mysql2-0.3.16 on Mac M1

Trying to launch my work project, but when I do bundle I get this error
Fetching mysql2 0.3.16
Installing mysql2 0.3.16 with native extensions
Gem::Ext::BuildError: ERROR: Failed to build gem native extension.
current directory: /Users/alexandra_shimanovich/.bundle/ruby/2.5.0/gems/mysql2-0.3.16/ext/mysql2
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/bin/ruby -I /Users/alexandra_shimanovich/.rbenv/versions/2.5.5/lib/ruby/site_ruby/2.5.0 -r ./siteconf20220608-34831-zm49z7.rb extconf.rb
--with-mysql-lib\=/opt/homebrew/opt/mysql/lib\ --with-mysql-dir\=/opt/homebrew/opt/mysql\ --with-mysql-config\=/opt/homebrew/opt/mysql/bin/mysql_config\
--with-mysql-include\=/opt/homebrew/opt/mysql/include\ --with-ldflags\=-L/opt/homebrew/opt/openssl#3/lib\ --with-cppflags\=-I/opt/homebrew/opt/openssl#3/include
checking for ruby/thread.h... yes
checking for rb_thread_call_without_gvl() in ruby/thread.h... no
checking for rb_thread_blocking_region()... no
checking for rb_wait_for_single_fd()... no
checking for rb_hash_dup()... no
checking for rb_intern3()... no
-----
Using mysql_config at /opt/homebrew/bin/mysql_config
-----
checking for mysql.h... yes
checking for errmsg.h... yes
checking for mysqld_error.h... yes
-----
Setting rpath to /opt/homebrew/Cellar/mysql/8.0.29/lib
-----
creating Makefile
current directory: /Users/alexandra_shimanovich/.bundle/ruby/2.5.0/gems/mysql2-0.3.16/ext/mysql2
make DESTDIR\= clean
current directory: /Users/alexandra_shimanovich/.bundle/ruby/2.5.0/gems/mysql2-0.3.16/ext/mysql2
make DESTDIR\=
compiling client.c
In file included from client.c:1:
In file included from ./mysql2_ext.h:41:
In file included from ./client.h:18:
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/backward/rubysig.h:14:2: warning: rubysig.h is obsolete [-W#warnings]
#warning rubysig.h is obsolete
^
In file included from client.c:1:
In file included from ./mysql2_ext.h:41:
./client.h:22:1: error: static declaration of 'rb_thread_call_without_gvl' follows non-static declaration
rb_thread_call_without_gvl(
^
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/thread.h:28:7: note: previous declaration is here
void *rb_thread_call_without_gvl(void *(*func)(void *), void *data1,
^
In file included from client.c:1:
In file included from ./mysql2_ext.h:41:
./client.h:29:3: error: use of undeclared identifier 'TRAP_BEG'
TRAP_BEG;
^
./client.h:31:3: error: use of undeclared identifier 'TRAP_END'
TRAP_END;
^
In file included from client.c:11:
./wait_for_single_fd.h:31:10: warning: implicit declaration of function 'rb_thread_select' is invalid in C99 [-Wimplicit-function-declaration]
return rb_thread_select(fd + 1, rfds, wfds, efds, tvp);
^
client.c:21:14: error: static declaration of 'rb_hash_dup' follows non-static declaration
static VALUE rb_hash_dup(VALUE other) {
^
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/intern.h:494:7: note: previous declaration is here
VALUE rb_hash_dup(VALUE);
^
client.c:22:31: warning: '(' and '{' tokens introducing statement expression appear in different macro expansion contexts [-Wcompound-token-split-by-macro]
return rb_funcall(rb_cHash, rb_intern("[]"), 1, other);
^~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1755:23: note: expanded from macro 'rb_intern'
__extension__ (RUBY_CONST_ID_CACHE((ID), (str))) : \
^
client.c:22:31: note: '{' token is here
return rb_funcall(rb_cHash, rb_intern("[]"), 1, other);
^~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1755:24: note: expanded from macro 'rb_intern'
__extension__ (RUBY_CONST_ID_CACHE((ID), (str))) : \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1740:5: note: expanded from macro 'RUBY_CONST_ID_CACHE'
{ \
^
client.c:22:31: warning: '}' and ')' tokens terminating statement expression appear in different macro expansion contexts [-Wcompound-token-split-by-macro]
return rb_funcall(rb_cHash, rb_intern("[]"), 1, other);
^~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1755:24: note: expanded from macro 'rb_intern'
__extension__ (RUBY_CONST_ID_CACHE((ID), (str))) : \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1745:5: note: expanded from macro 'RUBY_CONST_ID_CACHE'
}
^
client.c:22:31: note: ')' token is here
return rb_funcall(rb_cHash, rb_intern("[]"), 1, other);
^~~~~~~~~~~~~~~
....................
client.c:1396:31: warning: '}' and ')' tokens terminating statement expression appear in different macro expansion contexts [-Wcompound-token-split-by-macro]
rb_const_set(cMysql2Client, rb_intern("BASIC_FLAGS"),
^~~~~~~~~~~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1755:24: note: expanded from macro 'rb_intern'
__extension__ (RUBY_CONST_ID_CACHE((ID), (str))) : \
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1745:5: note: expanded from macro 'RUBY_CONST_ID_CACHE'
}
^
client.c:1396:31: note: ')' token is here
rb_const_set(cMysql2Client, rb_intern("BASIC_FLAGS"),
^~~~~~~~~~~~~~~~~~~~~~~~
/Users/alexandra_shimanovich/.rbenv/versions/2.5.5/include/ruby-2.5.0/ruby/ruby.h:1755:56: note: expanded from macro 'rb_intern'
__extension__ (RUBY_CONST_ID_CACHE((ID), (str))) : \
^
70 warnings and 14 errors generated.
make: *** [client.o] Error 1
make failed, exit code 2
Gem files will remain installed in /Users/alexandra_shimanovich/.bundle/ruby/2.5.0/gems/mysql2-0.3.16 for inspection.
Results logged to /Users/alexandra_shimanovich/.bundle/ruby/2.5.0/extensions/-darwin-21/2.5.0/mysql2-0.3.16/gem_make.out
An error occurred while installing mysql2 (0.3.16), and Bundler cannot continue.
Make sure that `gem install mysql2 -v '0.3.16' --source 'http://rubygems.org/'` succeeds before bundling.
macOS Monterey 12.2.1 Apple M1
ruby 2.5.5
I did all existing suggestions on the Internet, but still no luck, maybe here somebody can help me.
Thank you!

AWS EMR Spark exception on jdbc datasource load

I'm spinning emr-5.31.0 image of AWS EMR cluster with Spark 2.4.6 onboard and then I'm trying to login into spark-shell on the master node and follow this tutorial
https://bigdataprogrammers.com/load-data-from-mysql-in-spark-using-jdbc/
for uploading data from my RDS MySQL instance.
I've uploaded both connector jar (mysql-connector-java-5.1.49-bin.jar) as well as script to ~/home/hadoop folder.
Then I perform as described in tutorial and I'm getting 2 errors
scala> [hadoop#ip-172-31-* ~]$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/10/09 16:41:31 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
Spark context Web UI available at http://ip-172-31-*.ec2.internal:4040
Spark context available as 'sc' (master = yarn, app id = application_1602254033216_0005).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.6-amzn-0
/_/
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_265)
Type in expressions to have them evaluated.
Type :help for more information.
scala> :require /home/hadoop/mysql-connector-java-5.1.49-bin.jar
Added '/home/hadoop/mysql-connector-java-5.1.49-bin.jar' to classpath.
scala> :load /home/hadoop/test01.scala
Loading /home/hadoop/test01.scala...
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.hive.HiveContext
error: error while loading package, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/package.class)' has location not matching its contents: contains package object execution
error: error while loading QueryExecution, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/QueryExecution.class)' has location not matching its contents: contains class QueryExecution
error: error while loading package, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/plans/package.class)' has location not matching its contents: contains package object plans
error: error while loading LogicalPlan, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.class)' has location not matching its contents: contains class LogicalPlan
error: error while loading package, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/encoders/package.class)' has location not matching its contents: contains package object encoders
error: error while loading ExpressionEncoder, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.class)' has location not matching its contents: contains class ExpressionEncoder
error: error while loading Expression, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/expressions/Expression.class)' has location not matching its contents: contains class Expression
error: error while loading NamedExpression, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/expressions/NamedExpression.class)' has location not matching its contents: contains class NamedExpression
error: error while loading DataFrameNaFunctions, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameNaFunctions.class)' has location not matching its contents: contains class DataFrameNaFunctions
error: error while loading DataFrameStatFunctions, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameStatFunctions.class)' has location not matching its contents: contains class DataFrameStatFunctions
error: error while loading TypedColumn, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/TypedColumn.class)' has location not matching its contents: contains class TypedColumn
error: error while loading package, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/package.class)' has location not matching its contents: contains package object function
error: error while loading ReduceFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ReduceFunction.class)' has location not matching its contents: contains class ReduceFunction
error: error while loading KeyValueGroupedDataset, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/KeyValueGroupedDataset.class)' has location not matching its contents: contains class KeyValueGroupedDataset
error: error while loading MapFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/MapFunction.class)' has location not matching its contents: contains class MapFunction
error: error while loading Metadata, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/types/Metadata.class)' has location not matching its contents: contains class Metadata
error: error while loading FilterFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/FilterFunction.class)' has location not matching its contents: contains class FilterFunction
error: error while loading MapPartitionsFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/MapPartitionsFunction.class)' has location not matching its contents: contains class MapPartitionsFunction
error: error while loading FlatMapFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/FlatMapFunction.class)' has location not matching its contents: contains class FlatMapFunction
error: error while loading ForeachFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ForeachFunction.class)' has location not matching its contents: contains class ForeachFunction
error: error while loading ForeachPartitionFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ForeachPartitionFunction.class)' has location not matching its contents: contains class ForeachPartitionFunction
error: error while loading StorageLevel, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/storage/StorageLevel.class)' has location not matching its contents: contains class StorageLevel
error: error while loading CreateViewCommand, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/command/CreateViewCommand.class)' has location not matching its contents: contains class CreateViewCommand
error: error while loading DataFrameWriter, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameWriter.class)' has location not matching its contents: contains class DataFrameWriter
error: error while loading DataStreamWriter, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/streaming/DataStreamWriter.class)' has location not matching its contents: contains class DataStreamWriter
error: error while loading SparkPlan, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/SparkPlan.class)' has location not matching its contents: contains class SparkPlan
scala> :load /home/hadoop/test01.scala
Loading /home/hadoop/test01.scala...
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.hive.HiveContext
defined object ReadDataFromJdbc
scala> ReadDataFromJdbc.main(Array("batches"))
Started.......Fri Oct 09 16:42:02 UTC 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[Stage 0:> (0 + 1) / 1]20/10/09 16:42:04 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, ip-172-31-20-13.ec2.internal, executor 1): java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:55)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:272)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.lang.ClassLoader.findClass(ClassLoader.java:523)
at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
... 25 more
[Stage 0:> (0 + 0) / 1]20/10/09 16:42:05 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
(Connectivity Failed for Table ,org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, ip-172-31-27-165.ec2.internal, executor 2): java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:55)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:272)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.lang.ClassLoader.findClass(ClassLoader.java:523)
at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
... 25 more
Driver stacktrace:)
first error when I'm loading scala script, it is getting loaded with some errors but repetition of same command seems to fix it
second error once I'm requesting data to be loaded from mysql and despite fact that mysql jdbc connector was added to classpath with a command earlier, it fails with java.lang.ClassNotFoundException: com.mysql.jdbc.Driver.
While I believe I can find some directory which will be accessible by spark to find jdbc, I'm super-confused by error appearing on load of script - why is it appearing and how it can be fixed?
I've ended up creating a bootstrap step for cluster which was copying mysql-connector-java jar to all nodes of cluster before spark and hadoop even installed.
First, create copymysqljar.sh script
#!/bin/bash
sudo mkdir -p /home/hadoop
sudo mkdir -p /usr/lib/spark/jars
sudo mkdir -p /usr/lib/hadoop/lib
aws s3 cp s3://<YOUR_BUCKET>/mysql-connector-java-5.1.49-bin.jar /home/hadoop
chmod 777 /home/hadoop/mysql-connector-java-5.1.49-bin.jar
sudo cp /home/hadoop/mysql-connector-java-5.1.49-bin.jar /usr/lib/spark/jars
sudo cp /home/hadoop/mysql-connector-java-5.1.49-bin.jar /usr/lib/hadoop/lib
save copymysqljar.sh to S3 bucket identified by s3://<YOUR_BUCKET>
proceed to cluster creation in AWS with 'create cluster'-'advanced configuration'
during advanced configuration on step 4 create a custom bootstrap action with s3://<YOUR_BUCKET>/copymysqljar.sh as a script
start cluster creation
Alternatively, instead of steps 3, 4 and 5 you can do the same with AWS command-line tools.
You can reach out to official docs on bootstrap steps https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-plan-bootstrap.html#CustomBootstrapCopyS3Object
In general, this script takes care of everything for AWS EMR 5.31 with Hadoop, Spark and Zeppelin. Might require to copy to other directories if other tools should connect to mysql too.

undefined reference to symbol '_ZN12QSqlDatabase11setHostNameERK7QString'

My program has a basic function for adding datas to database but when the code gives an error when it is compiled.
void MainWindow::AddLocationToDatabase()
{
QSqlDatabase db= QSqlDatabase::addDatabase("QMYSQL");
db.setHostName("localhost");
db.setUserName("root");
db.setPassword("*******");
db.setDatabaseName("databasename");
db.setPort(1111);
if(db.open()){
qDebug()<<"connected";
}
}
/usr/bin/ld: build/debug/mainwindow.o: undefined reference to symbol '_ZN12QSqlDatabase11setHostNameERK7QString'
/usr/lib/x86_64-linux-gnu/libQt5Sql.so.5: error adding symbols: DSO missing from command line
collect2: error: ld returned 1 exit status
make: *** [sub-pcapprogram-make_first-ordered] Error 2
06:57:04: The process "/usr/bin/make" exited with code 2.
When executing step "Make"
When I added "QT += sql", there was no error.

How to make handler plugin to mysql in CentOS

Environment: CentOS+mysql-5.6.14
This is what I tried:
cd /usr/local/src/
tar zvfx ahiguti-HandlerSocket-Plugin-for-MySQL-1.0.6-71-g159ea6d.tar.gz
cd ahiguti-HandlerSocket-Plugin-for-MySQL-159ea6d/
./autogen.sh
./configure --with-mysql-source=/usr/local/src/mysql-5.1.47 --with-mysql-bindir=/usr/local/app/mysql/bin/ --with-mysql-plugindir=/usr/local/app/mysql/lib/mysql/plugin/ --prefix=/usr/local/app/mysql
make
but when i do 'make', I encounter the following errors:
In file included from database.cpp:16:
mysql_incl.hpp:12:1: warning: "HAVE_CONFIG_H" redefined
<command line>:1:1: warning: this is the location of the previous definition
database.cpp: In member function 'virtual void dena::dbcontext::init_thread(const void*, volatile int&)':
database.cpp:294: error: 'LOCK_thread_count' was not declared in this scope
database.cpp:296: error: 'threads' was not declared in this scope
database.cpp:297: error: 'thread_count' was not declared in this scope
database.cpp:310: error: 'create' is not a member of 'MDL_request'
database.cpp: In member function 'virtual void dena::dbcontext::term_thread()':
database.cpp:337: error: 'LOCK_thread_count' was not declared in this scope
database.cpp:340: error: 'thread_count' was not declared in this scope
database.cpp: In member function 'void dena::dbcontext::cmd_find_internal(dena::dbcallback_i&, const dena::prep_stmt&, ha_rkey_function, const dena::cmd_exec_args&)':
database.cpp:649: error: 'struct st_key' has no member named 'key_parts'
/data/install/mysql-5.6.14/sql/handler.h:2228: error: 'virtual int handler::index_read_map(uchar*, const uchar*, key_part_map, ha_rkey_function)' is protected
database.cpp:689: error: within this context
/data/install/mysql-5.6.14/sql/handler.h:2247: error: 'virtual int handler::index_prev(uchar*)' is protected
database.cpp:694: error: within this context
/data/install/mysql-5.6.14/sql/handler.h:2244: error: 'virtual int handler::index_next(uchar*)' is protected
database.cpp:698: error: within this context
/data/install/mysql-5.6.14/sql/handler.h:2256: error: 'virtual int handler::index_next_same(uchar*, const uchar*, uint)' is protected
database.cpp:701: error: within this context
database.cpp: In member function 'virtual void dena::dbcontext::cmd_open_index(dena::dbcallback_i&, size_t, const char*, const char*, const char*, const char*)':
database.cpp:770: error: cannot convert 'MEM_ROOT*' to 'Open_table_context*' for argument '3' to 'bool open_table(THD*, TABLE_LIST*, Open_table_context*)'
make[2]: *** [handlersocket_la-database.lo] error 1
make[1]: Leaving directory `/usr/local/src/HandlerSocket-Plugin-for-MySQL-1.0.6'
make: *** [all] error 2
You should use version 1.1 of https://github.com/DeNA/HandlerSocket-Plugin-for-MySQL

Cannot compile MySQL++

I am trying to compile MySQL++ 3.1.0 with the command:
mingw32-make -f Makefile.mingw
After a set of files gets compiled (e.g. beemutex.cpp, cmdline.cpp, connection.cpp, ...), I get these errors:
g++ -c -o mysqlpp_sql_buffer.o -g -mthreads -DUNICODE -D_UNICODE -DMYSQLPP_NO_DLL -DHAVE_MYSQL_SSL_SET -I"C:\Program Files\MySQL\MySQL Server 5.5\include" -MT
mysqlpp_sql_buffer.o -MFmysqlpp_sql_buffer.o.d -MD -MP lib/sql_buffer.cpp
In file included from lib/sql_buffer.h:31:0,
from lib/sql_buffer.cpp:26:
lib/refcounted.h:258:2: error: 'size_t' does not name a type
lib/refcounted.h: In constructor 'mysqlpp::RefCountedPointer<T, Destroyer>::RefCountedPointer()':
lib/refcounted.h:89:2: error: class 'mysqlpp::RefCountedPointer<T, Destroyer>' does not have any field named 'refs_'
lib/refcounted.h: In constructor 'mysqlpp::RefCountedPointer<T, Destroyer>::RefCountedPointer(T*)':
lib/refcounted.h:100:2: error: class 'mysqlpp::RefCountedPointer<T, Destroyer>' does not have any field named 'refs_'
lib/refcounted.h:104:4: error: 'refs_' was not declared in this scope
lib/refcounted.h:104:16: error: expected type-specifier before 'size_t'
lib/refcounted.h:104:16: error: expected ';' before 'size_t'
lib/refcounted.h: In constructor 'mysqlpp::RefCountedPointer<T, Destroyer>::RefCountedPointer(const ThisType&)':
lib/refcounted.h:112:2: error: class 'mysqlpp::RefCountedPointer<T, Destroyer>' does not have any field named 'refs_'
lib/refcounted.h:115:8: error: 'refs_' was not declared in this scope
lib/refcounted.h: In destructor 'mysqlpp::RefCountedPointer<T, Destroyer>::~RefCountedPointer()':
lib/refcounted.h:125:7: error: 'refs_' was not declared in this scope
lib/refcounted.h: In member function 'void mysqlpp::RefCountedPointer<T, Destroyer>::swap(mysqlpp::RefCountedPointer<T, Destroyer>::ThisType&)':
lib/refcounted.h:246:13: error: 'refs_' was not declared in this scope
mingw32-make: *** [mysqlpp_sql_buffer.o] Error 1
Something could be wrong in my configuration, but I can't find what. It seems strange that it cannot find, e.g., size_t.
Thank you!
Platform:
Windows 7
MinGW 2011.09 / GCC 4.6.1
MySQL 5.5
mysql++ 3.1.0
Changing size_t() with std::size_t() in refcounted.h solved the problem. It now compiles.
I still have linking problems, but they should be different issues.