Getting "Unable to get public no-arg constructor" When Using "logstash LogstashEncoder" - json

I'm trying to configure a Spring Boot project to use net.logstash.logback.encoder.LogstashEncoder to format log output in JSON and provide the ability to use the keyValue method for adding custom fields to the output. I know there are other ways of doing it, but many of our current applications using an older version of Spring use this technique and it works well. In order to update those applications, we'd like to be able to use the same configuration.
The problem I'm having is that with the following logstash.xml file:
<configuration>
<appender name="json" class="ch.qos.logback.core.ConsoleAppender">
<encoder class="net.logstash.logback.encoder.LogstashEncoder"/>
</appender>
<root level="INFO">
<appender-ref ref="json" />
</root>
</configuration>
I'm getting the following error when starting the app:
Exception in thread "restartedMain" java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)
Caused by: java.lang.IllegalStateException: java.lang.IllegalStateException: Logback configuration error detected:
ERROR in net.logstash.logback.encoder.LogstashEncoder#4f659288 - Error occurred while dynamically loading jackson modules java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: com.fasterxml.jackson.datatype.hibernate5.Hibernate5Module Unable to get public no-arg constructor
at org.springframework.boot.context.logging.LoggingApplicationListener.initializeSystem(LoggingApplicationListener.java:328)
at org.springframework.boot.context.logging.LoggingApplicationListener.initialize(LoggingApplicationListener.java:282)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEnvironmentPreparedEvent(LoggingApplicationListener.java:240)
at org.springframework.boot.context.logging.LoggingApplicationListener.onApplicationEvent(LoggingApplicationListener.java:216)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:176)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:169)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:143)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:131)
at org.springframework.boot.context.event.EventPublishingRunListener.environmentPrepared(EventPublishingRunListener.java:85)
at org.springframework.boot.SpringApplicationRunListeners.lambda$environmentPrepared$2(SpringApplicationRunListeners.java:66)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1541)
at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:120)
at org.springframework.boot.SpringApplicationRunListeners.doWithListeners(SpringApplicationRunListeners.java:114)
at org.springframework.boot.SpringApplicationRunListeners.environmentPrepared(SpringApplicationRunListeners.java:65)
at org.springframework.boot.SpringApplication.prepareEnvironment(SpringApplication.java:344)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:302)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1306)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1295)
at com.lowes.oms.eor.services.order.Application.main(Application.java:33)
... 5 more
Caused by: java.lang.IllegalStateException: Logback configuration error detected:
ERROR in net.logstash.logback.encoder.LogstashEncoder#4f659288 - Error occurred while dynamically loading jackson modules java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: com.fasterxml.jackson.datatype.hibernate5.Hibernate5Module Unable to get public no-arg constructor
at org.springframework.boot.logging.logback.LogbackLoggingSystem.loadConfiguration(LogbackLoggingSystem.java:179)
at org.springframework.boot.logging.logback.LogbackLoggingSystem.reinitialize(LogbackLoggingSystem.java:232)
at org.springframework.boot.logging.AbstractLoggingSystem.initializeWithConventions(AbstractLoggingSystem.java:73)
at org.springframework.boot.logging.AbstractLoggingSystem.initialize(AbstractLoggingSystem.java:60)
at org.springframework.boot.logging.logback.LogbackLoggingSystem.initialize(LogbackLoggingSystem.java:132)
at org.springframework.boot.context.logging.LoggingApplicationListener.initializeSystem(LoggingApplicationListener.java:313)
... 23 more
If I look in com.fasterxml.jackson.datatype.hibernate5.Hibernate5Module, there is a public no-arg constructor.
Relevant dependency versions:
Spring Boot*: 2.7.2
Jackson*: 2.13.3 (managed by Spring Boot)
Logback*: 1.2.11 (managed by Spring Boot)
Any thoughts on what I could be doing wrong?

Turns out this was an incompatibility between logstash-logback-encoder and jackson-datatype-hibernate5. Whenever I try to use the logstash encoder with jackson-datatype-hibernate5 on the classpath, the error occurs. I've raised an issue in the logstash github repo here: https://github.com/logfellow/logstash-logback-encoder/issues/878

Related

Error on Running Spring Boot Application on centos tomcat

My Spring Boot Application works perfectly on localhost but it gives an error when I run it on Centos. This is the error that's shown on the page.
Root Cause
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfiguration$Tomcat.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.tomcat.jdbc.pool.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
....
Root Cause
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceConfiguration$Tomcat.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.tomcat.jdbc.pool.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:599)
....
Root Cause
org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.apache.tomcat.jdbc.pool.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
....
Root Cause
org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
org.springframework.boot.autoconfigure.jdbc.DataSourceProperties.determineDriverClassName(DataSourceProperties.java:247)
....
application.yml file
`# ===============================
# = Hibernate datasource
# ===============================
spring:
datasource:
driverClassName: com.mysql.jdbc.Driver
url: jdbc:mysql://67.209.127.115:3306/blog
username: shadab
password: test
# ===============================
# = JPA configurations
# ===============================
jpa:
show-sql: true
hibernate:
ddl-auto: update
database-platform: MYSQL
properties:
hibernate.dialect: org.hibernate.dialect.MySQL5Dialect
pom.xml
I have included mysql-connector in pom.xml file but still it doesn't work.
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<scope>runtime</scope>
</dependency>
You are missing MySQL dependency from your classpath. Just put this into Maven/Gradle build:
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
</dependency>

Error when trying to load JDBC sink connector

I am trying to stream data from a Kafka topic to a MySQL database unsuccessfully. Although the source connector works fine (i.e. streaming data from a MySQL database to kafka topic), sink connector fails to load.
Here is my sink-mysql.properties file:
name=sink-mysql
connector.class=io.confluent.connect.jdbc.JdbcSinkConnector
tasks.max=1
topics=test-mysql-jdbc-foobar
connection.url=jdbc:mysql://127.0.0.1:3306/demo?user=user1&password=user1pass
auto.create=true
When I try to execute
./bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/sink-mysql.properties
the following errors are reported:
[2018-02-01 16:17:43,019] ERROR WorkerSinkTask{id=sink-mysql-0} Task threw an uncaught and unrecoverable exception. Task is being killed and will not recover until manually restarted. (org.apache.kafka.connect.runtime.WorkerSinkTask:515)
org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table: test-mysql-jdbc-foobar
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:127)
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:64)
at io.confluent.connect.jdbc.sink.BufferedRecords.add(BufferedRecords.java:71)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:66)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:69)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[2018-02-01 16:17:43,020] ERROR WorkerSinkTask{id=sink-mysql-0} Task threw an uncaught and unrecoverable exception (org.apache.kafka.connect.runtime.WorkerTask:172)
org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:517)
at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:288)
at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:198)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:166)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:170)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:214)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.kafka.connect.errors.ConnectException: No fields found using key and value schemas for table: test-mysql-jdbc-foobar
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:127)
at io.confluent.connect.jdbc.sink.metadata.FieldsMetadata.extract(FieldsMetadata.java:64)
at io.confluent.connect.jdbc.sink.BufferedRecords.add(BufferedRecords.java:71)
at io.confluent.connect.jdbc.sink.JdbcDbWriter.write(JdbcDbWriter.java:66)
at io.confluent.connect.jdbc.sink.JdbcSinkTask.put(JdbcSinkTask.java:69)
at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:495)
... 10 more
[2018-02-01 16:17:43,021] ERROR WorkerSinkTask{id=sink-mysql-0} Task is being killed and will not recover until manually restarted (org.apache.kafka.connect.runtime.WorkerTask:173)
Note that topic test-mysql-jdbc-foobar contains data streamed from MySQL to kafka however, I am not able to stream this data from MySQL back to kafka. The content of sink-mysql.properties looks identical to the one used in the official confluent's documentation but it doesn't seem to work. Also, mysql-connector is placed in the right directory (under share/java/kafka-connect-jdbc/).
EDIT
Here is the content of my worker config file:
bootstrap.servers=localhost:9092
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter.schemas.enable=false
# Local storage file for offset data
offset.storage.file.filename=/tmp/connect.offsets
plugin.path=share/java
To be able to use the JDBC Sink, your messages must have a schema. This can be by using Avro + Schema Registry, or JSON with schemas. In your worker config you've specified:
key.converter.schemas.enable=false
value.converter.schemas.enable=false
Which means that the JSON will not contain the schemas.
Here's an example of the JSON that Kafka Connect will produce (as a source) and expect (as a sink) if you enable schemas: https://gist.github.com/rmoff/2b922fd1f9baf3ba1d66b98e9dd7b364

Hive 1.2 Metastore Service doesn't start after configuring it to S3 storage instead HDFS

I have an Apache Spark Cluster(2.2.0) in standalone mode. Till now was running using HDFS to store the parquet files. I'm using the Hive Metastore Service of Apache Hive 1.2 to access, using the Thriftserver, Spark over JDBC.
Now I want to use S3 Object Storage instead HDFS. I have added the following configuration to my hive-site.xml:
<property>
<name>fs.s3a.access.key</name>
<value>access_key</value>
<description>Profitbricks Access Key</description>
</property>
<property>
<name>fs.s3a.secret.key</name>
<value>secret_key</value>
<description>Profitbricks Secret Key</description>
</property>
<property>
<name>fs.s3a.endpoint</name>
<value>s3-de-central.profitbricks.com</value>
<description>ProfitBricks S3 Object Storage Endpoint</description>
</property>
<property>
<name>fs.s3a.endpoint.http.port</name>
<value>80</value>
<description>ProfitBricks S3 Object Storage Endpoint HTTP Port</description>
</property>
<property>
<name>fs.s3a.endpoint.https.port</name>
<value>443</value>
<description>ProfitBricks S3 Object Storage Endpoint HTTPS Port</description>
</property>
<property>
<name>hive.metastore.warehouse.dir</name>
<value>s3a://dev.spark.my_bucket/parquet/</value>
<description>Profitbricks S3 Object Storage Hive Warehouse Location</description>
</property>
I have the hive metastore in a MySQL 5.7 database. I have added to the Hive lib folder the following jar files:
aws-java-sdk-1.7.4.jar
hadoop-aws-2.7.3.jar
I have deleted the old hive metastore schema on MySQL and then I start the metastore service with the following command: hive --service metastore & and I get the following error:
java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/ObjectMapper
at com.amazonaws.util.json.Jackson.<clinit>(Jackson.java:27)
at com.amazonaws.internal.config.InternalConfig.loadfrom(InternalConfig.java:182)
at com.amazonaws.internal.config.InternalConfig.load(InternalConfig.java:199)
at com.amazonaws.internal.config.InternalConfig$Factory.<clinit>(InternalConfig.java:232)
at com.amazonaws.ServiceNameFactory.getServiceName(ServiceNameFactory.java:34)
at com.amazonaws.AmazonWebServiceClient.computeServiceName(AmazonWebServiceClient.java:703)
at com.amazonaws.AmazonWebServiceClient.getServiceNameIntern(AmazonWebServiceClient.java:676)
at com.amazonaws.AmazonWebServiceClient.computeSignerByURI(AmazonWebServiceClient.java:278)
at com.amazonaws.AmazonWebServiceClient.setEndpoint(AmazonWebServiceClient.java:160)
at com.amazonaws.services.s3.AmazonS3Client.setEndpoint(AmazonS3Client.java:475)
at com.amazonaws.services.s3.AmazonS3Client.init(AmazonS3Client.java:447)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:391)
at com.amazonaws.services.s3.AmazonS3Client.<init>(AmazonS3Client.java:371)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:235)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5757)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5990)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.ClassNotFoundException: com.fasterxml.jackson.databind.ObjectMapper
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
The missing class belongs to the Jackson library, then I have copied the Jackson-*.jar located on my spark-2.2.0-bin-hadoop2.7/jars/ folder which are:
jackson-annotations-2.6.5.jar
jackson-core-2.6.5.jar
jackson-core-asl-1.9.13.jar
jackson-databind-2.6.5.jar
jackson-jaxrs-1.9.13.jar
jackson-mapper-asl-1.9.13.jar
jackson-module-paranamer-2.6.5.jar
jackson-module-scala_2.11-2.6.5.jar
jackson-xc-1.9.13.jar
But then I got the following error:
2018-01-05 17:51:00,819 ERROR [main]: metastore.HiveMetaStore (HiveMetaStore.java:main(5920)) - Metastore Thrift Server threw an exception...
java.lang.NumberFormatException: For input string: "100M"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1319)
at org.apache.hadoop.fs.s3a.S3AFileSystem.initialize(S3AFileSystem.java:248)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2811)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:100)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2848)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2830)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:389)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:356)
at org.apache.hadoop.hive.metastore.Warehouse.getFs(Warehouse.java:104)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:140)
at org.apache.hadoop.hive.metastore.Warehouse.getDnsPath(Warehouse.java:146)
at org.apache.hadoop.hive.metastore.Warehouse.getWhRoot(Warehouse.java:159)
at org.apache.hadoop.hive.metastore.Warehouse.getDefaultDatabasePath(Warehouse.java:177)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:601)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:620)
at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:461)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:66)
at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5757)
at org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5990)
at org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5915)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:234)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
I think the error here it have something to do with some jar version incompatibility but I'm not able to find the correct versions.
Can someone help me here?
You absolutely cannot mix versions of the Hadoop-common, hadoop-aws, aws-s3-sdk and jackson versions from what everything expects, or you will see stack traces.
And its all open source, so if you D/L all the source JARs locally, your IDE will help you find what's causing the stack trace. This is what we all do. It's not magic, modern IDEs (intellij IDEA) even have special stack debugging.
This one is coming in because the value of fs.s3a.multipart.size set in hadoop-common's /core-default.xml resource is 100M, which came in with HADOOP-13680 and the range parsing handling numbers like "100M" instead of 104857600 . This stack trace says "Hadoop 2.8+ configuration"
You could try setting the property in your configs to that numeric value, but its a warning sign that versions of JARs are out of sync and you will probably only get a few lines further before something else breaks.
Fix: make sure that hadoop-common.jar and hadoop-aws.jar are in sync. It looks like you've got the jackson and aws ones lined up, though jackson is complex enough you can never take that for granted.

Spring Boot DatasourceBean error, Macbook, MySQL

I'm getting the following error when I deploy my application to Tomcat 8. I have MySQL installed on my macbook :
Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire field: org.springframework.jdbc.core.namedparam.NamedParameterJdbcTemplate com.catalina.productcarried.dao.impl.ProductCarriedDAOImpl.jdbcTemplate; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration$JdbcTemplateConfiguration': Injection of autowired dependencies failed; nested exception is org.springframework.beans.factory.BeanCreationException: Could not autowire field: private javax.sql.DataSource org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration$JdbcTemplateConfiguration.dataSource; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'dataSource' defined in class path resource [org/springframework/boot/autoconfigure/jdbc/DataSourceAutoConfiguration$NonEmbeddedConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [javax.sql.DataSource]: Factory method 'dataSource' threw exception; nested exception is org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.inject(AutowiredAnnotationBeanPostProcessor.java:573) ~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at org.springframework.beans.factory.annotation.InjectionMetadata.inject(InjectionMetadata.java:88) ~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor.postProcessPropertyValues(AutowiredAnnotationBeanPostProcessor.java:331) ~[spring-beans-4.2.4.RELEASE.jar:4.2.4.RELEASE]
... 26 common frames omitted
Here is my application.properties file :
#Database properties local
spring.datasource.url=jdbc:mysql://localhost:3306/simulator?autoReconnect=true&useSSL=false
spring.datasource.username=root
spring.datasource.password=
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
debug=true
Cannot determine embedded database driver class for database type NONE. If you want an embedded database please put a supported one on the classpath. If you have database settings to be loaded from a particular profile you may need to active it (no profiles are currently active).
Your application.properties is not taken into account for some reason. Note that the driver-classis unnecessary, Spring Boot detects that from the url parameter.
Make sure to review the following:
The mysql driver is defined in your dependencies
The content of the application.properties file you provided in description is in one of the supported locations

XWiki + MySQL Database: Errors

I have (as far as I know) properly followed all of the instructions on xwiki.org, and I have also searched high and low for answers to my errors, but have unfortunately been unable to resolve them by myself.
I have downloaded XWiki in two different formats. The first is the .exe installer, and the second is the suggested .xar package, both for using on a Windows 8, 64 bit machine.
The first runs perfectly in "standalone" mode, but once the hibernate.cfg.xml file is changed to the preset MySQL Settings (which are accurate to my database), I get a very long HTTP 500 error. (This is after I added the MySQL Connection Java file, and created a MySQL database according to the instructions on xwiki.org) It seems that the primary issue is that it cannot find the com.mysql.jdbc.Driver along with the fact that I get an error later on the page saying that it cannot create a database connection pool.
The code pertaining to the error is this:
<!-- DBCP Connection Pooling configuration
-->
<property name="dbcp.defaultAutoCommit">false</property>
<property name="dbcp.maxActive">50</property>
<property name="dbcp.maxIdle">5</property>
<property name="dbcp.maxWait">30000</property>
<property name="dbcp.whenExhaustedAction">1</property>
<property name="dbcp.ps.whenExhaustedAction">1</property>
<property name="dbcp.ps.maxWait">120000</property>
<property name="dbcp.ps.maxIdle">20</property>
<property name="connection.provider_class">com.xpn.xwiki.store.DBCPConnectionProvider</property>
and this
<!-- MySQL configuration.
Uncomment if you want to use MySQL and comment out other database configurations.
-->
<property name="connection.url">jdbc:mysql://localhost/xwiki</property>
<property name="connection.username">xwiki</property>
<property name="connection.password">xwiki</property>
<property name="connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="dialect">org.hibernate.dialect.MySQL5InnoDBDialect</property>
<property name="dbcp.ps.maxActive">20</property>
<mapping resource="xwiki.hbm.xml"/>
<mapping resource="feeds.hbm.xml"/>
<mapping resource="activitystream.hbm.xml"/>
<mapping resource="instance.hbm.xml"/>
And the error itself looks like this:
HTTP ERROR 500
Problem accessing /xwiki/bin/view/Main/. Reason:
Error number 3 in 0: Could not initialize main XWiki context
Caused by:
com.xpn.xwiki.XWikiException: Error number 3 in 0: Could not initialize main XWiki context
at com.xpn.xwiki.XWiki.getMainXWiki(XWiki.java:422)
at com.xpn.xwiki.XWiki.getXWiki(XWiki.java:491)
at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:152)
at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:128)
at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:414)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:669)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1448)
at com.xpn.xwiki.web.ActionFilter.doFilter(ActionFilter.java:121)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.wysiwyg.server.filter.ConversionFilter.doFilter(ConversionFilter.java:144)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at com.xpn.xwiki.plugin.webdav.XWikiDavFilter.doFilter(XWikiDavFilter.java:66)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.container.servlet.filters.internal.SavedRequestRestorerFilter.doFilter
(SavedRequestRestorerFilter.java:208)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.container.servlet.filters.internal.SetCharacterEncodingFilter.doFilter
(SetCharacterEncodingFilter.java:111)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle
(ContextHandlerCollection.java:255)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:368)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest
(AbstractHttpConnection.java:488)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete
(AbstractHttpConnection.java:932)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete
(AbstractHttpConnection.java:994)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:640)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:628)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Unknown Source)
Caused by: com.xpn.xwiki.XWikiException: Error number 3202 in 3: Exception while reading
document [xwiki:XWiki.XWikiServerClass]
at com.xpn.xwiki.store.XWikiHibernateStore.loadXWikiDoc(XWikiHibernateStore.java:916)
at com.xpn.xwiki.store.XWikiCacheStore.loadXWikiDoc(XWikiCacheStore.java:290)
at com.xpn.xwiki.XWiki.getDocument(XWiki.java:1515)
at com.xpn.xwiki.XWiki.getDocument(XWiki.java:1558)
at com.xpn.xwiki.XWiki.initializeMandatoryClasses(XWiki.java:885)
at com.xpn.xwiki.XWiki.initXWiki(XWiki.java:855)
at com.xpn.xwiki.XWiki.<init>(XWiki.java:792)
at com.xpn.xwiki.XWiki.getMainXWiki(XWiki.java:402)
... 45 more
Caused by: org.hibernate.HibernateException: Could not create a DBCP pool. There is an error
in the hibernate configuration file, please review it.
at com.xpn.xwiki.store.DBCPConnectionProvider.configure(DBCPConnectionProvider.java:215)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider
(ConnectionProviderFactory.java:143)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider
(ConnectionProviderFactory.java:84)
at org.hibernate.cfg.SettingsFactory.createConnectionProvider(SettingsFactory.java:459)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:90)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2863)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2859)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1870)
at com.xpn.xwiki.store.XWikiHibernateBaseStore.initHibernate(XWikiHibernateBaseStore.java:223)
at com.xpn.xwiki.store.XWikiHibernateBaseStore.checkHibernate(XWikiHibernateBaseStore.java:640)
at com.xpn.xwiki.store.XWikiHibernateStore.loadXWikiDoc(XWikiHibernateStore.java:789)
... 52 more
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot load JDBC driver
class 'com.mysql.jdbc.Driver'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1429)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1371)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at com.xpn.xwiki.store.DBCPConnectionProvider.configure(DBCPConnectionProvider.java:197)
... 62 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:430)
at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:383)
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory(BasicDataSource.java:1420)
... 65 more
Caused by:
com.xpn.xwiki.XWikiException: Error number 3202 in 3: Exception while reading document
[xwiki:XWiki.XWikiServerClass]
at com.xpn.xwiki.store.XWikiHibernateStore.loadXWikiDoc(XWikiHibernateStore.java:916)
at com.xpn.xwiki.store.XWikiCacheStore.loadXWikiDoc(XWikiCacheStore.java:290)
at com.xpn.xwiki.XWiki.getDocument(XWiki.java:1515)
at com.xpn.xwiki.XWiki.getDocument(XWiki.java:1558)
at com.xpn.xwiki.XWiki.initializeMandatoryClasses(XWiki.java:885)
at com.xpn.xwiki.XWiki.initXWiki(XWiki.java:855)
at com.xpn.xwiki.XWiki.<init>(XWiki.java:792)
at com.xpn.xwiki.XWiki.getMainXWiki(XWiki.java:402)
at com.xpn.xwiki.XWiki.getXWiki(XWiki.java:491)
at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:152)
at com.xpn.xwiki.web.XWikiAction.execute(XWikiAction.java:128)
at org.apache.struts.action.RequestProcessor.processActionPerform(RequestProcessor.java:431)
at org.apache.struts.action.RequestProcessor.process(RequestProcessor.java:236)
at org.apache.struts.action.ActionServlet.process(ActionServlet.java:1196)
at org.apache.struts.action.ActionServlet.doGet(ActionServlet.java:414)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:735)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:848)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:669)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1448)
at com.xpn.xwiki.web.ActionFilter.doFilter(ActionFilter.java:121)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.wysiwyg.server.filter.ConversionFilter.doFilter(ConversionFilter.java:144)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at com.xpn.xwiki.plugin.webdav.XWikiDavFilter.doFilter(XWikiDavFilter.java:66)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.container.servlet.filters.internal.SavedRequestRestorerFilter.doFilter
(SavedRequestRestorerFilter.java:208)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.xwiki.container.servlet.filters.internal.SetCharacterEncodingFilter.doFilter
(SetCharacterEncodingFilter.java:111)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1419)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:455)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:137)
at org.eclipse.jetty.security.SecurityHandler.handle(SecurityHandler.java:557)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:231)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1075)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:384)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:193)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1009)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)
at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle
(ContextHandlerCollection.java:255)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:154)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)
at org.eclipse.jetty.server.Server.handle(Server.java:368)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest
(AbstractHttpConnection.java:488)
at org.eclipse.jetty.server.AbstractHttpConnection.headerComplete
(AbstractHttpConnection.java:932)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.headerComplete
(AbstractHttpConnection.java:994)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:640)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:628)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:52)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)
at java.lang.Thread.run(Unknown Source)
Caused by: org.hibernate.HibernateException: Could not create a DBCP pool. There is an error
in the hibernate configuration file, please review it.
at com.xpn.xwiki.store.DBCPConnectionProvider.configure(DBCPConnectionProvider.java:215)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider
(ConnectionProviderFactory.java:143)
at org.hibernate.connection.ConnectionProviderFactory.newConnectionProvider
(ConnectionProviderFactory.java:84)
at org.hibernate.cfg.SettingsFactory.createConnectionProvider(SettingsFactory.java:459)
at org.hibernate.cfg.SettingsFactory.buildSettings(SettingsFactory.java:90)
at org.hibernate.cfg.Configuration.buildSettingsInternal(Configuration.java:2863)
at org.hibernate.cfg.Configuration.buildSettings(Configuration.java:2859)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1870)
at com.xpn.xwiki.store.XWikiHibernateBaseStore.initHibernate(XWikiHibernateBaseStore.java:223)
at com.xpn.xwiki.store.XWikiHibernateBaseStore.checkHibernate
(XWikiHibernateBaseStore.java:640)
at com.xpn.xwiki.store.XWikiHibernateStore.loadXWikiDoc(XWikiHibernateStore.java:789)
... 52 more
Caused by: org.apache.commons.dbcp.SQLNestedException: Cannot load JDBC driver
class 'com.mysql.jdbc.Driver'
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory
(BasicDataSource.java:1429)
at org.apache.commons.dbcp.BasicDataSource.createDataSource(BasicDataSource.java:1371)
at org.apache.commons.dbcp.BasicDataSource.getConnection(BasicDataSource.java:1044)
at com.xpn.xwiki.store.DBCPConnectionProvider.configure(DBCPConnectionProvider.java:197)
... 62 more
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:430)
at org.eclipse.jetty.webapp.WebAppClassLoader.loadClass(WebAppClassLoader.java:383)
at org.apache.commons.dbcp.BasicDataSource.createConnectionFactory
(BasicDataSource.java:1420)
... 65 more
Do I have to use the .xar installation version to properly connect to a database? I tried that way also, with Xampp and Tomcat. Tomcat installed fine and I can easily access it through localhost:(port). However, when I try to access my xwiki, I get a HTTP 404 error saying that the reference cannot be found, which is odd since it shows up with the other files in the deployment list included in Tomcat, and the Tomcat example file shown there (located in the same Directory) works fine...
I'm a bit concerned that even if I could get the XWiki to work properly through Tomcat, I would still get the first error that I mentioned later when I try to connect the MySQL database.
Side Note: I have suspicions that there may be Java file conflicts between those found in XWiki and those in Tomcat from other errors that I have seen while playing around and trying to make things work... but I could be wrong as I currently have very little knowledge of Java. I've seen several servlet errors and could not find class/file errors that according to a Forum that I found, are common errors belonging to the jasper.jar and the jsp-api.jar, both are found in Tomcat.
Regardless, I am open to trying installation options that do or do not involve Tomcat. If anyone has had success with installing XWiki on Windows with a MySQL database, your help would be much appreciated!
Looks like you are missing the mysql driver. Since you converted the default distribution to use mysql, you will need to download a mysql driver jar and deploy it to your 'webapps\xwiki\WEB-INF\lib' directory.
Regarding the second issue, there is probably no connection between the database driver error, the xar and the tomcat 404. The xar is simply a collection of pages used to import or export content and functionality, but doesn't usually handle low level stuff like database conectivity. The 404 in Tomcat is probably due to the fact that you accessed a invalid url. Could need more details on that one.
First regarding potential conflict with Tomcat, I can assure you this is not possible as Tomcat is what XWiki team use in production all the time.
Now for MySQL, the error is pretty clear and if you really put the MySQL connector then you did not put it in the right place. What version of it did you put and is it in #xwiki folder#/WEB-INF/lib/?
The XAR package is not a different way of installing XWiki, it's a package containing wiki pages that can be imported in XWiki when it's running. Perhaps you mean the WAR package?
This was resolved some months ago. I was missing some vital information in the set-up, but it works fine now.