play 2.3 Subproject routing "not a member" Error - playframework-2.3

I'm having trouble routing my subproject, but I think I followed all the steps to include my subproject
|- mainProject
|- app
|- conf
|- modules
|- subProject
|- app
|- assets
|- controllers
|- subProject
|- Application.scala
|- Assets.scala
|- MyCode.java
|- models
|- subProject
|- MyModel.java
|- views
|- subProject
|- myView.scala.html
|- conf
|- subProject.routes
subProject.routes
GET / subProject.app.controllers.subProject.MyCode.index()
POST /add subProject.app.controllers.subProject.MyCode.add
POST /edit/:id subProject.app.controllers.subProject.MyCode.edit(id: Long)
GET /load/:id subProject.app.controllers.subProject.MyCode.load(id: Long)
GET /list subProject.app.controllers.subProject.MyCode.list()
# Map static resources from the /public folder to the /assets URL path
GET /assets/*file subProject.app.controllers.subProject.Assets.at(path="/public", file)
GET /webjars/*file controllers.WebJarAssets.at(file)
GET /javascriptRoutes subProject.app.controllers.subProject.Application.javascriptRoutes()
myview contains this line:
<script src="#controllers.subProject.routes.Assets.at("javascripts/myJavascript.js")"></script>
And I still get the error
[error] /path/git/mainProject/module/subProject/app/views/subProject/myview.scala.html:8: object subProject is not a member of package controllers
[error] <script src="#controllers.subProject.routes.Assets.at("javascripts/myJavascript.js")"></script>
[error] ^
[error] one error found
[error] (subProject/compile:compile) Compilation failed
[error] Total time: 1 s, completed 07-Apr-2015 17:21:56
I think my routes file is wrong, but if I do it like in the doc
GET / controllers.subProject.myCode.index()
I get
[error] /path/git/mainProject/modules/subProject/conf/subProject.routes:6: object MyCode is not a member of package controllers.subProject
[error] GET / controllers.subProject.MyCode.index()
searching for similar problems didn't help
what am I doing wrong?
edit: I just found this problem is masking another problem with the reversed routes
[error] /path/git/mainProject/modules/subProject/target/scala-2.10/src_managed/main/controllers/subProject/routes.java:8: error: cannot find symbol
[error] public static final controllers.subProject.ReverseMyCode MyCode = new controllers.subProject.ReverseMyCode();
[error] ^
[error] symbol: class ReverseMyCode
[error] location: package controllers.subProject
[error] /path/git/mainProject/modules/subProjecy/target/scala-2.10/src_managed/main/controllers/subProject/routes.java:11: error: package controllers.subProject.javascript does not exist
[error] public static final controllers.subProject.javascript.ReverseMyCode MyCode = new controllers.subProject.javascript.ReverseMyCode();
[error] ^
[error] /path/git/mainProject/modules/subProject/target/scala-2.10/src_managed/main/controllers/subProject/routes.java:16: error: package controllers.subProject.ref does not exist
[error] public static final controllers.subProject.ref.ReverseMyCode MyCode = new controllers.subProject.ref.ReverseMyCode();

I had the same problem. I suppose the package of MyCode.java is modules.subProject.app.controllers.subProject.MyCode ?
That's because of your IDE that creates your classes in packages.
The package of the files of your subproject must start with controllers.subProject.
And your subProject.routes file must be
GET / controllers.subProject.MyCode.index()
POST /add controllers.subProject.MyCode.add
POST /edit/:id controllers.subProject.MyCode.edit(id: Long)
GET /load/:id controllers.subProject.MyCode.load(id: Long)
GET /list controllers.subProject.MyCode.list()
For the reversed routes error, just clean your project activator clean compile

Related

AWS EMR Spark exception on jdbc datasource load

I'm spinning emr-5.31.0 image of AWS EMR cluster with Spark 2.4.6 onboard and then I'm trying to login into spark-shell on the master node and follow this tutorial
https://bigdataprogrammers.com/load-data-from-mysql-in-spark-using-jdbc/
for uploading data from my RDS MySQL instance.
I've uploaded both connector jar (mysql-connector-java-5.1.49-bin.jar) as well as script to ~/home/hadoop folder.
Then I perform as described in tutorial and I'm getting 2 errors
scala> [hadoop#ip-172-31-* ~]$ spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
20/10/09 16:41:31 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME.
Spark context Web UI available at http://ip-172-31-*.ec2.internal:4040
Spark context available as 'sc' (master = yarn, app id = application_1602254033216_0005).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.6-amzn-0
/_/
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_265)
Type in expressions to have them evaluated.
Type :help for more information.
scala> :require /home/hadoop/mysql-connector-java-5.1.49-bin.jar
Added '/home/hadoop/mysql-connector-java-5.1.49-bin.jar' to classpath.
scala> :load /home/hadoop/test01.scala
Loading /home/hadoop/test01.scala...
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.hive.HiveContext
error: error while loading package, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/package.class)' has location not matching its contents: contains package object execution
error: error while loading QueryExecution, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/QueryExecution.class)' has location not matching its contents: contains class QueryExecution
error: error while loading package, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/plans/package.class)' has location not matching its contents: contains package object plans
error: error while loading LogicalPlan, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/plans/logical/LogicalPlan.class)' has location not matching its contents: contains class LogicalPlan
error: error while loading package, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/encoders/package.class)' has location not matching its contents: contains package object encoders
error: error while loading ExpressionEncoder, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/encoders/ExpressionEncoder.class)' has location not matching its contents: contains class ExpressionEncoder
error: error while loading Expression, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/expressions/Expression.class)' has location not matching its contents: contains class Expression
error: error while loading NamedExpression, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/catalyst/expressions/NamedExpression.class)' has location not matching its contents: contains class NamedExpression
error: error while loading DataFrameNaFunctions, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameNaFunctions.class)' has location not matching its contents: contains class DataFrameNaFunctions
error: error while loading DataFrameStatFunctions, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameStatFunctions.class)' has location not matching its contents: contains class DataFrameStatFunctions
error: error while loading TypedColumn, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/TypedColumn.class)' has location not matching its contents: contains class TypedColumn
error: error while loading package, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/package.class)' has location not matching its contents: contains package object function
error: error while loading ReduceFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ReduceFunction.class)' has location not matching its contents: contains class ReduceFunction
error: error while loading KeyValueGroupedDataset, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/KeyValueGroupedDataset.class)' has location not matching its contents: contains class KeyValueGroupedDataset
error: error while loading MapFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/MapFunction.class)' has location not matching its contents: contains class MapFunction
error: error while loading Metadata, class file '/usr/lib/spark/jars/spark-catalyst_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/types/Metadata.class)' has location not matching its contents: contains class Metadata
error: error while loading FilterFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/FilterFunction.class)' has location not matching its contents: contains class FilterFunction
error: error while loading MapPartitionsFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/MapPartitionsFunction.class)' has location not matching its contents: contains class MapPartitionsFunction
error: error while loading FlatMapFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/FlatMapFunction.class)' has location not matching its contents: contains class FlatMapFunction
error: error while loading ForeachFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ForeachFunction.class)' has location not matching its contents: contains class ForeachFunction
error: error while loading ForeachPartitionFunction, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/api/java/function/ForeachPartitionFunction.class)' has location not matching its contents: contains class ForeachPartitionFunction
error: error while loading StorageLevel, class file '/usr/lib/spark/jars/spark-core_2.11-2.4.6-amzn-0.jar(org/apache/spark/storage/StorageLevel.class)' has location not matching its contents: contains class StorageLevel
error: error while loading CreateViewCommand, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/command/CreateViewCommand.class)' has location not matching its contents: contains class CreateViewCommand
error: error while loading DataFrameWriter, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/DataFrameWriter.class)' has location not matching its contents: contains class DataFrameWriter
error: error while loading DataStreamWriter, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/streaming/DataStreamWriter.class)' has location not matching its contents: contains class DataStreamWriter
error: error while loading SparkPlan, class file '/usr/lib/spark/jars/spark-sql_2.11-2.4.6-amzn-0.jar(org/apache/spark/sql/execution/SparkPlan.class)' has location not matching its contents: contains class SparkPlan
scala> :load /home/hadoop/test01.scala
Loading /home/hadoop/test01.scala...
import java.sql.{Connection, DriverManager, ResultSet}
import org.apache.spark.sql.functions._
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkConf, SparkContext}
import org.apache.spark.sql.hive.HiveContext
defined object ReadDataFromJdbc
scala> ReadDataFromJdbc.main(Array("batches"))
Started.......Fri Oct 09 16:42:02 UTC 2020 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[Stage 0:> (0 + 1) / 1]20/10/09 16:42:04 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, ip-172-31-20-13.ec2.internal, executor 1): java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:55)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:272)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.lang.ClassLoader.findClass(ClassLoader.java:523)
at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
... 25 more
[Stage 0:> (0 + 0) / 1]20/10/09 16:42:05 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
(Connectivity Failed for Table ,org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, ip-172-31-27-165.ec2.internal, executor 2): java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:111)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.sql.execution.datasources.jdbc.DriverRegistry$.register(DriverRegistry.scala:45)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:55)
at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$createConnectionFactory$1.apply(JdbcUtils.scala:54)
at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD.compute(JDBCRDD.scala:272)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:310)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1405)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.lang.ClassLoader.findClass(ClassLoader.java:523)
at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:106)
... 25 more
Driver stacktrace:)
first error when I'm loading scala script, it is getting loaded with some errors but repetition of same command seems to fix it
second error once I'm requesting data to be loaded from mysql and despite fact that mysql jdbc connector was added to classpath with a command earlier, it fails with java.lang.ClassNotFoundException: com.mysql.jdbc.Driver.
While I believe I can find some directory which will be accessible by spark to find jdbc, I'm super-confused by error appearing on load of script - why is it appearing and how it can be fixed?
I've ended up creating a bootstrap step for cluster which was copying mysql-connector-java jar to all nodes of cluster before spark and hadoop even installed.
First, create copymysqljar.sh script
#!/bin/bash
sudo mkdir -p /home/hadoop
sudo mkdir -p /usr/lib/spark/jars
sudo mkdir -p /usr/lib/hadoop/lib
aws s3 cp s3://<YOUR_BUCKET>/mysql-connector-java-5.1.49-bin.jar /home/hadoop
chmod 777 /home/hadoop/mysql-connector-java-5.1.49-bin.jar
sudo cp /home/hadoop/mysql-connector-java-5.1.49-bin.jar /usr/lib/spark/jars
sudo cp /home/hadoop/mysql-connector-java-5.1.49-bin.jar /usr/lib/hadoop/lib
save copymysqljar.sh to S3 bucket identified by s3://<YOUR_BUCKET>
proceed to cluster creation in AWS with 'create cluster'-'advanced configuration'
during advanced configuration on step 4 create a custom bootstrap action with s3://<YOUR_BUCKET>/copymysqljar.sh as a script
start cluster creation
Alternatively, instead of steps 3, 4 and 5 you can do the same with AWS command-line tools.
You can reach out to official docs on bootstrap steps https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-plan-bootstrap.html#CustomBootstrapCopyS3Object
In general, this script takes care of everything for AWS EMR 5.31 with Hadoop, Spark and Zeppelin. Might require to copy to other directories if other tools should connect to mysql too.

In Chisel3, what imports do I need for the Printable examples?

I am trying to follow the example here:
https://github.com/freechipsproject/chisel3/wiki/Printing-in-Chisel#custom-printing
As in the example, I overrode def toPrintable: Printable with a concatenation of p"..." + strings.
In my scala file I
import chisel3._
But I get Scala compile warnings that it doesn't know what Printable is nor does it know what to do with the p interpolator, which makes me think I don't have the right imports.
Is there something I need to import other than chisel3._?
Here is a bit more information on what I am doing and what error I am getting.
I am modifying this file:
https://github.com/chipsalliance/rocket-chip/blob/e6a6c67f30d668e702ddbef93789e9b4f709b237/src/main/scala/tilelink/Bundles.scala
Here is what I have added:
...
import Chisel._
import chisel3.{Printable}
...
final class TLBundleA(params: TLBundleParameters)
extends TLBundleBase(params) with TLAddrChannel
{
...
override def toPrintable: Printable = {
p"A:\t" +
p"opcode[${opcode}]\t" +
p"param[${param}]\t" +
p"size[${size}]\t" +
p"source[${source}]\t" +
p"address[${address}]\t" +
p"user[${user}]\t" +
p"mask[${mask}]\t" +
p"data[${data}]\t" +
p"corrupt[${corrupt}]\n"
}
}
It seems that (unlike import chisel3._), import.chisel3.{Printable} works, but I am not doing something right with the p"...". I get this series of error:
[error] rocket-chip/src/main/scala/tilelink/Bundles.scala:189:5: value p is not a member of StringContext
[error] p"A:\t" +
[error] ^
[error] rocket-chip/src/main/scala/tilelink/Bundles.scala:190:5: value p is not a member of StringContext
[error] p"opcode[${opcode}]\t" +
[error] ^
...
[error] rocket-chip/src/main/scala/tilelink/Bundles.scala:198:5: value p is not a member of StringContext
[error] p"corrupt[${corrupt}]\n"
[error] ^
[error] 10 errors found
[error] (Compile / compileIncremental) Compilation failed
EDIT TO UPDATE:
After looking at the Chisel3 source, I added
import chisel3.{Printable, PrintableHelper}
and now I get a new error which looks more like I am just messing up my string:
[error] (run-main-0) java.lang.IllegalArgumentException: requirement failed
[error] java.lang.IllegalArgumentException: requirement failed
[error] at scala.Predef$.require(Predef.scala:268)
[error] at chisel3.printf$.escaped$1(Printf.scala:28)
[error] at chisel3.printf$.$anonfun$format$3(Printf.scala:32)
[error] ocketchiat chisel3.printf$.$anonfun$format$3$adapted(Printf.scala:32)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
[error] at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
[error] at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
[error] at scala.collection.immutable.StringOps.foreach(StringOps.scala:33)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:238)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
[error] at scala.collection.immutable.StringOps.map(StringOps.scala:33)
[error] at chisel3.printf$.format(Printf.scala:32)
[error] at chisel3.internal.firrtl.Emitter.emit(Emitter.scala:75)
[error] at chisel3.internal.firrtl.Emitter.$anonfun$moduleDefn$4(Emitter.scala:140)
EDIT TO UPDATE:
I added more verbosity to the requires in the printf code and I think it doesn't like tabs (\t):
[error] (run-main-0) java.lang.IllegalArgumentException: requirement failed: char to Int 9 must be >= 32
[error] java.lang.IllegalArgumentException: requirement failed: char to Int 9 must be >= 32
EDIT TO UPDATE:
I modified chisel3 to be OK with tabs, and now the code compiles and runs, but this is one of the resulting verilog strings which is obviously not right. I tried the string2Printable import you suggested but still get the following:
freechips.rocketchip.system.DefaultConfig.v: $fwrite(32'h80000002,"PLIC_TL_IFC A:\tPrintables(ArrayBuffer(PString(opcode[), Decimal(UInt<3>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(param[), Decimal(UInt<3>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(size[), Decimal(UInt<2>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(source[), Decimal(UInt<9>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(address[), Decimal(UInt<28>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(user[), PString(None), PString(]\t)))Printables(ArrayBuffer(PString(mask[), Decimal(UInt<8>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(data[), Decimal(UInt<64>(IO in unelaborated TLMonitor)), PString(]\t)))Printables(ArrayBuffer(PString(corrupt[), Decimal(Bool(IO in unelaborated TLMonitor)), PString(]\n)))"); // #[Monitor.scala 577:40:freechips.rocketchip.system.DefaultConfig.fir#64082.10]
freechips.rocketchip.system.DefaultConfig.v: $fwrite(32'h80000002,"PLIC_TL_IFC TLBundleD(opcode -> %d, param -> %d, size -> %d, source -> %d, sink -> %d, denied -> %d, data -> %d, corrupt -> %d)",io_in_d_bits_opcode,2'h0,io_in_d_bits_size,io_in_d_bits_source,1'h0,1'h0,io_in_d_bits_data,1'h0); // #[Monitor.scala 578:40:freechips.rocketchip.system.DefaultConfig.fir#64090.10]
For reference, here is the callsite where I am actually calling this:
when (bundle.a.fire()) { printf(p"$prefix ${bundle.a.bits}")}
when (bundle.d.fire()) { printf(p"$prefix ${bundle.d.bits}")}
Note the bundle.d.bits looks "okay" because I haven't given it a specific toPrintable function, but bundle.a.bits is a mess.
EDIT: RESOLVED
The problem is I had changed my toPrintable in the course of debugging to be of the form
override def toPrintable: Printable = {
"A:\t" +
p"opcode[$opcode]\t" +
...
p"corrupt[$corrupt]\n"
}
}
The first un-p string was making the rest of them just be normal strings, ignoring the p. I put the first p"A:\t" back and now it interpolates correctly.
The import chisel3._ should be enough. Try looking at the examples in the Chisel3 repo such as LFSR16.scala. If that doesn't help can you show the error message.

Error: Attempted to instantiate a Module without wrapping it in Module()

Top module is as follows;
class PE (DataWidth: Int, NumLinks: Int, NumEntries: Int, FifoDepth: Int) extends Module {
val io = IO(new Bundle {
...
})
...
}
I think that this is ordinary style for the chisel3.
I do the following run of sbt to lint the code;
sbt 'test:runMain noc.PEMain'
Then I get bellow error messages;
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
[info] Running noc.NoCMain
[info] [0.002] Elaborating design...
[error] (run-main-0) chisel3.internal.ChiselException: Error: attempted to instantiate a Module without wrapping it in Module().
[error] chisel3.internal.ChiselException: Error: attempted to instantiate a Module without wrapping it in Module().
[error] at chisel3.internal.throwException$.apply(Error.scala:13)
[error] at chisel3.core.BaseModule.<init>(Module.scala:90)
[error] at chisel3.core.UserModule.<init>(UserModule.scala:18)
[error] at chisel3.core.ImplicitModule.<init>(UserModule.scala:102)
[error] at chisel3.core.LegacyModule.<init>(UserModule.scala:127)
[error] at noc.NumGen.<init>(NoC.scala:328)
[error] at noc.FanIn_Link.<init>(NoC.scala:376)
[error] at noc.PE$$anonfun$12.apply(NoC.scala:490)
[error] at noc.PE$$anonfun$12.apply(NoC.scala:490)
[error] at chisel3.core.Module$.do_apply(Module.scala:49)
[error] at noc.PE.<init>(NoC.scala:490)
[error] at noc.NoCMain$$anonfun$1.apply(NoCMain.scala:27)
[error] at noc.NoCMain$$anonfun$1.apply(NoCMain.scala:27)
...
[error] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:297)
[error] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:295)
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at chisel3.internal.Builder$.build(Builder.scala:295)
[error] at chisel3.Driver$.elaborate(Driver.scala:93)
[error] at chisel3.Driver$.execute(Driver.scala:140)
[error] at chisel3.iotesters.setupTreadleBackend$.apply(TreadleBackend.scala:139)
...
[error] at logger.Logger$$anonfun$makeScope$1.apply(Logger.scala:138)
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at logger.Logger$.makeScope(Logger.scala:136)
...
[error] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[error] at chisel3.iotesters.Driver$.execute(Driver.scala:38)
[error] at chisel3.iotesters.Driver$.execute(Driver.scala:100)
[error] at noc.NoCMain$.delayedEndpoint$noc$NoCMain$1(NoCMain.scala:27)
[error] at noc.NoCMain$delayedInit$body.apply(NoCMain.scala:26)
[error] at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
...
[error] at scala.App$class.main(App.scala:76)
[error] at noc.NoCMain$.main(NoCMain.scala:26)
[error] at noc.NoCMain.main(NoCMain.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
...
[error] at java.lang.Thread.run(Thread.java:745)
And lastly this error:
[error] (Test / runMain) Nonzero exit code: 1
I find warning of:
[warn] Multiple main classes detected. Run 'show discoveredMainClasses' to see the list
Command of sbt 'show discoveredMainClasses' shows;
[info] Loading settings from plugins.sbt ...
[info] Loading project definition from /Users/hoge/Desktop/NoC/project
[info] Loading settings from build.sbt ...
[info] Set current project to en-noc (in build file:/Users/hoge/Desktop/NoC/)
[info] *
[success] Total time: 1 s, completed 2019/10/22 2:08:49
What does this error message mean and how can I fix it?
sbt 'testOnly noc.PETester'
introduced;
[info] at chisel3.core.LegacyModule.<init>(UserModule.scala:127)
This is caused at
val io = IO(new Bundle {
val No = Output(Vec(NumLinks, UInt((log2Ceil(NumLinks)).W)))
})
It seem that the main problem is:
attempted to instantiate a Module without wrapping it in Module()
This may rise due to you making a new instance of a class extended from Module but you are probably not wrapping it into one.
For example you in your code you are doing something like:
val test = new module_class
where as you should be doing
val test = Module(new module_class)

ebtables.c:61:3: error: implicit declaration of function 'xt_compat_calc_jump' [-Werror=implicit-function-declaration]

I am getting the following exception while building the recovery for a lineageos project and need some help diagnosing and resolving the issue
Device tree:= https://github.com/darran-kelinske-fivestars/android_device_lenovo_tb8504f/tree/lineage-15.1
Vendor tree:= https://github.com/darran-kelinske-fivestars/android_vendor_lenovo_tb8504f/tree/lineage-15.1
Kernel source:= https://github.com/darran-kelinske-fivestars/android_kernel_lenovo_tb8504f/tree/lineage-15.1
ROM Source:= https://github.com/LineageOS/android
Command: source build/envsetup.sh && breakfast tb8504f && repo sync --force-sync -q -j6 && mka recoveryimage -j6 | tee recovery.log
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c: In function 'ebt_standard_compat_from_user':
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:61:3: error: implicit declaration of function 'xt_compat_calc_jump' [-Werror=implicit-function-declaration]
v += xt_compat_calc_jump(NFPROTO_BRIDGE, v);
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c: At top level:
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:76:15: error: variable 'ebt_standard_target' has initializer but incomplete type
static struct xt_target ebt_standard_target = {
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:77:2: error: unknown field 'name' specified in initializer
.name = "standard",
^
../../../../../../kernel/lenovo/msm8917/net/bridge/netfilter/ebtables.c:77:2: warning: excess elements in struct initializer
error, forbidden warning: ebtables.c:77
CC net/core/gen_estimator.o
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:257: recipe for target 'net/bridge/netfilter/ebtables.o' failed
make[4]: *** [net/bridge/netfilter/ebtables.o] Error 1
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:402: recipe for target 'net/bridge/netfilter' failed
make[3]: *** [net/bridge/netfilter] Error 2
/home/lineageos/kernel/lenovo/msm8917/scripts/Makefile.build:402: recipe for target 'net/bridge' failed
make[2]: *** [net/bridge] Error 2
make[2]: *** Waiting for unfinished jobs....
Full log:
https://pastebin.com/v2ZsfRuc
I resolved this issue by replacing the source in the include directory of this project with the include directory of another project.
I replaced the source with the source from this directory:
https://github.com/HighwayStar/android_kernel_lenovo_tb8704/tree/tab4-8plus-LA.UM.5.6.r1-0/include

Failing to start my spring boot application due to 'javax.sql.DataSource' that could not be found

I am a beginner in spring boot and am trying to write a simple spring boot application.
My folder structure is as follows:
-> Project
-> build.gradle
-> settings.gradle
-> src/main/java
-> package
-> Main.java
-> UserController.java
-> UserRespository.java
-> dto
-> User.java
->src/main/resouces
-> application.properties
My build.gradle is as follows :
buildscript {
repositories {
jcenter()
}
dependencies {
classpath(
'org.springframework.boot:spring-boot-gradle- plugin:1.5.6.RELEASE'
)
classpath('mysql:mysql-connector-java:5.1.34')
}
}
apply plugin: 'java'
apply plugin: 'spring-boot'
sourceCompatibility = 1.8
targetCompatibility = 1.8
repositories {
mavenCentral()
}
dependencies {
compile(
'org.springframework.boot:spring-boot-starter-actuator',
'org.springframework.boot:spring-boot-starter-web',
'org.springframework.boot:spring-boot-starter-data-jpa'
)
compile('mysql:mysql-connector-java')
testCompile('org.springframework.boot:spring-boot-starter-test')
}
The application.properties is as follows:
spring.jpa.hibernate.ddl-auto=create
spring.datasource.url=jdbc:mysql://localhost:3306/user
spring.datasource.username=root
spring.datasource.password=root
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.jpa.database-platform=org.hibernate.dialect.MySQLDialect
spring.jpa.database=MYSQL
spring.jpa.show-sql = true
My Main.java is as follows:-
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
};
}
I am able to successfully build the application. If I run the application , I get Unable to start application with the following stacktrace:
2017-08-14 20:43:02.976 WARN 27205 --- [ main] ationConfigEmbeddedWebApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'javax.sql.DataSource' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {}
2017-08-14 20:43:02.978 INFO 27205 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2017-08-14 20:43:03.007 INFO 27205 --- [ main] utoConfigurationReportLoggingInitializer :
Error starting ApplicationContext. To display the auto-configuration report re-run your application with 'debug' enabled.
2017-08-14 20:43:03.198 ERROR 27205 --- [ main] o.s.b.d.LoggingFailureAnalysisReporter :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of constructor in org.springframework.boot.autoconfigure.orm.jpa.HibernateJpaAutoConfiguration required a bean of type 'javax.sql.Data Source' that could not be found.
- Bean method 'dataSource' not loaded because #ConditionalOnProperty (spring.datasource.jndi-name) did not find property 'jndi-name'
- Bean method 'dataSource' not loaded because #ConditionalOnBean (types: org.springframework.boot.jta.XADataSourceWrapper; SearchStrategy: all) did not find any beans
Action:
Consider revisiting the conditions above or defining a bean of type 'javax.sql.DataSource' in your configuration.
I have checked the dependencies tree and can find both hibernate as well as mysql connector in it.
Have tried removing #EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class}) , in that case I get Cannot load driver class: com.mysql.jdbc.Driver
You should move the dto package inside into your Main.java class package, In your case it should be like src/main/java/package/dto
So when spring-boot scans, your entity will be visible to the scanner.
Make sure you have added the MySQL driver dependency
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.6</version>
</dependency>