Best way to pass the schema name as a variable to a query - anorm

I have a PlayFramework server (with Anorm) which operates against a database with several schemas, all of them with the same tables.
Most of my "access to database" functions look like:
def findById(zoneName: String, id: Long): Option[Employee] = {
DB.withConnection { implicit connection =>
SQL("""select *
from """+zoneName+"""employee
where employee._id = {id}"""
.on(
'_id -> id
).as(simpleParser.singleOpt)
}
}
But I know this is a wrong approach, because it is not SQL-Injection-safe and of course it is tedious to write in every function.
I want to use String interpolation to correct this, it works well with my id variable, but it doesn't with zoneName:
def findById(zoneName: String, id: Long): Option[Employee] = {
DB.withConnection { implicit connection =>
SQL"""select *
from $zoneName.employee
where employee._id = 1"""
.as(simpleParser.singleOpt)
}
}
Gives me:
info] ! #6lenhal6c - Internal server error, for (GET) [/limbo/br/employee/1] ->
[info]
[info] play.api.Application$$anon$1: Execution exception[[PSQLException: ERROR: syntax error at or near «$1»
[info] Position: 25]]
[info] at play.api.Application$class.handleError(Application.scala:296) ~[play_2.11-2.3.8.jar:2.3.8]
[info] at play.api.DefaultApplication.handleError(Application.scala:402) [play_2.11-2.3.8.jar:2.3.8]
[info] at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320) [play_2.11-2.3.8.jar:2.3.8]
[info] at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$3$$anonfun$applyOrElse$4.apply(PlayDefaultUpstreamHandler.scala:320) [play_2.11-2.3.8.jar:2.3.8]
[info] at scala.Option.map(Option.scala:146) [scala-library-2.11.5.jar:na]
[info] Caused by: org.postgresql.util.PSQLException: ERROR: syntax error at or near «$1»
[info] Position: 25
[info] at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198) ~[postgresql-9.3-1102.jdbc4.jar:na]
[info] at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927) ~[postgresql-9.3-1102.jdbc4.jar:na]
[info] at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255) ~[postgresql-9.3-1102.jdbc4.jar:na]
[info] at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:561) ~[postgresql-9.3-1102.jdbc4.jar:na
Tested also with ${zoneName} with the same result.
Any help or advice about how to write this would be appreciated, thank you in advance!

Using Anorm String interpolation, any $expression is to be provided a parameter, that is to say if it's a string it will quoted by the JDBC driver.
If you want to substitute part of the SQL statement with string (e.g. dynamic schema), either you can use concatenation, or since latest versions (2.4.0-M3 or 2.3.8) the syntax #$expr.
val table = "myTable"
SQL"SELECT * FROM #$table WHERE id=$id"
// SELECT * FROM myTable WHERE id=?

Related

SpecFlow - The argument type 'System.MarshalByRefObject' cannot be converted into parameter type 'TechTalk.SpecRun.Common.Logging.ITestLogger'

In visual studio 2019, I can build and run my specific test cases perfectly in test explorer. But I run from command line, its showing below error.
Discovered 1 tests
Thread#0: The argument type 'System.MarshalByRefObject' cannot be converted into parameter type 'TechTalk.SpecRun.Commo
Thread#1:
Thread#2:
100% (1/1) completed
Done.
TechTalk.SpecRun.Framework.SpecRunException: At least one test thread aborted. ---> System.Runtime.Remoting.RemotingException: The argument type 'System.MarshalByRefObject' cannot be converted into parameter type 'TechTalk.SpecRun.Common.Logging.ITestLogger'. ---> System.InvalidCastException: Object must implement IConvertible.
at System.Convert.ChangeType(Object value, Type conversionType, IFormatProvider provider)
at System.Runtime.Remoting.Messaging.Message.CoerceArg(Object value, Type pt)
--- End of inner exception stack trace ---
Server stack trace:
at System.Runtime.Remoting.Messaging.Message.CoerceArg(Object value, Type pt)
at System.Runtime.Remoting.Messaging.Message.CoerceArgs(MethodBase mb, Object[] args, ParameterInfo[] pi)
at System.Runtime.Remoting.Messaging.StackBuilderSink.SyncProcessMessage(IMessage msg)
Exception rethrown at [0]:
at System.Runtime.Remoting.Proxies.RealProxy.HandleReturnMessage(IMessage reqMsg, IMessage retMsg)
at System.Runtime.Remoting.Proxies.RealProxy.PrivateInvoke(MessageData& msgData, Int32 type)
at TechTalk.SpecRun.Framework.ITestAssemblyExecutor.Initialize(Int32 threadId, ITestExecutionManager executionManager, IAssemblyReference assemblyReference, ITestLogger logger, String testAssemblyFullPath, String testAssemblyConfigFilePath, TestExecutionConfiguration testExecutionConfiguration)
at TechTalk.SpecRun.Framework.TestThread.InitializeTestThreadExecutor(IAssemblyReference testAssembly, ExecutionModelSettings executionModelSettings, String testTarget)
at TechTalk.SpecRun.Framework.TestThread.GetThreadExecutorFor(TestItem testItem)
at TechTalk.SpecRun.Framework.TestThread.Run(ITestExecutionManager executionManagerForRun)
at TechTalk.SpecRun.Framework.AsyncTestThreadRunner.RunSync(TestExecutionManager executionManager)
--- End of inner exception stack trace ---
Result: test framework error: At least one test thread aborted.
Total: 1
Succeeded: 0
Ignored: 0
Pending: 0
Skipped: 1
Failed: 0
version I'm using:
.NET Framework : 4.6.2
SpecFlow : 2.2.1
SpecRun.Runner : 1.6.3
SpecRun.SpecFlow : 1.6.3
SpecRun.SpecFlow.2-2-0 : 1.6.3
command:
SpecRun.exe run Default.srprofile /basefolder:C:\Development\Project\bin\Debug\Automation\TestAutomation.Specs\ /outputfolder:output /filter:"testpath:Scenario:Search+page+in+Automation"
How can I fix this issue?
It is an active SpecFlow bug: https://github.com/SpecFlowOSS/SpecFlow/issues/1443
Best regards,
PM

Apache Spark SQL get_json_object java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String

I am trying to read a json stream from an MQTT broker in Apache Spark with structured streaming, read some properties of an incoming json and output them to the console. My code looks like that:
val spark = SparkSession
.builder()
.appName("BahirStructuredStreaming")
.master("local[*]")
.getOrCreate()
import spark.implicits._
val topic = "temp"
val brokerUrl = "tcp://localhost:1883"
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.toDF().withColumn("payload", $"payload".cast(StringType))
val jsonDF = lines.select(get_json_object($"payload", "$.eventDate").alias("eventDate"))
val query = jsonDF.writeStream
.format("console")
.start()
query.awaitTermination()
However, when the json arrives I get the following errors:
Exception in thread "main" org.apache.spark.sql.streaming.StreamingQueryException: Writing job aborted.
=== Streaming Query ===
Identifier: [id = 14d28475-d435-49be-a303-8e47e2f907e3, runId = b5bd28bb-b247-48a9-8a58-cb990edaf139]
Current Committed Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: -1}
Current Available Offsets: {MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496]: 0}
Current State: ACTIVE
Thread State: RUNNABLE
Logical Plan:
Project [get_json_object(payload#22, $.id) AS eventDate#27]
+- Project [id#10, topic#11, cast(payload#12 as string) AS payload#22, timestamp#13]
+- StreamingExecutionRelation MQTTStreamSource[brokerUrl: tcp://localhost:1883, topic: temp clientId: paho7247541031496], [id#10, topic#11, payload#12, timestamp#13]
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:300)
at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:189)
Caused by: org.apache.spark.SparkException: Writing job aborted.
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:92)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:247)
at org.apache.spark.sql.execution.SparkPlan.executeCollect(SparkPlan.scala:296)
at org.apache.spark.sql.Dataset.collectFromPlan(Dataset.scala:3384)
at org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2783)
at org.apache.spark.sql.Dataset.$anonfun$withAction$2(Dataset.scala:3365)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3365)
at org.apache.spark.sql.Dataset.collect(Dataset.scala:2783)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:537)
at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:78)
at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:125)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:73)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$14(MicroBatchExecution.scala:533)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:532)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:198)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:351)
at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:349)
at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:58)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:166)
at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:56)
at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:160)
at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)
... 1 more
Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 1.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1.0 (TID 8, localhost, executor driver): java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:1887)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:1875)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:1874)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1874)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:926)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:926)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2108)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2057)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2046)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:737)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.doExecute(WriteToDataSourceV2Exec.scala:64)
... 34 more
Caused by: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.spark.unsafe.types.UTF8String
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.BaseGenericInternalRow.getUTF8String$(rows.scala:46)
at org.apache.spark.sql.catalyst.expressions.GenericInternalRow.getUTF8String(rows.scala:195)
at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)
at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
at org.apache.spark.sql.execution.WholeStageCodegenExec$$anon$1.hasNext(WholeStageCodegenExec.scala:619)
at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.$anonfun$run$2(WriteToDataSourceV2Exec.scala:117)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1394)
at org.apache.spark.sql.execution.datasources.v2.DataWritingSparkTask$.run(WriteToDataSourceV2Exec.scala:116)
at org.apache.spark.sql.execution.datasources.v2.WriteToDataSourceV2Exec.$anonfun$doExecute$2(WriteToDataSourceV2Exec.scala:67)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:405)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:408)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
I am sending the JSON records using mosquitto broker and they look like this:
mosquitto_pub -m '{"eventDate": "2020-11-11T15:17:00.000+0200"}' -t "temp"
It seems that every strings coming from Bahir stream source provider raise this error. For instance the following code also raises this error :
spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("topic")
.writeStream
.format("console")
.start()
It looks like Spark does not recognize strings coming from Bahir, maybe some kind of weird string class version issue. I've tried the following actions to make the code work:
setup java version to 8
upgrade spark version from 2.4.0 to 2.4.7
setup scala version to 2.11.12
use decode function with all possible encoding combinations instead of .cast(StringType) to transform column "payload" to String
use substring function on column "payload" to recreate a compatible String.
Finally, I got working code by recreating the string using constructor and dataset:
val lines = spark.readStream
.format("org.apache.bahir.sql.streaming.mqtt.MQTTStreamSourceProvider")
.option("topic", topic).option("persistence", "memory")
.load(brokerUrl)
.select("payload")
.as[Array[Byte]]
.map(payload => new String(payload))
.toDF("payload")
This solution is rather ugly but at least it works.
I believe that there is nothing wrong with the code provided in the question and I suspect a bug on Bahir or Spark side preventing Spark to handle String from Bahir source.

Testing a DSPComplex ROM

I'm working on building a DSPComplex ROM still and have hit what I think may be an actual Chisel problem.
I've built the ROM, can generate a verilog output from the code that looks reasonable, but can't seem to test the module with even the most basic of testers. I've simplified it below to the most basic checking.
The error is a stack overflow like the following:
$ sbt 'testOnly taylor.TaylorTest'
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=8G; support was removed in 8.0
[info] Loading settings from plugins.sbt ...
[info] Loading project definition from /home/jcondley/Zendar/kodo/ZenFPGA/chisel/project
[info] Loading settings from build.sbt ...
[info] Set current project to zen-chisel (in build file:/home/jcondley/Zendar/kodo/ZenFPGA/chisel/)
[info] Compiling 1 Scala source to /home/jcondley/Zendar/kodo/ZenFPGA/chisel/target/scala-2.12/classes ...
[warn] there were 5 feature warnings; re-run with -feature for details
[warn] one warning found
[info] Done compiling.
[info] Compiling 1 Scala source to /home/jcondley/Zendar/kodo/ZenFPGA/chisel/target/scala-2.12/test-classes ...
[warn] there were two deprecation warnings (since chisel3, will be removed by end of 2017); re-run with -deprecation for details
[warn] there were two feature warnings; re-run with -feature for details
[warn] two warnings found
[info] Done compiling.
[info] [0.004] Elaborating design...
[deprecated] DspComplex.scala:22 (1029 calls): isLit is deprecated: "isLit is deprecated, use litOption.isDefined"
[deprecated] DspComplex.scala:22 (1029 calls): litArg is deprecated: "litArg is deprecated, use litOption or litTo*Option"
[deprecated] DspComplex.scala:23 (1029 calls): isLit is deprecated: "isLit is deprecated, use litOption.isDefined"
[deprecated] DspComplex.scala:23 (1029 calls): litArg is deprecated: "litArg is deprecated, use litOption or litTo*Option"
[warn] There were 4 deprecated function(s) used. These may stop compiling in a future release - you are encouraged to fix these issues.
[warn] Line numbers for deprecations reported by Chisel may be inaccurate; enable scalac compiler deprecation warnings via either of the following methods:
[warn] In the sbt interactive console, enter:
[warn] set scalacOptions in ThisBuild ++= Seq("-unchecked", "-deprecation")
[warn] or, in your build.sbt, add the line:
[warn] scalacOptions := Seq("-unchecked", "-deprecation")
[info] [1.487] Done elaborating.
Total FIRRTL Compile Time: 1887.8 ms
Total FIRRTL Compile Time: 770.6 ms
End of dependency graph
Circuit state created
[info] TaylorTest:
[info] TaylorWindow
[info] taylor.TaylorTest *** ABORTED ***
[info] java.lang.StackOverflowError:
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.evaluate(LoFirrtlExpressionEvaluator.scala:264)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.$anonfun$resolveDependency$1(LoFirrtlExpressionEvaluator.scala:453)
[info] at firrtl_interpreter.Timer.apply(Timer.scala:40)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.resolveDependency(LoFirrtlExpressionEvaluator.scala:445)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.getValue(LoFirrtlExpressionEvaluator.scala:81)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.evaluate(LoFirrtlExpressionEvaluator.scala:304)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.$anonfun$resolveDependency$1(LoFirrtlExpressionEvaluator.scala:453)
[info] at firrtl_interpreter.Timer.apply(Timer.scala:40)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.resolveDependency(LoFirrtlExpressionEvaluator.scala:445)
[info] at firrtl_interpreter.LoFirrtlExpressionEvaluator.getValue(LoFirrtlExpressionEvaluator.scala:81)
[info] ...
This looks suspiciously like another ROM issue from here:
https://github.com/freechipsproject/chisel3/issues/642
but trying Chick's response here:
export SBT_OPTS="-Xmx2G -XX:+UseConcMarkSweepGC -XX:+CMSClassUnloadingEnabled -XX:MaxPermSize=2G -Xss2M -Duser.timezone=GMT"
seems to not solve the issue (and one of the options, MaxPermSize is ignored)
Is this a legitimate Chisel bug with ROMs or is something else going on here?
Actual module with the ROM:
package taylor
import chisel3._
import chisel3.util._
import chisel3.experimental.FixedPoint
import dsptools.numbers._
import scala.io.Source
class TaylorWindow(len: Int, window: Seq[FixedPoint]) extends Module {
val io = IO(new Bundle {
val d_valid_in = Input(Bool())
val sample = Input(DspComplex(FixedPoint(16.W, 8.BP), FixedPoint(16.W, 8.BP)))
val windowed_sample = Output(DspComplex(FixedPoint(32.W, 8.BP), FixedPoint(32.W, 8.BP)))
val d_valid_out = Output(Bool())
})
val win_coeff = VecInit(window.map(x=>DspComplex.wire(x, FixedPoint(0, 16.W, 8.BP))).toSeq) // ROM storing our coefficients.
io.d_valid_out := io.d_valid_in
val counter = RegInit(UInt(10.W), 0.U)
// Implicit reset
io.windowed_sample:= io.sample * win_coeff(counter)
when(io.d_valid_in) {
counter := counter + 1.U
}
}
object TaylorDriver extends App {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
val fp_seq = fp_coeffs.toSeq
chisel3.Driver.execute(args, () => new TaylorWindow(1024, fp_seq))
}
Tester Code:
package taylor
import chisel3._
import chisel3.util._
import chisel3.experimental.FixedPoint
import dsptools.numbers.implicits._
import scala.io.Source
import chisel3.iotesters
import chisel3.iotesters.{ChiselFlatSpec, Driver, PeekPokeTester}
class TaylorWindowUnitTest(dut: TaylorWindow) extends PeekPokeTester(dut) {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
val fp_seq = fp_coeffs.toSeq
poke(dut.io.d_valid_in, Bool(false))
expect(dut.io.d_valid_out, Bool(false))
}
class TaylorTest extends ChiselFlatSpec {
behavior of "TaylorWindow"
backends foreach {backend =>
it should s"test the basic Taylow Window" in {
Driver(() => new TaylorWindow(1024, getSeq()), backend)(c => new TaylorWindowUnitTest(c)) should be (true)
}
}
def getSeq() : Seq[FixedPoint] = {
val filename = "src/test/test_data/taylor_coeffs"
val coeff_file = Source.fromFile(filename).getLines
val double_coeffs = coeff_file.map(x => x.toDouble)
val fp_coeffs = double_coeffs.map(x => FixedPoint.fromDouble(x, 16.W, 8.BP))
fp_coeffs.toSeq
}
}
This looks like a failure in the firrtl-interpreter (one of the Scala based Chisel simulator) which can have problems with a number of large firrtl constructs. If you have verilator installed can you try changing
backends foreach {backend =>
to
Seq("verilator") foreach {backend =>
and see what happens. Another think to try is Treadle which is the new version of the interpreter and is not in full release yet but is available in snapshot versions of the chisel ecosystem. It should be able to handle this also.

Use SparkSession.sql() with JDBC

Problem :
I would like to use JDBC connection to make a custom request using spark.
The goal of this query is to optimized memory allocation on workers, because of that I can't use :
ss.read
.format("jdbc")
.option("url", "jdbc:postgresql:dbserver")
.option("dbtable", "schema.tablename")
.option("user", "username")
.option("password", "password")
.load()
Currently :
I am currently trying to run :
ss = SparkSession
.builder()
.appName(appName)
.master("local")
.config(conf)
.getOrCreate()
ss.sql("some custom query")
Configuration :
url=jdbc:mysql://127.0.0.1/database_name
driver=com.mysql.jdbc.Driver
user=user_name
password=xxxxxxxxxx
Error :
[info] Exception encountered when attempting to run a suite with class name: db.TestUserProvider *** ABORTED ***
[info] org.apache.spark.sql.AnalysisException: Table or view not found: users; line 1 pos 14
[info] at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$lookupTableFromCatalog(Analyzer.scala:459)
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:478)
[info] at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:463)
[info] at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:61)
[info] at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:61)
[info] at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
[info] at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:60)
[info] at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:58)
[info] at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:58)
Assumption :
I guess there is a configuration error, but I can't find out where.
Spark can read and write data to/from relational databases using the JDBC data source (like you did in your first code example).
In addition (and completely separately), spark allows using SQL to query views that were created over data that was already loaded into a DataFrame from some source. For example:
val df = Seq(1,2,3).toDF("a") // could be any DF, loaded from file/JDBC/memory...
df.createOrReplaceTempView("my_spark_table")
spark.sql("select a from my_spark_table").show()
Only "tables" (called views, as of Spark 2.0.0) created this way can be queried using SparkSession.sql.
If your data is stored in a relational database, Spark will have to read it from there first, and only then would it be able to execute any distributed computation on the loaded copy. Bottom line - we can load the data from the table using read, create a temp view, and then query it:
ss.read
.format("jdbc")
.option("url", "jdbc:mysql://127.0.0.1/database_name")
.option("dbtable", "schema.tablename")
.option("user", "username")
.option("password", "password")
.load()
.createOrReplaceTempView("my_spark_table")
// and then you can query the view:
val df = ss.sql("select * from my_spark_table where ... ")

Chisel AlreadyBoundException

I am developing a simple on chip memory for an SoC, based on Sodor scratchpad memory. So, first I'm converting a slightly modified version of that design to chisel 3. Now, I'm getting this exception regarding a bounded type that I can't understand.
[info] - should correctly write and read data *** FAILED ***
[info] chisel3.core.Binding$BindingException: Error: Cannot set as output .M_WR: Already bound to LitBinding()
[info] at chisel3.core.Binding$.bind(Binding.scala:100)
[info] at chisel3.core.Output$.apply(Data.scala:50)
[info] at chisel3.util.ReadyValidIO.<init>(Decoupled.scala:22)
[info] at chisel3.util.DecoupledIO.<init>(Decoupled.scala:72)
[info] at chisel3.util.Decoupled$.apply(Decoupled.scala:81)
[info] at mem.MemPortIO.<init>(memory.scala:40)
[info] at mem.OnChipMemory$$anon$1.<init>(memory.scala:49)
[info] at mem.OnChipMemory.<init>(memory.scala:47)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(memoryTest.scala:33)
[info] at chisel3.core.Module$.do_apply(Module.scala:35)
[info] at chisel3.Driver$$anonfun$elaborate$1.apply(Driver.scala:194)
[info] at chisel3.Driver$$anonfun$elaborate$1.apply(Driver.scala:194)
[info] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:206)
[info] at chisel3.internal.Builder$$anonfun$build$1.apply(Builder.scala:204)
[info] at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
[info] at chisel3.internal.Builder$.build(Builder.scala:204)
[info] at chisel3.Driver$.elaborate(Driver.scala:194)
[info] at chisel3.Driver$.execute(Driver.scala:229)
[info] at chisel3.iotesters.setupFirrtlTerpBackend$.apply(FirrtlTerpBackend.scala:110)
[info] at chisel3.iotesters.Driver$.execute(Driver.scala:47)
[info] at chisel3.iotesters.Driver$.apply(Driver.scala:210)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply$mcV$sp(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply(memoryTest.scala:33)
[info] at mem.memoryTester$$anonfun$3$$anonfun$apply$1.apply(memoryTest.scala:33)
[info] at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
[info] at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
[info] at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
[info] at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
[info] at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
[info] at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:390)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:427)
[info] at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
[info] at scala.collection.immutable.List.foreach(List.scala:381)
[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
[info] at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
[info] at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
[info] at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
[info] at org.scalatest.Suite$class.run(Suite.scala:1424)
[info] at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
[info] at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
[info] at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
[info] at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
[info] at org.scalatest.FlatSpec.run(FlatSpec.scala:1683)
[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:466)
[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:677)
[info] at sbt.TestRunner.runTest$1(TestFramework.scala:76)
[info] at sbt.TestRunner.run(TestFramework.scala:85)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1$$anonfun$apply$8.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$.sbt$TestFramework$$withContextLoader(TestFramework.scala:185)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[info] at sbt.TestFramework$$anon$2$$anonfun$$init$$1.apply(TestFramework.scala:202)
[info] at sbt.TestFunction.apply(TestFramework.scala:207)
[info] at sbt.Tests$$anonfun$9.apply(Tests.scala:216)
[info] at sbt.Tests$$anonfun$9.apply(Tests.scala:216)
[info] at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[info] at sbt.std.Transform$$anon$3$$anonfun$apply$2.apply(System.scala:44)
[info] at sbt.std.Transform$$anon$4.work(System.scala:63)
[info] at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[info] at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
[info] at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
[info] at sbt.Execute.work(Execute.scala:237)
[info] at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[info] at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
[info] at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
[info] at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[info] at java.lang.Thread.run(Thread.java:745)
[info] ScalaCheck
[info] Passed: Total 0, Failed 0, Errors 0, Passed 0
[info] ScalaTest
I'm not clear about the value that is causing the problem. First I thought it was a problem with the Vec's I use in the Mem and IO ports (because of an explanation I found for a similar exception). But the problem exists even after I get rid of them. I even tried getting rid of clone functions.
Now, I'm clueless about the cause of the problem. I'm posting the main section of the source code (except parameter declarations and few simple functions) for the memory in case it might be useful.
trait MemOpConstants
{
val MT_X = Bits(0, 3) // memory transfer type
val MT_B = Bits(1, 3)
val MT_H = Bits(2, 3)
val MT_W = Bits(3, 3)
val MT_D = Bits(4, 3)
val MT_BU = Bits(5, 3)
val MT_HU = Bits(6, 3)
val MT_WU = Bits(7, 3)
val M_X = Bits("b0", 1) // access type
val M_RD = Bits("b0", 1) // load
val M_WR = Bits("b1", 1) // store
}
class MemReq(data_width: Int)(implicit config: Configuration) extends Bundle with MemOpConstants
{
val addr = UInt(width = config.xprlen)
val data = Bits(width = data_width)
val fcn = Bits(width = M_X.getWidth) // memory function code
val typ = Bits(width = MT_X.getWidth) // memory access type
override def cloneType = { new MemReq(data_width).asInstanceOf[this.type] }
}
class MemResp(data_width: Int) extends Bundle
{
val data = Bits(width = data_width)
override def cloneType = { new MemResp(data_width).asInstanceOf[this.type] }
}
class MemPortIO(data_width: Int)(implicit config: Configuration) extends Bundle // constructor for IO interface of data memory
{
val req = Decoupled(new MemReq(data_width)) // ready valid pair
val resp = (new ValidIO(new MemResp(data_width))).flip // valid signal
override def cloneType = { new MemPortIO(data_width).asInstanceOf[this.type] }
}
class OnChipMemory(num_ports: Int = 2, num_bytes: Int = (1 << 15), seq_read: Boolean = false)(implicit config: Configuration) extends Module with MemOpConstants
{
val io = IO(new Bundle{
val port = Vec(num_ports, (new MemPortIO(data_width = config.xprlen)).flip)
})
val num_bytes_per_line = 4
val num_lines = num_bytes/num_bytes_per_line
val lsb_idx = log2Up(num_bytes_per_line) // index of lsb in address
val chipMem = Mem(Vec(4, UInt(width = 32)), num_lines) // memory
for (i <- 0 until num_ports)
{
io.port(i).resp.valid := Reg(next = io.port(i).req.valid)
io.port(i).req.ready := Bool(true) // for now
val req_valid = io.port(i).req.valid
val req_addr = io.port(i).req.bits.addr
val req_data = io.port(i).req.bits.data
val req_fn = io.port(i).req.bits.fcn
val req_typ = io.port(i).req.bits.typ
val byte_shift_amt = io.port(i).req.bits.addr(1,0)
val bit_shift_amt = Cat(byte_shift_amt, UInt(0,3))
//mem read
val r_data_idx = Reg(UInt())
val data_idx = req_addr >> UInt(lsb_idx)
val read_data = Bits()
val rdata = Bits()
if (seq_read)
{
read_data := chipMem(r_data_idx)
rdata := LoadDataGen((read_data >> Reg(next=bit_shift_amt)), Reg(next=req_typ))
}
else
{
read_data := chipMem(data_idx)
rdata := LoadDataGen((read_data >> bit_shift_amt), req_typ)
}
io.port(i).resp.bits.data := rdata
//mem write
when (req_valid && req_fn === M_WR)
{
val wdata = StoreDataGen(req_data, req_typ)
val wmask = ((StoreMask(req_typ) << bit_shift_amt)(31,0)).toBools
chipMem.write(data_idx, wdata, wmask)
}
.elsewhen (req_valid && req_fn === M_RD)
{
r_data_idx := data_idx
}
}
}
object StoreDataGen extends MemOpConstants
{
def apply(din: Bits, typ: Bits): Vec[UInt] =
{
val word = (typ.equals(MT_W)) || (typ.equals(MT_WU))
val half = (typ.equals(MT_H)) || (typ.equals(MT_HU))
val byte = (typ.equals(MT_B)) || (typ.equals(MT_BU))
val dout = Mux(Bool(byte), Vec(4, din( 7,0)),
Mux(Bool(half), Vec(din(15,8), din( 7,0), din(15,8), din( 7,0)),
Vec(din(31,25), din(24,16), din(15,8), din( 7,0))))
return dout
}
}
It would help to have more of the code with line numbers, but based on the name of the trait MemOpConstants, I would guess that it includes some Chisel literals. The compiler is complaining it cannot make an IO port out of MemPortIO since it contains a Bundle MemReq which contains literals as elements.
IO Ports may only contain pure, unbound Chisel data types, not literals, Reg's, Mem's, or Wire's.
If you want to make the constants available at the IO Ports, you should define slots (UInts) to receive them, and then wire those slots up to the actual constant/literal value. Don't place the constants themselves in the IO Port definition.
All your definitions in MemOpConstants are literals (constants) of type Bits. They have values and widths. The values are what make them literals (constants), and unsuitable for inclusion in an IO Port definition.