Use multiple DB / Schema in a single DBIO in Slick - mysql

Let's say If I have 2 Schema below.Both of them are in a same MySQL server.
master
base
The problem is I can't use more than 2 schema actions in a single Database run.
If I execute such plain sql queries via sql, It works without any problem.
def fooAction(name: String) = {
sql"""
SELECT AGE FROM MASTER.FOO_TABLE WHERE NAME = $name
""".as[String].head
}
def barAction(id: String) = {
sql"""
SELECT BAZ FROM BASE.BAR_TABLE WHERE ID = $id
""".as[String].head
}
def execute = {
//It doesn't matter which Db I use in here But Let's say this baseDb is pointing to BASE schema.
baseDb.run(for{
foo <- fooAction("sample")
bar <- barAction("sample")
} yield foo + bar)
}
But the case of code blow doesn't
class FooTableDAO #Inject() (#NamedDatabase("master") protected val dbConfigProvider: DatabaseConfigProvider) extends HasDatabaseConfigProvider[JdbcProfile] {
import dbConfig.driver.api._
val table = TableQuery[FooTable]
def fooAction(name: String) = table.filter{_.name == name}.map{_.age}.result.head
}
class BarTableDAO #Inject() (#NamedDatabase("base") protected val dbConfigProvider: DatabaseConfigProvider) extends HasDatabaseConfigProvider[JdbcProfile] {
import dbConfig.driver.api._
val table = TableQuery[BarTable]
def fooAction(id: String) = table.filter{_.id == id}.map{_.baz}.result.head
}
def execute = {
//com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Table 'BASE.FOO_TABLE' doesn't exist
baseDb.run(for{
foo <- fooTableDAO.fooAction("sample")
bar <- barTableDAO.barAction("sample")
} yield foo + bar)
}
Since baseDb is pointing to BASE schema, It tries to find FOO_TABLE in MASTER schema. All What I want slick to do is use different schema for each query but I couldn't find the way.
Currently I do DBIO.from(db.run(**)) if another schema action is needed in a for-comprehension of DBIO action or execute each actions via run and wrap them with EitherT which is scala library named cats's monad transformer for Either to keep using for-comprehension.
Is there any way to handle more than 2 schemas in a single DBIO Action except using plain text query?
Thanks in advance.

I think (though I am not an MySQL expert) you mean schema, not a database. At least this is what I see from your SQL samples.
Can't you just use schema attribute in your Slick table mappings? Here you have complete answer for using different schemas: https://stackoverflow.com/a/41090987/2239369
Here is the relevant piece of code:
class StudentTable(tag: Tag) extends Table[Student](tag, _schemaName = Option("database2"), "STUDENT") {
...
}
(notice _schemaName attribute).
With this in mind answer to this part of the question:
Is there any way to handle more than 2 schemas in a single DBIO Action except using plain text query?
is: Yes you can.

Related

How to writing a accumulator by using ScalaBlackBox?

I want to create some new number types that like DspReal for dsptools, such as DspPosit and DspQuire. DspPosit bases on posit which I have some Java code, and DspQuire bases on quire which is a kind of accumulator for posit. Because I just want to simulation now, so I have write many ScalaBlackBox for their operation like DspReal. However, I found that ScalaBlackBox can't construct sequential logic. For example, current output of the quire accumulator depends on it's input and last output. But ScalaBlackBox can't get the value of the output. In addition, step(n) also influences the output. Because accumulator will read its input per clock cycle.
I found some system problems of treadle. First, the function of ScalaBlackBox, twoOp and oneOp and so on, will be called many times. I don't know why. Second, step(n) is the function of PeekPokeTester, which can't be access by ScalaBlackBox. Third, I try to read current output but system gives errors.
trait DspBlackBlackBoxImpl extends BlackBoxImplementation with ScalaBlackBox
abstract class DspQuireAccumulator extends DspBlackBlackBoxImpl {
lazy val accValue = Quire32() // initial value
/**
* sub-classes must implement this two argument function
*
* #param posit accumulate element
* #return quire operation result
*/
def accOp(posit: Posit32): Unit
def outputDependencies(outputName: String): Seq[(String)] = {
outputName match {
case "out" => Seq("in") // Seq("out", "in") gives errors
case _ => Seq.empty
}
}
def cycle(): Unit = {}
def execute(inputValues: Seq[Concrete], tpe: Type, outputName: String): Concrete = {
val arg1 :: _ = inputValues
val positArg = Posit32(arg1.value)
accOp(positArg)
val result = quire32ToBigInt(accValue)
ConcreteSInt(result, DspQuire.underlyingWidth, arg1.poisoned).asUInt
}
def getOutput(inputValues: Seq[BigInt], tpe: Type, outputName: String): BigInt = {
val arg1 :: _ = inputValues
val positArg = Posit32(arg1)
accOp(positArg)
quire32ToBigInt(accValue)
}
}
class DspQuireAddAcc(val name: String) extends DspQuireAccumulator {
def accOp(posit: Posit32): Unit = accValue += posit
}
class QuireBlackboxAccOperand extends BlackBox {
val io = IO(new Bundle() {
val in = Input(UInt(DspPosit.underlyingWidth.W))
val out = Output(UInt(DspQuire.underlyingWidth.W))
})
}
class BBQAddAcc extends QuireBlackboxAccOperand
class TreadleDspQuireFactory extends ScalaBlackBoxFactory {
def createInstance(instanceName: String, blackBoxName: String): Option[ScalaBlackBox] = {
blackBoxName match {
case "BBQAddAcc" => Some(add(new DspQuireAddAcc(instanceName)))
...
accOp will be called many times. So, if I want to accumulate List(1, 2, 3), the result maybe 0 + 1 + 1 + 2 + 2 + ...
And peek function will call accOp one time again, this makes me confused also.
I believe most of your problems at this point are caused by mixing two different approaches. I think you should not be using BlackBoxImplmentation because it is an older scheme used in with the firrtl-interpreter. Just use the ScalaBlackBox and implement the methods as described in the wiki page Black Boxes and Treadle and shown in the TreadleTest BlackBoxWithState.
Don't use outputDependencies, and instead specify any dependencies between inputs and outputs with with getDependencies. inputChanged will be called whenever an input IO is changed. So in that method you want to record or update the internal state of your black box. clockChange will be called whenever a clock is changed and will provide the transition information so you can decide what happens then. Treadle will call getOutput whenever it needs that output of your black box, since you will not have used outputDependencies you can ignore the inputs and just provide the output value depending on your internal state.
I am still trying to reproduce a running version of your code here but it will be a little time for me to put it together, if you can try my suggestions above and let me know how it goes that would be helpful. I am interested in making this feature of Treadle better and easier to use so all feedback is appreciated.

Fetch the response from sql, store it in a object and use conditions?

I have two sql statements to be executed with a validity check. My need is that I execute the 1st query and store the response in one object and check the object is empty or not and execute the second query if it is not empty.
So, I have tried something like
In rolerepository.scala=>
override val allQuery = s"""
select UserRoles.* from
(select CASE rbac.roleTypeID
ELSE rbac.name JOIN dirNetworkInfo ni
ON UserRoles.PersonID = ni.PersonID
where ni.Loginname = {loginName}
and UserRoles.roleName in ( 'Business User ','Administrator')"""
(This is just some sample of the query - it is not fully written here.)
Then I map it to an object with model class written outside
override def map2Object(implicit map: Map[String, Any]):
HierarchyEntryBillingRoleCheck = {
HierarchyEntryBillingRoleCheck(str("roleName"), oint("PersonID")) }
Then I have written the getall method to execute the query
override def getAll(implicit loginName: String):
Future[Seq[HierarchyEntryBillingRoleCheck]] = {
doQueryIgnoreRowErrors(allQuery, "loginName" -> loginName) }
Then I have written the method to check whether the response from the 1st sql is empty or not. This is were I'm stuck and not able to proceed further.
def method1()= {
val getallresponse = HierarchyEntryBillingRoleCheck
getallresponse.toString
if (getallresponse != " ")
billingMonthCheckRepository.getrepo()
}
I am getting an error (type mismatch) in last closing brace and I don't know what other logic can be used here.
Can any one of you please explain and give me some solution for this?
And i also tried to use for loop in controller but not getting how to do that.
i tried ->
def getAll(implicit queryParams: QueryParams,
billingMonthmodel:Seq[HierarchyEntryBillingRoleCheck]):
Action[AnyContent] = securityService.authenticate() { implicit request
=> withErrorRecovery { req =>
toJson {
repository.getAll(request.user.loginName)
for {
rolenamecheck <- billingMonthmodel
}yield rolenamecheck
}}}}
You don't say which db access method you are using. (I'm assuming anorm). One way of approaching this is:
Create a case class matching your table
Create a parser matching your case class
use Option (or Either) to return a row for a specific set of parameters
For example, perhaps you have:
case class UserRole (id:Int, loginName:String, roleName:String)
And then
object UserRole {
val sqlFields = "ur.id, ur.loginName, ur.roleName"
val userRoleParser = {
get[Int]("id") ~
get[String]("loginName") ~
get[String]("roleName") map {
case id ~ loginName ~ roleName => {
UserRole(id, loginName, roleName)
}
}
}
...
The parser maps the row to your case class. The next step is creating either single row methods like findById or findByLoginName and multi-row methods, perhaps allForRoleName or other generic filter methods. In your case there might (assuming a single role per loginName) be something like:
def findByLoginName(loginName:String):Option[UserRole) = DB.withConnection { implicit c =>
SQL(s"select $sqlFields from userRoles ur ...")
.on('loginName -> loginName)
.as(userRoleParser.singleOpt)
}
The .as(parser... is key. Typically, you'll need at least:
as(parser.singleOpt) which returns an Option of your case class
as(parser *) which returns a List of your case class (you'll need this if multiple roles could exist for a login
as(scalar[Long].singleOpt) which returns an Option[Long] and which is handy for returning counts or exists values
Then, to eventually return to your question a little more directly, you can call your find method, and if it returns something, continue with the second method call, perhaps like this:
val userRole = findByLoginName(loginName)
if (userRole.isDefined)
billingMonthCheckRepository.getrepo()
or, a little more idiomatically
findByLoginName(loginName).map { userRole =>
billingMonthCheckRepository.getrepo()
...
I've shown the find method returning an Option, but in reality we find it more useful to return an Either[String,(your case class)], and then the string contains the reason for failure. Either is cool.
On my version of play (2.3.x), the imports for the above are:
import play.api.db._
import play.api.Play.current
import anorm._
import anorm.SqlParser._
You're going to be doing this sort of thing a lot so worth finding a set of patterns that works for you.
WOW I don't know what's happening with the formatting here, I am really attempting to use the code formatter on the toolbar but I don't know why it won't format it, even when pressed multiple times. I invite the community to edit my code formatting because I can't figure it out. Apologies to OP.
Because I find Play's documentation to be very tough to trudge through if you're unfamiliar with it, I won't just leave a link to it only.
You have to inject an instance of your database into your controller. This will then give it to you as a global variable:
#Singleton
class LoginRegController #Inject()(**myDB: Database**, cc: ControllerComponents ) {
// do stuff
}
But, it's bad practice to actually use this connection within the controller, because the JDBC is a blocking operation, so you need to create a Model which takes the db as a parameter to a method. Don't set the constructor of the object to take the DB and store it as a field. For some reason this creates connection leaks and the connections won't release when they are done with your query. Not sure why, but that's how it is.
Create a Model object that you will use to execute your query. Instead of passing the DB through the object's constructor, pass it through the method you will create:
object DBChecker {
def attemptLogin(db:Database, password:String): String = {
}}
In your method, use the method .withConnection { conn => to access your JDBC connection. So, something like this:
object DBChecker {
def attemptLogin(db:Database, password:String):String = {
var username: String = ""
db.withConnection{ conn =>
val query:String = s"SELECT uploaded_by, date_added FROM tableName where PASSWORD = $password ;"
val stmt = conn.createStatement()
val qryResult:ResultSet = stmt.executeQuery(query)
// then iterate over your ResultSet to get the results from the query
if (qryResult.next()) {
userName = qryResult.getString("uploaded_by")
}
}
}
return username
}
// but note, please look into the use of PreparedStatement objects, doing it this way leaves you vulnerable to SQL injection.
In your Controller, as long as you import the object, you can then call that object's methods from your controller you made in Step 1.
import com.path.to.object.DBChecker
#Singleton
class LoginRegController #Inject()(myDB: Database, cc: ControllerComponents ) { def attemptLogin(pass:String) = Action {
implicit request: Request[AnyContent] => {
val result:String = DbChecker.attemptLogin(pass)
// do your work with the results here
}

Slick 2.1 MySqlSyntaxErrorException on DDL generation

In my tests, I'm trying to setup a test database based on the schema defined in slick, which in turn is generated with the slick code generator based on an existing database. To do so, I'm extending WithApplication and redefining the around method like so:
abstract class WithDbData extends WithApplication {
def ds = DB.getDataSource("test")
def slickDb = Database.forDataSource(ds)
override def around[T: AsResult](t: => T): Result = super.around {
setup
val result = AsResult.effectively(t)
teardown
result
}
def setup = {
slickDb.withSession { implicit session =>
ddl.create
}
}
def teardown = {
slickDb.withSession { implicit session =>
ddl.drop
}
}
}
But when I run the tests, I'm getting a
MySqlSyntaxErrorException : Column length too big for column 'text' (max = 21845); use BLOB or TEXT instead (null:-2).
I'm using MySql for development and for testing, and part of the schema generated by the code generator is like so:
/** Database column text DBType(TEXT), Length(65535,true) */
val text: Column[String] = column[String]("text", O.Length(65535,varying=true))
It seems that the ddl generator is trying to create the text column as a varchar or something like that, instead of as text, as it was originally intended, because in the original database that column is of type text.
In the autogenerated slick data model I have as profile = scala.slick.driver.MySQLDriver, I've also imported scala.slick.driver.MySQLDriver.simple._ in my test class so I shouldn't have any problem because of mixing drivers.
I'm using play 2.3, slick 2.1.0, codegen 2.1.0 and play-slick 0.8.0.
I'd appreciate any light on this matter.
It's a bug in the code generator. github.com/slick/slick/issues/975
You can work around it by customizing the code generator. See http://slick.typesafe.com/doc/2.1.0/code-generation.html
One easy way to do it is override def dbType = true . You loose cross-vendor portability but get the exact types. You may have to filter Length out of def options, not sure.
Also discussed here: https://github.com/slick/slick/issues/982
You may change the column manually:
Create the table as standar varchar(255):
val text: Column[String] = column[String]("text", O.Length(255,varying=true))
After that you can change the database:
alter table your_table change `text` `text` text;
And restore the values:
val text: Column[String] = column[String]("text", O.Length(65535,varying=true))
In the other hand: you can add a custom field definition:
def recipe = column[Option[String]]("text_column", O.DBType("TEXT"))
Note: I changed the column name to text_column because text is a reserved word in MySQL.

Play + scala + MySQL

I'm starting develop with Play 2.2.1 and Scala 2.10.2. I'm developing a CRUD application. I'm following a example from book "Play for Scala: Covers 2", but I've a problem.
In these book there is this code in model
import play.api.Play.current
import play.api.db.DB
def getAll: List[Product] = DB.withConnection { implicit connection =>
sql().map
( row =>
Product(row[Long]("id"), row[Long]("ean"), row[String]("name"), row[String]("description"))
).toList
}
But when I try run it, I recieve this error:
value map is not a member of anorm.SqlQuery
Why doesn't function .map?
Thanks you!
SqlQuery doesn't have a map function. I'm not sure how the example in the book is supposed to look, but I'm a little wary of it if it's using that clunky syntax for anorm. I think it should always be preferred to use result set parsers defined separately from the function itself--as you'll be able to reuse them elsewhere.
import anorm._
import anorm.SqlParser._
import play.api.Play.current
import play.api.db.DB
case class Product(id: Long, ean: Long, name: String, description: String)
object Product {
/** Describes how to transform a result row to a `Product`. */
val parser: RowParser[Product] = {
get[Long]("products.id") ~
get[Long]("products.ean") ~
get[String]("products.name") ~
get[String]("products.description") map {
case id ~ ean ~ name ~ description => Product(id, ean, name, description)
}
def getAll: List[Product] = {
DB.withConnection { implicit connection =>
SQL("SELECT * FROM products").as(parser *)
}
}
}
I've made the assumption that your table is named products. It's best to use the full column names in parsers (products.id instead of id), as if later you need to combine parsers (using joined results), then anorm won't get confused from multiple tables using a similar column name like id. The getAll function now looks so much cleaner, and we can re-use the parser for other functions:
def getById(id: Long): Option[Product] = {
DB.withConnection{ implicit connection =>
SQL("SELECT * FROM products WHERE id = {id}")
.on("id" -> id)
.as(parser.singleOpt)
}
}
In the tutorial it is mentioned that,
Before we can get any results, we have to create a query. With Anorm, you call
anorm.SQL with your query as a String parameter:
import anorm.SQL
import anorm.SqlQuery
val sql: SqlQuery = SQL("select * from products order by name asc")
Is this missing from your code?.
SqlQuery is being made private (pending PR), so it should not be used directly. In place SQL("...") or SQL"..." functions can be used, in a safer way.
Best

How do we update an HSTORE field with Flask-Admin?

How do I update an HSTORE field with Flask-Admin?
The regular ModelView doesn't show the HSTORE field in Edit view. It shows nothing. No control at all. In list view, it shows a column with data in JSON notation. That's fine with me.
Using a custom ModelView, I can change the HSTORE field into a TextAreaField. This will show me the HSTORE field in JSON notation when in edit view. But I cannot edit/update it. In list view, it still shows me the object in JSON notation. Looks fine to me.
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
When I attempt to save/edit the JSON, I receive this error:
sqlalchemy.exc.InternalError
InternalError: (InternalError) Unexpected end of string
LINE 1: UPDATE mytable SET attributes='{}' WHERE mytable.id = ...
^
'UPDATE mytable SET attributes=%(attributes)s WHERE mytable.id = %(mytable_id)s' {'attributes': u'{}', 'mytable_id': 14L}
Now -- using code, I can get something to save into the HSTORE field:
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
def on_model_change(self, form, model, is_created):
model.attributes = {"a": "1"}
return
This basically overrides the model and put this object into it. I can then see the object in the List view and the Edit view. Still not good enough -- I want to save/edit the object that the user typed in.
I tried to parse and save the content from the form into JSON and back out. This doesn't work:
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
def on_model_change(self, form, model, is_created):
x = form.data['attributes']
y = json.loads(x)
model.attributes = y
return
json.loads(x) says this:
ValueError ValueError: Expecting property name: line 1 column 1 (char
1)
and here are some sample inputs that fail:
{u's': u'ff'}
{'s':'ff'}
However, this input works:
{}
Blank also works
This is my SQL Table:
CREATE TABLE mytable (
id BIGSERIAL UNIQUE PRIMARY KEY,
attributes hstore
);
This is my SQA Model:
class MyTable(Base):
__tablename__ = u'mytable'
id = Column(BigInteger, primary_key=True)
attributes = Column(HSTORE)
Here is how I added the view's to the admin object
admin.add_view(ModelView(models.MyTable, db.session))
Add the view using a custom Model View
admin.add_view(MyView(models.MyTable, db.session))
But I don't do those views at the same time -- I get a Blueprint name collision error -- separate issue)
I also attempted to use a form field converter. I couldn't get it to actually hit the code.
class MyModelConverter(AdminModelConverter):
def post_process(self, form_class, info):
raise Exception('here I am') #but it never hits this
return form_class
class MyView(ModelView):
form_overrides = dict(attributes=fields.TextAreaField)
The answer gives you a bit more then asked
Fist of all it "extends" hstore to be able to store actually JSON, not just key-value
So this structure is also OK:
{"key":{"inner_object_key":{"Another_key":"Done!","list":["no","problem"]}}}
So, first of all your ModelView should use custom converter
class ExtendedModelView(ModelView):
model_form_converter=CustomAdminConverter
Converter itself should know how to use hstore dialect:
class CustomAdminConverter(AdminModelConverter):
#converts('sqlalchemy.dialects.postgresql.hstore.HSTORE')
def conv_HSTORE(self, field_args, **extra):
return DictToHstoreField(**field_args)
This one as you can see uses custom WTForms field which converts data in both directions:
class DictToHstoreField(TextAreaField):
def process_data(self, value):
if value is None:
value = {}
else:
for key,obj in value.iteritems():
if (obj.startswith("{") and obj.endswith("}")) or (obj.startswith("[") and obj.endswith("]")):
try:
value[key]=json.loads(obj)
except:
pass #
self.data=json.dumps(value)
def process_formdata(self, valuelist):
if valuelist:
self.data = json.loads(valuelist[0])
for key,obj in self.data.iteritems():
if isinstance(obj,dict) or isinstance(obj,list):
self.data[key]=json.dumps(obj)
if isinstance(obj,int):
self.data[key]=str(obj)
The final step will be to actual use this data in application
I did not make it in common nice way for SQLalchemy, since was used with flask-restful, so I have only adoption for flask-restful in one direction, but I think it's easy to get the idea from here and do the rest.
And if your case is simple key-value storage so nothing additionaly should be done, just use it as is.
But if you want to unwrap JSON somewhere in code, it's simple like this whenever you use it, just wrap in function
if (value.startswith("{") and value.endswith("}")) or (value.startswith("[") and value.endswith("]")):
value=json.loads(value)
Creating dynamical field for actual nice non-JSON way for editing of data also possible by extending FormField and adding some javascript for adding/removing fields, but this is whole different story, in my case I needed actual json storage, with blackjack and lists :)
Was working on postgres JSON datatype. The above solution worked great with a minor modifications.
Tried
'sqlalchemy.dialects.postgresql.json.JSON',
'sqlalchemy.dialects.postgresql.JSON',
'dialects.postgresql.json.JSON',
'dialects.postgresql.JSON'
The above versions did not work.
Finally the following change worked
#converts('JSON')
And changed class DictToHstoreField to the following:
class DictToJSONField(fields.TextAreaField):
def process_data(self, value):
if value is None:
value = {}
self.data = json.dumps(value)
def process_formdata(self, valuelist):
if valuelist:
self.data = json.loads(valuelist[0])
else:
self.data = '{}'
Although, this is might not be the answer to your question, but by default SQLAlchemy's ORM doesn't detect in-place changes to HSTORE field values. But fortunately there's a solution: SQLAlchemy's MutableDict type:
from sqlalchemy.ext.mutable import MutableDict
class MyClass(Base):
__tablename__ = 'mytable'
id = Column(Integer, primary_key=True)
attributes = Column(MutableDict.as_mutable(HSTORE))
Now when you change something in-place:
my_object.attributes.['some_key'] = 'some value'
The hstore field will be updated after session.commit().