EBean mapping Booleans defaulting to false instead of null - mysql

I've been using Scala + EBean and I have a problem;
I have a model that looks a bit like this;
case class SomeModel(name: String) extends Model {
var someBool: Boolean = _
}
The problem is that even though the default value of someBool is null in the schema, EBean fills it up with 0 (it maps it to a TINYINT in mySQL), I should be able to save a null in the fields as well.
(ideally I'd like to keep track of whether or not the field has been set to a value in the model), wherein a null for a field would mean the field hasn't been filled in yet.
What would be the best way to solve this?

A possible solution to this is to simply replace Boolean with java.lang.Boolean, so;
case class SomeModel(name: String) extends Model {
var someBool: java.lang.Boolean = _
}

Related

JDBI select on varbinary and uuid

A legacy mysql db table has an id column that is non-human readable raw varbinary (don't ask me why :P)
CREATE TABLE IF NOT EXISTS `tbl_portfolio` (
`id` varbinary(16) NOT NULL,
`name` varchar(128) NOT NULL,
...
PRIMARY KEY (`id`)
);
and I need to select on it based on a java.util.UUID
jdbiReader
.withHandle<PortfolioData, JdbiException> { handle ->
handle
.createQuery(
"""
SELECT *
FROM tbl_portfolio
WHERE id = :id
"""
)
.bind("id", uuid) //mapping this uuid into the varbinary
//id db column is the problem
.mapTo(PortfolioData::class.java) //the mapper out does work
.firstOrNull()
}
just in case anyone wants to see it, here's the mapper out (but again, the mapper out is not the problem - binding the uuid to the varbinary id db column is)
class PortfolioDataMapper : RowMapper<PortfolioData> {
override fun map(
rs: ResultSet,
ctx: StatementContext
): PortfolioData = PortfolioData(
fromBytes(rs.getBytes("id")),
rs.getString("name"),
rs.getString("portfolio_idempotent_key")
)
private fun fromBytes(bytes: ByteArray): UUID {
val byteBuff = ByteBuffer.wrap(bytes)
val first = byteBuff.long
val second = byteBuff.long
return UUID(first, second)
}
}
I've tried all kinds of things to get the binding to work but no success - any advice much appreciated!
Finally got it to work, partly thanks to https://jdbi.org/#_argumentfactory which actually deals with UUID specifically but I somehow missed despite looking at JDBI docs for hours, oh well
The query can remain like this
jdbiReader
.withHandle<PortfolioData, JdbiException> { handle ->
handle
.createQuery(
"""
SELECT *
FROM tbl_portfolio
WHERE id = :id
"""
)
.bind("id", uuid)
.mapTo(PortfolioData::class.java)
.firstOrNull()
}
But jdbi needs a UUIDArgumentFactory registered
jdbi.registerArgument(UUIDArgumentFactory(VARBINARY))
where
class UUIDArgumentFactory(sqlType: Int) : AbstractArgumentFactory<UUID>(sqlType) {
override fun build(
value: UUID,
config: ConfigRegistry?
): Argument {
return UUIDArgument(value)
}
}
where
class UUIDArgument(private val value: UUID) : Argument {
companion object {
private const val UUID_SIZE = 16
}
#Throws(SQLException::class)
override fun apply(
position: Int,
statement: PreparedStatement,
ctx: StatementContext
) {
val bb = ByteBuffer.wrap(ByteArray(UUID_SIZE))
bb.putLong(value.mostSignificantBits)
bb.putLong(value.leastSignificantBits)
statement.setBytes(position, bb.array())
}
}
NOTE that registering an ArgumentFactory on the entire jdbi instance like this will make ALL UUID type arguments sent to .bind map to bytes which MAY not be what you want in case you elsewhere in your code base have other UUID arguments that are stored on the mysql end with something other than VARBINARY - eg, you may have another table with a column where your JVM UUID are actually stores as VARCHAR or whatever, in which case you'd have to, rather than registering the UUID ArgumentFactory on the entire jdbi instance, only use it ad hoc on individual queries where appropriate.

TypeORM integer entity in MySQL to 'enum string' representation in JSON

I have an existing MySQL database table userdo that has a column userType of type integer, such that an administrator user is represented as value 0, and so on.
When I work with my TypeORM userdo entity object, I wish for its JSON form to represent the userType property in a string enumeration form. For example, if the userType is 0 in the table, then when serializing/deserializing to JSON (say, when getting a userdo instance as a result of a findOne() and sending it via res.json()), I want:
{
...
userType: 'ADMIN',
...
}
There are several ways I have considered trying to do this:
Using the entity enum type. This doesn't seem to solve my problem because at runtime the enum is just a normal JavaScript integer and so it is printed in the JSON as such.
Use a ValueTransformer. I haven't investigated this much yet, but I believe this would entail writing some code to marshal between a set of strings ('ADMIN', ...) and respective integer values.
Use the library https://github.com/typestack/class-transformer. I am not certain yet if this will achieve what I want.
From what I know of SQL, I don't think there is a way of doing that with SQL. Which means I wouldn't think that TypeORM can do that.
Anyway, the way I would do it is to declare a const object that can do the translation for you. If you're using numbers, then maybe even an array:
const typeToWord = {
0: "ADMIN",
1: "SUPERADMIN",
/*... and so on*/
};
// or: ["ADMIN", "SUPERADMIN"]
const typeToNumber = {
"ADMIN": 0,
"SUPERADMIN": 1,
/* ... */
};
And the idea is to use that right after the query, so the rest of your code uses the "translation", i.e "ADMIN"

ServiceStack.OrmLite: Implementing custom StringConverter affects column type of complex BLOB fields

In a previous SO question I asked how I change the MySql column type when I have string properties in my POCOs.
The answer, which I answered myself, was to implement my own StringConverter. It was sort of an acceptable approach, I thought.
However, in my answer, I noted that not only was my string properties affected, so was all those properties that are complex and where OrmLite BLOBs them as JSON.
The field types for those complex columns in MySQL also became like varchar(255), which of course doesn't last very long.
The StringConverter was very short and easy:
I said that default string length is 255:
StringConverter converter = OrmLiteConfig.DialectProvider.GetStringConverter();
converter.StringLength = 255;
I wanted string props defined as 255 chars or smaller to be varchar(255)
string props defined as > 255 and < 65535 to be text
string props defined as >= 65535 to be longtext
MyStringConverter:
public class MyStringConverter : StringConverter
{
public override string GetColumnDefinition(int? stringLength)
{
if (stringLength.GetValueOrDefault() == StringLengthAttribute.MaxText)
return MaxColumnDefinition;
if (stringLength.GetValueOrDefault(StringLength) <= 255)
{
return UseUnicode
? $"NVARCHAR({stringLength.GetValueOrDefault(StringLength)})"
: $"VARCHAR({stringLength.GetValueOrDefault(StringLength)})";
}
else if (stringLength.GetValueOrDefault(StringLength) <= 65535)
{
return $"TEXT";
}
else
{
return "LONGTEXT";
}
}
}
But, as stated above, a property that looks like this (ActionInfo just contains some strings and List):
public List<ActionInfo> _AvailableActions { get; set; }
produced a table like this when using MyStringConverter:
Without using MyStringConverter, the column became a longtext.
Question is then: what am I missing? I still want complex, BLOBed fields to become longtext so that JSON BLOBing can be done. I was hoping that the StringConverter only affected string properties?
The column definition of blobbed complex types should use the ReferenceTypeConverter which by default resolves to DialectProvider.GetStringConverter().MaxColumnDefinition;
You should inherit the RDBMS-specific MySqlStringConverter if you want to change the string behavior of MySQL Strings otherwise you will revert back to inheriting generic RDBMS behavior which uses VARCHAR(8000) for strings and MaxColumnDefinition.
Alternatively you can override MaxColumnDefinition in your own String converter:
public override string MaxColumnDefinition => "LONGTEXT";

Fetch the response from sql, store it in a object and use conditions?

I have two sql statements to be executed with a validity check. My need is that I execute the 1st query and store the response in one object and check the object is empty or not and execute the second query if it is not empty.
So, I have tried something like
In rolerepository.scala=>
override val allQuery = s"""
select UserRoles.* from
(select CASE rbac.roleTypeID
ELSE rbac.name JOIN dirNetworkInfo ni
ON UserRoles.PersonID = ni.PersonID
where ni.Loginname = {loginName}
and UserRoles.roleName in ( 'Business User ','Administrator')"""
(This is just some sample of the query - it is not fully written here.)
Then I map it to an object with model class written outside
override def map2Object(implicit map: Map[String, Any]):
HierarchyEntryBillingRoleCheck = {
HierarchyEntryBillingRoleCheck(str("roleName"), oint("PersonID")) }
Then I have written the getall method to execute the query
override def getAll(implicit loginName: String):
Future[Seq[HierarchyEntryBillingRoleCheck]] = {
doQueryIgnoreRowErrors(allQuery, "loginName" -> loginName) }
Then I have written the method to check whether the response from the 1st sql is empty or not. This is were I'm stuck and not able to proceed further.
def method1()= {
val getallresponse = HierarchyEntryBillingRoleCheck
getallresponse.toString
if (getallresponse != " ")
billingMonthCheckRepository.getrepo()
}
I am getting an error (type mismatch) in last closing brace and I don't know what other logic can be used here.
Can any one of you please explain and give me some solution for this?
And i also tried to use for loop in controller but not getting how to do that.
i tried ->
def getAll(implicit queryParams: QueryParams,
billingMonthmodel:Seq[HierarchyEntryBillingRoleCheck]):
Action[AnyContent] = securityService.authenticate() { implicit request
=> withErrorRecovery { req =>
toJson {
repository.getAll(request.user.loginName)
for {
rolenamecheck <- billingMonthmodel
}yield rolenamecheck
}}}}
You don't say which db access method you are using. (I'm assuming anorm). One way of approaching this is:
Create a case class matching your table
Create a parser matching your case class
use Option (or Either) to return a row for a specific set of parameters
For example, perhaps you have:
case class UserRole (id:Int, loginName:String, roleName:String)
And then
object UserRole {
val sqlFields = "ur.id, ur.loginName, ur.roleName"
val userRoleParser = {
get[Int]("id") ~
get[String]("loginName") ~
get[String]("roleName") map {
case id ~ loginName ~ roleName => {
UserRole(id, loginName, roleName)
}
}
}
...
The parser maps the row to your case class. The next step is creating either single row methods like findById or findByLoginName and multi-row methods, perhaps allForRoleName or other generic filter methods. In your case there might (assuming a single role per loginName) be something like:
def findByLoginName(loginName:String):Option[UserRole) = DB.withConnection { implicit c =>
SQL(s"select $sqlFields from userRoles ur ...")
.on('loginName -> loginName)
.as(userRoleParser.singleOpt)
}
The .as(parser... is key. Typically, you'll need at least:
as(parser.singleOpt) which returns an Option of your case class
as(parser *) which returns a List of your case class (you'll need this if multiple roles could exist for a login
as(scalar[Long].singleOpt) which returns an Option[Long] and which is handy for returning counts or exists values
Then, to eventually return to your question a little more directly, you can call your find method, and if it returns something, continue with the second method call, perhaps like this:
val userRole = findByLoginName(loginName)
if (userRole.isDefined)
billingMonthCheckRepository.getrepo()
or, a little more idiomatically
findByLoginName(loginName).map { userRole =>
billingMonthCheckRepository.getrepo()
...
I've shown the find method returning an Option, but in reality we find it more useful to return an Either[String,(your case class)], and then the string contains the reason for failure. Either is cool.
On my version of play (2.3.x), the imports for the above are:
import play.api.db._
import play.api.Play.current
import anorm._
import anorm.SqlParser._
You're going to be doing this sort of thing a lot so worth finding a set of patterns that works for you.
WOW I don't know what's happening with the formatting here, I am really attempting to use the code formatter on the toolbar but I don't know why it won't format it, even when pressed multiple times. I invite the community to edit my code formatting because I can't figure it out. Apologies to OP.
Because I find Play's documentation to be very tough to trudge through if you're unfamiliar with it, I won't just leave a link to it only.
You have to inject an instance of your database into your controller. This will then give it to you as a global variable:
#Singleton
class LoginRegController #Inject()(**myDB: Database**, cc: ControllerComponents ) {
// do stuff
}
But, it's bad practice to actually use this connection within the controller, because the JDBC is a blocking operation, so you need to create a Model which takes the db as a parameter to a method. Don't set the constructor of the object to take the DB and store it as a field. For some reason this creates connection leaks and the connections won't release when they are done with your query. Not sure why, but that's how it is.
Create a Model object that you will use to execute your query. Instead of passing the DB through the object's constructor, pass it through the method you will create:
object DBChecker {
def attemptLogin(db:Database, password:String): String = {
}}
In your method, use the method .withConnection { conn => to access your JDBC connection. So, something like this:
object DBChecker {
def attemptLogin(db:Database, password:String):String = {
var username: String = ""
db.withConnection{ conn =>
val query:String = s"SELECT uploaded_by, date_added FROM tableName where PASSWORD = $password ;"
val stmt = conn.createStatement()
val qryResult:ResultSet = stmt.executeQuery(query)
// then iterate over your ResultSet to get the results from the query
if (qryResult.next()) {
userName = qryResult.getString("uploaded_by")
}
}
}
return username
}
// but note, please look into the use of PreparedStatement objects, doing it this way leaves you vulnerable to SQL injection.
In your Controller, as long as you import the object, you can then call that object's methods from your controller you made in Step 1.
import com.path.to.object.DBChecker
#Singleton
class LoginRegController #Inject()(myDB: Database, cc: ControllerComponents ) { def attemptLogin(pass:String) = Action {
implicit request: Request[AnyContent] => {
val result:String = DbChecker.attemptLogin(pass)
// do your work with the results here
}

BLToolkit MapValue not mapping

We are starting to do a conversion to BL Toolkit but are hitting some issues and not finding answers. One such issue is the inability to get the MapValue attribute on our DTO's properly to map.
Using T4 templates, we generate this (as an example):
[MapField("counterparty_fl")]
[MapValue(true, 'y')]
[MapValue(false, 'n')]
public bool CounterpartyFlag { get; set; } // flag_yn_TY(1)
Our database is Sybase, and the field counterparty_fl is a char(1) that accepts either 'y' or 'n'.
However, when I look at the SQL generated by the following link query, it is writing [counterparty_fl] = 0. What I need is [counterparty_fl] = 'n'
var results = (from i in facade.InputList
where (
i.UserIdentifier == criteria.UserId &&
i.CounterpartyFlag == false &&
i.Name == criteria.Name)
select i);
Has anyone had better luck with MapValue? Any suggestions?
This type of mapping is not supported for linq queries. The problem is that CounterpartyFlag field can be mapped to 'y', 'n' values, but 'false' in your expression can't be.
You can use enum for CounterpartyFlag field type:
public enum
{
[MapValue('y')] Yes,
[MapValue('n')] No
}
This should work.