I am using the SQLModel library to do a simple select() like described on their official website. However I am getting Column expression or FROM clause expected error message
from typing import Optional
from sqlmodel import Field, Session, SQLModel, create_engine, select
from models import Hero
sqrl = f"mysql+pymysql:///roo#asdf:localhost:3306/datab"
engine = create_engine(sqrl, echo=True)
def create_db_and_tables():
SQLModel.metadata.create_all(engine)
def select_heroes():
with Session(engine) as session:
statement = select(Hero)
results = session.exec(statement)
for hero in results:
print(hero)
def main():
select_heroes()
if __name__ == "__main__":
main()
this is my models/Hero.py code:
from datetime import datetime, date, time
from typing import Optional
from sqlmodel import Field, SQLModel
class Hero(SQLModel, table=True):
id: Optional[int] = Field(default=None, primary_key=True)
name: str
secret_name: str
age: Optional[int] = None
created: datetime
lastseen: time
when I run app.py I get the sqlalchemy.exc.ArgumentError: Column expression or FROM clause expected, got <module 'models.Hero' from '/Users/dev/test/models/Hero.py'>. message
The error message <Column expression or FROM clause expected, got module 'models.Hero' from '/Users/dev/test/models/Hero.py'> tells us:
SQLModel / SQLAlchemy unexpectedly received a module object named models.Hero
that you have a module named Hero.py
The import statement from models import Hero only imports the module Hero. Either
change the import to import the model*
from models.Hero import Hero
change the code in select_heroes to reference the model†
statement = select(Hero.Hero)
* It's conventional to use all lowercase for module names; following this convention will help you distinguish between modules and models.
† This approach is preferable in my opinion: accessing the object via the module namespace eliminates the possibility of name collisions (of course it can be combined with lowercase module names).
Related
I'm in the process of migrating to SQLAlchemy 2.0 and adopting new Declarative syntax with MappedAsDataclass. Previously, I've implemented joined table inheritance for my models. The (simplified) code looks like this:
from sqlalchemy import ForeignKey, String
from sqlalchemy.orm import DeclarativeBase, Mapped, MappedAsDataclass, mapped_column
class Base(MappedAsDataclass, DeclarativeBase):
pass
class Foo(Base):
__tablename__ = "foo"
id: Mapped[int] = mapped_column(primary_key=True)
type: Mapped[str] = mapped_column(String(50))
foo_value: Mapped[float] = mapped_column(default=78)
__mapper_args__ = {"polymorphic_identity": "foo", "polymorphic_on": "type"}
class Bar(Foo):
__tablename__ = "bar"
id: Mapped[int] = mapped_column(ForeignKey("foo.id"), primary_key=True)
bar_value: Mapped[float]
__mapper_args__ = {"polymorphic_identity": "bar"}
The important bit for the question is the default value in foo_value. Because of its presence, a TypeError: non-default argument 'bar_value' follows default argument is raised. While moving fields around in the definition of a single class could make this error go away (but why is it raised in first place, since the field order is not really important?), it's not possible with inherited models.
How can I fix or work around this limitation? Am I missing something relevant from the documentation?
It seems I needed to use insert_default with MappedAsDataclass instead of default, as described in the docs.
Suppose I have a simple SQLAlchemy class and a simple Flask o FastAPI implementation like this:
from sqlalchemy.ext.declarative import declarative_base
from pydantic import BaseModel
Base = declarative_base()
class A(Base):
__tablename__ = 'as'
my_id = Column(String)
class AModel(BaseModel):
myId:str = None
And a simple endpoint like this:
#app_router.get('/a')
def get_all_a(session:Session = Depends(get_session)):
return session.query(A).all()
How could I ensure that the returned list of this endpoint yields in camelCase like this:
[{'myId': 'id1'},{'myId': 'id2'}, ...]
Note: My application is rather complex, as I also have pagination implemented and some post-processings require a little bit more that just snake_case to camelCase conversion, so the simplest solution would be the best.
I've tried overriding dict() methods and similar stuff with no luck, simply cannot understand how FastAPI processes the results to obtain a JSON.
require a little bit more that just snake_case to camelCase conversion
Well, if you don't use response_model you don't have a lot of choices.
The solution is returning your dict with a conversion from snake_case to camelCase. You have functions that do it recursively.
Why it is the best solution ?
using regex it's super fast, faster than any lib that convert dict to obj like pydantic
If you definitely don't want to do this, well your only solution is using pydantic models, attrs or dataclasses and convert your db query output to one of those models type with camelCase variable name (dirty).
Since you are using fastapi you should use all the power of it.
I would suggest this:
from typing import List
from sqlalchemy.ext.declarative import declarative_base
from pydantic import BaseModel, Field
from pydantic import parse_obj_as
Base = declarative_base()
class A(Base):
__tablename__ = 'as'
my_id = Column(String)
class AModel(BaseModel):
myId: str = Field(alias="my_id", default=None)
#app_router.get('/a', response_model=List[AModel])
def get_all_a(session:Session = Depends(get_session)):
return parse_obj_as(List[AModel], session.query(A).all())
Keep in mind that having classes variables in CamelCase is not a good practice.
The gold answer would be to not return camel but snake and let your client do the work of conversion if needed.
Trying to map a simple class using play version 2.6.2 and scala 2.11.11
import play.api.libs.json._
import play.api.libs.json.util._
import play.api.libs.json.Reads._
import play.api.libs.json.Writes._
import play.api.libs.json.Format._
import play.api.libs.functional.syntax._
case class ObjectInfo (
names : Iterable[String],
info : Iterable[String]
)
object ObjectInfo {
/**
* Mapping to and from JSON.
*/
implicit val documentFormatter = Json.format[ObjectInfo]
}
getting:
No instance of play.api.libs.json.Format is available for
scala.Iterable[java.lang.String], scala.Iterable[java.lang.String] in
the implicit scope (Hint: if declared in the same file, make sure it's
declared before)
I was expecting Play to automatically map these fields since they're not complex object types but simple Collection of strings.
You provide "too much" implicit stuff with your imports. If you remove all imports but the first one, it will compile and do what you want.
If you enable implicit parameter logging via the scalac option -Xlog-implicits, you will see various "ambigouity" and "diverging implicit expansion" errors. The conflicting imports are import play.api.libs.json.Reads._/import play.api.libs.json.Writes._ and import play.api.libs.json.Format._. Maybe someone else can explain this conflict in more detail.
I am new to Slick. I am using mysql and I am trying to retrieve some datetime from the database. Here are my imports
import slick.driver.MySQLDriver.simple._
import scala.slick.driver._
import java.util.Date
and here the line of the class where the mapping is
def creationDate = column[Date]("creation_date")
But I am getting this error
could not find implicit value for parameter tt: slick.ast.TypedType[java.util.Date]
Is there a way to import a datetime from mysql to a java.util.Date without using String?
Thank you
The reason you can't use java.util.Date in the Column is
it's not supported in Slick, see the documentation in Table Rows part.
The following primitive types are supported out of the box for JDBC-based databases in JdbcProfile (with certain limitations imposed by the individual database drivers):
Date types: java.sql.Date, java.sql.Time, java.sql.Timestamp
Thus, no implicit TypedType[C] is provided.
def column[C](n: String, options: ColumnOption[C]*)
(implicit tt: TypedType[C]): Rep[C] = {
If you try to find the children of TypedType, you will find three time-relevant class in slick.driver.JdbcTypesComponent.
DateJdbcType for java.sql.Date
TimestampJdbcType for java.sql.Timestamp
TimeJdbcType for java.sql.Time
Also, the types defined are in line with what is stated in the documentation, three time-relevant type.
I use Timestamp with Slick 3.0 in my program as following:
import slick.driver.MySQLDriver.api._
import java.sql.Timestamp
case class Temp(creation_date: Timestamp)
class Tests(tag: Tag) extends Table[Temp](tag, "tests") {
def creationDate = column[Timestamp]("creation_date")
def * = creationDate <> ((creationDate: Timestamp) =>
Temp.apply(creationDate), Temp.unapply _)
}
In that way, you just have to convert Timestamp to any time-relevant type you want back and forth, but that should be no big deal.
Anyway, hope it helps.
Have you tried Joda-Time ?
If not, you should give it a serious thought. And there is a slick-mapper project for it https://github.com/tototoshi/slick-joda-mapper
import org.joda.time.DateTime
import com.github.tototoshi.slick.MySQLJodaSupport._
// then it works just the same
def creationDate = column[DateTime]("creation_date")
If you want to use java.util.Date as is,
create mapping Date type to Timestamp.
import slick.driver.MySQLDriver.api._
import java.util.Date
import java.sql.Timestamp
implicit def mapDate = MappedColumnType.base[Date, Timestamp](
d => new Timestamp(d.getTime),
identity
)
Guys I am trying to get some results from mysql database and I am having error with fetching it onto scala.html file. Here are my codes:
/*Customers.scala. Its controller*/
package controllers
import play.api._
import play.api.mvc._
import models.Customers
object Customers extends Controller{
def customer = Action{
val nb_customers = Customers.allCustomers
Ok(views.html.customer(nb_customers)) //I am having error here.
}
// End of customer Action.
}
// End of Customer controller.
/*Now Customers.scala model*/
package models
import anorm._
import play.api.db._
import play.api.Play.current
case class Customers(CustomersID: Int, Name: String)
object Customers {
def allCustomers = {
DB.withConnection {implicit connection =>
SQL("Select * from Customers")().map{row =>
Customers(
CustomersID = row[Int]("CustomersID"),
Name = row[String]("Name")
)
// End of Customers object.
}.toList
// SQL ends.
}
// With connection.
}
// End of allCustomers.
}
// End of of Customers.
Please note, I am using JDBC driver for mysql connection in conf/application.conf file
Please help me out here. Thanks a lot.
There is a namespace conflict between your Customers controller and model, as both are in the scope. There are two things you can do to fix this.
Rename your model to something different, like Customer.
Change Customers.allCustomers to models.Customers.allCustomers to differentiate from controllers.Customers.