I work a working API endpoint to create a single user.
#app.post("/entity/", response_model=List[schemas.User])
def create_user(user: schemas.User, db: Session = Depends(get_db)):
crud.create_user(db=db, user=user)
return JSONResponse(content={"message": "user created successfully"})
class User(BaseModel):
id: str = Field(default_factory=generate_id)
first_name: Optional[str] = ""
last_name: Optional[str] = ""
username: str
class Config:
orm_mode = True
def create_user(db: Session, user: schemas.User):
db_item = models.User(**user.dict())
db.add(db_item)
db.commit()
db.refresh(db_item)
return db_item
This works, but I want to create multiple Users via one api request.
I guess the create_user function has to look somewhat like this:
def create_user(db: Session, data, user: schemas.User):
objects = []
for user in data:
db_item = models.User(**user.dict())
objects.append(dbitem)
db.bulk_save_objects(objects)
db.commit()
I just canĀ“t my head around what is the right way to create this kind of bulkinsert
You can accept a list of users as the request body.
def create_user(db: Session, users: List[schemas.User]):
objects = []
for user in users:
db_item = models.User(**user.dict())
objects.append(dbitem)
db.bulk_save_objects(objects)
db.commit()
First to say, I'm newcomer in Scala and really need a little help. I need to build a web api, and I'll try to insert one record into database, but have some problems with mapping the entity (db table) into a model (class). I worked with .Net Core Web API (there I used Entity Framework Core, here in Scala use Slick) and try to keep same arhitecture in Scala, but need some more informations, because on the internet I find a lot of versions, and can not choose the best.
As database, MySQL is used.
User.scala
case class User(
id: Int = 0,
userName: String,
firstName: String,
lastName: String
) {
override def equals(that: Any): Boolean = true
}
object User {
implicit object UserFormat extends Format[User] {
def writes(user: User): JsValue = {
val userSeq = Seq(
"id" -> JsNumber(user.id),
"userName" -> JsString(user.userName),
"firstName" -> JsString(user.firstName),
"lastName" -> JsString(user.lastName)
)
JsObject(userSeq)
}
def reads(json: JsValue): JsResult[User] = {
JsSuccess(User(
(json \ "id").as[Int].value,
(json \ "userName").as[String].value,
(json \ "firstName").as[String].value,
(json \ "lastName").as[String].value)
)
}
}
def tupled = (this.apply _).tupled
}
class UserMap #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)(implicit ex: ExecutionContext) {
val dbConfig: DatabaseConfig[JdbcProfile] = dbConfigProvider.get[JdbcProfile]
val db: JdbcBackend#DatabaseDef = dbConfig.db
val dbUsers = TableQuery[UserDef]
def getAll(): Unit = {
val action = sql"SELECT Id, UserName, FirstName, LastName FROM Users".as[(Int, String, String, String)]
return db.run(action)
}
def add(user: User): Future[Seq[User]] = {
dbUsers += user
db.run(dbUsers.result)
}
}
UserDef.scala (which is a mapper of db table / entity)
class UserDef(tag: Tag) extends Table[User](tag, "Users") {
def id = column[Int]("Id", O.PrimaryKey, O.AutoInc)
def userName = column[String]("UserName")
def firstName = column[String]("FirstName")
def lastName = column[String]("LastName")
override def * = (id, userName, firstName, lastName) <> (create, extract)
def create(user: (Int, String, String, String)): User = User(user._1, user._2, user._3, user._4)
def extract(user: User): Option[(Int, String, String, String)] = Some((user.id, user.userName,user.firstName,user.lastName))
}
UsersController.scala
def createUser = Action(parse.json) { implicit request => {
val userJson = request.body
var user = new User(
-1,
(userJson \ "userName").as[String].value,
(userJson \ "firstName").as[String].value,
(userJson \ "lastName").as[String].value
)
var users = TableQuery[UserDef]
Await.result(db.run(DBIO.seq(
users += user,
users.result.map(println))), Duration.Inf
)
Ok(Json.toJson(user))
}
}
How I see the problem:
UserDef is an Entity and must remain clean, only table columns definitions
UserMap is the bridge between User class and UserDef (entity), can be used as a repository with crud methods (getAll(), getById(id), create(user), update(user), delete(id)). This is in same file as User class, but probably must be moved in another.
User class is the model and need to contain only their parameters and writes/reads (Scala specifics)
and now in the controller:
If I try to insert a record into database, with current method, first I need to get all rows from table, and then to add the new record in the list. What happening if I have 3 4mil records in this table? Will get all these rows useless to insert only a new row.
Then, after inserting this new row, I need to return it into client, but how I can get it updated (Id is every time -1, but if I get entire list to see what it contain, I can see the correct id for the newest entity)
thx
Finally, I found a good solution and post it here, maybe somebody need this:
UserMap, for me at least will become UserRepository. There I have CRUD operations and maybe some extra :
def getAll(): Future[Seq[User]] = {
db.run(dbUsers.result)
}
def getById(id: Int): Future[Option[User]] ={
val action = dbUsers.filter(_.id === id).result.headOption
db.run(action)
}
def create(user: User): Future[User] = {
val insertQuery = dbUsers returning dbUsers.map(_.id) into ((x, id) => x.copy(id = id))
val action = insertQuery += user
db.run(action)
}
def update(user: User) {
Try( dbUsers.filter(_.id === user.id).update(user)) match {
case Success(response) => db.run(response)
case Failure(_) => println("An error occurred!")
}
}
def delete(id: Int) {
Try( dbUsers.filter(_.id === id).delete) match {
case Success(response) => db.run(response)
case Failure(_) => println("An error occurred!")
}
}
and UsersController:
def getAll() = Action {
var users = Await.result(usersRepository.getAll(), Duration.Inf)
Ok(Json.toJson(users))
}
def getById(id: Int) = Action { implicit request => {
val user = Await.result(usersRepository.getById(id), Duration.Inf)
Ok(Json.toJson(user))
}
}
def create = Action(parse.json) { implicit request => {
val userJson = request.body
var user = new User(
-1,
(userJson \ "userName").as[String].value,
(userJson \ "firstName").as[String].value,
(userJson \ "lastName").as[String].value
)
var createdUser = Await.result(usersRepository.create((user)), Duration.Inf)
Ok(Json.toJson(createdUser))
}
}
def update(id: Int) = Action(parse.json) { implicit request => {
val userJson = request.body
var user = new User(
(userJson \ "id").as[Int].value,
(userJson \ "userName").as[String].value,
(userJson \ "firstName").as[String].value,
(userJson \ "lastName").as[String].value
)
var updatedUser = usersRepository.update(user)
Ok(Json.toJson(user))
}
}
def delete(id: Int) = Action {
usersRepository.delete(id)
Ok("true")
}
Anyway, I know I have some bad blocks of code there...especially in create & update methods, where convert json to User.
I wanted to give it a try, and here is a full working example of a Play 2.7/Scala 2.13/Slick 4.0.2 REST-API controller bound to a MySQL database.
Since you are starting with Scala, maybe it is a bit overwhelming at first to get eased with Play, Slick, etc...
So here is an humble skeleton (derived from Play-Slick GitHub)
So first, since we want to write an API, here is the conf/routes file:
GET /users controllers.UserController.list()
GET /users/:uuid controllers.UserController.get(uuid: String)
POST /users controllers.UserController.create()
PUT /users controllers.UserController.update()
DELETE /users/:uuid controllers.UserController.delete(uuid: String)
Nothing to fancy here, we just bind routes to functions in the upcoming controller.
Just notice that the 2nd GET and the DELETE expect an UUID as query param, while Json bodies with be used for the POST and PUT.
It would be nice to see the model right now, in app/models/User.scala:
package models
import java.util.UUID
import play.api.libs.json.{Json, OFormat}
case class User(
uuid: UUID,
username: String,
firstName: String,
lastName: String
) {
}
object User {
// this is because defining a companion object shadows the case class function tupled
// see: https://stackoverflow.com/questions/22367092/using-tupled-method-when-companion-object-is-in-class
def tupled = (User.apply _).tupled
// provides implicit json mapping
implicit val format: OFormat[User] = Json.format[User]
}
I used an uuid instead using a numerical id, but basically, it is the same.
Notice that a Json serializer/deserializer can be written in just one line (you don't need to detail it with case classes). I think it is also a good practice to not override it to produce Seq as found on your code, since this serializer will be very usefull when converting objects to Json on the controller.
Now the tupled definition is most likelly a hack (see comment) that will be required later on the DAO...
Next, we need a controller in app/controllers/UserController.scala:
package controllers
import java.util.UUID
import forms.UserForm
import javax.inject.Inject
import play.api.Logger
import play.api.data.Form
import play.api.i18n.I18nSupport
import play.api.libs.json.Json
import play.api.mvc._
import services.UserService
import scala.concurrent.{ExecutionContext, Future}
import scala.util.{Failure, Success, Try}
class UserController #Inject()(userService: UserService)
(implicit ec: ExecutionContext) extends InjectedController with I18nSupport {
lazy val logger: Logger = Logger(getClass)
def create: Action[AnyContent] = Action.async { implicit request =>
withFormErrorHandling(UserForm.create, "create failed") { user =>
userService
.create(user)
.map(user => Created(Json.toJson(user)))
}
}
def update: Action[AnyContent] = Action.async { implicit request =>
withFormErrorHandling(UserForm.create, "update failed") { user =>
userService
.update(user)
.map(user => Ok(Json.toJson(user)))
}
}
def list: Action[AnyContent] = Action.async { implicit request =>
userService
.getAll()
.map(users => Ok(Json.toJson(users)))
}
def get(uuid: String): Action[AnyContent] = Action.async { implicit request =>
Try(UUID.fromString(uuid)) match {
case Success(uuid) =>
userService
.get(uuid)
.map(maybeUser => Ok(Json.toJson(maybeUser)))
case Failure(_) => Future.successful(BadRequest(""))
}
}
def delete(uuid: String): Action[AnyContent] = Action.async {
Try(UUID.fromString(uuid)) match {
case Success(uuid) =>
userService
.delete(uuid)
.map(_ => Ok(""))
case Failure(_) => Future.successful(BadRequest(""))
}
}
private def withFormErrorHandling[A](form: Form[A], onFailureMessage: String)
(block: A => Future[Result])
(implicit request: Request[AnyContent]): Future[Result] = {
form.bindFromRequest.fold(
errors => {
Future.successful(BadRequest(errors.errorsAsJson))
}, {
model =>
Try(block(model)) match {
case Failure(e) => {
logger.error(onFailureMessage, e)
Future.successful(InternalServerError)
}
case Success(eventualResult) => eventualResult.recover {
case e =>
logger.error(onFailureMessage, e)
InternalServerError
}
}
})
}
}
So here:
basically, each of our 5 functions referenced from the routes file check input, and then delegate the work to an injected UserService (more on that later)
for the create and update functions, you can see that we use Play Forms that I think is also a good practice. Their role is to validate the incoming Json, and that Marshall it into a User type.
Also, you can see that we use Action.async: Scala offers a very powerfull leverage with Futures so lets use it! Basically by doing so, you ensure that your code is not-blocking, thus easing the IOPS on your hardware.
Finally for the case of GET (one), GET (all), POST and PUT, since we return users, and have a deseralizer, a simple Json.toJson(user) do the work.
Before jumping to service and dao, lets see the form, in app/forms/UserForm.scala:
package forms
import java.util.UUID
import models.User
import play.api.data.Form
import play.api.data.Forms.{mapping, nonEmptyText, _}
object UserForm {
def create: Form[User] = Form(
mapping(
"uuid" -> default(uuid, UUID.randomUUID()),
"username" -> nonEmptyText,
"firstName" -> nonEmptyText,
"lastName" -> nonEmptyText,
)(User.apply)(User.unapply)
)
}
Nothing too fancy here, just as the doc says, although there is just a trick : when no uuid is defined (in the POST case, then we generate one).
Now, the service... not so much required in this very case, but in practice it might be a good thing to have an extra layer (dealing with acls for example), in app/services/UserService.scala:
package services
import java.util.UUID
import dao.UserDAO
import javax.inject.Inject
import models.User
import scala.concurrent.{ExecutionContext, Future}
class UserService #Inject()(dao: UserDAO)(implicit ex: ExecutionContext) {
def get(uuid: UUID): Future[Option[User]] = {
dao.get(uuid)
}
def getAll(): Future[Seq[User]] = {
dao.all()
}
def create(user: User): Future[User] = {
dao.insert(user)
}
def update(user: User): Future[User] = {
dao.update(user)
}
def delete(uuid: UUID): Future[Unit] = {
dao.delete(uuid)
}
}
As you can see, here, it is just a wrapper around the dao, and finnally the dao in app/dao/UserDao.scala:
package dao
import java.util.UUID
import javax.inject.Inject
import models.User
import play.api.db.slick.{DatabaseConfigProvider, HasDatabaseConfigProvider}
import play.db.NamedDatabase
import slick.jdbc.JdbcProfile
import scala.concurrent.{ExecutionContext, Future}
class UserDAO #Inject()(#NamedDatabase("mydb") protected val dbConfigProvider: DatabaseConfigProvider)(implicit executionContext: ExecutionContext) extends HasDatabaseConfigProvider[JdbcProfile] {
import profile.api._
private val users = TableQuery[UserTable]
def all(): Future[Seq[User]] = db.run(users.result)
def get(uuid: UUID): Future[Option[User]] = {
db.run(users.filter(_.uuid === uuid).result.headOption)
}
def insert(user: User): Future[User] = {
db.run(users += user).map(_ => user)
}
def update(user: User): Future[User] = {
db.run(users.filter(_.uuid === user.uuid).update(user)).map(_ => user)
}
def delete(uuid: UUID): Future[Unit] = {
db.run(users.filter(_.uuid === uuid).delete).map(_ => ())
}
private class UserTable(tag: Tag) extends Table[User](tag, "users") {
def uuid = column[UUID]("uuid", O.PrimaryKey)
def username = column[String]("username")
def firstName = column[String]("firstName")
def lastName = column[String]("lastName")
def * = (uuid, username, firstName, lastName) <> (User.tupled, User.unapply)
}
}
So, here I have just adapted the code from the official play-slick example, so I guess, I do not have better comment than theirs...
Hope, the whole things helps to get a better picture :)
If something is unclear, feel free to ask!
I have a Json type column in MySql and I am using Scala with Slick.
How can I Provide support for the Json Column via Slick.
class SampleTable(tag: Tag) extends Table[(String, ??)](tag, "test") {
override def * : ProvenShape[NodeReference] = (name, data)
def name: Rep[String] = column[String]("name", O.PrimaryKey)
def Data: Rep[??] = column[??]("data", O.PrimaryKey)
}
Any Help will be appreciated.
Thanks In Advance
You can use a BaseColumnType to convert your JSON to a String while writing it to the database and parsing it to JSON while reading from the database:
import play.api.libs.json.{JsValue, Json}
private implicit val jsValueMappedColumnType: BaseColumnType[JsValue] =
MappedColumnType.base[JsValue, String](Json.stringify, Json.parse)
class SampleTable(tag: Tag) extends Table[(String, JsValue)](tag, "test") {
def name: Rep[String] = column[String]("name", O.PrimaryKey)
def data: Rep[JsValue] = column[JsValue]("data", O.PrimaryKey)
def * = (name, data)
}
I have a MySQL table mapped as
class HealthReport(tag: Tag) extends Table[(Int, Int, Int, String, Timestamp, Blob)](tag, "LogProcessorHealthReport") {
def id = column[Int]("id")
def usersId = column[Int]("Users_Id")
def tenantId = column[Int]("Tenant_Id")
def ecId = column[String]("EcId")
def reportedOn = column[Timestamp]("ReportedOn")
def healthInfo = column[Blob]("HealthInfo")
def * = (id, usersId, tenantId, ecId, reportedOn, healthInfo)
}
I want to read the contents of healthInfo, how can I do that?
Thanks
Slick 2:
import slick.driver.MySQLDriver.simple._
val db = Database.forConfig("db")
val fourthHealthInfo: Option[Blob] = db withSession { implicit session =>
TableQuery[HealthReport].filter(_.id === 4).map(_.healthInfo).list.headOption
}
val healthInfos: List[Blob] = db withSession { implicit session =>
TableQuery[HealthReport].map(_.healthInfo).list
}
Slick 3:
import slick.driver.MySQLDriver.api._
val db = Database.forConfig("db")
val fourthHealthInfo: Future[Option[Blob]] = db run {
TableQuery[HealthReport].filter(_.id === 4).map(_.healthInfo).result.headOption
}
val healthInfos: Future[Seq[Blob]] = db run {
TableQuery[HealthReport].map(_.healthInfo).result
}
I have been using slick 2 as the dbms for my play app, though using play-slick plugin rather then slick independently, and added the tototoshi plugins as well, dependencies are:
"joda-time" % "joda-time" % "2.4"
"org.joda" % "joda-convert" % "1.6"
"com.github.tototoshi" %% "slick-joda-mapper" % "1.2.0"
"com.typesafe.play" %% "play-slick" % "0.6.1"
"com.typesafe.slick" %% "slick" % "2.0.3"
And the case class and projection are :
//Slick imports used
import play.api.db.slick.Config.driver.simple._
import play.api.db.slick.DB
import scala.slick.lifted.ProvenShape
import com.github.tototoshi.slick.MySQLJodaSupport._
case class Design(
var id: Int,
var imageName: String,
var title: String,
var creatorID: Int,
var flagged: Boolean,
var modifiedTimestamp: Instant,
var createdTimestamp: Instant) {
def this() = this(0, "", "", 0, false, DateTime.now.toInstant, DateTime.now.toInstant)
}
class DesignProjection(tag: Tag) extends Table[Design](tag, "designs_47") {
def id: Column[Int] = column[Int]("id", O.PrimaryKey, O.AutoInc)
def imageName: Column[String] = column[String]("des_image_name")
def title: Column[String] = column[String]("des_title")
def creatorID: Column[Int] = column[Int]("des_creator_id")
def flagged: Column[Boolean] = column[Boolean]("des_flagged_link")
def modifiedTimestamp: Column[Instant] = column[Instant]("tt_modified_timestamp")
def createdTimestamp: Column[Instant] = column[Instant]("tt_tweeted_timestamp")
def * : ProvenShape[Design] = (id, imageName, title, creatorID, flagged, modifiedTimestamp, createdTimestamp) <> (
((Design.apply _): (Int, String, String, Int, Boolean, Instant, Instant) => Design).tupled,
Design.unapply)
}
And when I try to list all rows using method :
def list: List[Design] = {
println("Start Listing")
val result = DB.withSession { implicit session =>
val res = designProjection.list <-error here
res
}
convertListResultSet(result)
}
I get [RuntimeException: java.lang.AbstractMethodError], well I am more then sure its cause of the DataTime class, but I really don't know whats going wrong. I have used the tototoshi plugins as well. All other cases where DateTime are considered, they are working fine.
Any help or pointers are really welcome. Thank you
It is a known issue.
https://github.com/tototoshi/slick-joda-mapper/issues/19
Could you try slick-joda-mapper 1.1.0 ?