Custom fields in Many2Many JoinTable - mysql

I have this model with a custom JoinTable:
type Person struct {
ID int
Name string
Addresses []Address `gorm:"many2many:person_addresses;"`
}
type Address struct {
ID uint
Name string
}
type PersonAddress struct {
PersonID int
AddressID int
Home bool
CreatedAt time.Time
DeletedAt gorm.DeletedAt
}
How is it possible to assign a value to the Home field when creating a new Person?

Method 1
From what I can see in the docs, here's a clean way you might currently do this:
DB.SetupJoinTable(&Person{}, "Addresses", &PersonAddress{})
addr1 := Address{Name: "addr1"}
DB.Create(&addr1)
addr2 := Address{Name: "addr2"}
DB.Create(&addr2)
person := Person{Name: "jinzhu"}
DB.Create(&person)
// Add an association with default values (i.e. Home = false)
DB.Model(&person).Association("Addresses").Append(&addr1)
// Add an association with custom values
DB.Create(&PersonAddress{
PersonID: person.ID,
AddressID: addr2.ID,
Home: true,
})
Here we're using the actual join table model to insert a row with the values we want.
We can also filter queries for the association:
addr := Address{}
// Query association with filters on join table
DB.Where("person_addresses.home = true").
Model(&person).
Association("Addresses").
Find(&addr)
Method 2
Here's a more magical way, by (ab)using the Context to pass values to a BeforeSave hook, in addition to the SetupJoinTable code from above:
func (pa *PersonAddress) BeforeSave(tx *gorm.DB) error {
home, ok := tx.Statement.Context.Value("home").(bool)
if ok {
pa.Home = home
}
return nil
}
// ...
DB.WithContext(context.WithValue(context.Background(), "home", true)).
Model(&person).
Association("Addresses").
Append(&addr2)
This method feels icky to me, but it works.

As you can find this point in the official documents of the GROM, you can implement some methods for each table(struct).
You can implement BeforeCreate() and/or AfterCreate() methods for your join table, gorm will check that method on time!
You can do anything inside those methods to achieve your goal.
here you will find the full documentation.
enjoy ;)

Related

JDBI select on varbinary and uuid

A legacy mysql db table has an id column that is non-human readable raw varbinary (don't ask me why :P)
CREATE TABLE IF NOT EXISTS `tbl_portfolio` (
`id` varbinary(16) NOT NULL,
`name` varchar(128) NOT NULL,
...
PRIMARY KEY (`id`)
);
and I need to select on it based on a java.util.UUID
jdbiReader
.withHandle<PortfolioData, JdbiException> { handle ->
handle
.createQuery(
"""
SELECT *
FROM tbl_portfolio
WHERE id = :id
"""
)
.bind("id", uuid) //mapping this uuid into the varbinary
//id db column is the problem
.mapTo(PortfolioData::class.java) //the mapper out does work
.firstOrNull()
}
just in case anyone wants to see it, here's the mapper out (but again, the mapper out is not the problem - binding the uuid to the varbinary id db column is)
class PortfolioDataMapper : RowMapper<PortfolioData> {
override fun map(
rs: ResultSet,
ctx: StatementContext
): PortfolioData = PortfolioData(
fromBytes(rs.getBytes("id")),
rs.getString("name"),
rs.getString("portfolio_idempotent_key")
)
private fun fromBytes(bytes: ByteArray): UUID {
val byteBuff = ByteBuffer.wrap(bytes)
val first = byteBuff.long
val second = byteBuff.long
return UUID(first, second)
}
}
I've tried all kinds of things to get the binding to work but no success - any advice much appreciated!
Finally got it to work, partly thanks to https://jdbi.org/#_argumentfactory which actually deals with UUID specifically but I somehow missed despite looking at JDBI docs for hours, oh well
The query can remain like this
jdbiReader
.withHandle<PortfolioData, JdbiException> { handle ->
handle
.createQuery(
"""
SELECT *
FROM tbl_portfolio
WHERE id = :id
"""
)
.bind("id", uuid)
.mapTo(PortfolioData::class.java)
.firstOrNull()
}
But jdbi needs a UUIDArgumentFactory registered
jdbi.registerArgument(UUIDArgumentFactory(VARBINARY))
where
class UUIDArgumentFactory(sqlType: Int) : AbstractArgumentFactory<UUID>(sqlType) {
override fun build(
value: UUID,
config: ConfigRegistry?
): Argument {
return UUIDArgument(value)
}
}
where
class UUIDArgument(private val value: UUID) : Argument {
companion object {
private const val UUID_SIZE = 16
}
#Throws(SQLException::class)
override fun apply(
position: Int,
statement: PreparedStatement,
ctx: StatementContext
) {
val bb = ByteBuffer.wrap(ByteArray(UUID_SIZE))
bb.putLong(value.mostSignificantBits)
bb.putLong(value.leastSignificantBits)
statement.setBytes(position, bb.array())
}
}
NOTE that registering an ArgumentFactory on the entire jdbi instance like this will make ALL UUID type arguments sent to .bind map to bytes which MAY not be what you want in case you elsewhere in your code base have other UUID arguments that are stored on the mysql end with something other than VARBINARY - eg, you may have another table with a column where your JVM UUID are actually stores as VARCHAR or whatever, in which case you'd have to, rather than registering the UUID ArgumentFactory on the entire jdbi instance, only use it ad hoc on individual queries where appropriate.

How to pass dynamic table name in gorm model

I am using Gorm ORM for my current application. I have one model correspondents to many tables with identical table structures(i.e. column name and type). So my requirement how can I change the table name dynamically while doing the query.
For e.g.
I have product model like Product.go
type Product struct{
ID int
Name strig
Quantity int
}
And we have different products like shirts, jeans and so on and we have same tables like shirts, jeans.
Now I wanted to query the product as per name of the product how can we do that also want to have table created through migrations. But there is only One model than how can we run use Automigrate feature with Gorm.
Updated for GORM v2
DEPRECATED: TableName will not allow dynamic table name anymore, its result will be cached for future uses.
There is a much easier way to create several tables using the same struct:
// Create table `shirts` & `jeans` with the same fields as in struct Product
db.Table("shirts").AutoMigrate(&Product{})
db.Table("jeans").AutoMigrate(&Product{})
// Query data from those tables
var shirts []Product
var jeans []Product
db.Table("shirts").Find(&shirts)
db.Table("jeans").Where("quantity > 0").Find(&shirts)
But, now on the second thought, I would suggest using the embedded struct so that you won't have to call Table in every query and you can also have additional fields per model while still sharing the same table structure.
type ProductBase struct {
ID int
Name strig
Quantity int
}
type Shirt struct {
ProductBase
NeckType string
}
type Jean struct {
ProductBase
Ripped bool
}
db.AutoMigrate(&Shirt{}, &Jean{})
shirt, jeans := Shirt{}, make([]Jean, 0)
db.Where("neck_type = ?", "Mandarin Collar").Last(&shirt)
db.Where("ripped").Find(&jeans)
Old answer for GORM v1
You're almost there by using table field inside the struct:
type Product struct{
ID int
Name strig
Quantity int
// private field, ignored from gorm
table string `gorm:"-"`
}
func (p Product) TableName() string {
// double check here, make sure the table does exist!!
if p.table != "" {
return p.table
}
return "products" // default table name
}
// for the AutoMigrate
db.AutoMigrate(&Product{table: "jeans"}, &Product{table: "skirts"}, &Product{})
// to do the query
prod := Product{table: "jeans"}
db.Where("quantity > 0").First(&prod)
Unfortunately, that does not work with db.Find() when you need to query for multiple records... The workaround is to specify your table before doing the query
prods := []*Product{}
db.Table("jeans").Where("quantity > 0").Find(&prods)

algorithm verifying data from user beween two tables then insert into another table

Greeting I need to get details from users, in those details the user has I have to validate all the User details validate this details with another table and if the date doesn’t match insert on the table but if it does match then don insert anything, this has to be done for all the users, the domains.
User{
String orderNumber
String dealer
Int UserKm
String dateUser
String adviser
Vehicle vehicle
String dateCreated
Date appointmentDate //this date has to be validated with DateNext
appointmentDate from Appointments domain of it doesn’t exit then you can
insert on that table.
}
Appointments{
User user
Date managementDate
Date lasDataApointies
DateNext appointmentDate
Date NextdAteAppointment
Date callDate
String observations
}
def result = User.executeQuery("""select new map(
mmt.id as id, mmt.orderNumber as orderNumber, mmt.dealer.dealer as
dealer, mmt.UserKm as UserKm, mmt.dateUser as dateUser, mmt.adviser as
adviser, mmt.technician as technician, mmt.vehicle.placa as vehicle,
mmt.dateCreated as dateCreated, mmt.currenKm as currenKm) from User as
mmt """)
def result1=result.groupBy{it.vehicle}
List detailsReslt=[]
result1?.each { SlasDataApointing placa, listing ->
def firsT = listing.first()
int firstKM = firsT.UserKm
def lasT = listing.last()
def lasDataApoint = lasT.id
int lastKM = lasT.UserKm
int NextAppointmentKM = lastKM + 5000
int dayBetweenLastAndNext = lastKM - NextAppointmentKM
def tiDur = getDifference(firsT.dateUser,lasT.dateUser)
int dayToInt = tiDur.days
int restar = firstKM - lastKM
int kmPerDay = restar.div(dayToInt)
int nextMaintenaceDays = dayBetweenLastAndNext.div(kmPerDay)
def nextAppointment = lasT.dateUser + nextMaintenaceDays
detailsReslt<<[placa:placa, nextAppointment:
nextAppointment, manageId:lasDataApoint, nextKmUser: NextAppointmentKM]
}
detailsReslt?.each {
Appointments addUserData = new Appointments()
addUserData.User = User.findById(it.manageId)
addUserData.managementDate = null
addUserData.NextdAteAppointment = null
addUserData.observations = null
addUserData.callDate = it.nextAppointment
addUserData.save(flush: true)
}
println "we now have ${detailsReslt}"
}
Based on the details that are not full and looking at the code I can suggest:
no need to do a query to map you can simply query the list of users and check all the properties like user.vehicle. in any case, you need to check each row.
the groupBy{it.vehicle} is not clear but if needed you can do it using createCriteria projections "groupProperty"
Create 2 service method one for iterating all users and one for each user:
validateAppointment(User user){
/* your validation logic */
....
if (validation term){
Appointments addUserData = new Appointments()
...
}
}
validateAppointments(){
List users = User. list()
users.each{User user
validateAppointment(user)
}
}
you can trigger the validateAppointments service from anywhere in the code or create a scheduled job so it will run automatically based on your needs.
if your list of user is big and also for efficiency you can do bulk update - take a look at my post about it: https://medium.com/meni-lubetkin/grails-bulk-updates-4d749f24cba1
I would suggest to create a Custom Validator using a Service, something like this:
class User{
def appointmentService
...
Date appointmentDate
static constraints = {
appointmentDate validator: { val, obj ->
obj.appointmentService.isDateAppointmentValid(obj.appointmentDate)
}
}
}
But keep in mind that validation may run more often than you think. It is triggered by the validate() and save() methods as you’d expect (as explained in the user guide (v3.1.15)). So I'm not sure if this scenario is the best way to validate àppointmentDate` in your domain, so you have to be careful about that.
Hope this help.

One to many relationship with gorm in golang doesnt work

I have two tables:
type Person struct {
ID int
FirstName string
LastName string
Functions []Function
}
type Function struct {
gorm.Model
Info string
Person Person
}
I create the tables like this:
db.AutoMigrate(&models.Person{}, &models.Function{})
I then initialize the database:
user := models.Person{
FirstName: "Isa",
LastName: "istcool",
Functions: []models.Function{{Info: "Trainer"}, {Info: "CEO"}},
}
db.Create(&user)
Now the problem is that my Person table only got Firstname and Lastname columns and my Function table only got the Info column.
But when I start my GET request I get people with the column function which is always null.
Here is a screenshot from my GET request and my db
To see the code visit my GitHub repo
Finally found the answer!!
The problem is my GET functions I have to use
db.Preload("Functions").Find(&[]models.Person{})
instead of
db.Find(&[]models.Person{})

How to detect a zeroed value of struct safely in Go?

I tried to written an simple ORM for MySQL, I got a problem, if a defined struct missing some field:
type User struct {
ID int64
Username string
Password string
Email string
Comment string
}
var u = User{Username: "user_0001", Password: "password"}
Some fields of User didn't given a value, then it's value will be a zeroed value, such as string "", bool false, integer 0 and so on.
So I am using reflect to get field name and value, generate a sql to insert row.
INSERT INTO User (Id, Username, Password, Email, Comment) Values (0, "user_0001", "password", , ,)
You can see there has some zeroed value of string, if I detect empty string "" and skip them, I may skip normal value.
To handle database columns that may be NULL, you can either use pointers as Friedrich Große suggests, or use the Null...-variants found in the db package, for instance sql.NullString.
To use the Null-variants, the struct would be
type User struct {
ID int64
Username string
Password string
Email sql.NullString
Comment sql.NullString
}
You can then detect of a value is set by checking NullString.Valid. The downside of using NullString is that you have to add special cases when printing or marshaling them, as they don't implement the Stringer interface, nor MarshalJSON or UnmarshalJSON. You also have to remember to set NullString.Valid manually when setting the value of the string.
For instance, a test like
func TestNullString(t *testing.T) {
s := sql.NullString{}
t.Log(s)
s.String = "Test"
s.Valid = true
t.Log(s)
}
Prints
null_string_test.go:21: { false}
null_string_test.go:25: {Test true}
To print a NullString you have to instead do
func TestNullString(t *testing.T) {
s := sql.NullString{}
t.Log(s.String)
s.String = "Test"
s.Value = true
t.Log(s.String)
}
Which prints:
null_string_test.go:21:
null_string_test.go:25: Test
You can always just insert an empty string into your table column. If it is truly important to know the difference between the zero value or complete absence of a value you would need to use pointers.
type User struct {
ID int64
Username *string
}
This changes the zero value to be nil so you can distinguish that from "".
The downside is that this makes this type less easy to use (nil checks are often forgotten in practice) and you have to dereference that field to get the actual string.
In your specific example I don't see why you need to worry about the empty value at all. Can't you just insert "" into the database and enforce validation (non-emptiness) in your code instead of using database constraints?