I have a many-to-many relationship linked by an association table, like this:
left_right_association_table = Table("left_right_association_table", Base.metadata,
Column('leftside_id', Integer, ForeignKey('leftside_table.id')),
Column('rightside_id', Integer, ForeignKey('rightside_table.id'))
)
class LeftSide(Base):
...
rightside_members = relationship("RightSide", secondary=left_right_association_table, backref="leftside", lazy="joined")
...
class RightSide(Base):
...
Now I have a LeftSide instance, with a list of RightSide instances as its rightside_members attribute:
ls = LeftSide(**kw)
rs1 = RightSide(**kw)
rs2 = RightSide(**kw)
ls.rightside_members.append(rs1)
ls.rightside_members.append(rs2)
Then, I want to drop one of the list member: rs2.
Step 1: I reproduce the rightside_members list:
updated_rightside_members = [RightSide(**rs_attrs) for rs_attrs in ls.rightside_members] # each `rs` object already has primary key
updated_rightside_members.pop() # remove the second item from the list
Step 2: I retrieve the ls from the database:
old_ls = db_session.query(LeftSide).filter(LeftSide_id == ls.id).one()
Step 3: I tack the updated_rightside_members onto the old_ls:
old_ls.rightside_members = updated_rightside_members
Step 4: commit to database.
db_session.commit()
Then I get this Error:
sqlite3.IntegrityError: UNIQUE constraint failed: rightside_table.id
It seems to me that SQLAlchemy is thinking that I'm trying to put a duplicate rs1 back into the database, instead of deleting its brother rs2 from the database.
How should I do this deletion operation?
I've solved this problem. First I really want to point out that reading carefully the documentation helps. Using mapped class to initialize an instance doesn't mean the session knows about this object, at least not until you use Session.add() method.
So, the proper steps should be like this:
decide whether the incoming args contain a id which represent the instance primary key. If it does, then instead of initialize it, the right way to do is to query it from the database. This way the Session will ensure that the queried object has a unique object id in its identity map.
if it doesn't contain an id attribute, then initialize it, and add it to the member list.
Related
I've a model which is just a relation between two entities like this:
class SomeModel(models.Model):
a = OneToOneField(User,primary_key=True,...)
b = ForeignKey(Car,...)
As per design, it was correct, as I didn't want an User to have multiple Car. But in my new design, I want to accept multiple Car to an User. So I was trying something like this:
class SomeModel(models.Model):
class Meta:
unique_together = (("a", "b"),)
a = ForeignKey(User,...)
b = ForeignKey(Car,...)
But during migration, it asks me:
You are trying to add a non-nullable field 'id' to somenmodel without a default; we can't do that (the database needs something to populate existing rows).
Please select a fix:
1) Provide a one-off default now (will be set on all existing rows with a null value for this column)
2) Quit, and let me add a default in models.py
Select an option:
I just wanted to remove that foreign key from OneToOneRelation and add a new id combining both. - How to resolve this?
Delete the model and table from the database and create a new model from starting or add null=True to the given field. you must be having some data (rows) in your table and now you are creating a new column, so previous rows need something to be there in the newly created column, writing null = True will do that.
I have 5 records in the database and want to update all the values of a particular field called "ret_name".
For that, I have written the following code in views:
def retap_list(request, date_selected):
obj = ret_tbl.objects.filter(curr_date = date_selected)
obj[1].update(ret_name = "change")
print(obj[1].ret_name)
But it is showing me the following AttributeError:
"'ret_tbl' object has no attribute 'update'"
How should I update each row one at a time?
Actually, update method works on quesrysets not on the instances. So, you need to filter objects first and then:
objs = ret_tbl.objects.filter(curr_date=date_selected)
objs.update(ret_name="change")
All of the objects in queryset will be updated. If you need to update a particular object maybe it's a good idea to use save() instead.
I have models:
class Reference(models.Model):
name = models.CharField(max_length=50)
class Search(models.Model):
reference = models.ForeignKey(Reference)
update_time = models.DateTimeField(auto_now_add=True)
I have an instance of Reference and i need to get all last searches for the reference. Now i am doing it in this way:
record = Search.objects.filter(reference=reference)\
.aggregate(max_date=Max('update_time'))
if record:
update_time = record['max_date']
searches = reference.search_set.filter(update_time=self.update_time)
It is not a big deal to use 2 queries except the one but what if i need to get last searches for each reference on a page? I would have got 2x(count of references) queries and it would not be good.
I was trying to use this solution https://stackoverflow.com/a/9838438/293962 but it didn't work with filter by reference
You probably want to use the latest method.
From the docs, "Returns the latest object in the table, by date, using the field_name provided as the date field."
https://docs.djangoproject.com/en/1.8/ref/models/querysets/#latest
so your query would be
Search.objects.filter(reference=reference).latest('update_time')
I implemented a snippet from someone in gist but I don't remember the user neither have the link.
A bit of context:
I have a model named Medicion that contains the register of mensuration of a machine, machines are created in a model instance of Equipo, Medicion instances have besides of a Foreign key to Equipo, a foreign key to Odometro, this model serves as a kind of clock or metre, that's why when I want to retrieve data (measurements aka instances of Medicion model) for a certain machine, I need to indicate the clock as well, otherwise it would retrieve me a lot of messy and unreadable data.
Here is my implementation:
First I retrieve the last dates:
ult_fechas_reg = Medicion.objects.values('odometro').annotate(max_fecha=Max('fecha')).order_by()
Then I instance an Q object:
mega_statement = Q() # This works as 'AND' Sql Statement
Then looping in every date retrieved in the queryset(annotation) and establishing the Q statement:
for r in ult_fechas_reg:
mega_statement |= (Q(odometro__exact=r['odometro']) & Q(fecha=r['max_fecha']))
Finally passed this mega statement to the queryset that pursues to retrieve the last record of a model filtered by two fields:
resultados = Medicion.objects.filter(mega_query).filter(
equipo=equipo,
odometro__in=lista_odometros).order_by('odometro', 'fecha') # lista_odometros is a python list containing pks of another model, don't worry about it.
In our project, we are "cloning table data" and "subsequent tables" in the hierarchy. For this, in the domain objects, we are writing a "clone()" method, creating an object of the class and then setting the properties in the created object. What we are doing in code is like this (this is the clone object of a "dummy Book.groovy class" representing book table in database)
public Book clone(def newParent) {
def clonedBook = new Book(properties)
clonedBook.parent=newParent
clonedBook.<childObject>=<childObject>*.clone(clonedBook)
return clonedBook
}
The "clone()" method of Book is called from the clone method of the domain class of parent table, and it in turn calls "clone()" method of child domain class. In database the foreign keys are made as ON DELETE NO ACTION
So while running this way, I was getting integrity constraint violation exception, which got resolved by changing the foreign keys from "ON DELETE NO ACTION to ON DELETE CASCADE". Then I started getting "HibernateOptimisticLockingFailure exception". This was also solved by not passing 'properties' in constructor of "clonedBook", but setting the properties of Book.groovy explicitly in "clonedBook". Example, the "clone()" method is
public Book clone(def newParent) {
def clonedBook = new Book() //properties not passed here
clonedBook.name = name //explicitly set
clonedBook.price = price //explicitly set
...
}
But I was unable to find why the problem was solved this way, or what was wrong the previous way. I am using "Grails 2.4.4" and MySQL database
One of my models which has ForeignKey's is actually a MySQL view on other tables. The problem I'm running into is that when I delete data from these tables, Django, as described in the "deleting objects" documentation...
When Django deletes an object, it
emulates the behavior of the SQL
constraint ON DELETE CASCADE -- in
other words, any objects which had
foreign keys pointing at the object to
be deleted will be deleted along with
it.
...tries to remove rows from my view, which of course it can't, and so throws the error:
mysql_exceptions.OperationalError '>=(1395, "Can not delete from join view 'my_db.my_mysql_view'"'
Is there any way to specify a ForeignKey constraint on a model which will provide me with all the Django wizardry, but will not cascade deletes onto it? Or, is there a way to ask MySQL to ignore the commands to delete a row from my view instead of raising an error?
Django 1.3a1 and up support this via ForeignKey's on_delete argument.
The following example sets the field NULL upon deletion of the foreign key. See the documentation for more options.
user = models.ForeignKey(User, blank=True, null=True, on_delete=models.SET_NULL)
Harold's answer pointed me in the right direction. This is a sketch on the way I implemented it (on a french legacy database, hence the slightly odd naming convention):
class Factures(models.Model):
idFacture = models.IntegerField(primary_key=True)
idLettrage = models.ForeignKey('Lettrage', db_column='idLettrage', null=True, blank=True)
class Paiements(models.Model):
idPaiement = models.IntegerField(primary_key=True)
idLettrage = models.ForeignKey('Lettrage', db_column='idLettrage', null=True, blank=True)
class Lettrage(models.Model):
idLettrage = models.IntegerField(primary_key=True)
def delete(self):
"""Dettaches factures and paiements from current lettre before deleting"""
self.factures_set.clear()
self.paiements_set.clear()
super(Lettrage, self).delete()
Django's ForeignKey manager has a method called clear() that removes all objects from the related object set. Calling that first, then deleting your object should work. The dependent objects will have their foreign keys set to None (if allowed in your model).
A short description here:
http://docs.djangoproject.com/en/dev/topics/db/queries/#following-relationships-backward
FYI - a feature request for this exists in the django source repository at http://code.djangoproject.com/ticket/7539. It looks like this topic is getting some attention. Hopefully it will be included in future Django releases.
The ticket includes patches to Django's core to implement an "on_delete" optional parameter to models.ForeignKey(...) that lets you specify what happens when the pointed to Model is deleted, including turning off the default ON DELETE CASCADE behavior.
Well, looking at delete method
def delete(self):
assert self._get_pk_val() is not None, "%s object can't be deleted because its %s attribute is set to None." % (self._meta.object_name, self._meta.pk.attname)
# Find all the objects than need to be deleted.
seen_objs = CollectedObjects()
self._collect_sub_objects(seen_objs)
# Actually delete the objects.
delete_objects(seen_objs)
I'd say overriding delete should be enough...untested code would be
def delete(self):
assert self._get_pk_val() is not None, "%s object can't be deleted because its %s attribute is set to None." % (self._meta.object_name, self._meta.pk.attname)
# Find all the objects than need to be deleted.
seen_objs = CollectedObjects()
seen_objs.add(model=self.__class__, pk=self.pk, obj=self, parent_model=None)
# Actually delete the objects.
delete_objects(seen_objs)
One way is to call clear method before deleting, documentation here which basically "clears" the relationship.
One problem thought: it's not auto by itself. You can choose: call it every time you don't want cascade, or use the pre_delete signal to send clear before each delete, of course it'll give you problems when you DO want delete - cascade.
Or you can contribute to the django community and add the keyword argument to delete, maybe it'll be in django 1.3?:D
Re: http://code.djangoproject.com/ticket/7539
No attention as of Django 1.2.1, June 2010. I guess we need to "watch that space".