I am using the django admin to modify records in a table. The problem is that whenever I modify an entry, when I click save, instead of modifying that entry, the old one is not modified and a new entry containing the modified details is being added.
For example, if I have the following:
Aardvark | Orycteropus | Some description | aardvark | animals/images/aardvark.jpg
when I change the first field to Aardvarkon, I get the following:
Aardvark | Orycteropus | Some description | aardvark | animals/images/aardvark.jpg
Aardvarkon | Orycteropus | Some description | aardvark | animals/images/aardvark.jpg
I have the following django model:
def article_file_name(instance, filename):
return ANIMAL_IMAGES_BASE_DIR[1:] + instance.ai_species_species_sanitized + '.jpg'
class ai_species(models.Model):
ai_species_species = models.CharField('Species', max_length=100, primary_key=True, db_column='species')
ai_species_genus = models.ForeignKey(ai_genera, max_length=50, blank=False, null=False, db_column='genus')
ai_species_description = models.CharField('Description', max_length=65000, db_column='description')
ai_species_species_sanitized = models.CharField(max_length=100, blank=False, null=False, db_column='species_sanitized')
image_url = models.ImageField(max_length=100, storage=OverwriteStorage(), validators=[validate_jpg_extension], upload_to=article_file_name)
class Meta:
db_table = 'Species'
verbose_name = 'Animal species'
verbose_name_plural = 'Animal species'
def __unicode__(self): # Required, don't remove.
return self.ai_species_species
And the following helpers:
def validate_jpg_extension(value):
if not value.name.lower().endswith('.jpg') and not value.name.lower().endswith('.jpeg'):
raise ValidationError(u'Invalid file format! Only jpg or jpeg files allowed!')
class OverwriteStorage(FileSystemStorage):
def get_available_name(self, name):
# If the filename already exists, remove it.
if self.exists(name):
os.remove(os.path.join(settings.MEDIA_ROOT, name))
return name
This is the MySQL table schema for this table:
This is a very counter-intuitive behavior and I haven't found any other occurrences of this online. Any help on this would be greatly appreciated.
Here's the culprit:
ai_species_species = models.CharField('Species', max_length=100, primary_key=True, db_column='species')
Since you've defined the species as the primary key, any time you change this field in the admin it will create a new record (because there isn't already a record with that primary key).
FYI, primary keys aren't supposed to be things that change for a given record, since changing the primary key will invalidate every foreign key (ForeignKey, OneToOneField, and ManyToManyField) that refers to the record.
BTW, you don't need to be prefixing the field names with ai_species_; it's cluttering. Removing those prefixes would remove the need for the db_column parameters as well.
Related
This question already has answers here:
Unique constraint that allows empty values in MySQL
(3 answers)
Closed 2 years ago.
I have a model with a custom _id that has to be unique, and soft delete, deleted objects don't have to have a unique _id, so I did it as follows:
class MyModel(models.Model):
_id = models.CharField(max_length=255, db_index=True)
event_code = models.CharField(max_length=1, blank=True, default='I')
deleted = models.BooleanField(default=False)
deleted_id = models.IntegerField(blank=True, null=True)
objects = MyModelManager() # manager that filters out deleted objects
all_objects = MyModelBaseManager() # manager that returns every object, including deleted ones
class Meta:
constraints = [
UniqueConstraint(fields=['_id', 'event_code', 'deleted', 'deleted_id'], name='unique_id')
]
def delete(self, *args, **kwargs):
self.deleted = True
self.deleted_id = self.max_related_deleted_id() + 1
self.save()
def undelete(self, *args, **kwargs):
self.deleted = False
self.deleted_id = None
self.save()
def max_related_deleted_id(self):
# Get max deleted_id of deleted objects with the same _id
max_deleted_id = MyModel.all_objects.filter(Q(_id=self._id) & ~Q(pk=self.pk) & Q(deleted=True)).aggregate(Max('deleted_id'))['deleted_id__max']
return max_deleted_id if max_deleted_id is not None else 0
The whole logic of the deleted_id is working, I tested it out, the problem is, the UniqueConstraint is not working, for example:
$ MyModel.objects.create(_id='A', event_code='A')
$ MyModel.objects.create(_id='A', event_code='A')
$ MyModel.objects.create(_id='A', event_code='A')
$ MyModel.objects.filter(_id='A').values('pk', '_id', 'event_code', 'deleted', 'deleted_id')
[{'_id': 'A',
'deleted': False,
'deleted_id': None,
'event_code': 'A',
'pk': 1},
{'_id': 'A',
'deleted': False,
'deleted_id': None,
'event_code': 'A',
'pk': 2},
{'_id': 'A',
'deleted': False,
'deleted_id': None,
'event_code': 'A',
'pk': 3}]
Here is the migration that created the unique constraint:
$ python manage.py sqlmigrate myapp 0003
BEGIN;
--
-- Create constraint unique_id on model MyModel
--
ALTER TABLE `myapp_mymodel` ADD CONSTRAINT `unique_id` UNIQUE (`_id`, `event_code`, `deleted`, `deleted_id`);
COMMIT;
Any help is appreciated!
Django version = 2.2
Python version = 3.7
Database = MySQL 5.7
Ok, I figured out my problem, I'm gonna post this here in case someone runs into it:
The problem is with MySQL, as stated in this post, mysql allows multiple null values in a unique constraint, so I had to change the default of deleted_id to 0 and now it works.
I have an abstract model:
class ChronoModel(models.Model):
created = models.DateTimeField(
u"Create time",
auto_now_add=True,
db_index=True
)
modified = models.DateTimeField(
u"Last change time",
auto_now_add=True,
db_index=True
)
class Meta(object):
abstract = True
ordering = ('-created', )
And I have several models inherited from ChronoModel. My problem is same for all of them - for example one of this models:
class BatchSession(ChronoModel):
spent_seconds = models.BigIntegerField(
u"spent_seconds", default=0, editable=False)
max_seconds = models.BigIntegerField(
u"max_seconds", null=True, blank=True)
comment = models.CharField(
u"comment", max_length=255, null=True, blank=False,
unique=True)
class Meta(ChronoModel.Meta):
verbose_name = u'Session'
verbose_name_plural = u'Sessions'
ordering = ('-modified',)
db_table = 'check_batchsession'
def __unicode__(self):
return u'#{}, {}/{} sec'.format(
self.id, self.spent_seconds, self.max_seconds)
After creating and applying migration there is not index on fields "created" and "modified"
Command
python manage.py sqlmigrate app_name 0001 | grep INDEX
Shows me
BEGIN;
....
CREATE INDEX `check_batchsession_e2fa5388` ON `check_batchsession` (`created`);
CREATE INDEX `check_batchsession_9ae73c65` ON `check_batchsession` (`modified`);
....
COMMIT;
But mysql returns me:
mysql> SHOW INDEX FROM check_batchsession;
+--------------------+------------+--------------------------------------------------+--------------+-------------+
| Table | Non_unique | Key_name | Seq_in_index | Column_name |
+--------------------+------------+--------------------------------------------------+--------------+-------------+
| check_batchsession | 0 | PRIMARY | 1 | id |
| check_batchsession | 0 | check_batchsession_comment_558191ed0a395dfa_uniq | 1 | comment |
+--------------------+------------+--------------------------------------------------+--------------+-------------+
2 rows in set (0,00 sec)
How can I resolve it?
Django 1.8.18
MySQL 5.6
It was a Django South trouble.
I don't know what's happened, by all my indexes wasn't created. (If someone know - pleas write in comments)
My solution:
1) remove db_index=True from all fields in ChronoModel
2) makemigrations
3) migrate
3) add db_index=True to all all fields in ChronoModel
4) makemigrations
5) migrate
All my indexes was restored
I am inserting a list of python dictionaries into a Postgres database using SQL Alchemy (via Flask_sqlalchemy).
One of the tables is a list of all unique items (table 1), while the second is a time series of data related to an item (table2).
In essence, I want to insert any new row (with unique hash) in to table 1, then insert it's data to table 2. If it already exists in table 1, just insert the "child" in table 2 referencing the entry in table 1.
This is one item in the list, the list has a few hundred of these.
{'placement': '5662448s608653114', 't1': datetime.datetime(2018, 4, 15, 17, 47, 7, 434982), 't2': datetime.datetime(2018, 4, 25, 17, 47, 7, 434994), 'camp_id': 1, 'clicks': '0', 'visits': '3', 'conversions': '0', 'revenue': '0'}
I would like to insert 5662448s608653114 into table1, and then insert all the other data into table2, where i reference the item not by 5662448s608653114, but by it's id in table 1
So I'd get:
Table 1:
____________________
1| 5662448s608653114
2| 5520103
Table 2:
ID | Pl id | T1 | T2 | cost | revenue | clicks
_______________________________________________
499| 1 |
500| 2 |
I tested this, which does not work:
def write_tracker_data(self):
for item in self.data:
ts = Placements(placement_ts_hash=item["placement"])
pl = TrackerPlacementData(placement_id=ts.id, t1=item["t1"], t2=item["t2"], camp_id=1, revenue=item["revenue"], clicks=item["clicks"], conversions=item["conversions"])
db.session.add(pl)
db.session.commit()
The above code inserts the data, but with none instead of the id from the Table 1. It also doesn't seem very efficient, you know that feeling when something can definitely be done a better way ...
Here's the model classes for reference:
class Placements(db.Model):
id = db.Column(db.Integer, primary_key=True)
traffic_source = db.Column(db.Integer, db.ForeignKey('ts_types.id'))
placement_ts_hash = db.Column(db.String, index=True)
placement_url = db.Column(db.String)
placement_type = db.Column(db.String)
# Relationship betwwen unique placement table and tracker_placeemnt_data
tracker_data = db.relationship("TrackerPlacementData", backref="placement_hash")
class TrackerPlacementData(db.Model):
id = db.Column(db.Integer, primary_key=True)
t1 = db.Column(db.DateTime(timezone=True))
t2 = db.Column(db.DateTime(timezone=True), index=True)
camp_id = db.Column(db.Integer, db.ForeignKey('campaigns.id'), nullable=False)
placement_id = db.Column(db.Integer, db.ForeignKey('placements.id'), nullable=True, index=True)
revenue = db.Column(db.Float)
clicks = db.Column(db.Integer)
conversions = db.Column(db.Integer)
Thanks in advance.
Edit: This works, but it doesn't seem very good due to a new session for every item in the loop :/
def write_tracker_data(self):
for item in self.data:
ts = Placements(placement_ts_hash=item["placement"])
db.session.add(ts)
db.session.commit()
pl = TrackerPlacementData(placement_hash=ts, t1=item["t1"], t2=item["t2"], camp_id=1,
revenue=item["revenue"], clicks=item["clicks"], conversions=item["conversions"])
db.session.add(pl)
db.session.commit()
Your Placement instance won't have an id until it is committed. This is where the tracker_data relationship can help you...
for item in self.data:
ts = Placements(placement_ts_hash=item["placement"])
pl = TrackerPlacementData(
t1=item["t1"],
t2=item["t2"],
camp_id=1,
revenue=item["revenue"],
clicks=item["clicks"],
conversions=item["conversions"]
)
ts.tracker_data.append(pl)
db.session.add(ts)
db.session.commit()
Notice that pl.placement_id is not set to anything. Instead pl is appended to ts.tracker_data and everything should be looked after for you when you call commit.
I am trying to get an SQLAlchemy ORM class to automatically:
either lookup the foreign key id for a field
OR
for entries where the field isn't yet in foreign key table, add the row to the foreign key table - and use the auto generated id in the original table.
To illustrate:
Class Definition
class EquityDB_Base(object):
#declared_attr
def __tablename__(cls):
return cls.__name__.lower()
__table_args__ = {'mysql_engine': 'InnoDB'}
__mapper_args__= {'always_refresh': True}
id = Column(Integer, primary_key=True)
def fk(tablename, nullable=False):
return Column("%s_id" % tablename, Integer,
ForeignKey("%s.id" % tablename),
nullable=nullable)
class Sector(EquityDB_Base, Base):
name = Column(String(40))
class Industry(EquityDB_Base, Base):
name = Column(String(40))
sector_id = fk('sector')
sector = relationship('Sector', backref='industries')
class Equity(EquityDB_Base, Base):
symbol = Column(String(10), primary_key=True)
name = Column(String(40))
industry_id = fk('industry')
industry = relationship('Industry', backref='industries')
Using the Class to Set Industry and Sector
for i in industry_record[]:
industry = Industry(id=i.id,
name=i.name,
sector=Sector(name=i.sector_name))
session.merge(industry)
Result
Unfortunately, when I run this - the database adds individual rows to the sector table for each duplicate use of 'sector_name' - for instance, if 10 industries use 'Technology' as their sector name, I get 10 unique sector_id for each one of the 10 industries.
What I WANT - is for each time a sector name is presented that is already in the database, for it to auto-resolve to the appropriate sector_id
I am clearly just learning SQLAlchemy, but can't seem to figure out how to enable this behavior.
Any help would be appreciated!
See answer to a similar question create_or_get entry in a table.
Applying the same logic, you would have something like this:
def create_or_get_sector(sector_name):
obj = session.query(Sector).filter(Sector.name == sector_name).first()
if not obj:
obj = Sector(name = sector_name)
session.add(obj)
return obj
and use it like below:
for i in industry_record[:]:
industry = Industry(id=i.id,
name=i.name,
sector=create_or_get_sector(sector_name=i.sector_name))
session.merge(industry)
One thing you should be careful about is which session instance is used there in the create_or_get_sector.
In my database I want to synchronize two tables. I use auth_user(Default table provided by Django) table for registration and there was another table user-profile that contain entities username, email, age etc. During the synchronization how to update Foriegn key?
def get_filename(instance,filename):
return "upload_files/%s_%s" % (str(time()).replace('.','_'),filename)
def create_profile(sender, **kwargs):
if kwargs["created"]:
p = profile(username = kwargs["instance"], email=kwargs["instance"])
p.save()
models.signals.post_save.connect(create_profile, sender=User)
class profile(models.Model):
username = models.CharField(max_length = 30)
email = models.EmailField()
age = models.PositiveIntegerField(default='15')
picture = models.FileField(upload_to='get_filename')
auth_user_id = models.ForeignKey(User)
Here in table profile during synchronization all columns are filled except auth_user_id. and there was an error
Exception Value:
(1048, "Column 'auth_user_id_id' cannot be null")
You have to alter your table and change the column auth_user_id_id datatype attribute that allows null.
Something like this:-
ALTER TABLE mytable MODIFY auth_user_id_id int;
Assuming auth_user_id_id as int datatype.(Columns are nullable by default)