this is my model :
class zjm_model(models.Model):
a = models.CharField(max_length=36)
b = models.CharField(max_length=36)
and the table zjm_model has many data in my mysql , and now ,
i want to add a new field :
class zjm_model(models.Model):
a = models.CharField(max_length=36)
b = models.CharField(max_length=36)
c = models.CharField(max_length=36)
but , when i run manage.py syncdb , it show this :
No fixtures found.
so how can i to add a new field to my database ,
thanks
Database migrations are not built-in to Django, so you'll need to use a third party library. I highly recommend south.
south is very nice and all that, but if this is a very rare one-off thing then just fire up your favourite mysql tool and do something like: ALTER TABLE foo ADD COLUMN wotsit VARCHAR(100) - I can't remember the exact syntax...
But +1 for south.
You have to dump your previous data.
manage.py dumpdata > dump.json
And you can load your data after syncdb. ("c" column have to permit null)
manage.py loaddata dump.json
Related
On a django 1.11 application which uses mysql , I have 3 apps and in one of them I have a 'Country' model:
class Country(models.Model):
countryId = models.AutoField(primary_key=True, db_column='country_id')
name = models.CharField(max_length=100)
code = models.CharField(max_length=3)
class Meta:
db_table = 'country'
Whaen I try to makemigrations I get this error:
django.db.utils.ProgrammingError: (1146, "Table 'dbname.country' doesn't exist")
If I run making migration for another app which is not related to this model and its database table using ./manage.py makemigrations another_app, I still get this error.
I've had this problem and it's because I was initializing a default value somewhere in a model using... the database that I had just dropped. In a nutshell I had something like forms.ChoiceField(choices=get_some_data(),...) where get_some_data() used the database to retrieve some default values.
I wish you had posted the backtrace because in my case it's pretty obvious by looking at the backtrace that get_some_data() was using the orm (using something like somemodel.objetcs.filter(...)).
Somehow, Django thinks you've already created this table and are now trying to modify it, while in fact you've externally dropped the table and started over. If that's the case, delete all of the files in migrations folders belong to your apps and start over with ./manage.py makemigrations.
Review, if you have any dependencies, is possible same Model need the Model Country in the same app or other app like:
class OtherModel(models.Model):
country = models.ForeignKey(Country)
1.- If is True, you need to review if installed_apps in settings.py have the correct order of apps, if is in the same app, you need to declare first a Country app and then the dependents.
2.- If dependent is in the same app, the dependent Model need to be declared after Country model in models.py.
3.- Review if the error track on console talk about same erros on models.py or forms.py
4.- Review if when executing makemigrations and migrate is the correct order of apps: python manage.py makemirgations app_of_country, other_app_name
I have transfered my project from MySQL to PostgreSQL and tried to drop the column as result of previous issue, because after I removed the problematic column from models.py and saved. error didn't even disappear. Integer error transferring from MySQL to PostgreSQL
Tried both with and without quotes.
ALTER TABLE "UserProfile" DROP COLUMN how_many_new_notifications;
Or:
ALTER TABLE UserProfile DROP COLUMN how_many_new_notifications;
Getting the following:
ERROR: relation "UserProfile" does not exist
Here's a model, if helps:
class UserProfile(models.Model):
user = models.OneToOneField(User)
how_many_new_notifications = models.IntegerField(null=True,default=0)
User.profile = property(lambda u: UserProfile.objects.get_or_create(user=u)[0])
I supposed it might have something to do with mixed-case but I have found no solution through all similar questions.
Yes, Postgresql is a case aware database but django is smart enough to know that. It converts all field and it generally converts the model name to a lower case table name. However the real problem here is that your model name will be prefixed by the app name. generally django table names are like:
<appname>_<modelname>
You can find out what exactly it is by:
from myapp.models import UserProfile
print (UserProfile._meta.db_table)
Obviously this needs to be typed into the django shell, which is invoked by ./manage.py shell the result of this print statement is what you should use in your query.
Client: DataGrip
Database engine: PostgreSQL
For me this worked opening a new console, because apparently from the IDE cache it was not recognizing the table I had created.
Steps to operate with the tables of a database:
Database (Left side panel of the IDE) >
Double Click on PostgreSQL - #localhost >
Double Click on the name of the database >
Right click on public schema >
New > Console
GL
I want to create migrations to add columns from my rails application (not through rails g migration xxxx), while creating the migration I want to store the version number to the for the migration for later possible down operation.
The scenario is, I have an application with generalized tables and their fields. The application can be deployed for multiple customers. I want to provide a way to define custom fields in the table. Once the user selects/inputs desired data like table_name, field_name, data_type etc. I will be creating a new migration to add the field and store the version number somewhere in the database. This version number will be used to migrate:down in case the user decides to delete the field.
Is there any other better approach than this?
I have implemented this as below:
Depending upon the field_name and table_name I create a migration using:
def create_migration
field_name_for_db = field_name.gsub(' ', '_').downcase
migration_name = "add_column_#{self.field_name}_to_#{self.table_name}"
logger.info "cd #{Rails.root} && rails g migration #{migration_name} #{self.field_name}:string > #{Rails.root}/tmp/migration_details.txt && rake db:migrate"
system "cd #{Rails.root} && rails g migration #{migration_name} #{self.field_name}:string > #{Rails.root}/tmp/migration_details.txt && rake db:migrate"
migration_version = File.read("#{Rails.root}/tmp/migration_details.txt").split('/').last.split("_#{migration_name}").first
self.migration_name = migration_name
self.migration_version = migration_version
self.save
end
In this method I have redirected the output of create migration command to a file and retrieving migration number from the file and then storing it to the database.
I want to programatically generate ALTER TABLE statements in SQL Alchemy to add a new column to a table. The column to be added should take its definition from an existing mapped class.
So, given an SQL Alchemy Column instance, can I generate the SQL schema definition(s) I would need for ALTER TABLE ... ADD COLUMN ... and CREATE INDEX ...?
I've played at a Python prompt and been able to see a human-readable description of the data I'm after:
>>> DBChain.__table__.c.rName
Column('rName', String(length=40, convert_unicode=False, assert_unicode=None, unicode_error=None, _warn_on_bytestring=False), table=<Chain>)
When I call engine.create_all() the debug log includes the SQL statements I'm looking to generate:
CREATE TABLE "Chain" (
...
"rName" VARCHAR(40),
...
)
CREATE INDEX "ix_Chain_rName" ON "Chain" ("rName")
I've heard of sqlalchemy-migrate, but that seems to be built around static changes and I'm looking to dynamically generate schema-changes.
(I'm not interested in defending this design, I'm just looking for a dialect-portable way to add a column to an existing table.)
After tracing engine.create_all() with a debugger I've discovered a possible answer:
>>> engine.dialect.ddl_compiler(
... engine.dialect,
... DBChain.__table__.c.rName ) \
... .get_column_specification(
... DBChain.__table__.c.rName )
'"rName" VARCHAR(40)'
The index can be created with:
sColumnElement = DBChain.__table__.c.rName
if sColumnElement.index:
sIndex = sa.schema.Index(
"ix_%s_%s" % (rTableName, sColumnElement.name),
sColumnElement,
unique=sColumnElement.unique)
sIndex.create(engine)
I'm getting an error when I'm trying to dump data to a JSON fixture in Djanog 1.2.1 on my live server. On the live server it's running MySQL Server version 5.0.77 and I imported a lot of data to my tables using the phpMyAdmin interface. The website works fine and Django admin responds as normal. But when I try and actually dump the data of the application that corresponds to the tables I get this error:
$ python manage.py dumpdata --indent=2 gigs > fixtures/gigs_100914.json
/usr/local/lib/python2.6/site-packages/MySQLdb/__init__.py:34: DeprecationWarning: the sets module is deprecated
from sets import ImmutableSet
Error: Unable to serialize database: Location matching query does not exist.
My Django model for 'gigs' that I'm trying to dump from looks like this in the models.py file:
from datetime import datetime
from django.db import models
class Location(models.Model):
name = models.CharField(max_length=120, blank=True, null=True)
class Meta:
ordering = ['name']
def __unicode__(self):
return "%s (%s)" % (self.name, self.pk)
class Venue(models.Model):
name = models.CharField(max_length=120, blank=True, null=True)
contact = models.CharField(max_length=250, blank=True, null=True)
url = models.URLField(max_length=60, verify_exists=False, blank=True, null=True) # because of single thread problems, I left this off (http://docs.djangoproject.com/en/dev/ref/models/fields/#django.db.models.URLField.verify_exists)
class Meta:
ordering = ['name']
def __unicode__(self):
return "%s (%s)" % (self.name, self.pk)
class Gig(models.Model):
date = models.DateField(blank=True, null=True)
details = models.CharField(max_length=250, blank=True, null=True)
location = models.ForeignKey(Location)
venue = models.ForeignKey(Venue)
class Meta:
get_latest_by = 'date'
ordering = ['-date']
def __unicode__(self):
return u"%s on %s at %s" % (self.location.name, self.date, self.venue.name)
Like I say, Django is fine with the data. The site works fine and the relationships seem to operate absolutely fine. When a run the command to get what SQL Django is using:
$ python manage.py sql gigs
/usr/local/lib/python2.6/site-packages/MySQLdb/__init__.py:34: DeprecationWarning: the sets module is deprecated
from sets import ImmutableSet
BEGIN;CREATE TABLE `gigs_location` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`name` varchar(120)
)
;
CREATE TABLE `gigs_venue` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`name` varchar(120),
`contact` varchar(250),
`url` varchar(60)
)
;
CREATE TABLE `gigs_gig` (
`id` integer AUTO_INCREMENT NOT NULL PRIMARY KEY,
`date` date,
`details` varchar(250),
`location_id` integer NOT NULL,
`venue_id` integer NOT NULL
)
;
ALTER TABLE `gigs_gig` ADD CONSTRAINT `venue_id_refs_id_3d901b6d` FOREIGN KEY (`venue_id`) REFERENCES `gigs_venue` (`id`);
ALTER TABLE `gigs_gig` ADD CONSTRAINT `location_id_refs_id_2f8d7a0` FOREIGN KEY (`location_id`) REFERENCES `gigs_location` (`id`);COMMIT;
I've triple checked the data, gone through to make sure all the relationships and data is ok after importing. But I'm still getting this error, three days on... I'm stuck with what to do about it. I can't imagine the "DeprecationWarning" is going to be a problem here. I really need to dump this data back out as JSON.
Many thanks for any help at all.
Could be something similar to this.
Run it with:
python manage.py dumpdata --indent=2 -v 2 --traceback gigs
To see the underlying error.
I once ran in a similar problem where the error message was as mesmerizing as yours. The cause was a lack of memory on my server. It seems that generating dumps in json is quite memory expensive. I had only 60meg of memory (at djangohosting.ch) and it was not enough to get a dump for a mysql DB for which the mysql dump was only 1meg.
I was able to find out by watching the python process hit the 60meg limit using the top command in a second command line while running manage.py dumpdata in a first one.
My solution : get the mysql dump and then load it on my desktop pc, before generating the json dump. That said, for backup purposes, the mysql dumps are enough.
The command to get a mysql dump is the following :
mysqldump -p [password] -u [username] [database_name] > [dump_file_name].sql
That said, your problem could be completely different. You should really look at every table that has a foreign key to your Location table, and check if there is no field pointing to a previously deleted location. Unfortunately MySQL is very bad at maintaining Referential integrity, and you cannot count on it.
you can --exclude that particular app which is creating problem , still there will be database tables , it worked for me
python manage.py dumpdata > backedup_data.json --exclude app_name
This error shows because there's a mismatch between your DB's schema and your Models.
You can try find it manually or you could just go ahead and install django-extensions
pip install django-extensions
and use the sqldiff command which will print you exactly wheres the problem.
python manage.py sqldiff -a -t
First and foremost, make your models match what your db has. Then run migrations and a fake migrate:
python manage.py makemigrations && python manage.py migrate --fake
That alone should let you run a dump. As soon as django makes sure the DB's schema matches your models, it will let you.
Moving forward, you can update your models and re-run the migrations as usual:
python manage.py makemigrations && python manage.py migrate