Hey everyone I'm fairly new to Python Anywhere, I'm running a django app on python 3.7, I've manually added the tables and fields I need via SSH in MySQL Workbench and ran [migrate.py inpectdb > /app/models.py] then makemigrations then when I run Migrate I get this:
ERRORS: auth.Group.permissions: (fields.E340) The field's intermediary table 'auth_group_permissions' clashes with the table name of 'app.AuthGroupPermissions'. auth.User.groups: (fields.E340) The field's intermediary table 'auth_user_groups' clashes with the table name of 'app.AuthUserGroups'. auth.User.user_permissions: (fields.E340) The field's intermediary table 'auth_user_user_permissions' clashes with the table name of 'app.AuthUserUserPermissions'.
If I remove the auth tables from the models.py and try to migrate I get:
File "/usr/lib/python3.7/site-packages/MySQLdb/connections.py", line 276, in query _mysql.connection.query(self, query) django.db.utils.OperationalError: (1071, 'Specified key was too long; max key length is 767 bytes')
From what I've read it is conflicting with the settings.py [INSTALLED_APPS] but I'm not sure where to go from here to get the migration to work properly.
The extra tables that you noticed were created by Django because you have the app that creates those tables enabled in your INSTALLED_APPS. From the table names involved, I'm guessing it's django.contrib.auth that's adding them. There are probably other tables that are being created that way, but they are just not clashing with the tables you've already created.
The second error you're getting is because you have tried to create a key on a column (or columns) that is too big to be a key. That may still be as a result of the auth_ tables clashing. For instance, the Django model may be specifying a key on the id of a table, expecting it to be an integer column, but your database has a large string column for id instead.
I suspect that you may continue to have issues as long as you try to get the Django database and your database to be in the same database. Django does, however, support multiple databases so you could put your legacy database in one database and have your Django database in another. That way, they have no way on stepping on each other.
Related
During my experimentation with my blog app (blogapp) in Django, I created two models (Category and Language), connected them to another model (Post) using following connections:
category = models.ManyToManyField(Category)
language = models.ForeignKey(Language)
Then it gave an error like THIS due to the lack of default value. Tried to roll that back by using an amalgam of THIS and THIS. Then I tried to add a default value using THIS. I've got an error "django.db.utils.OperationalError 1050, Table XXX already exists", then I tried THIS. Tried to revert back migrations by deleting the created migrations from the migrations folder manually. At some point I got django (1054, "Unknown column in 'field list") error.
Finally I decided to revert back to my original starting place. When I connect to my MySQL database using python manage.py dbshell, I realized that my MySQL server still have two tables that should have been deleted, blogapp_category and blogapp_language. Server is working properly but I keep getting "Table XXX already exists" error when I try to add those models.
Dropping tables from MySQL seems to be the only option at the moment.
When I run
mysql> SHOW COLUMNS FROM blogapp_post;
I did not see any reference to language or category, i.e. no columns named language_id or category_id. I have two questions at the moment:
Is it safe to delete tables manually using:
DROP TABLE blogapp_language;
DROP TABLE blogapp_category;
Will there be any negative effects?
Is there a way to freeze database like git so that when I revert to the old database, such tables added to the database by django migrations automatically dropped??
Delete respective entry from table django_migrations.
Delete migration folder from your app.
Delete table created by the app.
Now do makemigrations and migrate.
You can revert back using git but there will be errors and data correction requirements.
I have this frustrating problem when trying to enable code migrations to create DB schema on azure MySql DB i got the:
[DbConfigurationType(typeof(MySqlEFConfiguration))]
specified to my data context and also
var configuration = new App.Migrations.Configuration();
var migrator = new DbMigrator(configuration);
migrator.Update();
but a simple Table that contains only ID property and a String doesnt seem to work on Azure, it says
Table 'xxx' already exists, and when it doesnt it gives another error saying
Specified key was too long; max key length is 767 bytes
whats wrong with the MySQL and Code First schema generation?
thanks
Issue is likely...
Your Entity Code
Your Seed Code
Table probably does already exist
I would recommend opening the Visual Studio 2015 SQL Server Object explorer and running a few queries on the tables in question or just see if those tables exist and what data is in them.
I also do not like code first migrations, for these sorts of reasons. I would recommend following this tutorial series.
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Creating-a-Database-Project-for-Artificial-Intelligence
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Deploying-Database-Projects-to-SQL-Azure
https://channel9.msdn.com/Blogs/raw-tech/AI-Part-3-Entity-Framework-and-Unit-Tests
I have to create a simple CRUD panel for an active MySQL database. When I try to migrate my application I receive the following error:
AssertionError: A model can't have more than one Autofield
I've read the following in The Django Book, Chapter 18:
Each generated model has an attribute for every field, including id
primary key fields. However, recall that Django automatically adds an
id primary key field if a model doesn’t have a primary key. Thus,
you’ll want to remove any lines that look like this:
id = models.IntegerField(primary_key=True) Not only are these lines
redundant, but also they can cause problems if your application will
be adding new records to these tables.
I have the same scenario with this field:
id_call = models.BigIntegerField(primary_key=True)
However, if I follow the above suggestion and remove this line, the original application (not the django application) using this table may not work properly because it could be calling data from this table using this id_call field.
How can I resolve this situation?
For me, by changing models.AutoField(unique=True) to models.AutoField(primary_key=True), while working with a Wordpress database.
I had about 4 of them in generated models.py by python manage.py --database='olddb' inspectdb > models.py. Here i used one more db, say olddb, if you use default db then you can remove --database='olddb'.
I tried running python manage.py runserver. So i fixed each line by things mentioned above.
Reference:-
https://docs.djangoproject.com/en/1.8/topics/db/models/#automatic-primary-key-fields
https://docs.djangoproject.com/en/1.8/topics/db/multi-db/#
I have a new client that is running a Symfony application with 170 or so MySQL tables. He recently updated his MySQL Workbench to the latest revision and is now getting a warning pop-up when he launched the application
FIX INDEX NAMES
Index names identical to FK names were found in the model, which is not allowed for MySQL5.5 and later. Would you like to rename the indexes?
I am not a DBA but I understand that the index and primary key names are clashing. What are the implications of renaming these indexes vs. just ignoring?
The Symfony app he is running uses the Doctrine ORM, would any queries or the model need to be updated should the indexes be renamed?
Using the InnoDB engine
Thanks
MySQL Workbench checks at opening a model if there are any duplicate index names and offers to rename them to be unique. Letting it doing this has not bad side effect. In fact it is even necessary to be able to apply the model to a server. Otherwise the server will refuse to create tables that contain an index with a name that was already taken.
So in short: it's a good idea to let Workbench fix this bug (since duplicate key names are nothing but a bug).
I'm migrating my Django database from sqlite to mysql. I've done the following with no problems:
python manage.py dumpdata > datadump.json
Change your settings.py to the mysql database.
But when I issue the following command python manage.py loaddata datadump.json I get this error:
IntegrityError: (1062, "Duplicate entry '13-13' for key 'from_category_id'")
Can someone tell me how to go about fixing this issue so that I can run the command again and hopefully load my data?
Thanks,
J.
Do you have existing data in theDB?,
try with indent --4 to get a version that you can eyeball
Post some sample data
It looks like you got a duplicate key violation or your data you are trying to insert does not match the column type i.e check the constraints applied in your models.py, field types, and the tables created in mysql
The question then will be why does it work in SQLITE and not my sql?
Simple really, SQLITE does not do any type checking, i.e you could easily insert text into an integer field. You will need to clean your data before inserting it into MySQL.
Unlike most SQL databases, SQLite does
not restrict the type of data that may
be inserted into a column based on the
columns declared type. Instead, SQLite
uses dynamic typing.
From http://www.sqlite.org/lang_createtable.html