I have this frustrating problem when trying to enable code migrations to create DB schema on azure MySql DB i got the:
[DbConfigurationType(typeof(MySqlEFConfiguration))]
specified to my data context and also
var configuration = new App.Migrations.Configuration();
var migrator = new DbMigrator(configuration);
migrator.Update();
but a simple Table that contains only ID property and a String doesnt seem to work on Azure, it says
Table 'xxx' already exists, and when it doesnt it gives another error saying
Specified key was too long; max key length is 767 bytes
whats wrong with the MySQL and Code First schema generation?
thanks
Issue is likely...
Your Entity Code
Your Seed Code
Table probably does already exist
I would recommend opening the Visual Studio 2015 SQL Server Object explorer and running a few queries on the tables in question or just see if those tables exist and what data is in them.
I also do not like code first migrations, for these sorts of reasons. I would recommend following this tutorial series.
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Creating-a-Database-Project-for-Artificial-Intelligence
https://channel9.msdn.com/Blogs/Have-you-tried-turning-it-off-and-on-again/Deploying-Database-Projects-to-SQL-Azure
https://channel9.msdn.com/Blogs/raw-tech/AI-Part-3-Entity-Framework-and-Unit-Tests
Related
I have a .NET 6 application that is currently backed by MSSQL database, that is maintained by Entity Framework using a Model First approach. I am trying to migrate it to use a MySQL database backend, for a variety of reasons.
I have installed MySQL Locally (Windows) to start exploring and getting it working. I can migrate the schema easily enough (With either MySQL Workbench or using EF) but migrating the data is proving to be a little tricky.
Around half of the tables migrated fine, but the other half, relating to string data, are failing due to errors which look a little like this - the column obviously differs from table to table. The source data is nvarchar in SQL, and the destination is type `varchar'
Statement execution failed: Incorrect string value: '\xF0\x9F\x8E\xB1' for column 'AwayNote'
Does anyone know how I can get the Migration to run successfully?
The research I have read has said to ensure server and table character sets are aligned as per the below.
I have set up my Source as SQL using the ODBC FreeTDS
The data import screen is set up like this - the check box doesn;t seem to affect things especially.
I have MySQL setup with this too, which I have also read is important.
[mysql]
default-character-set = utf8mb4
I am trying to access my tables in Snowflake using MS Access. I am able to make the connection to make a connection between them and see the list of all my tables but the I am getting all my tables (from all Databases and Schema) even if I have specifically mentioned the Database and Schema to be accessed when creating the Data Source Network(DSN).
And when I try to open a table i get the message: "Cannot define field more than once."
The table which I am accessing has a copy under different database. But, the table name and schema name is same.
ex:
DATABASE_A.SCHEMA.TABLE1
DATABASE_B.SCHEMA.TABLE1
Does anyone has any idea how to resolve this issue?
I can confirm that if you have two Snowflake databases with the same table name you experience this problem. I have been beating my head against a wall and your question gave me the clue. I was able to delete my other database in Snowflake and the error that you described disappeared.
I have come across another problem though but that will be for another SO question.
Hey everyone I'm fairly new to Python Anywhere, I'm running a django app on python 3.7, I've manually added the tables and fields I need via SSH in MySQL Workbench and ran [migrate.py inpectdb > /app/models.py] then makemigrations then when I run Migrate I get this:
ERRORS: auth.Group.permissions: (fields.E340) The field's intermediary table 'auth_group_permissions' clashes with the table name of 'app.AuthGroupPermissions'. auth.User.groups: (fields.E340) The field's intermediary table 'auth_user_groups' clashes with the table name of 'app.AuthUserGroups'. auth.User.user_permissions: (fields.E340) The field's intermediary table 'auth_user_user_permissions' clashes with the table name of 'app.AuthUserUserPermissions'.
If I remove the auth tables from the models.py and try to migrate I get:
File "/usr/lib/python3.7/site-packages/MySQLdb/connections.py", line 276, in query _mysql.connection.query(self, query) django.db.utils.OperationalError: (1071, 'Specified key was too long; max key length is 767 bytes')
From what I've read it is conflicting with the settings.py [INSTALLED_APPS] but I'm not sure where to go from here to get the migration to work properly.
The extra tables that you noticed were created by Django because you have the app that creates those tables enabled in your INSTALLED_APPS. From the table names involved, I'm guessing it's django.contrib.auth that's adding them. There are probably other tables that are being created that way, but they are just not clashing with the tables you've already created.
The second error you're getting is because you have tried to create a key on a column (or columns) that is too big to be a key. That may still be as a result of the auth_ tables clashing. For instance, the Django model may be specifying a key on the id of a table, expecting it to be an integer column, but your database has a large string column for id instead.
I suspect that you may continue to have issues as long as you try to get the Django database and your database to be in the same database. Django does, however, support multiple databases so you could put your legacy database in one database and have your Django database in another. That way, they have no way on stepping on each other.
During my experimentation with my blog app (blogapp) in Django, I created two models (Category and Language), connected them to another model (Post) using following connections:
category = models.ManyToManyField(Category)
language = models.ForeignKey(Language)
Then it gave an error like THIS due to the lack of default value. Tried to roll that back by using an amalgam of THIS and THIS. Then I tried to add a default value using THIS. I've got an error "django.db.utils.OperationalError 1050, Table XXX already exists", then I tried THIS. Tried to revert back migrations by deleting the created migrations from the migrations folder manually. At some point I got django (1054, "Unknown column in 'field list") error.
Finally I decided to revert back to my original starting place. When I connect to my MySQL database using python manage.py dbshell, I realized that my MySQL server still have two tables that should have been deleted, blogapp_category and blogapp_language. Server is working properly but I keep getting "Table XXX already exists" error when I try to add those models.
Dropping tables from MySQL seems to be the only option at the moment.
When I run
mysql> SHOW COLUMNS FROM blogapp_post;
I did not see any reference to language or category, i.e. no columns named language_id or category_id. I have two questions at the moment:
Is it safe to delete tables manually using:
DROP TABLE blogapp_language;
DROP TABLE blogapp_category;
Will there be any negative effects?
Is there a way to freeze database like git so that when I revert to the old database, such tables added to the database by django migrations automatically dropped??
Delete respective entry from table django_migrations.
Delete migration folder from your app.
Delete table created by the app.
Now do makemigrations and migrate.
You can revert back using git but there will be errors and data correction requirements.
I am using the migration wizard provided by MySQL workbench 6.3 to convert a SQL Server database into a MySQL. I tested the connection between both DBs and they are valid for the migration wizard. Once The migration wizard has completed I am left with 22 migration warnings and they are all the same warning:
Truncated key column length for column 0 to 16
I am having a hard time finding any similarities between the tables that are receiving warnings to narrow down the issue. There are tables with the same types of data that are not receiving these errors.
Here is an example of one of the tables affected by this warning.
Does anyone know what is/what could be causing these migration warnings?
if you need more information/images please let me know.
Migration wizard show this warning when find index that have different length on source and target databases. In fact you should also get index name in that message - ... for column <name> from ..., but it's empty. I guess something goes wrong, but to investigate that I need to reproduce issue on my machine. Please fill bug report on bugs.mysql.com and attach there sample database (you can make it private if you wish). Then paste link here.
Warning doesn't matter. Just remember to rename the schema while migrating I have attached an image for better understanding
https://i.stack.imgur.com/v6PGK.png