Rails creating schema_migrations - Mysql2::Error: Specified key was too long - mysql

I am using Rails 3.2.6 and Mysql 6.0.9 (but I have exactly the same error on MySQL 5.2.25)
When I create new database (rake db:create) and then when I try to load the schema (rake schema:load) I get this error:
Mysql2::Error: Specified key was too long; max key length is 767 bytes: CREATE UNIQUE INDEX `unique_schema_migrations` ON `schema_migrations` (`version`)
After hours and hours of research I found these solutions:
1. Change MySQL variable innodb_large_prefix to true (or ON)
This didn't work. I tried it on my Linux server, my Mac and even on Windows - it just doesn't work.
2. Monkeypatch ActiveRecord::SchemaMigration.create_table
I do not need the version column to be 255 long (when it is UTF-8, then it takes 4*255 = 1020 bytes and exceeds the MySQL limit of 767 byte for keys). I do not need it to be UTF-8 either, but all other tables in the DB are UTF-8 and I have set utf8_czech_ci to be the default collation.
The method that actually creates the schema_migrations table looks like this:
def self.create_table
unless connection.table_exists?(table_name)
connection.create_table(table_name, :id => false) do |t|
t.column :version, :string, :null => false
end
connection.add_index table_name, :version, :unique => true, :name => index_name
end
end
You can read the whole file on Github rails/rails
So I tried to add :limit => 100 to the t.column statement, but I did not succeed with this solution either. The problem is that I cannot make this patch load when the originial is already in place. In other words - my patch loads before ActiveRecord::SchemaMigration so it is overwritten.
When I put this in config/initializers/patches/schema_migration.rb:
require 'active_record/scoping/default'
require 'active_record/scoping/named'
require 'active_record/base'
module ActiveRecord
class SchemaMigration < ActiveRecord::Base
def self.create_table
unless connection.table_exists?(table_name)
connection.create_table(table_name, :id => false) do |t|
t.column :version, :string, :null => false, :limit => 100
end
connection.add_index table_name, :version, :unique => true, :name => index_name
end
end
end
end
It is successfully loaded, but the it is overwritten when the original ActiveRecord::SchemaMigration is loaded.
I tried to mess up with ActiveSupport.on_load(:active_record) but that doesn't seem to work either.
Is there a way to load this file after the originial ActiveRecord::SchemaMigration is in place and make this patch work?
Do you have any suggestions? I can clarify any part of this question, if it makes no sense to you. Just ask me. I've been stuck with this for too long.

767 key should work. Make sure you use utf8 encoding, and not utf16.
I had same problem, and my mistake was that I accidently created utf16 database

I suggest you to drop your database and recreate a new one with the following instructions :
mysql -u root -p -e "CREATE DATABASE {DB_NAME} DEFAULT CHARACTER SET utf8 DEFAULT COLLATE utf8_general_ci;"

I have the same problem with a column named version for varchar of length 2000
class AddVersionToUsers < ActiveRecord::Migration
def change
add_column :users, :version, :string, limit:2000
add_index :users, :version
end
end
I was using this latin 1 1 character 1 byte, but now I want to use utf8mb4 1 character 4 bytes.
Configuring your databse like this you can get index until 3072 bytes:
docker run -p 3309:3306 --name test-mariadb -e MYSQL_ROOT_PASSWORD=Cal1mero. -d mariadb:10.2 --character-set-server=utf8mb4 --collation-server=utf8mb4_unicode_ci --innodb-large-prefix=1 --innodb-file-format=barracuda --innodb-file-per-table=1 --innodb-strict-mode=1 --innodb-default-row-format=dynamic
this is enough for latin_1, (will be 2000 bytes), but for utf8mb4 it will be 8000 bytes. In this keys you have some options
Add a column named hash_version and implement the index on that column.
Consistent String#hash based only on the string's content
Make the string shorter, it should work , but depernds on your needs
or use fulltext in your migrations, like this:
class AddVersionToUsers < ActiveRecord::Migration
def change
add_column :users, :version, :string, limit:2000
add_index :users, :version, type: :fulltext
end
end
references:
https://mensfeld.pl/2016/06/ruby-on-rails-mysql2error-incorrect-string-value-and-specified-key-was-too-long/
https://codex.wordpress.org/Converting_Database_Character_Sets
https://dev.mysql.com/doc/refman/8.0/en/innodb-restrictions.html
https://docs.oracle.com/cd/E17952_01/mysql-5.7-en/innodb-restrictions.html
https://dba.stackexchange.com/questions/35821/possible-index-on-a-varchar-field-in-mysql

Related

Modifying old Rails migrations then adding a new migration that calls the old ones

MySQL 5.7 began failing old migrations that set a default for columns of type text.
After editing the migration, dropping the development db and re-running all migrations, all worked fine in development.
For production:
1. Is my below understanding correct? Is this the main and/or only issue (besides editing old migrations being frowned upon)
The issue to look out for is that since you can't drop a production db, these migrations will not be run and not reflected in db/schema.rb. As a result, your code will be error prone because it was written to work with a slightly different schema in development.
2. My solution is to essentially force the migration to re-run by calling its up method in a new migration. (See below code). Is this a reasonable solution? ie. Will it work in all cases (no critical downfalls).
This is the original migration, made years ago in Rails 2
class AddZipWarpableResHistory < ActiveRecord::Migration[5.2]
def self.up
add_column :warpables, :history, :text, :default => "", :null => false
add_column :warpables, :cm_per_pixel, :float, :default => 0, :null => false
Warpable.find(:all).each do |w|
r = w.get_cm_per_pixel
w.cm_per_pixel = r if r != nil
w.save
end
end
def self.down
remove_column :warpables, :history
remove_column :warpables, :cm_per_pixel
end
end
after edits (only including the edited lines below) in Rails 5
class AddZipWarpableResHistory < ActiveRecord::Migration[5.2]
def self.up
add_column :warpables, :history, :text, :null => false
Warpable.all.each do |w| # edited this bc find(:all) was broken as well
r = w.get_cm_per_pixel
w.cm_per_pixel = r if r != nil
w.save
end
end
end
new migration to ensure it works in production:
Note that none of the changes in that migration have any subsequent changes in other migrations before this one to account for
require_relative '20111005211631_add_zip_warpable_res_history'
class EnsureNoWarpableCorruption < ActiveRecord::Migration[5.2]
def self.up
AddZipWarpableResHistory.down
AddZipWarpableResHistory.up
msg = 'MySQL 5.7 began throwing errors on migrations that set a default for columns of type text.' +
'We edited that migration and rerun it here to ensure no data corruption in production'
change_table_comment(:warpables, msg)
end
def self.down
raise ActiveRecord::IrreversibleMigration
end
end
Would just editing the production DB manually in MySQL before merging this be an alternative solution if the model code wasn't updated as well?
MySQL doesn't support transactional migrations. Is this the simple solution when working with other databases?
Notes
I didn't execute SQL directly because then we would probably need to start storing db/structure.sql instead of db/schema.rb.
I didn't try to fix any model code or other logic.

Migration to create table raises Mysql2::Error: Table doesn't exist

I wrote a migration with the following:
class CreateTableSomeTable < ActiveRecord::Migration[5.1]
def change
create_table :some_tables do |t|
t.references :user, foreign_key: true
t.references :author, references: :user, foreign_key: true
t.text :summary
end
end
end
It is a basic migration that is creating a database table. However: when I run rails db:migrate a very odd error message aborts the migration:
Mysql2::Error: Table 'my_database.some_tables' doesn't exist: SHOW FULL FIELDS FROM 'some_tables'
It is as if the error is saying it can't create the table because the table does exist, which doesn't make sense.
Things I have looked at and tried:
reviewed the database.yml which seems fine. Nothing has changed, and I have recently run other migrations just fine (though no migrations that created database tables)
ran bundle to ensure all gems were installed
deleted the schema.rb file, recreated the database with data from another copy, and I ran rake db:schema:dump to recreate the schema.rb file. I attempted to run the migration again and still got the same error.
I am using rails 5.1.1 as well as mysql2 0.4.6
Any tips on how I can get the migration to run?
I got a similar error when trying to create a new model that has a reference to an existing model that was created before migrating to Rails 5.1.
Although the error message was not very clear about that, in my case it turned out that the problem was data type mismatch between the primary key of the old model and the foreign key of the new model (MySQL does not permit that). It was so because since Rails 5.1 the default data type of all the primary and foreign keys is bigint, but for the old model the primary key type was still integer.
I solved this by converting all the primary and foreign keys of the current models to bigint, so I can use the Rails new defaults and forget about it.
A workaround could also be specifying integer type for the new foreign keys so that they match the primary keys type of the old models. Something like the following:
class CreateUserImages < ActiveRecord::Migration[5.1]
def change
create_table :user_images do |t|
t.references :user, type: :integer, foreign_key: true
t.string :url
end
end
end
The big issue with the ActiveRecord migration 5.1 is that now the id are expected to be BIGINT instead of INT, so when you adding a column referring another table created before rails 5.1 it expect the column type to be BIGINT but instead is just an INT, hence the error.
The best solution is just modify your migration and change the type of the column to int.
class CreateTableSomeTable < ActiveRecord::Migration[5.1]
def change
create_table :some_tables do |t|
t.references :user, foreign_key: true, type: :int
t.references :author, references: :user, foreign_key: true
t.text :summary
end
end
that should work.
I figured out a work around, but it is still very puzzling to me.
The error message in the log file was not exactly pointing to the issue. For some reason, it might be rails 5.1.1 or it might be mysql2 0.4.6, but it doesn't like using references within the create_table block for some reason. Very odd because it has worked for me in the past.
So I changed the migration from this:
class CreateTableSomeTable < ActiveRecord::Migration[5.1]
def change
create_table :some_tables do |t|
t.references :user, foreign_key: true
t.references :author, references: :user, foreign_key: true
t.text :summary
end
end
end
To this:
class CreateTableSomeTable < ActiveRecord::Migration[5.1]
def change
create_table :some_tables do |t|
t.integer :user_id
t.integer :author_id
t.text :summary
end
end
end
And it worked.
It is very odd because references works just fine with sqlite3 (I tested this by generating a dummy app, ran a scaffold command with a references column, and ran rails db:migrate and it all worked).
This drove me nuts, I think I was seeing a different reason for this than what others suggested. In my case it happened because my migration file names didn't exactly match the migration class therein. For example, I had a migration file named 20171205232654_bonus.rb but inside the class was declared as class CreateBonus < ActiveRecord::Migration[5.1]. Once I changed the file name to 20171205232654_create_bonus.rb everything worked.
This might have something to do with the fact that I've been creating migrations only, not full scaffolds, and maybe I did something wrong. I really don't know how I wound up with that mismatch.

Rails change primary id to 64 bit bigint

I am using rails and the mysql2 adapter. I want to change all primary ids and foreign keys to be 64 bit integers instead of the default 32 bit as they are right now for my production database.
Is this possible on the fly or do I have to drop the database, change the structure and import the data again?
If there is a way to do it without dropping the database, even if it's a hack, it would be great to know.
Rails 5.1 already added a bigint type for migrations, you can do this:
change_column :users, :id, :bigint
Source:
http://www.mccartie.com/2016/12/05/rails-5.1.html
While ActiveRecord does not support this, you are able to do it using execute
class UpdateUserIdLimit < ActiveRecord::Migration
def up
# PostgreSQL
execute('ALTER TABLE users ALTER COLUMN id SET DATA TYPE BIGINT')
# MySQL
execute('ALTER TABLE users MODIFY COLUMN id BIGINT(8) NOT NULL AUTO_INCREMENT')
end
def down
raise ActiveRecord::IrreversibleMigration
end
end
For new tables you should be able to simply do
def change
create_table :users, id: false do |t|
t.int :id, limit: 8, primary_key: true
t.string :first_name
t.string :last_name
end
end
Also starting with Rails 5.1 primary keys will be BIGINT by default.

Getting a binary limit from a migration to the schema in Rails

I have a migration for a model with a binary field meant to store a file that can be bigger than 10Mb:
class CreateNewModel < ActiveRecord::Migration
def change
create_table :new_model do |t|
...
t.binary :data, limit: 16777216
...
end
end
end
With the limit information the migration can create a longblob object in a MySQL or MariaDB database, as seen in How do you get Rails to use the LONGBLOB column in mysql?.
The migration seems to work fine on a MariaDB database: data has longblob type. However loading directly from the schema gives data a blob type instead of longblob, this means the rake db:setup command is no longer usable as the schema doesn't reflect the database I want.
This seems pretty evident when one looks at the db/schema.rb file:
create_table "new_model", force: :cascade do |t|
...
t.binary "data"
...
t.datetime "created_at", null: false
t.datetime "updated_at", null: false
end
There is no limit info and as such loading the schema can only lead to blobs and not longblobs.
Why doesn't the limit information get written in the schema?
I don't want to change the schema manually as it would mean have to redo the change at each migration (as they regenerate the schema file). What other solutions do I have, is there a way to force the limit field from the migration to the schema?
I've tried using the shorthands described in https://github.com/rails/rails/pull/21688 but it doesn't seem to exist in Rails 4.2.6.

How do I add a check constraint in a Rails migration?

I need to add a new integer column to an existing table in my Rails app. The column can only have values 1, 2, 3, so I'd like to add a check constraint to the table/column. How do I specify this constraint within a Rails migration?
Rails migration does not provide any way to add Constraints, but you can still do it via migration but by passing actual SQL to execute()
Create Migration file:
ruby script/generate Migration AddConstraint
Now, in the migration file:
class AddConstraint < ActiveRecord::Migration
def self.up
execute "ALTER TABLE table_name ADD CONSTRAINT check_constraint_name CHECK (check_column_name IN (1, 2, 3) )"
end
def self.down
execute "ALTER TABLE table_name DROP CONSTRAINT check_constraint_name"
end
end
Rails 6.1+ Check Constraints
Rails 6.1 added basic support for check constraints to database migrations.
So now, a migration for adding a check constraint which restricts integer column values only to 1, 2, and 3 can be written as follows:
class AddConstraint < ActiveRecord::Migration
def up
add_check_constraint :table_name, 'check_column_name IN (1, 2, 3)', name: 'check_constraint_name'
end
def down
remove_check_constraint :table_name, name: 'check_constraint_name'
end
end
Here is a link to the relative PR where you can find more details about add_check_constraint and remove_check_constraint.
You can do it with Migration Validators gem. See details here: https://github.com/vprokopchuk256/mv-core
With that gem you'll be able to define inclusion validation on db level:
def change
change_table :table_name do |t|
t.integer :column_name, inclusion: [1, 2, 3]
end
end
moreover you is able to define how that validation should be defined and even error message that should be shown:
def change
change_table :posts do |t|
t.integer :priority,
inclusion: { in: [1, 2, 3],
as: :trigger,
message: "can't be anything else than 1, 2, or 3" }
end
end
you can even level up that validation from migration right to your model:
class Post < ActiveRecord::Base
enforce_migration_validations
end
and then validation defines in migration will be also defined as ActiveModel validation in your model:
Post.new(priority: 3).valid?
=> true
Post.new(priority: 4).valid?
=> false
Post.new(priority: 4).errors.full_messages
=> ["Priority can't be anything else than 1, 2, or 3"]
This answer is obsolete as of May 2021
I just published a gem for this: active_record-postgres-constraints. As the README there describes, you can use it with a db/schema.rb file, and it adds support for the following methods in migrations:
create_table TABLE_NAME do |t|
# Add columns
t.check_constraint conditions
# conditions can be a String, Array or Hash
end
add_check_constraint TABLE_NAME, conditions
remove_check_constraint TABLE_NAME, CONSTRAINT_NAME
Note that at this time, only postgres is supported.
I have just worked through getting a PostgreSQL CHECK constraint to work.
Nilesh's solution is not quite complete; the db/schema.rb file won't include the constraint, so tests and any deployments that use db:setup won't get the constraint. As per http://guides.rubyonrails.org/migrations.html#types-of-schema-dumps
While in a migration you can execute custom SQL statements, the
schema dumper cannot reconstitute those statements from the database.
If you are using features like this, then you should set the schema
format to :sql.
I.e., in config/application.rb set
config.active_record.schema_format = :sql
Unfortunately, if you're using PostgreSQL you may get an error when loading the resultant dump, see discussion at ERROR: must be owner of language plpgsql. I didn't want to go down the PostgreSQL configuration path in that discussion; plus in any case i'm fond of having a readable db/schema.rb file. So that ruled out custom SQL in the migration file for me.
The https://github.com/vprokopchuk256/mv-core gem suggested by Valera seems promising, but it only supports a limited set of constraints (and I got an error when I tried to use it, though that may be due to incompatibilities with other gems I'm including).
The solution (hack) I went with is to have the model code insert the constraint. Since it's kindof like a validation, that's where I put it:
class MyModel < ActiveRecord::Base
validates :my_constraint
def my_constraint
unless MyModel.connection.execute("SELECT * FROM information_schema.check_constraints WHERE constraint_name = 'my_constraint'").any?
MyModel.connection.execute("ALTER TABLE my_models ADD CONSTRAINT my_constraint CHECK ( ...the SQL expression goes here ... )")
end
end
Of course this does an extra select before each validation; if that's a problem a solution would be to put it in an "after connect" monkey patch such as discussed in How to run specific script after connected to oracle using rails? (You can't simply cache the result of the select, because the validation/constraint addition happens within a transaction that might get rolled back, so you need to check each time.)
You can use Sequel gem https://github.com/jeremyevans/sequel
Sequel.migration do
change do
create_table(:artists) do
primary_key :id
String :name
constraint(:name_min_length){char_length(name) > 2}
end
end
end