I want to create migrations to add columns from my rails application (not through rails g migration xxxx), while creating the migration I want to store the version number to the for the migration for later possible down operation.
The scenario is, I have an application with generalized tables and their fields. The application can be deployed for multiple customers. I want to provide a way to define custom fields in the table. Once the user selects/inputs desired data like table_name, field_name, data_type etc. I will be creating a new migration to add the field and store the version number somewhere in the database. This version number will be used to migrate:down in case the user decides to delete the field.
Is there any other better approach than this?
I have implemented this as below:
Depending upon the field_name and table_name I create a migration using:
def create_migration
field_name_for_db = field_name.gsub(' ', '_').downcase
migration_name = "add_column_#{self.field_name}_to_#{self.table_name}"
logger.info "cd #{Rails.root} && rails g migration #{migration_name} #{self.field_name}:string > #{Rails.root}/tmp/migration_details.txt && rake db:migrate"
system "cd #{Rails.root} && rails g migration #{migration_name} #{self.field_name}:string > #{Rails.root}/tmp/migration_details.txt && rake db:migrate"
migration_version = File.read("#{Rails.root}/tmp/migration_details.txt").split('/').last.split("_#{migration_name}").first
self.migration_name = migration_name
self.migration_version = migration_version
self.save
end
In this method I have redirected the output of create migration command to a file and retrieving migration number from the file and then storing it to the database.
Related
I am aware of syncdb and makemigrations, but we are restricted to do that in production environment.
We recently had couple of tables created on production. As expected, tables were not visible on admin for any user.
Post that, we had below 2 queries executed manually on production sql (i ran migration on my local and did show create table query to fetch raw sql)
django_content_type
INSERT INTO django_content_type(name, app_label, model)
values ('linked_urls',"urls", 'linked_urls');
auth_permission
INSERT INTO auth_permission (name, content_type_id, codename)
values
('Can add linked_urls Table', (SELECT id FROM django_content_type where model='linked_urls' limit 1) ,'add_linked_urls'),
('Can change linked_urls Table', (SELECT id FROM django_content_type where model='linked_urls' limit 1) ,'change_linked_urls'),
('Can delete linked_urls Table', (SELECT id FROM django_content_type where model='linked_urls' limit 1) ,'delete_linked_urls');
Now this model is visible under super-user and is able to grant access to staff users as well, but staff users cant see it.
Is there any table entry that needs to be entered in it?
Or is there any other way to do a solve this problem without syncdb, migrations?
We recently had couple of tables created on production.
I can read what you wrote there in two ways.
First way: you created tables with SQL statements, for which there are no corresponding models in Django. If this is the case, no amount of fiddling with content types and permissions that will make Django suddenly use the tables. You need to create models for the tables. Maybe they'll be unmanaged, but they need to exist.
Second way: the corresponding models in Django do exist, you just manually created tables for them, so that's not a problem. What I'd do in this case is run the following code, explanations follow after the code:
from django.contrib.contenttypes.management import update_contenttypes
from django.apps import apps as configured_apps
from django.contrib.auth.management import create_permissions
for app in configured_apps.get_app_configs():
update_contenttypes(app, interactive=True, verbosity=0)
for app in configured_apps.get_app_configs():
create_permissions(app, verbosity=0)
What the code above does is essentially perform the work that Django performs after it runs migrations. When the migration occurs, Django just creates tables as needed, then when it is done, it calls update_contenttypes, which scans the table associated with the models defined in the project and adds to the django_content_type table whatever needs to be added. Then it calls create_permissions to update auth_permissions with the add/change/delete permissions that need adding. I've used the code above to force permissions to be created early during a migration. It is useful if I have a data migration, for instance, that creates groups that need to refer to the new permissions.
So, finally i had a solution.I did lot of debugging on django and apparanetly below function (at django.contrib.auth.backends) does the job for providing permissions.
def _get_permissions(self, user_obj, obj, from_name):
"""
Returns the permissions of `user_obj` from `from_name`. `from_name` can
be either "group" or "user" to return permissions from
`_get_group_permissions` or `_get_user_permissions` respectively.
"""
if not user_obj.is_active or user_obj.is_anonymous() or obj is not None:
return set()
perm_cache_name = '_%s_perm_cache' % from_name
if not hasattr(user_obj, perm_cache_name):
if user_obj.is_superuser:
perms = Permission.objects.all()
else:
perms = getattr(self, '_get_%s_permissions' % from_name)(user_obj)
perms = perms.values_list('content_type__app_label', 'codename').order_by()
setattr(user_obj, perm_cache_name, set("%s.%s" % (ct, name) for ct, name in perms))
return getattr(user_obj, perm_cache_name)
So what was the issue?
Issue lied in this query :
INSERT INTO django_content_type(name, app_label, model)
values ('linked_urls',"urls", 'linked_urls');
looks fine initially but actual query executed was :
--# notice the caps case here - it looked so trivial, i didn't even bothered to look into it untill i realised what was happening internally
INSERT INTO django_content_type(name, app_label, model)
values ('Linked_Urls',"urls", 'Linked_Urls');
So django, internally, when doing migrate, ensures everything is migrated in lower case - and this was the problem!!
I had a separate query executed to lower case all the previous inserts and voila!
I have transfered my project from MySQL to PostgreSQL and tried to drop the column as result of previous issue, because after I removed the problematic column from models.py and saved. error didn't even disappear. Integer error transferring from MySQL to PostgreSQL
Tried both with and without quotes.
ALTER TABLE "UserProfile" DROP COLUMN how_many_new_notifications;
Or:
ALTER TABLE UserProfile DROP COLUMN how_many_new_notifications;
Getting the following:
ERROR: relation "UserProfile" does not exist
Here's a model, if helps:
class UserProfile(models.Model):
user = models.OneToOneField(User)
how_many_new_notifications = models.IntegerField(null=True,default=0)
User.profile = property(lambda u: UserProfile.objects.get_or_create(user=u)[0])
I supposed it might have something to do with mixed-case but I have found no solution through all similar questions.
Yes, Postgresql is a case aware database but django is smart enough to know that. It converts all field and it generally converts the model name to a lower case table name. However the real problem here is that your model name will be prefixed by the app name. generally django table names are like:
<appname>_<modelname>
You can find out what exactly it is by:
from myapp.models import UserProfile
print (UserProfile._meta.db_table)
Obviously this needs to be typed into the django shell, which is invoked by ./manage.py shell the result of this print statement is what you should use in your query.
Client: DataGrip
Database engine: PostgreSQL
For me this worked opening a new console, because apparently from the IDE cache it was not recognizing the table I had created.
Steps to operate with the tables of a database:
Database (Left side panel of the IDE) >
Double Click on PostgreSQL - #localhost >
Double Click on the name of the database >
Right click on public schema >
New > Console
GL
i have now new structure of my database, but i need to import the old data in the new format. For that reason i want to use the Laravel seeder, but i need somehow to connect to the old database and make select queries and to tell the seeder how to put the data in the new database.
Is that possible ?
Try:
Examples:
php artisan iseed my_table
php artisan iseed my_table,another_table
Visit: https://github.com/orangehill/iseed
Configure your laravel app to use two mysql connections (How to use multiple database in Laravel), one for the new database, the other for the old one.
I'll fake it like old and new.
In your seeds read from the old database and write into the new.
$old_user = DB::connection('old')->table('users')->get();
foreach ($old_users as $user) {
DB::connection('new')->table('users')->insert([
'name' => $user->name,
'email' => $user->email,
'password' => $user->password,
'old_id' -> $user->id
// ...
]);
}
Make sure to add messages while seeding like $this->command->info('Users table seeded'); or even a progress bar (you can access command line methods) to know at which point of the import you are.
Download package from
Git repo : https://github.com/orangehill/iseed
then update below file src/Orangehill/Iseed/IseedCommand.php
Add below code at line number 75
// update package script
if($this->argument('tables') === null){
$tables = Schema::getConnection()->getDoctrineSchemaManager()->listTableNames();
}
and update getArguments method in same file with below code
array('tables', InputArgument::OPTIONAL, 'comma separated string of table names'),
and then run php artisan iseed so it will get all the tables from your existing db and start creating seeders for all tables
So I am building a new app that needs to do some importing of legacy data from an old app. The old apps database is mysql, which you obviously can't use on heroku, but I want to use postgres. Basically I am doing an ETL via activerecord.
Here's what I have so far:
# config/initializers/legacy_database.rb
LEGACY_DATABASE_URL = "mysql://myusername:#{ENV['LEGACY_DATABASE_PASSWORD']}#host/foo1008801154002"
# app/models/legacy.rb
class Legacy < ActiveRecord::Base
establish_connection LEGACY_DATABASE_URL
end
# app/models/legacy/user.rb
class Legacy::User < Legacy
self.table_name = 'users'
end
If I am in the console and I run Legacy::User.count I get back the correct count. However if I try to do something like Legacy::User.first I get the following error:
Legacy::User Load (54.0ms) SELECT `users`.* FROM `users` ORDER BY `users`.`id` DESC LIMIT 1
Mysql::Error: Table 'foo1008801154002.legacies' doesn't exist: SHOW FULL FIELDS FROM `legacies`
ActiveRecord::StatementInvalid: Mysql::Error: Table 'foo1008801154002.legacies' doesn't exist: SHOW FULL FIELDS FROM `legacies`
I'm not sure why rails is adding on the .legacies to the table name, nor am I sure how to fix this. I figure it might be some setting in Legacy.connection
Any advice?
Well it was relatively simple. I had to add the line self.abstract_class = true to my Legacy Base class.
Just a note: if you have mysql2 gem in your bundle then need to put:
# config/initializers/legacy_database.rb
LEGACY_DATABASE_URL = "mysql2://myusername:#{ENV['LEGACY_DATABASE_PASSWORD']}#host/foo1008801154002"
(or load mysql gem in Gemfile)
I'm working on ROR 3 app . I have added the following observer but I dont see any output as expected in the console or log file ( i have tried in both development and production modes)
cmd : rails g observer auditor
models:
class AuditorObserver < ActiveRecord::Observer
observe :excel_file
def after_update(excel_file)
excel_file.logger.info('New contact added!')
AuditTrail.new(execl_file, "UPDATED")
puts "*******************"
logger.info "********************************************"
end
end
application.rb:
config.active_record.observers = :auditor_observer
What am I missing in here? When I change the database (thru Mysql workbench/command line) I don't see any of the above lines getting executed.. neither after_update/after_save. But after_save works if I'm executing a query thru the app itself and do #excel.save
How else are we supposed to update data in DB so that we see the observer working????
When you bypass activerecord by modifying the database directly, you naturally bypass all of the activerecord callbacks.
So the answer is to update the data through the application, or to use database triggers instead.