How to import mysql data into postgresql in Heroku?(Django project) - mysql

Recently I need to import mysql data into postgres database in Heroku. Actcually it includes several steps:
convert mysql data to postgresql
import postgresql data to Heroku
After referring plenty of materials and testing several tools in github, finally I succeed. Here I want to share some of my experience and references.

Firstly, I list some tools for converting mysql database format into postgresql format.
mysql-postgresql-converter
: I finally use this tool and succeed. Dump MySQL database in PostgreSQL-compatible format
mysqldump -u username -p --compatible=postgresql databasename > outputfile.sql
then use the converter to transfer data into *.psql file. then load new dump into a fresh PostgreSQL database.
mysql2postgres : It is a tool which is introduced in Heroku Dev Center. Just refer here Migrating from MySQL to Postgres on Heroku. it is based on Ruby. However, as for me, I found some issues after I finish installation and cannot solve it.
You have already activated test-unit 2.5.5, but your Gemfile requires test-unit 3.2.3. #95
py-mysql2pgsql: Similar process with mysql2postgres above by editing a *.yml file for configuration. There is a nice reference table in README file called Data Type Conversion Legend, which compares different data type between MySQL and PostgreSQL. You can manually modify the data type.
** This website lists some other converting methods.
Some basic operations in PostgreSQL:
$sudo su - postgres
$createtedb testdb
$psql testdb
=# create user username password ' password ';
-- To change a password:
=# alter role username password ' password ';
=# create database databasename with encoding 'utf8';
How to list all database in postgres: PostgreSQL - SELECT Database
postgres-# \l
List of databases
Name | Owner | Encoding | Collate | Ctype | Access privileges
-----------+----------+----------+---------+-------+-----------------------
postgres | postgres | UTF8 | C | C |
template0 | postgres | UTF8 | C | C | =c/postgres +
| | | | | postgres=CTc/postgres
template1 | postgres | UTF8 | C | C | =c/postgres +
| | | | | postgres=CTc/postgres
testdb | postgres | UTF8 | C | C |
(4 rows)
postgres-#
Now type the below command to connect/select a desired database, here we will connect to the testdb database:
postgres=# \c testdb;
psql (9.2.4)
Type "help" for help.
You are now connected to database "testdb" as user "postgres".
testdb=#
After you create your database, import the converted tables into psql. Kindly note that a database should be created before importing data.
$psql -h server -d databasename -U username -f data.sql
(sometimes a sudo -u postgres should be added before psql)
How to generate dump of psql using pg_dump: creating dump file
$sudo -u postgres pg_dump -Fc --no-acl --no-owner -h localhost -U postgres databasename > mydb.dump
The next step, how to import data into Heroku Postgres?
After previous steps, you may have import data into your local PostgresSQL or generate a pg_dump file. Here two methods will be introduced to transfer data to remote Heroku Postgres.
use pg_dump file.reference
Use the raw file URL in the pg:backups restore command:
$ heroku pg:backups:restore 'https://s3.amazonaws.com/me/items/3H0q/mydb.dump' DATABASE_URL
In this case, you should firstly upload the dump file to somewhere with an HTTP-accessible URL. The dev center in Heroku recommend using Amazon S3.
The DATABASE_URL represents the HEROKU_POSTGRESQL_COLOR_URL of the database you wish to restore to. For example, my database url is postgresql-globular-XXXXX.
use pg:push
pg:push will push data from your local psql database into remote Heroku Postgres database. The command looks like this:
$heroku pg:push mylocaldb DATABASE_URL --app sushi
This command will take the local database "mylocaldb" and push it to the database at DATABASE_URL on the app "sushi". Kindly note that the remote database must be empty before performing pg:push in order to prevent accidental data overwrites and loss.
Actually I use this pg:push method and succeed finally.
More information about Heroku Postgres can be found in official document of Heroku.
**others:
Viewing logs of your web application in Heroku: heroku logs --tail
How to deploy Python and Django Apps on Heroku?
How to write the Procfile of Django Apps?
A common Procfile of Django projects will look like this:
web: gunicorn yourprojectname.wsgi --log-file -
Here web is a single process type. What we need to modify is yourprojectname.wsgi. Just replace your project name in the prefix.
How to add Gunicorn to your application?
$ pip install gunicorn
$ pip freeze > requirements.txt
How to run command line in remote Heroku server?
You can execute the bash command in Heroku.
$heroku run bash
Running bash attached to terminal... up, run.1
~ $ ls
Then you can do some command such ls, cd just like in your local bash.
Also, you can use commands in this pattern to execute manage.py in remote Heroku:heroku run python manage.py runserver

Related

mysqldump and send it to another server without save on local machine

i have a database server ( MySQL ) that does not have enough free space to save dump of MySQL. i want to create dump on it and send to archive server straight without save on local host because lack of space. how can i do this on my local database server ?
all of my servers are Linux base.
thanks in advance
You can do it like this:
mysqldump -u MYSQL_USER -p'MYSQL_PASSWORD' YOUR_DATABASE | gzip -c | ssh you#your_host 'cat > ~/backup.sql.gz'

How to pull mysql database from heroku to local machine

Hi I have a ruby on rails app hosted on heroku and it is using mysql as database.
Now I have to take backup of the database to my local machine. But I am getting issues while taking backup.
For this I installed taps gem and I am using following commands for it
heroku pg:pull mysql2://username#hostname.cleardb.com/heroku_database local_database --app my_app
but it is giving error as !Your app has no databases.
Can any one guide me how to pull mysql database from heroku to local machine.
EDIT
I have used following syntax for the command
heroku pg:pull <REMOTE_SOURCE_DATABASE> <LOCAL_TARGET_DATABASE>
and for getting REMOTE_SOURCE_DATABASE I have used following command
heroku config:get DATABASE_URL --app my_app
I refer this link1 and link2 for more detailed heroku documentation.
The pg:pull command only works with Postgres databases in your Heroku app. But, you are using a third-party MySQL provider. Your database is hosted on the ClearDB servers and it's available to anyone with the right credentials, including both your app server on Heroku and your dev machine.
Even though there aren't special commands to pull the database, you don't need any - plain mysqldump should do.
mysqldump -h hostname.cleardb.com -u username heroku_database | mysql local_database
Running $heroku config | grep ^DATABASE will give you something like this:
DATABASE_URL: mysql2://username:password#host/dbname?reconnect=true`
From there you can construct your db dump command:
mysqldump -h host -p -u username dbname | mysql local_database
This will prompt you for the password which you received from the previous command. If you wanted to create a script that would automatically include the password from the heroku command you could do something like this:
mysqldump -u username --password=`heroku config | grep ^DATABASE | sed 's/.*[a-z0-9][a-z0-9]*:\([a-z][a-z0-9]*\).*/\1/'` -h host dbname | mysql cedric
In this way you can have a script that will import the database without requiring any user input but also does not expose the password to your database.
(IMPORTANT DISCLAIMER: You MUST have your database.yml configured correctly in order for this to work. I am not responsible for any data you lose as a result of running the below script.)
For Ruby on Rails users ... you could consider writing a Rake task like these db:clone tasks below.
I find myself using this script constantly to clone down from production to development. It's way easier than remembering the mysqldump syntax, much less all of the usernames and passwords involved ...
To clone from production to development:
rake db:clone:production
To clone from staging to development:
rake db:clone:staging
To clone from production to staging:
rake db:clone:production_to_staging
And here's the code enjoy (and be careful in setting up your database.yml):
namespace :db do
namespace :clone do
class << self
%w(development test staging production).each do |env|
define_method("#{env}_db") do
Rails.configuration.database_configuration[env]
end
end
end
def clone_db(from_db, to_db)
start_time = Time.now
puts "Cloning Remote DB...."
system("mysqldump -h#{from_db['host']} -u#{from_db['username']} -p#{from_db['password']} #{from_db['database']} | mysql #{to_db['database']} -u#{to_db['username']} -p#{to_db['password']}")
puts "Import Successful"
end_time = Time.now
puts "===================="
puts "Job Completed: #{end_time - start_time} Seconds"
end
task :staging => :environment do
clone_db(staging_db, development_db)
end
task :production => :environment do
clone_db(production_db, development_db)
end
task :production_to_staging => :environment do
clone_db(production_db, staging_db) if Rails.env.staging?
end
end
end

MySQL incomplete table data after export and import

We are trying to migrate a very heavy table (about 23GB) from a hosted server to a virtual private cloud with a MySQL instance running in it.
For all the other (smaller) tables that we have, the following procedure works perfectly:
Dump and compress the data on the origin server:
mysqldump -u user -p db table | gzip > table.gz
Send the data to the server that is in the cloud using rsync:
rsync -avz -e ssh root#gold.net:/path/table.gz .
import the data into MySQL using:
gzip -dc < table.gz | mysql -u user -h {LOCAL IP OF DB INSTANCE} -p db
When using this procedure for the large table (again, about 23GB), rows are missing from the target database (this has been tested more than once). Many rows do load successfully, but it is evident that a large amount of the data is missing.
The MD5 of the .gz file is the same both on the source server and the target one.
What else could possibly be going wrong here?

How to import a big database from Heroku to local mysql or sqlite3?

As per title I need to import, but PG backups is giving me strict Postgres SQL that doesn't work with MySQL, also with a non-specified encoding that I guess is UTF-16. Using db:pull takes ages and errors before finishing.
I'd appreciate any suggestion. Thanks.
Set up PostgreSQL locally, use PG backups to copy the data from Heroku to your local machine, then pg_restore to import it into your new local PostgreSQL. Then you can copy it from PostgreSQL to MySQL or SQLite locally without having to worry about timeouts. Or, since you'd have a functional PostgreSQL installation after that, just start developing on top of PostgreSQL so that your development stack better matches your deployment stack; developing and deploying on the same database is a good idea.
You're probably getting binary dumps (i.e. pg_dump -Fc) from Heroku, that would explain why the dump looks like some sort of UTF-16 nonsense.
You can use the pgbackups addon to export the database dump
$ heroku addons:add pgbackups # To install the addon
$ curl -o latest.dump `heroku pgbackups:url` # To download a dump
Heroku has instructions on how to do this: https://devcenter.heroku.com/articles/heroku-postgres-import-export#restore-to-local-database, which boil down to:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U myuser -d mydb latest.dump
where myuser is the current user and mydb is the current database.
If you're using Postgres.app, it's pretty trivial to copy your production database locally. You can leave out -U myuser unless you have configured it otherwise, and create a database by running $ psql -h localhost and then CREATE DATABASE your_database_name; (from the Postgres.app documentation. Then run the above command and you're set.
Follow these 4 simple steps in your terminal(Heroku Dev Center):
Install the Heroku Backup tool:
$ heroku addons:add pgbackups
Start using it:
$ heroku pgbackups:capture
Download the remote db on Heroku (to your local machine) using curl:
$ curl -o latest.dump 'heroku pg:backups public-url'
Load it*:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U YOUR_USERNAME -d DATABASE_NAME latest.dump
get your username and choose the desired database from your config/database.yml file.
DATABASE_NAME can be your development/test/production db (Ex. mydb_development)
That's it!
UPDATE: It is updated to the new interface

Migrating existing database to Amazon RDS

How can I import existing MySQL database into Amazon RDS?
I found this page on the AWS docs which explains how to use mysqldump and pipe it into an RDS instance.
Here's their example code (use in command line/shell/ssh):
mysqldump acme | mysql --host=hostname --user=username --password acme
where acme is the database you're migrating over, and hostname/username are those from your RDS instance.
You can connect to RDS as if it were a regular mysql server, just make sure to add your EC2 IPs to your security groups per this forum posting.
I had to include the password for the local mysqldump, so my command ended up looking more like this:
mysqldump --password=local_mysql_pass acme | mysql --host=hostname --user=username --password acme
FWIW, I just completed moving my databases over. I used this reference for mysql commands like creating users and granting permissions.
Hope this helps!
There are two ways to import data :
mysqldump : If you data size is less than 1GB, you can directly make use of mysqldump command and import your data to RDS.
mysqlimport : If your data size is more than 1GB or in any other format, you can compress the data into flat files and upload the data using sqlimport command.
I'm a big fan of the SqlYog tool. It lets you connect to your source and target databases and sync schema and/or data. I've also used SQLWave, but switched to SqlYog. Been so long since I made the switch that I can't remember exactly why I switched. Anyway, that's my two cents. I know some will object to my suggestion of Windows GUI tools for MySQL. I actually like the SqlYog product so much that I run it from Wine (works flawlessly from Wine on Ubuntu for me).
This blog might be helpful.
A quick summary of a GoSquared Engineering post:
Configuration + Booting
Select a maintenance window and backup window when the instance will be at lowest load
Choose Multi-AZ or not (highly recommended for auto-failover and maintenance)
Boot your RDS instance
Configure security groups so your apps etc can access the new instance
Data migration + preparation
Enable binlogging if you haven't already
Run mysqldump --single-transaction --master-data=2 -C -q dbname -u username -p > backup.sql on the old instance to take a dump of the current data
Run mysql -u username -p -h RDS_endpoint DB_name < backup.sql to import the data into your RDS instance (this may take a while depending on your DB size)
In the meantime, your current production instance is still serving queries - this is where the master-data=2 and binlogging comes in
In your backup.sql file, you'll have a line at the top that looks like CHANGE MASTER TO MASTER_LOG_FILE=’mysql-bin.000003′, MASTER_LOG_POS=350789121;
Get the diff since backup.sql as an SQL file mysqlbinlog /var/log/mysql/mysql-bin.000003 --start-position=350789121 --base64-output=NEVER > output.sql
Run those queries on your RDS instance to update it cat output.sql | mysql -h RDS_endpoint -u username -p DB_name
Get the new log position by finding end_log_pos at the end of the latest output.sql file.
Get the diff since the last output.sql (like step 6) and repeat steps 7 + 8.
The actual migration
Have all your apps ready to deploy quickly with the new RDS instance
Get the latest end_log_pos from output.sql
Run FLUSH TABLES WITH READ LOCK; on the old instance to stop all writes
Start deploying your apps with the new RDS instance
Run steps 6-8 from above to update the RDS instance with the last queries to the old server
Conclusion
Using this method, you'll have a small amount of time (depending on how long it takes to deploy your apps + how many writes your MySQL instance serves - probably only a minute or two) with writes being rejected from your old server, but you will have a consistent migration with no read downtime.
A full and detailed post explaining how we (GoSquared) migrated to RDS with minimal downtime (including error debugging) is available here: https://engineering.gosquared.com/migrating-mysql-to-amazon-rds.
I am completely agree with #SanketDangi.
There are two ways of doing this one way is as suggested using either mysqldump or mysqlimport.
I have seen cases where it creates problem while restoring data on cloud gets corrupt.
However importing applications on cloud has became much easier now a days. You try uploading your DB server on to public cloud through ravello.
You can import your database server itself on Amazon using ravello.
Disclosure: I work for ravello.
Simplest example:
# export local db to sql file:
mysqldump -uroot -p —-databases qwe_db > qwe_db.sql
# Now you can edit qwe_db.sql file and change db name at top if you want
# import sql file to AWS RDS:
mysql --host=proddb.cfrnxxxxxxx.eu-central-1.rds.amazonaws.com --port=3306 --user=someuser -p qwe_db < qwe_db.sql
AWS RDS Customer data Import guide for Mysql is available here : http://aws.amazon.com/articles/2933
Create flat files containing the data to be loaded
Stop any applications accessing the target DB Instance
Create a DB Snapshot
Disable Amazon RDS automated backups
Load the data using mysqlimport
Enable automated backups again
If you are using the terminal this is what worked for me:
mysqldump -u local_username -plocal_password local_db_name | mysql -h myRDS-at-amazon.rds.amazonaws.com -u rds-username -prds_password_xxxxx remote_db_name
and then i used MYSQL WorkBench (free download) to check it was working because the command line was static after pressing submit, i could have probably put -v at end to see it's output
Note: there is no space after -p
Here are the steps which i have done and had sucess.
Take the MySQLdump of the needed database.
mysqldump -u username -p databasename --single-transaction --quick --lock-tables=false >databasename-backup-$(date +%F).sql
( Dont forget to replace the username as root – most of the times, and databasename -> Db name of database which you are going to migrate to RDS )
Once prompted, enter your password.
Once done, login to the RDS Instance from your MySQL server ( Make sure the security groups are configured to allow the connection from Ec2 to RDS )
mysql -h hostaddress -P 3306 -u rdsusername -p
( Dont forget to replace hostaddress with the address of your RDS Instance and rdsusernmae with username for your RDS Instance, when prompted give the password too )
You find that hostaddress under – Connectivity & security -> Endpoint & port under RDS Database From AWS Console.
Once logged in, create the database using MySQL commands :
create database databasename;
\q
Once Database is created in RDS, Import the SQL file created in Step 1 :
mysql -h hostaddress -u rdsusername -p databasename < backupfile.sql
This should import the SQL file to RDS and restore the contents into the new database.
Reference from : https://k9webops.com/blog/migrate-an-existing-database-on-mysql-mariadb-to-an-already-running-rds-instance-on-the-aws/