Moving/copying one remote database to another remote database - mysql

I'm using heroku to deploy my app. So far, I've been using a development version of my app, and seeding some data into it. I also have a production version set up on heroku.
Both apps are using a mysql database hosted with ClearDB to store the data.
I simply want to move the data from the development version to the production version.
Using MySql Workbench, I exported the dev data to a file and tried to import it to the prod db, but I got an access denied error because it tried to log into the dev db with production credentials.
The databases have identical table/row/column structure. How can I take the data from one and insert it into the other?

add Taps gem to your Gemfile:
gem 'taps'
you should be able to pull your development data to your local development environment with:
heroku db:pull --app your_development_app_name
then push it to production environment:
heroku db:push --app your_production_app_name
this will completely overwrite the db schema and data, so make sure you are careful with it. hope it helps!
_ryan

You can use a combination of the 'mysql' and 'mysqldump' command line
clients to "copy" one database's contents to the other. Here's an example of
how to do this:
mysqldump --single-transaction -u (old_database_username) -p -h (old_database_host) (database_name) | mysql -h (new_host) -u (new_user) -p -D (new_database)
with line breaks:
mysqldump --single-transaction -u (old_database_username)
-p -h (old_database_host) (database_name) | mysql -h (new_host)
-u (new_user) -p -D (new_database)

Related

retrieve a mysql database via SSH

I need to copy an entire database from a mysql installation on a remote machine via SSH to my local machines mysql.
I know the SSH and both local and remote MYSQL admin user and password.
Is this enough information, and how is it done?
From remote server to local machine
ssh {ssh.user}#{remote_host} \
'mysqldump -u {remote_dbuser} --password={remote_dbpassword}
{remote_dbname} | bzip2 -c' \ | bunzip2 -dc | mysql -u {local_dbuser}
--password={local_dbpassword} -D {local_dbname}
That will dump remote DB in your local MySQL via pipes :
ssh mysql-server "mysqldump --all-databases --quote-names --opt --hex-blob --add-drop-database" | mysql
You should take care about users in mysql.users
Moreover, to avoid typing users and passwords for mysqldump and mysql on local and remote hosts, you can create a file ~/.my.cnf :
[mysql]
user = dba
password = foobar
[mysqldump]
user = dba
password = foobar
See http://dev.mysql.com/doc/refman/5.1/en/option-files.html
Try reading here:
Modified from http://www.cyberciti.biz/tips/howto-copy-mysql-database-remote-server.html - modified because I prefer to use .sql as the extension for SQL files:
Usually you run mysqldump to create a database copy and backups as
follows:
$ mysqldump -u user -p db-name > db-name.sql
Copy db-name.out file using sftp/ssh to remote MySQL server:
$ scp db-name.sql user#remote.box.com:/backup
Restore database at remote server (login over ssh):
$ mysql -u user -p db-name < db-name.sql
Basically you'll use mysqldump to generate a dump of your database, copy it to your local machine, then pipe the contents into mysql to regenerate the DB.
You can copy the DB files themselves, rather than using mysqldump, but only if you can shutdown the MySQL service on the remote machine.
I would recommend the Xtrabackup tool by Percona. It has support for hot copying data via SSH and has excelent documentation. Unlike using mysqldump, this will copy all elements of the MySQL instance including user permissions, triggers, replication, etc...
ssh into the remote machine
make a backup of the database using mysqldump
transfer the file to local machine using scp
restore the database to your local mysql

How to import a big database from Heroku to local mysql or sqlite3?

As per title I need to import, but PG backups is giving me strict Postgres SQL that doesn't work with MySQL, also with a non-specified encoding that I guess is UTF-16. Using db:pull takes ages and errors before finishing.
I'd appreciate any suggestion. Thanks.
Set up PostgreSQL locally, use PG backups to copy the data from Heroku to your local machine, then pg_restore to import it into your new local PostgreSQL. Then you can copy it from PostgreSQL to MySQL or SQLite locally without having to worry about timeouts. Or, since you'd have a functional PostgreSQL installation after that, just start developing on top of PostgreSQL so that your development stack better matches your deployment stack; developing and deploying on the same database is a good idea.
You're probably getting binary dumps (i.e. pg_dump -Fc) from Heroku, that would explain why the dump looks like some sort of UTF-16 nonsense.
You can use the pgbackups addon to export the database dump
$ heroku addons:add pgbackups # To install the addon
$ curl -o latest.dump `heroku pgbackups:url` # To download a dump
Heroku has instructions on how to do this: https://devcenter.heroku.com/articles/heroku-postgres-import-export#restore-to-local-database, which boil down to:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U myuser -d mydb latest.dump
where myuser is the current user and mydb is the current database.
If you're using Postgres.app, it's pretty trivial to copy your production database locally. You can leave out -U myuser unless you have configured it otherwise, and create a database by running $ psql -h localhost and then CREATE DATABASE your_database_name; (from the Postgres.app documentation. Then run the above command and you're set.
Follow these 4 simple steps in your terminal(Heroku Dev Center):
Install the Heroku Backup tool:
$ heroku addons:add pgbackups
Start using it:
$ heroku pgbackups:capture
Download the remote db on Heroku (to your local machine) using curl:
$ curl -o latest.dump 'heroku pg:backups public-url'
Load it*:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U YOUR_USERNAME -d DATABASE_NAME latest.dump
get your username and choose the desired database from your config/database.yml file.
DATABASE_NAME can be your development/test/production db (Ex. mydb_development)
That's it!
UPDATE: It is updated to the new interface

Uploading a huge SQL file

I have a 3 huge 5GB .sql files which I need to upload to my server so I can process it. Its too large to run on my computer so I can't process the information offline.
So the question, How can I upload a huge 5GB sql file to MySql server without using PHP because its upload limit is 2MB.
The MySql is hosted on a remote server miles away so I can't just plug it in but I have turned remote access on which should be of some help.
All ideas welcome.
Thank-you for your time
Paul
You cannot use PHP to import such huge files. You need to use command prompt to do so. If you have access to the server then it's quite easy to do so. You can login to the server through SecureCRT(paid) or putty(free).
Use this command to import the sql database to the server - Using IP Address
$ mysql -u username -p -h 202.54.1.10 databasename < data.sql
OR - Using hostname
$ mysql -u username -p -h mysql.hosting.com database-name < data.sql
If you wanna try it out locally first to get used to the command prompt. You can use this.
$ mysql -u username -p -h localhost data-base-name < data.sql
Are you using something like phpMyAdmin? If you have ftp/sftp access upload the file and then run it with phpMyAdmin.

Syncing remote database to local?

I'm hoping I can use a shell script that will pull a sql dump down from my production site and into my local database. Ideally, I'd like to be able to run something like this:
sync_site example_com example_local
Where the first argument is the production database and the second is the local database. The remote database is always on the same server, behind SSH, with known MySQL credentials.
Figured it out:
ssh me#myserver.com mysqldump -u user -ppass remote_db | mysql -u user -ppass local_db

How do I upload new data-base structure into mysql from my development db?

we've been making changes in the MySQL database, adding tables, columns and so forth and I have a development/staging site and production. All are MySQL, staging and production hosted remote.
How do I export the table structure and bring it into the production environment?
I have been using phpMyAdmin to administer it to date.
On local dev system:
$ mysqldump -u username -p databasename > export.sql
On remote system:
$ mysql -u username -p databasename
mysql> source pathto/export.sql
Check out mysqldump, a command line tool that comes with MySQL, here:
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
Using it, you can take a snapshot of both the structure and the data of your database and import it elsewhere.