I have a 3 huge 5GB .sql files which I need to upload to my server so I can process it. Its too large to run on my computer so I can't process the information offline.
So the question, How can I upload a huge 5GB sql file to MySql server without using PHP because its upload limit is 2MB.
The MySql is hosted on a remote server miles away so I can't just plug it in but I have turned remote access on which should be of some help.
All ideas welcome.
Thank-you for your time
Paul
You cannot use PHP to import such huge files. You need to use command prompt to do so. If you have access to the server then it's quite easy to do so. You can login to the server through SecureCRT(paid) or putty(free).
Use this command to import the sql database to the server - Using IP Address
$ mysql -u username -p -h 202.54.1.10 databasename < data.sql
OR - Using hostname
$ mysql -u username -p -h mysql.hosting.com database-name < data.sql
If you wanna try it out locally first to get used to the command prompt. You can use this.
$ mysql -u username -p -h localhost data-base-name < data.sql
Are you using something like phpMyAdmin? If you have ftp/sftp access upload the file and then run it with phpMyAdmin.
Related
I tried copy a sql file from one server another server mysql database
ssh -i keylocation user#host 'mysql --user=root --password="pass" --host=ipaddress additional_content Additional_Content' | < databasedump.sql
databasedump.sql - file on server A, I wanna copy data from that database file into database on server B, i tried to connec via ssh, to that server,and i need keyfile for that, and then copy the data, but when i run this command into console, nothing happens, any help?
Are you able to secure copy the file over to server B first, and then ssh in for the mysql dump? Example:
scp databasedump.sql user#server-B:/path/to/databasedump.sql
ssh -i keylocation user#host 'mysql --user=root --password="pass" --database=db_name < /path/to/databasedump.sql'
Edit: type-o
I'm not entirely sure how mysql handles stdin, so one thing you can do that should work in one command is
ssh -i keylocation user#host 'cat - | mysql --user=root --password="pass" --host=ipaddress additional_content Additional_Content' < databasedump.sql
However it's better to copy the files first with scp, and then import it with mysql. See Dan's answer.
I may have gone completely braindead today, but I'm having trouble copying my DB down to my local MAMP server. I'm not too familiar with mysqldump, etc, but I want to know how to copy a database from a test server to my MAMP local server in the easiest way possible. I have very limited experience with server stuff, but have a bit of experience with command line.
Any straight-forward help would be much appreciated. I look forward to smacking myself in the head when I realise what a dick I've been ;)
Dalogi
mysqldump's the best way:
on the test server: mysqldump -p name_of_db > dump.sql
on the map server: mysql -p < dump.sql
The dump file contains the full instructions in SQL query format to recreating the db, it stable structure, and data. The -p option forces both apps to prompt for your password. If your MySQL username is different than your system's login account, then you'll need the -u option as well:
mysqldump -p -u yourDBusername name_of_db > dump.sql
mysql -p -u yourDBusername < dump.sql
mysqldump -h 'remotehost' -uremoteuser -premotepass db_name | mysql -ulocaluser -plocalpass db_name
I'm hoping I can use a shell script that will pull a sql dump down from my production site and into my local database. Ideally, I'd like to be able to run something like this:
sync_site example_com example_local
Where the first argument is the production database and the second is the local database. The remote database is always on the same server, behind SSH, with known MySQL credentials.
Figured it out:
ssh me#myserver.com mysqldump -u user -ppass remote_db | mysql -u user -ppass local_db
I'm creating a snippet to be used in my Mac OS X terminal (bash) which will allow me to do the following in one step:
Log in to my server via ssh
Create a mysqldump backup of my Wordpress database
Download the backup file to my local harddrive
Replace my local Mamp Pro mysql database
The idea is to create a local version of my current online site to do development on. So far I have this:
ssh server 'mysqldump -u root -p'mypassword' --single-transaction wordpress_database > wordpress_database.sql' && scp me#myserver.com:~/wordpress_database.sql /Users/me/Downloads/wordpress_database.sql && /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database < /Users/me/Downloads/wordpress_database.sql
Obviously I'm a little new to this, and I think I've got a lot of unnecessary redundancy in there. However, it does work. Oh, and the ssh command ssh server is working because I've created an alias in a local .ssh file to do that bit.
Here's what I'd like help with:
Can this be shortened? Made simpler?
Am I doing this in a good way? Is there a better way?
How could I add gzip compression to this?
I appreciate any guidance on this. Thank you.
You can dump it out of your server and into your local database in one step (with a hint of gzip for compression):
ssh server "mysqldump -u root -p'mypassword' --single-transaction wordpress_database | gzip -c" | gunzip -c | /Applications/MAMP/Library/bin/mysql -u root -p'mylocalpassword' wordpress_database
The double-quotes are key here, since you want gzip to be executed on the server and gunzip to be executed locally.
I also store my mysql passwords in ~/.my.cnf (and chmod 600 that file) so that I don't have to supply them on the command line (where they would be visible to other users on the system):
[mysql]
password=whatever
[mysqldump]
password=whatever
That's the way I would do it too.
To answer your question #3:
Q: How could I add gzip compression to this?
A: You can run gzip wordpress_database.sql right after the mysqldump command and then scp the gzipped file instead (wordpress_database.sql.gz)
There is a python script that would download the sql dump file to local. You can take a look into the script and modify a bit to perform your requirements:
download-remote-mysql-dump-local-using-python-script
I want to copy all the tables, fields, and data from my local server mysql to my hosting sites mysql. Is there a way to copy all the data? (It's only 26kb, very small)
In phpMyAdmin, just export a dump (using the export) tab and re-import it on the other server using the sql tab.
Make sure you compare the results, I have had phpMyAdmin screw up the import more than once.
If you have shell access to both servers, a combination of
mysqldump -u username -p databasename > dump.sql
and a
mysql -u username -p databasename < dump.sql
on the target server is the much more fast and reliable alternative in my experience.
Have a look at
Copying MySQL Databases to Another Machine
Copy MySQL database from one server to another remote server
Please follow the following steps:
Create the target database using MySQLAdmin or your preferred method. In this example, db2 is the target database, where the source database db1 will be copied.
Execute the following statement on a command line:
mysqldump -h [server] -u [user] -p[password] db1 | mysql -h [server]
-u [user] -p[password] db2
Note: There is NO space between -p and [password]
I copied this from Copy/duplicate database without using mysqldump.
It works fine. Please ensure that you are not inside mysql while running this command.
If you have the same version of mysql on both systems (or versions with compatible db file sytsem), you may just copy the data files directly. Usually files are kept in /var/lib/mysql/ on unix systems.