Importing huge sql file using phpmyadmin - mysql

I am trying to import 2 GB sql file of table. I tried using bigdump but it failed. Can any body help me in this regard? I am using phpmyadmin by increasing the max_upload_filesize. Any help will be much appreciated.

How To
Make sure you change both *"post_max_size"* and *"upload_max_filesize"* in your "php.ini" (which is located where your "php.exe" is).
The following example allows you to upload and import 128MB sql files:
post_max_size=128M
upload_max_filesize=128M
Restart Apache and you're all set.
Alternatives
An alternative way to work around the problem is to use the command line. But it's a workaround, and as ugly as workarounds get:
C:\xampp\mysql\mysql.exe -u root -p db_name < C:\some_path\your_sql_file.sql
As you're not using the server but the command line, upload and POST sizes don't matter. That's why I call this a "workaround", since it doesn't actually solve your problem.

I tried using the mysql console like in this answer and it worked for me! (It will take a while)

Related

How to execute a my sql migration script in MYSQL Shell?

Despite a comprehensive search I couldnt find a definite answer how to run a mysql script, sitting in one specific directory, directly in the mysql shell.
The MYSQL documentation states:
shell> mysql < text_file
So far so good, but the text file (*.sql) sits in another directory than the mysql instance. So that doesnt work that easily - unfortunately...
I hope someone can help me out.
Thank you in advance
GG
In the documentation you are quoting, "shell>" refers to a Unix prompt, not MySQL shell.
In MySQL shell, just do "source /path/to/file" "/path/to" being the absolute path to the file.
So found it out myself ;-)
Just add the source path of the script behind the mysql-Prompt like:
mysql> SOURCE C:\User\Folder\Folder...
Best GG

How come doing a find/replace on sql dump and importing it gives error #2006?

I have the sql dump for a wordpress install that lives on a domain. I need to make some changes to the site so I have set up a localhost using MAMP.
If I import the sql dump "as-is" it imports without a problem. However I need to change all the URL's in the sql dump to point to localhost instead.
When I use Aptana to do a replace all on the sql file from http://example.com to http://localhost/example and try to import the modified file to mysql I get the error "#2006 - MySQL server has gone away"
What is the problem here? I have temporarily fixed by overriding my hosts file for example.com to point to my localhost but this is not a long term viable option.
I am aware this error usually occurs for files that are too large or the server not responding but I am always able to import the non-modified version of the sql. Also there are 9538 replaces being done so I cannot go through 1 by 1 to find the culprit.
Thank you
(Just realized that it doesn't relate DIRECTLY to your problem, but you still need this info if you are doing Search and Replace in your MySQL dump).
Data in the WP database is serialized. You can't just search and replace.
You can't just change the data without re-serializing it.
There are scripts and services that allow you to do a proper Search and Replace.
I usually use this tool (git repository here), and it works perfectly. There are also a couple plugins that work, and just discovered this service from a theme creation company that does the same thing.
Good luck and happy wordpressing.

export mysql database schema and data using phpmyadmin

I'm trying to export a database with phpmyadmin but the result is a file that can't be read. In phpmyadmin I leave the options and only add compression. After downloading I'm unable to open the resulted .sql file (after extracting) in geany and get a complaint about the file being in an encoding that's not supported. Every time I do the export the file seems to be of a random size as if the process just stops somewhere.
It's a small database with joomla stuff that needs to be moved to another site and I would like to use mysldump but don't have ssh access. Other than the phpmyadmin in the cpanel I can't think of another way to access that data but phpmyadmin doesn't seem to be up for doing the job.
Is there any setting I should have to have a look at or some other way to get that data exported?
Try disabling compression or using another compression method. There have been a few bugs in phpMyAdmin. Maybe you are stumbling about one of them. Also try opening the sql file in an ordinary text editor and check the content maybe you can get some more information about the problem you are experiencing.
If you have access to the commandline you can also try the following command: mysqldump -u <username> -p <database_name> > dumpfilename.sql.

Exporting drupal site from local host causes missing tables

I have a functional site on my local laptop that I am trying to port to a webhost. Now I was successful in doing this once which must have been a stroke of luck. When an issue forced me to wipe and rebuild, I now find myself unable to migrate the site. Now when I try I get a bunch of errors regarding missing tables.
My process was as follows:
Export mysql table;
FTP website files and sql datatable;
Import sql datatable into new sql database;
Visit the new IP address.
Was I lucky and any way I can fix it? I do not have that much content but I would like to avoid recreating it all.
Please help!
Not sure how helpful I can be without knowing what the specific error messages are.
One thing I can think of is dropping the database and recreating it. Assuming you have command-line access:
mysqladmin drop database_name
mysqladmin create database_name
And then use mysql to import the new dump file. I find this is the safest way as sometimes mysql complains about "duplicate tables" or something similar when you try to restore over the top of an existing database. I can't give any better advice without knowing what the specific errors are.
I suppose you have already done this, but also make sure you server meets the minimum Drupal requirements: https://drupal.org/requirements

Splitting MySQL database into tables separate

Well I'm not much of a good developer or a database expert. But I have a little understanding of these things. I'm trying to dump a database on a VPS using "mysqldump" command which works perfectly. But when I tried to restore locally after downloading the dump, gives me a time out error.
Can anyone advise me how to dump a database by splitting it into tables separately. The database I'm referring to is pretty large (6 - 7 GB). I actually tried searching and it confuses me.. even this link here confuses me as where to start.
Any help is highly appreciated.
Are you restoring with phpmyadmin? If you try to upload the import it is probably too large.
You can set a directory where the backup files are stored, then you can select the file in phpmyadmin without uploading it.
For the timeout with importing you can increase the timeout settings, or use something like "BigDump"
If you're using mysqldump I'll assume you're familiar with the command line.
On your local machine, use
mysql -u [root] -p [database_name] < [database_dump.sql] -v
enter password: ********
The empty database needs to be created your local machine first before you can import the structure and data to it (as simple as doing CREATE DATABASE [database_name];)
The -v flag will do it in 'verbose' mode so you can see the queries as they run. Omitting '-v' will stop it filling your window with the queries but will also give you that 'is it working or not?' nervous feeling after a few minutes.
This will work on Windows as well as it works on Linux / Mac / anything else
Replace my [placeholders] with your own values.
Thank you so much for all your answers! Well, what I was looking for is a dumping method or a similar script to dump the database table by table. Finally I tried the dumping the output file with a .txt extension which returned me with success.
Below is the command I used (I know its pretty long proceess, but I finally got all tables dumped);
mysqldump -u users -p database_name table_name > table_name.txt
I used the current directory to output the file assuming I'm already in the directory where I need to dump. If you need to dump the output file to a specific dir, then use /path/to/the/dump/table_name.txt instead of just mentioning the table name. Ans make sure you don't enter password after -p. I don't know why, but I left it blank and it prompts for the password. Then when I type password it dumps to a text file.
I hope this helps.!
Once again thank you so much for the users who came in the first place to help me. :)