I'm trying to import the .sql file from Open Product Data, the POD Database Dump. The site recommends to use BigDump, but when I try to run the php script from BigDump in my localhost, I'm getting this message:
Stopped at the line 339.
At this place the current query includes more than 300 dump lines. That can happen if your dump file was created by some tool which doesn't place a semicolon followed by a linebreak at the end of each query, or if your dump contains extended inserts or very long procedure definitions. Please read the BigDump usage notes for more infos. Ask for our support services in order to handle dump files containing extended inserts.
How can I fix this?
I already tried to open the .sql file, but it always stops working/crashs in my Mac
Related
This is how the error happened - I was using MySQL shell to import a 16G .sql file that contains multiple tables using the source command and optimizations (as the first solution in this question, with maximum values indicated in MySQL documentation). The import seem to have gone on for a while, and then I got ERROR 2006: Server has gone away when I checked it after three days. It seems to have happened in the middle of inserting rows into a table that already has hundreds and thousands of rows inserted.
I restarted the server, want to pick up where the import has left off so I don't have to duplicate all the work and possibly run into the same problem, and got stuck trying the following options -
Finding where the problem is through logs. Since I haven't initiated error logs (reference here), I looked for binary logs. The SHOW BINARY LOGS command shows a list logs as the documentation says it would, but I couldn't view the logs themselves, so I can't figure out where it's gone wrong.
Insert-ignore - I have tried the first and second solutions to this question import mysql data interrupted, how to resume? but kept getting syntax error when I tried to reference the full path to the .sql file, which is on an external drive.
Look for an insert-ignore option that works with the source command, with which I've imported several smaller large .sql files, but so far haven't found it.
Run snippets of the .sql file using MySQL WorkBench - it was unresponsive for a few hours and eventually loaded a blank window. I also tried opening the .sql file using NotePad, ATOM and Sublime, they all haven't loaded.
My last resort would be to break up the .sql file into snippets, and then copy and paste potentially hundreds of statements into MySQL shell, but I'm hoping that I don't have to go there.
Any suggestions?I'm using MySQL community version 8.0...and also I am a SQL newbie so might be missing something really obvious. Thanks in advance!
EDIT -
On 2)- Figured out how the syntax works with full path in INSERT-IGNORE after rereading documentation of the query. But then the LOAD DATA command has directory and also unique key constraints which I might not be able to satisfy.
I have a mysqldump from an SQL server and i want to open it in a program like MySQL Workbench or DBeaver so that i can easily search it and remove some values etc.
I'm trying to use MySQL Workbench however am unsure if i can import this SQL DUMP directly into here from the file like this. I have created in a new model and clicked import and it seems to show all my tables however they are empty.
Is this possible and how would i go about this?
MySQL Workbench can restore a dump without loading it first into an editor (and hence can even handle gigabyte sized dumps). For this go to the Data Import/Restore admin section,
select your dump, set options (e.g. what of the dump to restore) and click Start Import to start the process.
However this doesn't allow to change the dump and because of usual dump sizes even browsing them is often not possible. You can try to load the dump file into an editor if it is not too large (say around 250MB, depending on system RAM). If it is much larger you can only try special tools like hex editors (which load large files piece by piece).
As far as I remember, MySQLDump exports the database to an sql script with a .sql file extension?
In MySQL Workbench, open this file Using File->Open SQL Script or alternatively CTRL+SHIFT+O
Running the script should then create the database
I am trying to import my backup .sql file into MySQL. I've tried using phpmyadmin, but that didn't allow my large sized file.
My file size is 243 MB. I've made changes in php.ini. The problem is out of 1.1M records, its importing only 600k records. I've tried with MySQL console also using command. Same thing happened with that also. I am using MySQL Workbench, and it showed me below error.
ERROR at line 695259: Unknown command '\a'.
Finished executing script
Operation failed with exitcode 1
What does this mean? How do I import my full data?
The presence of the <br/> tag at the tail of your dump file indicates that you have probably used phpmyadmin to create it and the the page timed out while the dump was in progress. Thus the error is not in your import process but in your dump process.
How can this be fixed? A global solution is to increase the php script execution time with set_time_limit an alternative is to use the mysql console client to create the dump by passing phpmyadmin.
A somewhat tedius but workable solution is to dump one table at a time. This doesn't always work (for example if one table makes up most of the size of the database it might still cause a timeout)
It seems dump was not properly created. There are few errors in you sql file.
During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.
Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.
This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.
Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.
I have a local database thats about 1GB and my remote host is a free host that I am using for testing. want to make sure everything works before i spend money on a paid host. The problem is the phpmyadmin on the remote server only allows 50mb files which which just doesn't cut it, especially since the restore usually fails due to execution time limits. Below is the list of everything I've tried.
LOCAL
phpmyadmin -----> backingup of table no longer work because of timeout even with modified php.ini settings because of shear size of db
mysqldumper -----> program creates dumps with inserts, there is no option for me to make it create insert ignores. ill explain the problem later below.
mysqlworkbench -----> creates database using database name of my local server (problem is my remote server has a different database name and i cant open a 1gb .sql file to edit the database name at the very top. computer just craps out and I have to force quit workbench)
sqlsplitter (mac program) cuts up large .sql or .sql.gz files
REMOTE
phpmyadmin with .gz/.sql files cut up into 20mb chunks
-----> timeout. phpmyadmin resume function doesnt work either. it just overwrites old data
mysqldumper -----> process ends up in an error randomly midway through my restore on remote server using a backup created with mysqldumper on my local computer (single file or multipart, both dont work). could be at 10% completion, could be at 50%.
bigdump -----> used single and multipart dumps from mysqldumper, same problem. randomly quits halfway through. some multiparts were successful in completing, but when one failed and I tried the failed part again, it would give me an error saying unique key already exists in table. i dont want to unset all my unique key stuff and have to go through and delete all duplicates later.
mysqldumper -----> does not work with dump from mysqlworkbench
bigdump -----> gives me an error sql error denied for creating database using dump from mysqlworkbench (i cannot open up a 1 gb file to delete that 1 line that says create database)
Does anybody know of a better method to upload to my host? I have no command line access on there and only a 500mb space limit (no limit on sql space though).
Thanks
Use mysqldump. Figure out what the error you're seeing is, and fix it. The mysqldump utility works. I've restored dumpfiles with hundreds of gigabytes of data to servers, and never use anything else. If it doesn't work for you, you're doing something incorrectly.
You can prevent it from writing a USE database-name; statement at the top of the file by invoking it with the database name as the last argument, without using the --databases option before it.
You can add the --insert-ignore command line option to write all the INSERT statements as INSERT IGNORE to work around your partial insert issues
You can use --no-data to extract a dump file that contains table definitions, not data, and get all of the tables declared, first.
You can use the --no-create-info option to extract a dump file with just the inserts, not the table definitions.
http://dev.mysql.com/doc/refman/5.6/en/mysqldump.html
You can also use a simple bash loop to extract each table into its own file, so you have smaller files to work with:
for TABLE in `mysql [args] -e 'show tables in database-name'`; do mysqldump [args] database-name $TABLE > $TABLE.sql; done
When restoring the files, add the --compress option to the mysql command line arguments for a faster transfer, and specify your (new) database name as the last argument, so the client will use the correct database before applying the file, which no longer contains the database name.