I am new to mySql workbench and need help importing a large file for analysis. I read here
MySQL workbench table data import wizard extremely slow that the data import wizard shouldn't be used so i used LOAD DATA INFILE as suggested but I had issues importing due to secure_file_priv=NULL.
I found a solution here How should I tackle --secure-file-priv in MySQL? but I don't know how to reset the value of secure_file_priv.
I found some sources that use command line to do so but I don't know how to use mySql workbench from command line.
Any help on how to disable this or change it would be appreciated
don't touch it, it is basicakky a security feature.
use the data import under server in workbench,
If you want do it by sql, ecery thng os written already in How should I tackle --secure-file-priv in MySQL? which you linked
simple
copy your file in the folder, that you get as value, when you do a SHOW VARIABLES LIKE "secure_file_priv";
Now you can make a LOAD DAT ideally from the console, because then you have no timeout.
Read also Bulk insert in the namual so that you can optimize a big data import
If you really want to use any folder which is not recommended, you have to edit the my.ini file for that and restart to server (both can be done in Workbench as well, you see in the picture options file and Start/shutdown.) There are some other ways do restart the service.
Here are the basics good described of importing csv
Related
I have a mysqldump from an SQL server and i want to open it in a program like MySQL Workbench or DBeaver so that i can easily search it and remove some values etc.
I'm trying to use MySQL Workbench however am unsure if i can import this SQL DUMP directly into here from the file like this. I have created in a new model and clicked import and it seems to show all my tables however they are empty.
Is this possible and how would i go about this?
MySQL Workbench can restore a dump without loading it first into an editor (and hence can even handle gigabyte sized dumps). For this go to the Data Import/Restore admin section,
select your dump, set options (e.g. what of the dump to restore) and click Start Import to start the process.
However this doesn't allow to change the dump and because of usual dump sizes even browsing them is often not possible. You can try to load the dump file into an editor if it is not too large (say around 250MB, depending on system RAM). If it is much larger you can only try special tools like hex editors (which load large files piece by piece).
As far as I remember, MySQLDump exports the database to an sql script with a .sql file extension?
In MySQL Workbench, open this file Using File->Open SQL Script or alternatively CTRL+SHIFT+O
Running the script should then create the database
During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.
Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.
This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.
Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.
I need suggestions on how to create a SQLite database from an old MySQL database using Windows 7. I have MySQL's .FRM and .IDB files, and a SQL File (which I believe to be a dump, not a script, unfortunately). I tried to just use sqlite's .read and got a bunch of syntax errors about lock and unlock, which is why I'm guessing it's a dump file. This is a 30 gig database, so recreating by hand isn't really an option.
Is there any way for me to do something like export to a CSV, then import it into SQLite? I tried to use mysql2sqlite with Cygwin to convert it, and got a ./mysql2sqlite.sh: line 2: $'\r': command not found.
Any ideas?
There is a collection of tools in the official SQLite website.
Wiki Link
During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.
Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.
This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.
Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.
I've exported a MySQL v4.0.25 script to a sql file and since I can't find an installer for 4.0 anymore the only option left is to use 4.1..
Now, I'm getting the common 1064 error since v4.0 doesn't have utf-8 (only latin-1) and v4.1 gives me a syntax error.
I'd be okay with editing the files manually but, one of the scripts is a file 12GB big and the other one is 5GB so I can't even find an editor able to open a file that large and a problem at hands with this migration (the files are that big because they are a copy of 2 production DBs with over 10years use).
How can I fix or bypass this problem? Any chance I can tell the import script to ignore the lines with errors (and I don't even know how many are there..)?
If it's still possible, dump the data structures in sql and the data tables in csv format using mysqldump --tab=path. This way, any modifications you will need to do will be on the much smaller sql file, keeping the large data files untouched. They you could later import the whole thing using mysqlimport command.
Alternatively, you could always use the mysql --force option for importing your sql file.
More information:
MySQL Reference Manual: mysqldump --tab=path option
MySQL Reference Manual: mysqlimport
MySQL Reference Manual: mysql --force
For manually editing the files:
If you are using Linux as your operating system, then there is a big variety of commands in your hand: more, less, sed, etc. sed is good for substitutions, similar to your question. A nice tutorial can be found at http://www.grymoire.com/Unix/Sed.html
In Windows, I sometimes use PowerShell. I had similar post on StackOverflow about "mysqldump without database name" where there is an example of how to replace a string in a dump file.