Browse and modify an SQL dump from MySql Workbench - mysql

I have a mysqldump from an SQL server and i want to open it in a program like MySQL Workbench or DBeaver so that i can easily search it and remove some values etc.
I'm trying to use MySQL Workbench however am unsure if i can import this SQL DUMP directly into here from the file like this. I have created in a new model and clicked import and it seems to show all my tables however they are empty.
Is this possible and how would i go about this?

MySQL Workbench can restore a dump without loading it first into an editor (and hence can even handle gigabyte sized dumps). For this go to the Data Import/Restore admin section,
select your dump, set options (e.g. what of the dump to restore) and click Start Import to start the process.
However this doesn't allow to change the dump and because of usual dump sizes even browsing them is often not possible. You can try to load the dump file into an editor if it is not too large (say around 250MB, depending on system RAM). If it is much larger you can only try special tools like hex editors (which load large files piece by piece).

As far as I remember, MySQLDump exports the database to an sql script with a .sql file extension?
In MySQL Workbench, open this file Using File->Open SQL Script or alternatively CTRL+SHIFT+O
Running the script should then create the database

Related

How to import MYSQL database from Windows to production server

Disclaimer: I'm a total green horn.
I'm working with PhpStorm on Windows, which offers a convenient way of creating and managing databases during development. Unfortunately, now that I want to push to production, uploading the database to Cloadways (Digital Ocean Server) doesn't seem that simple.
Cloudways' Database Manager has an import function, that requires .gz (gzip) files. gzip files can not be created from directories, but on Windows mysql creates directories for each database and fills them with table files (.ibd).
I've read that you can compress directories into .tar files first and then gzip them (database.tar.gz) and that's what I tried. But when I try to import them with the database manager it only shows this:
Is there any way to do this?
Here's what you need to migrate your data to another MySQL server from your development machine: A text file named whatever.sql containing the definitions and contents of your tables (and views and stored functions and all your other database objects). This kind of text file is often called a mysqldump file. You'll find these files contain a whole mess of SQL statements, mostly CREATE TABLE and INSERT. They also contain some lines like
/*!40103 SET #OLD_TIME_ZONE=##TIME_ZONE */;
These look like comments, but when you load the file they're handled as SQL statements, so leave them alone.
You can use gzip(1) on your file to make it smaller, and Cloudways will handle it correctly. Things will work either with or without gzip.
How to export
With PhpStorm, open up the database panel and right-click on your database name (not the server name itself, but your application's database). You'll see a menu item Export with mysqldump. Click it. You can keep the default checkbox settings.
Then give a filename for your output and run the export.
I use PhpStorm on linux, so I'm not totally sure this export works on Windows. If it doesn't, download Ansgar Becker's free and excellent Windows MySQL client program called HeidiSQL. Right click on the database name then choose Export Database as SQL. Check the Create Tables box and choose Insert from the data pulldown.
How to import to production
Log into the production MySQL server.
Create the database and choose it. Cloudways looks like it does that for you. If not, issue these SQL statements.
CREATE DATABASE myDatabaseName;
USE myDatabaseName;
Use an appropriate tool to run all the SQL in your whatever.sql file. Cloudways looks like it does that for you too. If not, this command line, or something similar, might work.
mysql --user=yourMySQLUserName --password=secret\
--database=myDatabaseName\
--host=cloudways.example.com < whatever.sql
Your migration will be complete.
Extra bonus: If your whatever.sql file contains the initial state of your production database, you can put it into git (or other source control) and use it whenever you deploy a new instance of your software package.
Don't try to copy those .ibd files and other files managed by MySQL to another machine. If you do, you'll be sorry.

secure_file_priv= NULL in mySQL workbench MAC

I am new to mySql workbench and need help importing a large file for analysis. I read here
MySQL workbench table data import wizard extremely slow that the data import wizard shouldn't be used so i used LOAD DATA INFILE as suggested but I had issues importing due to secure_file_priv=NULL.
I found a solution here How should I tackle --secure-file-priv in MySQL? but I don't know how to reset the value of secure_file_priv.
I found some sources that use command line to do so but I don't know how to use mySql workbench from command line.
Any help on how to disable this or change it would be appreciated
don't touch it, it is basicakky a security feature.
use the data import under server in workbench,
If you want do it by sql, ecery thng os written already in How should I tackle --secure-file-priv in MySQL? which you linked
simple
copy your file in the folder, that you get as value, when you do a SHOW VARIABLES LIKE "secure_file_priv";
Now you can make a LOAD DAT ideally from the console, because then you have no timeout.
Read also Bulk insert in the namual so that you can optimize a big data import
If you really want to use any folder which is not recommended, you have to edit the my.ini file for that and restart to server (both can be done in Workbench as well, you see in the picture options file and Start/shutdown.) There are some other ways do restart the service.
Here are the basics good described of importing csv

Importing 200k table [duplicate]

During development, how our local WAMP servers get up-to-date data from the test server is that a dump of the database is made and we upload that dump using the source command to load the .sql file.
Recently, at the very end of the import we have been getting errors about the #old variables which stored the original settings like foreign key constraints before they’re changed (so turning off foreign key constraints so that the import doesn’t throw errors when it recreates tables and attempts to create foreign keys when one of the tables has yet to be created). I have worked out that the cause is that the product table is getting more and more data and at a point the session has timed out during the import.
I’m wondering what setting can I set (either as part of the SQL query on in the my.ini file) that will stop all timeouts, in effect making a session last forever while we are signed in.
Strategies for importing large MySQL databases
PHPMyAdmin Import
Chances are if you’re reading this, PHPMyAdmin was not an option for your large MySQL database import. Nonetheless it is always worth a try, right? The most common cause of failure for PHPMyAdmin imports is exceeding the import limit. If you’re working locally or have your own server, you can try changing the MySQL ini settings usually found in the my.ini file located in the MySQL install folder. If you’re working with WAMP on Windows, you can access that file using the WAMP control panel under MySQL > my.ini. Remember to restart WAMP so your new settings will be used. Settings you may want to increase here include:
max_allowed_packet
read_buffer_size
Even with enhanced MySQL import settings you may still find that imports time out due to PHP settings. If you have access to PHP.ini, you can make edits to the maximum execution time and related settings. In WAMP, access the PHP.ini file under the WAMP control panel at PHP > php.ini. Consider raising the limits on the following settings while trying large MySQL imports:
max_execution_time
max_input_time
memory_limit
Using Big Dump staggered MySQL dump importer
If basic PHPMyAdmin importing does not work, you may want to try the Big Dump script from Ozerov.de for staggered MySQL imports. What this useful script does is run your import in smaller blocks, which is exactly what is often needed to successfully import a large MySQL dump. It is a free download available at http://www.ozerov.de/bigdump/.
The process of using Big Dump is fairly simple: you basically position your SQL import file and the Big Dump script together on the server, set a few configs in the Big Dump script and then run the script. Big Dump handles the rest!
One key point about this otherwise great option, is that it will not work at all on MySQL exports that contain extended inserts. So if you have the option to prevent extended inserts, try it. Otherwise you will have to use another method for importing your large MySQL file.
Go command line with MySQL console
If you’re running WAMP (and even if you’re not) there is always the option to cut to the chase and import your large MySQL database using the MySQL console. I’m importing a 4GB database this way as I write this post. Which is actually why I have some time to spend writing, because even this method takes time when you have a 4GB SQL file to import!
Some developers (usually me) are intimidated by opening up a black screen and typing cryptic commands into it. But it can be liberating, and when it comes to MySQL databases it often the best route to take. In WAMP we access the MySQL console from the WAMP control panel at MySQL > MySQL Console. Now let’s learn the 2 simple MySQL Console commands you need to import a MySQL database, command-line style:
use `db_name`
Command use followed by the database name will tell the MySQL console which database you want to use. If you have already set up the database to which you are importing, then you start by issuing the use command. Suppose your database is named my_great_database. In this case, issue the following command in the MySQL Console. Note that commands must end with a semi-colon.
mysql-> use my_great_database;
mysql-> source sql_import_file.sql
Command source followed by the location of a SQL file will import the SQL file to the database you previously specified with the use command. You must provide the path, so if you’re using WAMP on your local server, start by putting the SQL file somewhere easy to get at such as C:\sql\my_import.sql. The full command with this example path would be:
mysql-> source C:\sql\my_import.sql;
After you run that command, the SQL file should begin to be imported. Let the queries run and allow the import to complete before closing the MySQL console.
Further documentation for MySQL command line can be found here: http://dev.mysql.com/doc/refman/5.5/en/mysql.html.
Another solution is to use MySQL Workbench.
This solution worked for me:
max_allowed_packet <-- --> upped size to 8M
read_buffer_size <-- --> upped from 256 to 512
Using Xampp control panel on localhost. After making the changes to the my.ini file in MySQL config, don’t forget to quit Xampp (or Wamp) and restart it for changes to take effect.
(Four days of head-banging and I finally got it fixed!)
Symptoms were on Import: #2006 MySql server went away. However, only 10 table rows were being imported out of 87 table rows.
Consider using MySQL Workbench, it's free and handles very large script very well (from the menu choose: File -> Open SQL Script - if it's large, it will ask you if you'd like run it). Has served me well over the years when working with large SQL dumps.

MySQL 5.0.22 export dump file not importing - syntax errors

I backed up my db with mysqldump from phpMyAdmin. Using MySQL 5.0.22. Made no changes to database file. Import fails. Found many instances of extra spaces using notepad, but now cannot find any other such extraneous spaces. Error is 1064.
Any suggestions on how to import file properly?
Thanks.
I encountered problems with mysql dumps of entire databases including views. So now I dump the tables and data as a separate dump, and export the views, stored routines and functions separately. I restore the tables first then the views etc.
Having come from MS SQL Server and Oracle I would like to know if there are any totally reliable tools out there for MySQL database backups and restores.
You have done several things wrong here
Using PHPMyAdmin for anything critical, especially backups. It is not production-ready, in my experience. Feel free to use it for unimportant read-only work on noncritical servers however.
Editing mysqldump files with notepad (or any other editor). Despite appearances, mysql dump files are not text files and should not be edited with any editor. They contain binary data which is not valid in most character encodings, and therefore can probably not be loaded/saved without introducing errors.
Make a fresh dump using mysqldump, which is the only reliable way of making them, and import that. Do not edit mysql dump files using notepad or any other text tool (this includes the likes of grep, sed etc).
If you need to edit a mysql dump file, then restore it into another (i.e. non-production) database instance, make the necessary changes using SQL commands and re-dump the database. This may be slow but it's reliable.

How do I migrate a populated mySQL database from dev to a shared host?

The title pretty much says it all, but to elaborate: If I build a mySQL database on my local dev machine, populate it with data, and subsequently want to migrate the database to a shared host (in this case, Siteground,) how do I do so in a way that keeps structure and data intact?
In this case, I don't have file access to the database server.
use mysqldump (doc) and dump your database (mysqldump [databasename] for a simple configuration) on your development machine to a dump (a file containing sql statements needed to recover both schema and data). Now insert the dump on your shared-host using the provided utilities (normaly you get phpMyAdmin preinstalled from your hoster, which can import dumps)
In addition to the response made by theomega (namely, do a dump of your development database and then insert the dump into your production database), be aware that you may need to enable large SQL insert statements if you have a lot of data. I would recommend you first FTP the file to the host, and then do the insert from a file. Each host has their own way of doing it, but if you can connect to the remote server using SSH, there is likely the ability to run the insert using the command line.
also in addition to theomega: most tools for mysql has dump / execute functions for sql files.
if you're using navicat, for an example, you're just a right-click away:
right-click on the database you want to export, and choose "dump sql file". this will allow you to save the .sql file on your local drive in the folder of your choosing.
then, right click on the destination database and choose "execute batch file". browse to the newly-created .sql file and it will execute all sql commands from that file in the destination database. namely, creating a copy of the exported db.