Facing issue in importing a large SQL dump - mysql

I have a SQL file with 22 MB(The Magento table - "index_event") , when i'm trying to import it to the MySQL database using MySQLWorkBench , WorkBench is not responding and hence i'm not able to import it.
Have tried to split the statements manually but few of the insert statements are very large and is hard to split as they were single statements.
Can anyone please suggest on how to tackle this situation?

Open a terminal and connect MySQL using below command.
mysql -u youruser -p
Now select your database in which you want to import schema and data.
use your_db_name;
Now Provide you sql file using below command.
source /home/user/yourdb.sql;

Related

Why do all my procedures run when I import mysqldump file?

I execute the following statement from the cmd terminal to import my MySQL Database:
mysql u- root p- database < "C:\Users\Tom\data.sql"
When I open my MySQL Database from the MySQL Workbench I've realised that more tables have been created that I don't recognise. Basically, what is happening is the stored procedures/routines I have created seem to be automatically running and thus creating many more tables? I don't want this, I'd rather execute routines as I wish using the "Call" statements in MySQL, is there a way stop this happening?

How to import the data from a data dump (SqLite3) into django?

I have a data dump file(.sql) containing certain data, I would like to import it to my database(currently Sqlite3 but thinking about changing to MySQL) in order to use it in my test website.
Also, I need to know if it's possible to add the models too automatically, I presume it needs to be added manually, but any how, if there is any way to solve it, please suggest it.
There is a way to help you generate the Django models automatically given you have an existing database.
However this is a shortcut. You may then fine tune your models and add them to your app as needed.
Structuring your models in apps might force you to use the Models db_table meta option.
If at some point you would like to switch databases (Sqlite3 -> MySQL) you can export (dump) your current data to json. Then you could import (load) the data to the new database (after creating the database tables with migrate command). To do this you can use Django management commands:
Dump data
Load data
I was able to get an alternative answer after researching a bit.
Since I'm having a PostgreSQL data dump file with a file extension '.sql', I was capable of running a single command that imported the whole data dump into my local database, which is PostgreSQL. I'm using PgAdmin4 as my database management system and I installed psql during the installation of PgAdmin4, I added the psql to the path of my command prompt, hence it was accessible.
In order to import the data dump, I used the command provided below,
psql -U <username> -d <database_name> < <file.sql>
The '<' after database_name is necessary, so be sure to include it.
Here is the username of the configured account, is the database to which the data dump should be added, , is the file containing the data dump.

MySQL Workbench - How to clone a database on the same server with different name?

I am using MYSQL Workbench and I want to clone a database on the same server with different name. It should duplicate the all the tables structure and data into the new database.
I know the usual way is probably using data export to generate a sql script of the database and then run the script on the new database but I encounter some issues with it.
Anyway, is there any better way or easier way to do so?
You can use migration wizard from MySQL Workbench. Just choose the same local connection in both source and target selection, then change schema name on manual editing step. If nothing appears on manual editing step click next and the source and targets will appear. Click slowly on the source database name and edit to the correct name. Go thorough to the end and voilĂ  - you have two identical databases with different names. Note you must have created the target database already and granted permissions to it for the MySQL Workbench user.
I tried to do it in MySQL Workbench 8.0. However I kept receiving an error regarding column-statics. The main idea is to use mysqldump.exe, located in the installation directory of MySQL Workbench, to export the data. So, supposing a Windows oriented platform:
Open Powershell, navigate to mysqldump.exe directory. In my case the command is:
cd C:\Program Files\MySQL\MySQL Workbench 8.0 CE
Export database by executing mysqldump providing the right arguments:
./mysqldump.exe --host=[hostServerIP] --protocol=tcp --user=[nameOfUser] --password=[yourPassword] --dump-date=FALSE --disable-keys=FALSE --port=[portOfMysqlServer] --default-character-set=utf8 --skip-triggers --column-statistics=0 "[databaseName]"
Without changing directory, import the exported file (.sql) by using the following command in Powershell:
Get-Content "[pathToExportedDataFile]" | ./mysql.exe --user=[nameOfUser] --password=[yourPassword] --port=[portOfMysqlServer] --host=[hostServerIP] --database=[nameOfNewDatabase] --binary-mode=1
You can check in the documentation here for more information regarding the mysqldump options.
Please note the following:
Do not forget to replace the values in [] with your own values and remove the []. Do not remove the quotes("") where the are present.
Do not switch Powershell for cmd or something like git-bash, since the above will not work.
As far as step 3 is concerned, I created the new database from MySQL Workbench and then ran the powershell command.
List item First, create a new database using CREATE DATABASE statement.
Second, export all the database objects and data of the database from which you want to copy using mysqldump tool.
Third, import the SQL dump file into the new database.

mysqldump.exe - how to have database renamed in the dump file

I'm writing a simple utility of merging few database dump files into a single one. I have a temporary database (lets name it 'db_temporary') and have to export it into the dump but in the dump file it should be named 'db_final'. Can I do this using 'mysqldump.exe'? This seems like a trivial task but I can't find any clue in the 'mysqldump' documentation here: https://dev.mysql.com/doc/refman/4.1/en/mysqldump.html
Big thanks to any help.
The Dump does not contain the database name, the dump only contains the tables and data from the database youve dumped. So yes, you could import the resulting SQL code in any Database you wish, just create a database and import the SQL code into that.
At least if you use it this form:
mysqldump.exe -u USERNAME -p database > database_dump.sql

How do I export a MySQL database from PHPMyAdmin and import it to SQLite?

I would like to export a database from PHPMyAdmin (or MySQl Workbench) and import it to a SQLite database so that I can do local editing and testing without screwing up the live version. I am very new to SQL, so all of the export options, etc, are rather dense to me at this point. I have tried using the default export settings PHPMyAdmin with the command
sqlite3 test_db.db < maindb.sql
as well as
sqlite3--> .read maindb.sql
But these throw a bunch of syntax errors and 'no such table' errors.
I have also tried the oft-cited script script found here, but when I try to run this using an export from MySQL Workbench, using the command:
943776/mysql2sqlite.sh maindb.sql | sqlite3 test_db.sqlite
I get the following error:
mysqldump: Got error: 2002: Can't connect to local MySQL server through socket '/var/run/mysql.sock' (2) when trying to connect
Am I not configuring the exports correctly?
Please see that the referenced script connects to the database server itself. It does not expect a dump!
./mysql2sqlite -h example.com -u root -pMySecretPassWord myDbase | sqlite3 database.sqlite
This is the way the script should be executed. With host, username, passwort and the mysql database you would like to dump.
Since database dumps and DBMS features can be severely different between different DBMS (like MySQL and sqlite3), I would recommend to install a local MySQL server instead of using sqlite3. What advantage have you achieved when you make changes to sqlite3, which you cannot apply to the MySQL production database without changes?
An alternative solution is to export an sql dump of your database and then import it back into phpLiteAdmin. From there you can manage your sqlite database inside your browser. When you want to export it, just open the folder where the database is stored and copy the database file.
This solution does not require messing around with scripts, and it's especially handy if you're on a Mac and you're using MAMP, since phpLiteAdmin comes preinstalled with it.