How can i export single table from database - mysql

I Need to take single "nametable" from "MYSQLDATABASE1" and take it by all privilege & relation then import it into other "MYSQLDATABASE2"

This will dump the whole table from the specified database 'MYSQLDATABASE1'
mysqldump -u username -p MYSQLDATABASE1 table_name > nametable.sql
This will load the dumped table into the required database 'MYSQLDATABASE2'
mysql -u username -p MYSQLDATABASE2 < nametable.sql
regarding the privileges & relationship i think this part will require you to manually lookup for them and do it manually table by table

Related

importing mysql .sql file appears to randomly selects which tables to import or not

I have database A. I issue this command against it:
mysqldump --host=localhost -uroot -p"mypassword" my_db_name > file.sql
now I take this file to machine B, running mysql too. I create a database:
create database newdb;
I then:
mysql --host=localhost -uroot -proot newdb < file.sql
My problem is that not all tables that exist in file.sql are created in the new database! I clearly see CREATE TABLES users in the content of the file.sql followed by thousands of INSERT calls for content in that table.
But users table is never created in the new database. I am completely lost as to why.
If you have foreign keys, the tables might be created in the wrong order and since the constraints can't be created, creating the table fails. Try adding SET FOREIGN_KEY_CHECKS=0 in the beginning of the dump and SET FOREIGN_KEY_CHECKS=0 at the end.
Delete whole newdb database;
Restart mysqld;
Run mysqlcheck --repair --all-databases -u root -p root on machine B;
Create newdb again (or maybe call it newdb2 just to be sure);
Delete file.sql on machine B, copy file.sql again from machine A and import by mysql --host=localhost -uroot -proot newdb < file.sql;
Run SHOW engine innodb STATUS; and or show table status and analyze results.
Copy a CREATE TABLE that failed to work. In the commandline tool "mysql", paste that. What messages, if any do you get? Does it create the table?
Please provide that CREATE for us; there may be some odd clues.
Also provide SHOW VARIABLES LIKE '%enforce%';

How to export thousands of tables from large MySQL database

I have a MySQL database with tens of thousands of tables in a Wordpress multisite database, and I need to export several thousand of them so that I can import them into a new database.
I know that I can use mysqldump like this: "mysqldump -u user -p database_name table_1 table_2 table_3 > filename.sql", but what's the best way to make this scale? If it helps, the tables are named as follows: "wp_blogid_tablename" where blogid is the ID of the blog (there are around 1000 blogs to export), and tablename is one of many different tables names, for example:
wp_8_commentmeta
wp_8_comments
wp_8_links
wp_8_options
wp_8_postmeta
wp_8_posts
wp_8_referer_blacklist
wp_8_referer_visitLog
wp_8_signups
wp_8_term_relationships
wp_8_term_taxonomy
wp_8_termmeta
wp_8_terms
You can try this but not tested though -
mysqldump -u user -p database_name table_blogid_* > wp_blogid.sql
The first approach might not work. Anyway, here is another solution for you -
mysqldump DBNAME $(mysql -D DBNAME -Bse "show tables like 'wp_8_%'") > wp_8.sql
Or you can try this, get the tables into a file first -
mysql -N information_schema -e "select table_name from tables where table_schema = 'databasename' and table_name like 'wp_8_%'" > wp_8_tables.txt
Now execute the sqldump script to export the tables -
mysqldump -u user -p database_name `cat wp_8_tables.txt` > wp_blogid.sql
My best solution so far was to create a shell script with an array of numbers (the blog ids) that would loop and then used the mydumper command to export the SQL tables to a single directory. The command looked like this:
mydumper --database="mydbname" --outputdir="/path/dir/" --regex="mydbname\.wp_${i}_.*"
(the ${i} is the blog id from the array loop)
Once completed I was able to load all of the sql files into the new database with the myloader command:
myloader --database="mynewdbname" --directory="/path/dir/"
Next challenge is to figure out how to DROP all of the exported tables from the original database...

Restore MySQL table from backup

I have created backup of a specific table from my database by using the command below.
mysqldump -u root -p db_name table_name > table.sql
Is it possible to restore the specific backup table without affecting the data of the rest of the tables? Which means, whatever data from my backup file for my table will be the only one affected?
The reverse will be:
mysql database_name < database_name.sql
But this is for the whole database. How to do it with table backup alone?
While the answer given by "Pradeep Reddy" is absolutely correct there is another way of doing it as well from inside mysql prompt using the SOURCE command.
mysql> USE database_name;
mysql> SOURCE /my_fullpath_to_backup_folder/table.sql
mysql -u root -p databasename < mytable.sql

How to Import MySqlDump File into MySQL without Selective Columns

I am having few large mysqldump files that i need to import to mysql on my desktop mysql server.Each dump file has only 1 table with multiple columns.I want to import on selective 2 columns as other columns are not required.
Currently i have been importing the dump using below stated commands, but this imports table with all columns.But i want to import only selective columns.
Create Database DatabaseName;
CONNECT DatabaseName
Show tables;
Source D:/DatabaseName
please suggest.
There are three ways you can go about this:
The easiest and fastest way to do this would be to first import the full dump, then drop the column(s) you don't want:
mysql -u username -p password database_name < file_name.sql
mysql -u username -p password database_name -e'ALTER TABLE table_name DROP COLUMN column_name'
If the column you want to remove is very large, you can use sed to preemptively (and quickly) remove the offending column. Something like this (the regex patterns will change based on the data you have in your table):
# First, remove the column name from the INSERT line
sed -i "s/`column_name`,//g" file_name.sql
# Then, remove the column value from the VALUES lines
sed -i "s/(\([0-9]+,'[\w+]',[0-9]+,)'.*',([0-9]+\),/\1\2/g" file_name.sql
# Then, you can import your dump, entirely stripped of that column
mysql -u username -p password database_name < file_name.sql
More about sed: https://en.wikipedia.org/wiki/Sed
More about regex: https://en.wikipedia.org/wiki/Regular_expression
If you have access to the DB the dump was made from, you can remove the column from a temp table before dumping:
# First, create the temp table as a clone of your table, import the data
mysql -u username -p password database_name -e 'CREATE TABLE table_name2 LIKE table_name; INSERT INTO table_name2 SELECT * FROM table_name; ALTER TABLE table_name2 DROP COLUMN column_name;'
# Dump ONLY the temp table with the missing column
mysqldump -u username -p password database_name table_name2 > dump_excluding_column.sql
# Afterwards, delete the temporary table
mysql -u username -p password database_name -e 'DROP TABLE table_name2;'
When performing the import, you'll have to rename your table afterwards, e.g.:
# Run the import
mysql -u username -p password database_name < dump_excluding_column.sql
# Afterwards, rename the table name to remove the "2" suffix
mysql -u username -p password database_name -e 'RENAME TABLE table_name2 TO table_name;'
if You need dump with only necessary columns:
1) create temporary db
2) import dump
3) with some db gui or from console alter tables and keep necessary columns
4) backup resulting database
if You need to import directly to database only N olumns, so:
1) import dump
2) with some db gui or from console alter tables and keep necessary columns
sooooooo... simple

How to export mysql database through command line, but ignore some specific table

I have a large MySQL database its almost 434 tables. I would like the export the database but ignore or skip a couple of tables. This process should be done through command line, because i have more than 6 GB+ data in database. What is the proper syntax to export all tables, but ignore some specific of them?
mysqldump -u root -p database directory table1 table2 table2 > /var/www/mydb_tables.sql
This query is working fine, but its difficulty to mention all the 434+ table name. I want the query that skip only specific table and export remaining all table through command line.
You can use --ignore-table to skip certain tables.
mysqldump -u username -p database --ignore-table=database.table1 --ignore-table=database.table2 > /var/www/mydb_tables.sql