MySQL table 30 million records insert into another - mysql

I have a 30 million record mysql table.
Has about 20 columns of which I will use 15 to insert into another table.
Now I can't use PHP to load this large dataset (selecting 30 million rows and loading into memory isn't feasible), what would be the best method of loading all these records? MySQL 5.X
I'm using EMS to connect to the database.

What about doing an INSERT INTO MySmallerTable SELECT Col1, col2, col3... FROM MyBiggerTable
It might be worth breaking it into multiple INSERT
Like:
INSERT INTO ... SELECT ... WHERE ID between 1 and 100000;
INSERT INTO ... SELECT ... WHERE ID between 100001 and 200000;
etc.

you can do this
INSERT INTO new_table (`col1`,`col2`,`col3`) SELECT `oldcol1`,`oldcol2`,`oldcol3`
FROM old_table LIMIT 0,100000
and repeat it by php loop (with changing limit start value)

There are few ways you can including one user M_M provided above. I have not used EMS and not sure what it can and can't do. But I have extensively used Workbench.
A.
Create the new destination table
Create view on the source table with the columns of interest
LInsert into destination from source with simple INSERT INTO SOURCE_TABLE SELECT * FROM DESTINATION_TABLE
B.
Use mysqldump
Upload into new table
Alter table by dropping the columns you don't need

Related

remove Duplicated data from very huge table

I have a table contains more than 500 millions records in MySQL database ,
i need to remove duplicated from it ,
i tried this query on table contain 20 millions , it was ok but for the 500 millions it take very long time :
-- Create temporary table
CREATE TABLE temp_table LIKE names_tbles;
-- Add constraint
ALTER TABLE temp_table ADD UNIQUE(name , family);
-- Copy data
INSERT IGNORE INTO temp_table SELECT * FROM names_tbles;
is there better solution ?
One option is aggregation rather than insert ignore. That way, there is no need for the database to manage rejected records:
insert into temp_table(id, name, family)
select min(id), name, family
from names_tbles
group by id, family;
I would take one step further and suggest adding the unique constraints only after the table is populated, so there is no need for the database to check for duplicates (the query guarantees that already), which should speed up the insert statement.

MySQL Large Insert Update Query

I have a MySQL 8 RDS (innodb) instance that I am trying to insert / update to.
The target table contains approx 120m rows and I am trying to insert 2.5m to the table from a csv. Some of the data in the source table may already exist in the target table which is constrained by a primary key, in which case update.
Having done some research I have found that the quickest way seems to be to do a bulk load from the source table into a temporary table, then a
insert into target_table
select col1, col2 from source table a
on duplicate key update col1 = a.col1, col2 = a.col2
However this seems to be taking hours.
Is there a best practice to optimise inserts of this sort?
Would it be quicker to separate the inserts into inserts an updates separately? Can I disable indexes in the target table (I know this is possible for myisam)?
Thanks

How to insert 13 million records with selected columns of a table to another existing table?

I've two tables one is the main table having data and I want to insert data from another existing table having about 13 million records. I'm using the query to insert from another table i.e.
insert into table1 ( column1, col2 ...) select col1, col2... from table2;
But, unfortunately the query fails as lock wait timeout comes Error 1205.
What is the best way to do it in least time without timeout.
If you have a primary key on table2, then you can use that for ordering and inserting in batches:
insert into table1 ( column1, col2 ...)
select col1, col2...
from table2
order by <primary key>
limit 0, 100000
Then repeat this for additional values. (Of course, the 100,000 is arbitrary. A larger value might work. A smaller value might be necessary.)
Another possibility is to remove all indexes and insert triggers from table1, try the insert without them, and then add them back after the new data is in the table.

How to update all fields from one table to another table in the same database using mySQL?

There are 2 tables in the same database with the same structure. I want to copy all data from one table to the other table using mySQL. The source table may have the same, less or more number of rows of the destination table.
I tried searching. I found 2 approaches:
Approach #1
TRUNCATE destination;
INSERT INTO destination SELECT * FROM source
Approach #2
DROP TABLE destination;
CREATE TABLE destination SELECT * FROM source
Isn't there any other approach involving UPDATE?
Update I don't think so.
You can do Insert
Insert into destination
(
column_1,
column_2,
....
)
SELECT
column_1,
column_2,
....
FROM source
Note: No. of columns mention in destination = No. of columns mention in source
By the approach #1 will not work always.
and approach #2 will always work

Transferring all rows of a MySQL table to another

I want to bulk insert all rows from one table to another. I am confused on how to use Select with Insert. Is there a way that a new table is automatically created if it does not exist?
There are two ways to do this:
One is to INSERT INTO ... SELECT, which will insert the resultset of your query into an existing table with the same data structure as your query
INSERT INTO MyTable_Backup
SELECT * FROM MyTable
The other is to CREATE TABLE ... SELECT ..., which will create a new table based on the data structure of your query an insert the resultset.
CREATE TABLE MyTable_Backup
SELECT * FROM MyTable;
However one thing to note is that this will not match the indexes of the source table. If you need indexes, you need to add them manually.
trigger and/or select into are recommended here