mysql> create database lib;
Query OK, 1 row affected (0.31 sec)
mysql> use lib;
Database changed
mysql> create table library_2
-> (id int AUTO_INCREMENT primary key,
-> Book_name varchar(20),
-> Details varchar(50));
Query OK, 0 rows affected (2.24 sec)
mysql> insert into library_2 values
-> (1,'aaa','bbb'),
-> (2,'ccc','ddd'),
-> (3,'eee','fff'),
-> (4,'ggg','hhh');
Query OK, 4 rows affected (0.46 sec)
Records: 4 Duplicates: 0 Warnings: 0
mysql> select*from library_2;
+----+-----------+-------------------+
| id | Book_name | Details |
+----+-----------+-------------------+
| 1 | aaa | bbb |
| 2 | ccc | ddd |
| 3 | eee | fff |
| 4 | ggg | hhh |
+----+-----------+-------------------+
4 rows in set (0.00 sec)
mysql> create table library_audit2
-> (id int AUTO_INCREMENT primary key,
-> Book_Name varchar(20) not null,
-> Details varchar(50) default null,
-> change_date date,
-> library_id int,
-> foreign key(library_id) REFERENCES library_2(id));
Query OK, 0 rows affected (2.33 sec)
mysql> insert into library_audit2 values
-> (10,'aaa','bbb','2011-9-1',1),
-> (20,'ccc','ddd','2012-8-2',2),
-> (30,'eee','fff','2013-7-3',3),
-> (40,'ggg','hhh','2014-6-4',4);
Query OK, 4 rows affected (0.20 sec)
Records: 4 Duplicates: 0 Warnings: 0
mysql> select*from library_audit2;
+----+-----------+---------+-------------+------------+
| id | Book_Name | Details | change_date | library_id |
+----+-----------+---------+-------------+------------+
| 10 | aaa | bbb | 2011-09-01 | 1 |
| 20 | ccc | ddd | 2012-08-02 | 2 |
| 30 | eee | fff | 2013-07-03 | 3 |
| 40 | ggg | hhh | 2014-06-04 | 4 |
+----+-----------+---------+-------------+------------+
4 rows in set (0.00 sec)
mysql> create trigger BeforeLibraryDelete1
-> BEFORE DELETE
-> ON library_audit2 FOR EACH ROW
-> BEGIN
-> declare id1 int;
-> select library_id into id1 from library_audit2 where change_date=OLD.change_date;
-> delete from library_2 li where li.id=id1;
-> END $$
Query OK, 0 rows affected (0.45 sec)
mysql> DELIMITER ;
mysql> Delete from library_audit2 where change_date='2011-09-01';
ERROR 1451 (23000): Cannot delete or update a parent row: a foreign key constraint fails (`abc`.`library_audit2`, CONSTRAINT `library_audit2_ibfk_1` FOREIGN KEY (`library_id`) REFERENCES `library_2` (`id`))
I know what this error means, but i need a different trigger query to rectify this problem. This seems to ending up wrong no matter what I try. Kindly provide me with a query that works. Also because MYSQL doesn't work with INSTEAD OF, don't provide me with a query that has INSTEAD OF DELETE in it. But a replacement for it in MYSQL would be highly appreciated.
You are running in an endless loop.
You have a delete trigger on the table library_audit2 and in that trigger you delete from the same table which invokes another trigger and so on.
The DB won't allow that and returns that error message.
Related
So I have a table where a column that was given an auto_increment value accidentally got started form 300 instead of 1,2,3,4......i'm a beginner and i do not know how to change it back to 1,2,3,4......screenshot of table
how to change the 307, 308 to 1,2,3,4...?
I tried to update the table but that did not work.
Step-1) First take backup of your table data.
Step-2) Truncate the table by using the below SQL query.
TRUNCATE TABLE [Your_Table_Name];
Step-3) then again insert the into your table using backup data.
Alter table to drop the auto_increment, update, alter table to add the auto_increment
drop table if exists t;
create table t
( id int auto_increment primary key, val int);
insert into t values
(307,1),(308,1),(309,1),(310,1),(311,1);
alter table t
modify column id int;
#drop primary key;
show create table t;
update t
set id = id - 306;
alter table t
modify column id int auto_increment;
show create table t;
https://dbfiddle.uk/eBQh6cj8
With MySQL 8.0 you can use a window function to calculate the row numbers and then update the table:
mysql> select * from t;
+-----+------+
| id | val |
+-----+------+
| 307 | 1 |
| 308 | 1 |
| 309 | 1 |
| 310 | 1 |
| 311 | 1 |
+-----+------+
mysql> with cte as ( select id, row_number() over () as rownum from t )
-> update t join cte using (id) set id = rownum;
Query OK, 5 rows affected (0.00 sec)
Rows matched: 5 Changed: 5 Warnings: 0
mysql> select * from t;
+----+------+
| id | val |
+----+------+
| 1 | 1 |
| 2 | 1 |
| 3 | 1 |
| 4 | 1 |
| 5 | 1 |
+----+------+
Then make sure the next id won't be a high value:
mysql> alter table t auto_increment=1;
You can try to set the auto_increment to 1, MySQL will automatically advances that to the highest id value in the table, plus 1.
Be aware that this doesn't guarantee subsequent rows will use consecutive values. You can get non-consecutive values if:
You insert greater values explicitly, overriding the auto-increment.
You roll back transactions. Id values generated by auto-increment are not recycled if you roll back.
You delete rows.
Occasionally InnoDB will skip a number anyway. It does not guarantee consecutive values — it only guarantees unique values. You should not rely on the auto-increment to be the same as a row number.
Here is a one approach to your problem.
Please take note of the following points before proceeding:
Take backup of your table in-case things do not go as expected.
Below test case has been performed on MySQL 5.7 and MyISAM Engine.
Step1: Generating dummy test table as per your test case.
mysql> CREATE TABLE t (
-> `Id` int(11) NOT NULL AUTO_INCREMENT,
-> `product_id` int(11) DEFAULT 0,
-> PRIMARY KEY (`Id`)
-> ) ENGINE=MyISAM;
Query OK, 0 rows affected (0.03 sec)
-- Inserting dummy data
mysql> INSERT INTO t VALUES (300,1);
Query OK, 1 row affected (0.00 sec)
mysql> INSERT INTO t VALUES (302,1);
Query OK, 1 row affected (0.00 sec)
mysql> INSERT INTO t VALUES (305,1);
Query OK, 1 row affected (0.00 sec)
-- Checking auto_increment value
mysql> show create table t;
+-------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| t | CREATE TABLE `t` (
`Id` int(11) NOT NULL AUTO_INCREMENT,
`product_id` int(11) DEFAULT '0',
PRIMARY KEY (`Id`)
) ENGINE=MyISAM AUTO_INCREMENT=306 DEFAULT CHARSET=latin1 |
+-------+-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
mysql> INSERT INTO t (product_id) VALUES (2);
Query OK, 1 row affected (0.01 sec)
-- Below is the resultant table for which we need Id starting from 1,2,3 and so on...
mysql> SELECT * FROM t;
+-----+------------+
| Id | product_id |
+-----+------------+
| 300 | 1 |
| 302 | 1 |
| 305 | 1 |
| 306 | 2 |
+-----+------------+
4 rows in set (0.00 sec)
Step2: Remove AUTO_INCREMENT for the column and set the Ids manually.
-- Remove AUTO_INCREMENT
mysql> ALTER TABLE t MODIFY COLUMN Id int(11) NOT NULL;
Query OK, 4 rows affected (0.00 sec)
Records: 4 Duplicates: 0 Warnings: 0
-- Set the Id manually starting from 1
mysql> SET #i = 0;UPDATE t SET id = #i :=#i +1;
Query OK, 0 rows affected (0.00 sec)
Query OK, 5 rows affected (0.00 sec)
Rows matched: 5 Changed: 5 Warnings: 0
-- Below is the updated table with Id starting from 1,2,3 and so on...
mysql> SELECT * FROM t;
+----+------------+
| Id | product_id |
+----+------------+
| 1 | 1 |
| 2 | 1 |
| 3 | 1 |
| 4 | 2 |
| 5 | 2 |
+----+------------+
5 rows in set (0.00 sec)
Step3: Enable AUTO_INCREMENT again for future record insertions.
-- Enable AUTO_INCREMENT again for future record insertions.
mysql> ALTER TABLE t MODIFY COLUMN Id int(11) NOT NULL AUTO_INCREMENT;
Query OK, 5 rows affected (0.01 sec)
Records: 5 Duplicates: 0 Warnings: 0
-- Set the AUTO_INCREMENT value to continue from highest value of id in the table.
mysql> SELECT MAX(id+1) FROM t;
+-----------+
| MAX(id+1) |
+-----------+
| 6 |
+-----------+
1 row in set (0.00 sec)
mysql> ALTER TABLE t AUTO_INCREMENT=6;
Query OK, 5 rows affected (0.00 sec)
Records: 5 Duplicates: 0 Warnings: 0
-- Table is successfully modified and will have future records inserted with no gaps in Id's
mysql> INSERT INTO t (product_id) VALUES (5);
Query OK, 1 row affected (0.00 sec)
mysql> SELECT * FROM t;
+----+------------+
| Id | product_id |
+----+------------+
| 1 | 1 |
| 2 | 1 |
| 3 | 1 |
| 4 | 2 |
| 5 | 2 |
| 6 | 5 |
+----+------------+
6 rows in set (0.00 sec)
The DBCC CHECKIDENT management command is used to reset identity counter
DBCC CHECKIDENT (table_name [, { NORESEED | { RESEED [, new_reseed_value]}}])
[ WITH NO_INFOMSGS ]
EXample:
DBCC CHECKIDENT ('TestTable', RESEED, 0)
GO
many times we need to just reseed to next Id available
declare #max int
select #max=max([Id]) from [TestTable]
if #max IS NULL --check when max is returned as null
SET #max = 0
DBCC CHECKIDENT ('[TestTable]', RESEED, #max)
This will check the table and reset to the next ID.
You can get help from the link below:
Reset identity seed after deleting records in SQL Server
My mother says: the mountain that can be seen is not far away, don't stop trying
The problem is related to autoincrement with mysql. What I'm trying to achieve is to increment an ID value based on the customer number. So basically i insert data sets without any order into a table. Each time a new customer is inserted, i would like the id column to be incremented, but of course kept for every row related to the customer, see the table below. Is there any way to achieve that via sql? I tried my luck with multiple primary keys and also looked into partioning, but was not able to figure it out by myself.
you can use a query like this:
INSERT INTO autoinc (cid,info,customer)
SELECT
COALESCE(max(cid),0) +1
, 'A Customer 1'
, 12345
FROM autoinc
WHERE customer = 12345;
sample
mysql> SELECT * from autoinc;
Empty set (0,00 sec)
mysql> INSERT INTO autoinc (cid,info,customer)
-> SELECT
-> COALESCE(max(cid),0) +1
-> , 'A Customer 1'
-> , 12345
-> FROM autoinc
-> WHERE customer = 12345;
Query OK, 1 row affected (0,00 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> SELECT * from autoinc;
+----+------+--------------+----------+
| id | cid | info | customer |
+----+------+--------------+----------+
| 1 | 1 | A Customer 1 | 12345 |
+----+------+--------------+----------+
1 row in set (0,00 sec)
mysql> INSERT INTO autoinc (cid,info,customer)
-> SELECT
-> COALESCE(max(cid),0) +1
-> , 'A Customer 1'
-> , 12345
-> FROM autoinc
-> WHERE customer = 12345;
Query OK, 1 row affected (0,00 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> SELECT * from autoinc;
+----+------+--------------+----------+
| id | cid | info | customer |
+----+------+--------------+----------+
| 1 | 1 | A Customer 1 | 12345 |
| 2 | 2 | A Customer 1 | 12345 |
+----+------+--------------+----------+
2 rows in set (0,00 sec)
mysql> INSERT INTO autoinc (cid,info,customer)
-> SELECT
-> COALESCE(max(cid),0) +1
-> , 'B Customer 2'
-> , 9876
-> FROM autoinc
-> WHERE customer = 9876;
Query OK, 1 row affected (0,00 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> SELECT * from autoinc;
+----+------+--------------+----------+
| id | cid | info | customer |
+----+------+--------------+----------+
| 1 | 1 | A Customer 1 | 12345 |
| 2 | 2 | A Customer 1 | 12345 |
| 3 | 1 | B Customer 2 | 9876 |
+----+------+--------------+----------+
3 rows in set (0,00 sec)
mysql> INSERT INTO autoinc (cid,info,customer)
-> SELECT
-> COALESCE(max(cid),0) +1
-> , 'A Customer 1'
-> , 12345
-> FROM autoinc
-> WHERE customer = 12345;
Query OK, 1 row affected (0,00 sec)
Records: 1 Duplicates: 0 Warnings: 0
mysql> SELECT * from autoinc;
+----+------+--------------+----------+
| id | cid | info | customer |
+----+------+--------------+----------+
| 1 | 1 | A Customer 1 | 12345 |
| 2 | 2 | A Customer 1 | 12345 |
| 3 | 1 | B Customer 2 | 9876 |
| 4 | 3 | A Customer 1 | 12345 |
+----+------+--------------+----------+
4 rows in set (0,00 sec)
mysql>
What you probably need is to have different values for ID for each customer. The easiest way to achieve this is to use an AUTO_INCREMENT column as PK of your table.
It is an implementation detail that for consecutively inserted rows an AUTO_INCREMENT column has consecutive values. And the previous statement is not even true. It just happens some times, it is not guaranteed. If an INSERT statement is enclosed in a transaction that is rolled back, the value generated by that insert is skipped. Also, if an INSERT statements that use ON DUPLICATE KEYS UPDATE tries to insert many rows but some of them already exist in the table then the IDs generated for the duplicate keys are skipped.
What I want to stress out is that there is no point trying to get consecutive values using an AUTO_INCREMENT column and it is not even possible.
Back to your problem, if the column ID is the PK of the table and its type is INT AUTO_INCREMENT then MySQL guarantees there won't be two rows having the same value in the ID column and this also satisfies your need to have different values for ID for all the rows with the same value in customer.
You could procedurally do this using a stored procedure, which I won't elaborate on (unless requested) as it isn't a simple query (as you're asking for).
A hacky solution would be to bulk insert into a new joining table:
CREATE TABLE auto_inc_customer_id (
id INT UNSIGNED NOT NULL AUTO_INCREMENT,
customer_id INT UNSIGNED NOT NULL, -- Could/should add a FK constraint
PRIMARY KEY (id)
) ENGINE=innodb;
INSERT INTO auto_inc_customer_id SELECT NULL, DISTINCT(Customer) FROM YourExistingTable;
See: http://dev.mysql.com/doc/refman/5.7/en/ansi-diff-select-into-table.html
I have two tables:
Company(id int, varchar(name), primary key(id));
Product (id int, c_id int, varchar(name), foreign key(c_id), references Company(id));
Table 'Company' stores a list of company names and 'Product' stores a list of product names, and one company can have multiple products.
If I have data file like this, tab delimited:
1 Apple iPhone
2 Apple iPad
3 Apple iMac
4 Google Gmail
5 Google Google Search
6 Amazon Kindle
Is it possible to use "Load DATA INFILE" to load this file into two tables, where the first column goes to table Company and second coloumn goes to Product? The question is how to load selected fields to a particular table, instead of loading the full record into one table.
You can try an approach like this, with the premise of using a temporary table. I hope you find it useful.
/path/to/file/file.csv
1,Apple,iPhone
2,Apple,iPad
3,Apple,iMac
4,Google,Gmail
5,Google,Google Search
6,Amazon,Kindle
mysql> DELIMITER //
mysql> DROP TRIGGER IF EXISTS `from_load_data`//
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> DROP TABLE IF EXISTS `temp_company_product`//
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> DROP TABLE IF EXISTS `product`//
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> DROP TABLE IF EXISTS `company`//
Query OK, 0 rows affected, 1 warning (0.00 sec)
mysql> CREATE TABLE `company` (
-> `id` INT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
-> `name` VARCHAR(25),
-> UNIQUE KEY `unique_name` (`name`)
-> )//
Query OK, 0 rows affected (0.00 sec)
mysql> CREATE TABLE `product` (
-> `id` INT UNSIGNED AUTO_INCREMENT PRIMARY KEY,
-> `c_id` INT UNSIGNED NOT NULL,
-> `name` VARCHAR(25),
-> FOREIGN KEY (`c_id`) REFERENCES `company`(`id`)
-> ON UPDATE CASCADE ON DELETE CASCADE
-> )//
Query OK, 0 rows affected (0.00 sec)
mysql> CREATE TABLE `temp_company_product` (
-> `id` INT UNSIGNED PRIMARY KEY,
-> `company_name` VARCHAR(25),
-> `product_name` VARCHAR(25)
-> )//
Query OK, 0 rows affected (0.00 sec)
mysql> CREATE TRIGGER `from_load_data` AFTER INSERT ON `temp_company_product`
-> FOR EACH ROW
-> BEGIN
-> INSERT INTO `company` (`name`) VALUES (NEW.`company_name`)
-> ON DUPLICATE KEY UPDATE `name` = VALUES(`name`);
-> INSERT INTO `product` (`c_id`, `name`)
-> SELECT `id`, NEW.`product_name`
-> FROM `company`
-> WHERE `name` = NEW.`company_name`;
-> END//
Query OK, 0 rows affected (0.00 sec)
mysql> LOAD DATA INFILE '/path/to/file/file.csv'
-> INTO TABLE `temp_company_product`
-> FIELDS TERMINATED BY ',' ENCLOSED BY '"'
-> LINES TERMINATED BY '\r\n'
-> (`id`, `company_name`, `product_name`)//
Query OK, 6 rows affected (0.00 sec)
Records: 6 Deleted: 0 Skipped: 0 Warnings: 0
mysql> SELECT
-> `id`,
-> `company_name`,
-> `product_name`
-> FROM
-> `temp_company_product`//
+----+--------------+---------------+
| id | company_name | product_name |
+----+--------------+---------------+
| 1 | Apple | iPhone |
| 2 | Apple | iPad |
| 3 | Apple | iMac |
| 4 | Google | Gmail |
| 5 | Google | Google Search |
| 6 | Amazon | Kindle |
+----+--------------+---------------+
6 rows in set (0.00 sec)
mysql> SELECT
-> `id`,
-> `name`
-> FROM
-> `company`//
+----+--------+
| id | name |
+----+--------+
| 6 | Amazon |
| 1 | Apple |
| 4 | Google |
+----+--------+
3 rows in set (0.00 sec)
mysql> SELECT
-> `id`,
-> `c_id`,
-> `name`
-> FROM
-> `product`//
+----+------+---------------+
| id | c_id | name |
+----+------+---------------+
| 1 | 1 | iPhone |
| 2 | 1 | iPad |
| 3 | 1 | iMac |
| 4 | 4 | Gmail |
| 5 | 4 | Google Search |
| 6 | 6 | Kindle |
+----+------+---------------+
6 rows in set (0.00 sec)
mysql> DROP TRIGGER IF EXISTS `from_load_data`//
Query OK, 0 rows affected (0.00 sec)
mysql> DROP TABLE IF EXISTS `temp_company_product`//
Query OK, 0 rows affected (0.00 sec)
mysql> DELIMITER ;
I don't believe that LOAD DATA is flexible enough to allow to you selectively load certain columns into two different already-existing tables. It was designed to be a fast work horse, but was not designed to be particularly flexible. An alternative would be to load your data into a temporary table, and then INSERT INTO...SELECT the data into your two tables.
CREATE TABLE temp(id int, company varchar(55), product varchar(55));
LOAD DATA LOCAL INFILE 'file.csv' INTO TABLE temp
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(id, company, product);
And then just use INSERT INTO...SELECT to get the data into your two tables which already exist:
INSERT INTO Company (name)
SELECT company
FROM temp
INSERT INTO Product (name)
SELECT product
FROM temp
I have a MySQL database with tables t1 and t2. One of the columns in table t1 has a foreign key to t2.
Need to allow the foreign key column to accept null values. There is already some important data so recreating the table is not an option.
Tried the usual alter table commands but it showed syntax error.
Is there a way to go around it without affecting the database?
This is what I tried:
ALTER TABLE t1 MODIFY fk_column_id NULL;
The missing part is the type definition in the modify statement. With MODIFY you redefine the column, thus you need to give the new type as well. But in case you only modify that it can be null, no data will be lost.
Create referenced table and filling it :
mysql> -- Creating referenced table
mysql> create table `tUser` (
-> `id` int auto_increment not null,
-> `name` varchar(16),
-> primary key (`id`)
-> );
Query OK, 0 rows affected (0.07 sec)
mysql> -- Filling and checking referenced table
mysql> insert into `tUser` (`name`) values ("Jane"), ("John");
Query OK, 2 rows affected (0.04 sec)
Records: 2 Duplicates: 0 Warnings: 0
mysql> select * from `tUser`;
+----+------+
| id | name |
+----+------+
| 1 | Jane |
| 2 | John |
+----+------+
2 rows in set (0.07 sec)
mysql> -- Creating referencing table
mysql> create table `tHoliday` (
-> `id` int auto_increment not null,
-> `userId` int,
-> `date` date,
-> primary key (`id`),
-> foreign key (`userId`) references `tUser` (`id`)
-> );
Query OK, 0 rows affected (0.14 sec)
mysql> -- Filling and checking referencing table
mysql> insert into `tHoliday` (`userId`, `date`) values
-> (1, "2014-11-10"),
-> (1, "2014-11-13"),
-> (2, "2014-10-10"),
-> (2, "2014-12-10");
Query OK, 4 rows affected (0.08 sec)
Records: 4 Duplicates: 0 Warnings: 0
mysql> select * from `tHoliday`;
+----+--------+------------+
| id | userId | date |
+----+--------+------------+
| 1 | 1 | 2014-11-10 |
| 2 | 1 | 2014-11-13 |
| 3 | 2 | 2014-10-10 |
| 4 | 2 | 2014-12-10 |
+----+--------+------------+
4 rows in set (0.05 sec)
mysql> -- Updating foreign key column to allow NULL
mysql> alter table `tHoliday` modify `userId` int null;
Query OK, 0 rows affected (0.08 sec)
Records: 0 Duplicates: 0 Warnings: 0
mysql> -- Inserting line without foreign key
mysql> insert into `tHoliday` (`date`) values ("2014-11-15");
Query OK, 1 row affected (0.06 sec)
mysql> select * from `tHoliday`;
+----+--------+------------+
| id | userId | date |
+----+--------+------------+
| 1 | 1 | 2014-11-10 |
| 2 | 1 | 2014-11-13 |
| 3 | 2 | 2014-10-10 |
| 4 | 2 | 2014-12-10 |
| 5 | NULL | 2014-11-15 |
+----+--------+------------+
5 rows in set (0.03 sec)
I am trying to do a SELECT... INSERT into a table with constraints that prevent NULL values:
mysql> create table if not exists table1 (
-> id int not null auto_increment,
-> description varchar(45),
-> primary key (`id`)
-> );
Query OK, 0 rows affected (0.01 sec)
mysql> create table if not exists table2 (
-> id int not null auto_increment,
-> description varchar(45) not null,
-> primary key (`id`),
-> unique index `unique_desc` (`description`)
-> );
Query OK, 0 rows affected (0.02 sec)
mysql> insert ignore into table1
-> (description)
-> values("stupid thing"),
-> ("another thing"),
-> (null),
-> ("stupid thing"),
-> ("last thing");
Query OK, 5 rows affected (0.00 sec)
Records: 5 Duplicates: 0 Warnings: 0
mysql> select * from table1;
+----+---------------+
| id | description |
+----+---------------+
| 1 | stupid thing |
| 2 | another thing |
| 3 | NULL |
| 4 | stupid thing |
| 5 | last thing |
+----+---------------+
5 rows in set (0.00 sec)
Cool, we have the source (table1) and destination (table2) tables created, and the source table populated with some duplicate, null data.
If I do a normal SELECT... INSERT into the destination table, I get a column with empty string as the value:
mysql> insert ignore into table2
-> (description)
-> select description
-> from table1;
Query OK, 4 rows affected, 1 warning (0.00 sec)
Records: 5 Duplicates: 1 Warnings: 1
mysql> select * from table2;
+----+---------------+
| id | description |
+----+---------------+
| 3 | |
| 2 | another thing |
| 4 | last thing |
| 1 | stupid thing |
+----+---------------+
4 rows in set (0.00 sec)
This is bad. But some boss brogrammer led me to the answer in this question:
MySQL Insert Select - NOT NULL fields
And now this method gives me the desired result:
mysql> insert ignore into table2
-> (description)
-> select description
-> from table1
-> where description <> '' and description is not null;
Query OK, 3 rows affected (0.00 sec)
Records: 4 Duplicates: 1 Warnings: 0
mysql> select * from table2;
+----+---------------+
| id | description |
+----+---------------+
| 2 | another thing |
| 3 | last thing |
| 1 | stupid thing |
+----+---------------+
3 rows in set (0.00 sec)
Is there a way for me to get the above result without having to manually protect each field using the WHERE clause?
Thanks in advance,
K
This technically answers your question in that you can eliminate the nulls by a join instead of the where clause.
insert ignore into table2
(description)
select t.description from table1 t
join
(
select distinct description from table1
) t1 on (t.description=t1.description);
I am pretty sure, however, that you will need to specify a join for each field though. Off the top of my head, I can't think of a way around this.