How can I import data from a text file into a database without giving a primary key in the text file?
So I have a table where I have 3 columns: ID, firstName, lastName.
ID is auto incremented. I would like to read in the names from the text file like that:
John, Smith;
Michael, Jordan;
I don't want to use the primary key, as I don't know what will be the next primary key in the table, that should be done by auto increment.
If I use the text file like this, than I get the error message: Invalid column count...
The settings:
Columns separated with: ,
Columns enclosed with: "
Columns escaped with: \
Lines terminated with: ;
If I use the text file like this:
21, John, Smith; 22, Michael, Jordan;
The file can be imported (with the strange behavior that it tries to read the 3 empty line too, and sends an error, this one I don't understand either, but its a different topic)
This is the dump from the table:
CREATE TABLE IF NOT EXISTS `LoginData2` (
`FirstName` varchar(10) NOT NULL,
`LastName` varchar(10) NOT NULL,
`ID` int(4) NOT NULL AUTO_INCREMENT,
PRIMARY KEY (`ID`),
UNIQUE KEY `ID` (`ID`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 AUTO_INCREMENT=1 ;
Your spec says that Columns enclosed with: "
I added quotes around the columns (and took away the spaces):
"John","Smith";"Michael","Jordan";
AND put the auto increment ID column last. It imported fine with these settings.
This is a dump from my test table. Compare it with yours and see what is different. Also, try creating this table and import the data above to see how it works.
CREATE TABLE IF NOT EXISTS `users` (
`firstname` varchar(100) NOT NULL,
`lastname` varchar(100) NOT NULL,
`id` int(11) NOT NULL
) ENGINE=InnoDB AUTO_INCREMENT=7 DEFAULT CHARSET=utf8;
ALTER TABLE `users`
ADD PRIMARY KEY (`id`);
ALTER TABLE `users`
MODIFY `id` int(11) NOT NULL AUTO_INCREMENT,AUTO_INCREMENT=7;
UPDATE: When you import with phpMyAdmin, make sure the import setttings are correct. Your settings are not the default. I believe you need to choose csv using LOAD DATA and then fill in all the delimiters as you have stated.
Related
Here's what I'm trying to do:
CREATE TABLE IF NOT EXISTS hashes (
id int NOT NULL AUTO_INCREMENT,
text varchar(50) NOT NULL,
hash varchar(64) NOT NULL AS (SHA2(CONCAT(text), 256) STORED,
PRIMARY KEY (id)
) DEFAULT CHARSET=utf8;
And then I want to run an insert like this:
INSERT INTO `hashes` (`text`) VALUES ('testing');
From the research I've done, the id should be automatically generated since auto_increment is enabled, so I don't need to define it in the insert query.
From my CREATE TABLE query, the hash should be automatically generated based upon the data entered into the text field. However, when I run the CREATE TABLE command I get an error with this line:
hash varchar(64) NOT NULL AS (SHA2(CONCAT(text), 256) STORED
I'm just wanting the hash to be automatically generated similar to how CURRENT_TIMESTAMP will automatically generate the current time by default.
What am I doing wrong?
It seems you have syntax error. You should write NOT NULL after SHA2 hash function. Please try:
CREATE TABLE IF NOT EXISTS hashes (
id int NOT NULL AUTO_INCREMENT,
text varchar(50) NOT NULL,
hash varchar(64) AS (SHA2(CONCAT(text), 256)) STORED NOT NULL ,
PRIMARY KEY (id)
) DEFAULT CHARSET=utf8;
INSERT INTO `hashes` (`text`) VALUES ('testing');
You don't need to declare your hash column as NOT NULL. It's based on another NOT NULL column, text, so the hash will naturally be NOT NULL as well.
You also have forgotten a closing parenthesis.
hash varchar(64) AS (SHA2(CONCAT(text), 256) STORED,
1 2 3 3 2 ^
You need another closing paren where I indicated ^.
If you already have the table filled by some content, you can Alter it with :
ALTER TABLE `page` ADD COLUMN `hash` char(64) AS (SHA2(`content`, 256)) AFTER `content`
This solution will add hash column right after the content one, make hash for existing and new records too. Unique index can be added to prevent insertion of large content duplicates.
So I have someone working on a csv (generated from my table) where they will be updating some fields by hand. This may happen multiple times. I'd like to take the modified csv and update my existing table.
From my understanding, I will need to create a tmp table and then use that to update the existing table. So I can create the temporary table, but how can I iterate through that table and use it to update the existing table?
My sql querying skills are pretty basic. I think its possible, but I'm not sure where to start.
You don't need a temporary table. Just make sure the CSV file includes the primary key of the table. Then you can use the REPLACE modifier in LOAD DATA INFILE. From the documentation:
If you specify REPLACE, input rows replace existing rows. In other words, rows that have the same value for a primary key or unique index as an existing row.
In the CSV you are generating that gets edited, you must include a unique value that will allow you to match the edited record to the original record. Make sure the user doesn't change that column! Also, make sure you have a unique key on that column.
You can then import the edited data into a table with the same (or at least very similar) structure as the original table.
Once the data is imported, you can use an INSERT ... ON DUPLICATE KEY UPDATE ... statement to update the original table. Here's an example:
Main data table:
DROP TABLE IF EXISTS `my_table`;
CREATE TABLE IF NOT EXISTS `my_table` (
`id` INT(10) UNSIGNED NOT NULL AUTO_INCREMENT COMMENT 'Primary Key',
`fld1` VARCHAR(100) NULL,
`fld2` VARCHAR(100) NULL,
`fld3` VARCHAR(100) NULL,
PRIMARY KEY (`id`)
)
ENGINE=MyISAM
AUTO_INCREMENT=1
DEFAULT CHARSET=utf8
COLLATE=utf8_unicode_ci;
Temporary table for edited CSV import:
DROP TABLE IF EXISTS `import_table`;
CREATE TABLE IF NOT EXISTS `import_table` (
`n_id` INT(10) NOT NULL COMMENT 'Original Primary Key',
`n_fld1` VARCHAR(100) NULL,
`n_fld2` VARCHAR(100) NULL,
`n_fld3` VARCHAR(100) NULL,
PRIMARY KEY (`id`)
)
ENGINE=MyISAM
AUTO_INCREMENT=1
DEFAULT CHARSET=utf8
COLLATE=utf8_unicode_ci;
Simulated data before export and editing:
INSERT INTO `my_table`
(`fld1`,`fld2`,`fld3`)
VALUES
('John','Doe','Atlanta'),
('Jane','Smith','New York'),
('Bill','Howe','San Antonio'),
('Harry','Fields','Paris');
Simulate the imported, edited records:
INSERT INTO `import_table`
(`n_id`,`n_fld1`,`n_fld2`,`n_fld3`)
VALUES
(1,'John','Doe','Decatur, IL'),
(2,'Jane','Smithsonian','New York, NY'),
(3,'Bill','Bellweather','San Antonio, TX'),
(4,'Harry','Belefonte','Houston, TX');
Merge the imported, edited records bak into the main table:
INSERT INTO `my_table`
(`id`,`fld1`,`fld2`,`fld3`)
SELECT `n_id`,`n_fld1`,`n_fld2`,`n_fld3`
FROM `import_table`
ON DULPICATE KEY UPDATE
`fld1` = `n_fld1`,
`fld2` = `n_fld2`,
`fld3` = `n_fld3`;
I create the MySql table by the following sql statement:
CREATE TABLE IF NOT EXISTS `mytable` (
`agent` varchar(64) NOT NULL,
`name` varchar(40) NOT NULL,
`app` varchar(64) NOT NULL,
PRIMARY KEY (`app`,`agent`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
As you see, the field 'app' and 'agent' is the primary key. But unfortunately it doesn't work when I insert the following data, it always show the duplicated key in 'app' field:
app agent name
-------------------------
MyApp ios cde
MyApp android abc
Can anybody tell me anything wrong? Thanks
In your primary key app and agent are a primary key together, not two individual keys.
You'll be able to add many rows with app = 'MyApp' as long as agent differs. And the other way around.
If you wan't to disallow multiple rows with the same app and multiple rows with the same agent add normal unique indexes.
CREATE TABLE IF NOT EXISTS `mytable` (
`agent` varchar(64) NOT NULL,
`name` varchar(40) NOT NULL,
`app` varchar(64) NOT NULL,
UNIQUE app_index (`app`),
UNIQUE agent_index (`agent`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
The set of primary keys in MySQL does not check for individual unique values, it will give duplicate error when you will try to insert same set of values in multiple records, but both the columns will not accept NULL values
Eg.
app agent name
-------------------------
MyApp ios cde
MyApp ios abc - it will give you error as "Duplicate entry 'MyApp-ios' for key 'PRIMARY'"
may this will help you
I'm trying to import csv file to MYSQL, and I have the following schema.
CREATE TABLE `monitor` (
`id` bigint(20) NOT NULL AUTO_INCREMENT,
`time` time DEFAULT NULL,
`domain_name` text,
`cpu_ns` int(11) DEFAULT NULL,
`cpu_percentage` int(11) DEFAULT NULL,
`mem_bytes` int(11) DEFAULT NULL,
`mem_percentage` int(11) DEFAULT NULL,
`block_rdby` int(11) DEFAULT NULL,
`block_wrby` int(11) DEFAULT NULL,
`net_rxby` int(11) DEFAULT NULL,
`net_wrby` int(11) DEFAULT NULL,
PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1
I'm having issues importing a file with data presented as follows.
16:48:27,2,s-1-VM,220000000.,0.448204684384,262144,0,0,0,60,0
16:48:30,2,s-1-VM,260000000.,0.528932926209,262144,0,0,16384,300,0
16:48:33,2,s-1-VM,300000000.,0.609786677944,262144,0,0,0,180,0
16:48:37,2,s-1-VM,290000000.,0.59000206364,262144,0,0,16384,120,0
16:48:40,2,s-1-VM,270000000.,0.54985661784,262144,0,0,0,649,425
16:48:43,2,s-1-VM,310000000.,0.631207212346,262144,0,0,0,180,0
16:48:46,2,s-1-VM,220000000.,0.44728232907,262144,0,0,20480,60,0
16:48:49,2,s-1-VM,200000000.,0.407008216196,262144,0,0,0,300,0
16:48:52,2,s-1-VM,250000000.,0.508946559213,262144,0,0,0,240,0
16:48:55,2,s-1-VM,240000000.,0.488674160215,262144,0,0,0,120,0
How can import this to my database?
I have tried the following and I get lots of warnings.
LOAD DATA LOCAL INFILE '/tmp/domain2.csv' INTO TABLE vtop FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n';
your help is highly appreciated.
Thank you
If I understand http://dev.mysql.com/doc/refman/5.1/de/load-data.html correctly, you should set ID to null (that makes it auto_increment) via
"SET id=NULL "
at the end of the statement. Otherwise column counts and column orders have to match perfectly.
But your columns don't match at all (what is the "2" at position 2?). So create a temp-Table with the structure of your CSV and then assign via insert into ... select the matching columns.
MySQL isn't an AI and can't figure out that 16:48:27 should go into the the time field - it'll be trying to stuff that into id instead.
You need to explictly map the columns in your CSV file to the fields they should go into in the table:
LOAD DATA .... (time, domain_name, x, y, z, foo, bar)
I'm looking to append a comments table from one WordPress site to another. The users are different. When I import the comments from site B to A, I run into a duplicate key issue; comment_id is already taken.
So how can I resolve this and append the table with a simple .sql file? Would I have to take the user information, generate a new user, check for comments made on site B, pull the content and postID, then go back to site A and recreate the comment for the newly created user!?
What a headache! THanks.
if your only problem is a duplicate key issue, go to the end of your sql file after
ENGINE=MyISAM
and make it
ENGINE=MyISAM AutoIncrement=a nubmer above the last id in the new database
or
Query database A for the last id then add one and use it on a new insert query.
Example 1:
CREATE TABLE IF NOT EXISTS `movies` (
`id` int(255) NOT NULL AUTO_INCREMENT,
`title` varchar(255) NOT NULL,
`year` int(4) NOT NULL,
`size` varchar(255) NOT NULL,
`added` date NOT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY `title` (`title`,`year`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1 AUTO_INCREMENT=4 ;
The Inserts From My Dump:
INSERT INTO `movies` (`title`, `year`, `size`, `added`) VALUES
('[REC] 2', 0, '716688', '2011-09-23'),
('5 Days of War', 0, '1435406', '2012-01-09'),
('[REC]', 0, '1353420800', '2011-11-06');
See how i didnt include the PRIMARY KEY (id) in my includes, but it will still check against my UNIQUE KEY and see if the title exists. Just a little demo that hopefully helps out. If your table already exists on the new database then just skip to the inserts and dont include the primary key and it will be auto set on a new insert to the next available value.