load data infile mysql null value entry error - mysql

I am trying to load a text file into an existing table by issuing the following command
load data infile "test.txt" into table m_c;
The table has 5 columns: id, title, official, genre and platform
where the id is the primary key with auto_increment set.
The file was added to the table, but the content was not. Instead i got NULL as values for all columns.
I really need to know why!

LOAD DATA INFILE 'test.txt'
INTO TABLE m_c
(title, genre, platform, official)
SET gameid = NULL;
Reference
Also, how is your file formatted? Tab delimited? CSV? You may need file or line terminators.
See the manual.

You do not need an ID field in your text file, the system will automatically give you a new id for each row you insert.

Related

UPDATE all rows of a single column using LOAD DATA INFILE

I have a table with 18 columns and I want to update all rows of the column page_count using LOAD DATA INFILE. The file with the data is extremely simple, just \n separated values. I'm not trying to pull off anything impressive here; I just want to update the values in that one single column - about 3000 records. The code I'm using is
LOAD DATA INFILE '/tmp/sbfh_counter_restore' REPLACE INTO TABLE obits FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n' (page_count);
And all this does is add one new row with the entire contents of the file dumped into the column page_count. What am I doing wrong? Shoud I use phpmyadmin? I'd be happy to use that as it better fits my skill set ;)
I created the file using SELECT page_count FROM obits INTO outfile '/tmp/sbfh_counter_restore'
based what I can understand from MySQL document, it does not support loading data into ONE COLUMN, but it will require ALL COLUMNS to be present in the file in correct order.
At least, you should use SELECT * FROM obits INTO outfile, then load it back as it will ensure the column order is consistent.
As all your file content was loaded into a new row, I think you should check the primary key (or unique key) of your table. The rows will be matched by the key and update or insert based on the matching result. It is likely that page_count is not your primary or unique key.
Hope that helps.

Reading CSV file into MySQL with subset of columns

Given a table 'products' with the following fields:
id
name
cost
user_id
I want to dump a CSV file containing 'name' and 'cost', and read it back in.
I'm using SELECT 'name', 'cost' INTO 'data.csv' FROM products;
How can I use LOAD DATA INFILE 'data.csv' INTO TABLE products; to read it back in, since some columns are not defined?
Assuming your id and user_id columns have default values set (or accept NULL), the statement is simple:
LOAD DATA INFILE 'rows.csv' (name,cost);
If those columns need values set, then you can set them per-row at load time:
LOAD DATA INFILE 'rows.csv' (name,cost) SET id=MD5(name),user_id=NULL;
MySQL is quite powerful when it comes to filling in values from a source CSV. Here's a blog article that shows many of the features in the context of a real world example.

How to bulk load data such without entering values for the column indexed "auto-increment"

Following is the code to bulk load data from a text file.
LOAD DATA LOCAL INFILE 'C:\\file.txt'
INTO TABLE datatable;
I have a table with two columns, an attribute and id, the primary key with the AUTO_INCREMENT index. Values for the attribute are given (one line for each row) in the text file.
I want the id (indexed "AUTO_INCREMENT) to be inserted itself and then increment itself. I think it is possible, but what will be the way to do it?
Try this one:
LOAD DATA LOCAL INFILE 'C:\\file.txt'
INTO TABLE datatable(`attribute`);
If this won't work, a table structure and a sample rows of your file.txt would help.
You could raw import everything from .txt into the Database (with your given command), so you have just the attributes there and then afterwards add the ID field later.
ALTER TABLE datatable ADD `id` MEDIUMINT NOT NULL AUTO_INCREMENT KEY
For detail explanation there is already a question for that:
Add a column to existing table and uniquely number them

mysql: import csv in existing table

Imagine you have a CSV file with only text in it and line ending \r\n.
like this:
abc
def
hij
...
xyz
You want to import this file in an existing multi-column table, where each line of text needs to go in a line of one specific (currently empty) column (lets name it needed) of the table. Like this:
| a | b | c |needed|
|foo|bar|baz|______|<-- abc
|foo|bar|baz|______|<-- def
...
|foo|bar|baz|______|<-- xyz
The data from the CSV file does not need to be inserted in a certain order. It really does not matter which field of needed has which data from the CSV in it, as long as every line gets imported, everything is fine.
I've tried lots of things and its driving me mad, but I can't figure out, how this could be done. Can this be solved somehow with LOAD DATA INFILE and update/replace/insert command? What would you do in a case like this? Is it even possible with mysql? Or do I need a custom php script for this?
I hope the question is clear enough. Please ask, if something is unclear to you.
OK, here you go...
Add a new column to stuff populated with 1-600,000
ALTER TABLE stuff ADD COLUMN newId INTEGER UNIQUE AUTO_INCREMENT;
Create a temp table for your CSV file
CREATE TABLE temp (
id INTEGER PRIMARY KEY AUTO_INCREMENT,
data VARCHAR(32)
);
I'm guessing the required length of the data.
Import into the temporary table
LOAD DATA INFILE <wherever> INTO TABLE temp;
Add the data to stuff
UPDATE stuff AS s
JOIN temp AS t ON t.id=s.newId
SET s.needed=t.data;
Tidy up
DROP TABLE temp;
ALTER TABLE stuff DROP COLUMN newId;
I must admit that I haven't tested the LOAD DATA INFILE or the UPDATE statements, but I have checked all the table fiddling. In particular, MySQL will populate the newId column with consecutive numbers.
I don't think there will be a problem if the number of rows in stuff doesn't match the number of lines in your csv file - obviously, there will be a few rows with needed still NULL if there are fewer lines than rows.
Most easiest way I found was with Libreoffice Calc+Base.
Import/Export Data in Base.
But becarefull,if you have "not null" options in columns and by mistake there is a cell which has no data in it, that row will be skiped.
So first disable the not null options from columns.
Oh and there is Microsoft Office Excel way,but I did not done it since it required a plugin to install and I was lazy.

LOAD DATA LOCAL INFILE custom value

How to add a custom value using LOAD DATA LOCAL INFILE?
The column time_added is the 7th column and the file has only 2 values for the first and the second column. For the 7th column, time_added I want to use the unix timestamp when loading from file.
This code isn't working:
$result = mysql_query("LOAD DATA LOCAL INFILE '{$myFile}' INTO TABLE {$table} FIELDS TERMINATED BY ':' LINES TERMINATED BY '\n' SET `time_added`=unix_timestamp()");
Why wouldn't this work?
LOAD DATA INFILE 'file.txt'
INTO TABLE t1
(column1, column2)
SET column7 = unix_timestamp();
The answer given by #iouri indicates the key element to address your question, namely the explicit listing of the columns populated by the .csv file, (column1, column2). This line informs the LOAD function to only consider these columns when loading data from the .csv file and avoids an error similar to Row 1 doesn't contain data for all columns.
You will still need to list all columns, including custom columns, in the table definition. Also, the column names listed in the parentheses should match the names of the columns defined in the table definition. For example, if the table definition specifies two columns named user and id then you would need to have the line (user, id) above the SET column7 = unix_timestamp() line.
You may also want to double check that you want LOAD DATA LOCAL INFILE instead of LOAD DATA INFILE (no LOCAL). As specified in the documentation for load-data, the LOCAL keyword affects the expected location of the file and both the server and client must be configured properly to allow for using the LOCAL option.