I ran below SQL query against my MySQL database:
load data local infile '/tmp/ad_packs.csv'
into table ad_packs
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(company_name,pack_size,remaining,ad_pack_purchase_officer,advertising_contact_officer,updated_ad_pack,email,phone,website,note,organisation_id,categories,modified)
This seems to insert only one row and I already checked the csv and it has heading so it only inserts the first row which is the header.
Is there a reason why?
Belated answer, I have just had this problem myself; Tried everything. The solution for me was to set AUTO_INCREMENT against my id field, and voila in came the records.
Are you sure sure your lines are terminated by "\n"? Try "\r\n";
Try to look into your primary key of table. If no problem with the '\n', '\r\n' issue, it is very likely the first row and second row in your csv file share the same primary key.
Had the same issue just recently and used the od -c command on the csv file. It showed that \r was being use for the line breaks.
Related
I am trying to upload csv file into mysql database, but 0 row was inserted using the following query. I wonder where should I start looking at the problem
LOAD DATA LOCAL INFILE "/Applications/xxx/tmp/student_info.csv" INTO TABLE student_info_table
CHARACTER SET utf8
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(name,age,grade) ;
I found the problem is actually in the setup of my database table. It has a foreign key, and in the uploading file, I did not define a valid foreign key value
There is a table with some data in it, I'm inserting additional data from txt file with this function:
LOAD DATA LOCAL INFILE 'users.txt' IGNORE INTO TABLE users2
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n'
(#col1,#col2,#col3,#col4) set login=#col1,name=#col1,balance=#col2,email=#col3,reg_date=#col4;
It works, but if login already exists with a different balance, it creates a row with a duplicate login and a different balance.
I need the function to ignore the line if the login exists. Can anybody help me with that please?
Create a unique index on the login field.
ALTER TABLE USER2 add UNIQUE KEY(login);
thats all
I have an existing table on MySQL with 7 fields - 4 are integer(one acting as primary key and auto incremete) and 3 text. This table already has data.
I am collecting new data in Excel and wish to bulk-add the data to the Mysql table. Some of the cells in Excel will be blank and need to show as null in MySQL and the primary key won't be part of excel, I let MYSQL auto add it. Previously I was manually adding each record through PHPmyadmin but it is far too time consuming (I have 1000's of records to add).
How do I go about this in terms of setting the Excel sheet in the right way and making sure I add to the exsisting data instead of replacing it? I have heard CSV files are the way? I use PHPmyadmin if that helps.
Thanks
If you want to append data to a MySQL table, I would use .csv files to do it.
In MySQL:
load data local infile 'yourFile'
into table yourTable (field1, field2, field3)
fields terminated by ',' optionally enclosed by '"'
lines terminated by '\n' -- On windows you may need to use `\r\n'
ignore 1 lines; -- Ignore the header line
Check the reference manual: http://dev.mysql.com/doc/refman/5.0/en/load-data.html
Here is the SQL query I have so far:
LOAD DATA INFILE 'filename.csv'
INTO TABLE zt_accubid
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 4 LINES
I need to be able to end the process once a field with value="xyz" is encountered.
Is this possible?
LOAD DATA INFILE has no such option. There are a couple of workaraounds, though
You could solve the problem outside MySQL by manipulating the file first
As Alain Collins mentioned, if the column containing you marker has only unique values, and you don't have the LOAD DATA inside a transcation, you can use a unique key as a stopper.
You can use a trigger on the table as a stopper
If the marker is near the end of the table or the overhead is not important to you, you can do a full LOAD DATA into an interims table, then use INSERT INTO ... SELECT to move only the relevant data into your final table
Similarily you can load all data, then delete the irellevant part
Sure - put a row in the database before the load, and put a unique key on it. The LOAD should fail when it hits the duplicate.
i am using the following command
LOAD DATA INFILE '/sample.txt' INTO TABLE TEST FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n';
whenever executed this query ,table "TEST" gets added the values from sample.txt file.
How to avoid this multiple entries in table.i want to load this file if table doesn't load this file already.
diEcho's comment is nearly right - identify unique column combinations in the data and declare them as a unique index on the table - but importantly you also need to modify your LOAD DATA statement - as its stands, it will fall over at the first occurrence of a duplicate record. You need to add an IGNORE:
LOAD DATA INFILE '/sample.txt' IGNORE INTO TABLE `test`
FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n';
Although for preference you should consider loading the full dataset into a staging table then handle the duplicates a bit more gracefully.