I want to insert data into a table via load command in my sql but when ever i run my query the data is entered only in first column and the other one is null
My text file is:
- 1 server
- 2 client
- 3 network
- 4 system
First column is error code and second is comment and query is:
load data local infile 'C:/Users/nco/Desktop/help.txt' into table help;
After that select * from help;
And the output is:
mysql> select * from help;
+------------+-------------+
| error_code | description |
+------------+-------------+
| 1 | NULL |
| 2 | NULL |
| 3 | NULL |
| 4 | NULL |
+------------+-------------+
4 rows in set (0.03 sec)
Any idea what the problem might be?
If you created the file on Windows with an editor that uses \r\n as a line terminator, you should use this statement instead:
LOAD DATA LOCAL INFILE 'C:/Users/nco/Desktop/help.txt' INTO TABLE help
LINES TERMINATED BY '\r\n';
Related
I have a table errors with the following columns : id_error, product_id, error_code which is already filled up with some errors. I am using id_error as a primary key and I added a UNIQUE index composed of columns product_id and error_code in order to ensure that there can't be two errors with the same error_code for the same product_id. For example :
+----------+------------+------------+
| error_id | product_id | error_code |
+----------+------------+------------+
| 1 | 4 | 1118 |
| 2 | 4 | 1119 |
| 3 | 4 | 1120 |
| 4 | 5 | 1121 |
+----------+------------+------------+
I want to import from a .csv file a list of errors, some of them possibly already be in the errors table. For example :
product_id, error_code
4,1120
4,1121
5,1121
5,1122
To do so, I am using the LOAD DATA statement and this works properly. For example :
LOAD DATA LOCAL INFILE 'C:/Users/Public/Documents/updated_errors.csv'
IGNORE INTO TABLE errors
FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n'
IGNORE 1 LINES
(#col1,#col2) set product_id=#col1,error_code=#col2;
As a result, the errors table now looks like this, and this is the expected result :
+----------+------------+------------+
| error_id | product_id | error_code |
+----------+------------+------------+
| 1 | 4 | 1118 |
| 2 | 4 | 1119 |
| 3 | 4 | 1120 |
| 4 | 5 | 1121 |
| 5 | 4 | 1121 |
| 6 | 5 | 1122 |
+----------+------------+------------+
However, by doing so, I get a warning for each line that is already in the errors table to notify me that the UNIQUE key plays its role :
2 row(s) affected, 2 warning(s):
1062 Duplicate entry '4-1120' for key 'errors.UNIQUE'
1062 Duplicate entry '5-1121' for key 'errors.UNIQUE'
Records: 4 Deleted: 0 Skipped: 2 Warnings: 2
I want to automatise this process using Labview because it fits my workflow for this particular task. However it seems that the library I am using in Labview to access my database does not support warnings : it triggers an error saying that my query is wrong. I have double checked the query by running it directly in Workbench and there is no error, just the aforementionned warnings.
I also double checked everything on the Labview side and everything seems to work fine with other request. It just seems that this library consider warnings as errors.
I have tried to change the level of error verbosity with the following request (intending to change it back after the query), unfortunately as I am using a cloud DB, I do not think I can have a SUPER privilege nor a SYSTEM_VARIABLES_ADMIN privilege.
SET GLOBAL log_error_verbosiy = 1
Error Code: 1227. Access denied; you need (at least one of) the SUPER or SYSTEM_VARIABLES_ADMIN privilege(s) for this operation
I tried different combinations of primary keys and unique keys to avoid triggering warnings while keeping the security that prevents me from adding an already existing error but I have not been successful.
I am looking for a way to do one of the following :
Avoid warnings in case of duplicates with the LOAD DATA statement
Import a .csv file into the table with another statement that would not trigger any warning. I am thinking maybe a line-per-line import with a check for each line if the error is already in the table ?
Any other solution to achieve what I want to do ?
edit : I am using the Database Connectivity Toolkit for Big Data by Ovak Technologies in Labview
I have looked on previous forums for this, websites etc and can't find a solution. I keep getting this error despite me having 8 columns in my database and my csv file which I'm trying to load into the database.
I have included screenshots of my command line, database table which im loading into and my csv file.
Any help is much appreciated!
Any suggestions on this please??
It’s the line endings. MySQL isn’t getting what it expects, so specify the format of the file using LINES TERMINATED BY ‘\r\n’ or whatever is appropriate for you:
‘\r\n’ for files that came from Windows systems
‘\r’ for files from VMS
‘\n’ for every other source.
The issue is not related to the new lines but related to the commas within the data itself as seen in the "Full Name" column in your data.
Was able to replicate and fix the issue.
Database table used for the replication of the issue, please note that the data definitely fits within the column.
mysql> describe import;
+-------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+--------------+------+-----+---------+-------+
| col1 | varchar(255) | YES | | NULL | |
| col2 | varchar(255) | YES | | NULL | |
+-------+--------------+------+-----+---------+-------+
2 rows in set (0.01 sec)
Replication:
I am sure that the line terminator is \n as the file was created on linux.
# cat /var/lib/mysql-files/import.csv
col1,col2
test1,value1,value2
test2,value3
SQL statement that gives the issue:
LOAD DATA INFILE '/var/lib/mysql-files/import.csv'
INTO TABLE import
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Error:
ERROR 1262 (01000): Row 1 was truncated; it contained more data than there were input columns
Solution:
I had to change the data file and the SQL statement to make the issue go away.
I made sure that the data contained double quotes around the columns:
# cat /var/lib/mysql-files/import.csv
"col1","col2"
"test1","value1,value2"
"test2","value3"
Updated the SQL statement to know that the fields are enclosed by double quotes, seen "ENCLOSED BY '"'":
LOAD DATA INFILE '/var/lib/mysql-files/import.csv'
INTO TABLE import
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Result:
Query OK, 2 rows affected (0.01 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
Import was successful:
mysql> select * from import;
+-------+---------------+
| col1 | col2 |
+-------+---------------+
| test1 | value1,value2 |
| test2 | value3 |
+-------+---------------+
2 rows in set (0.00 sec)
I need to load data to a table from a file using LOAD DATA command.
I've got a txt file that looks something like this:
1 "MARCA"#"MODELO"#"MATRICULA"#PRECIO
2 "CITROEN"#"PICASSA"#"CPG-2044"#12000
3 "CITROEN"#"PICASSA"#"CPR-1762"#12500
4 "CITROEN"#"C4"#"FPP-1464"#13500
5 "CITROEN"#"C4"#"FDR-4563"#13000
6 "CITROEN"#"C3"#"BDF-8856"#8000
7 "CITROEN"#"C3"#"BPZ-7878"#7500
8 "CITROEN"#"C2"#"CDR-1515"#5000
9 "CITROEN"#"C2"#"BCC-3434"#4500
Now, my first table is constructed as follows:
mysql> show columns from MARCAS;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| ID_MARCA | int(11) | NO | PRI | NULL | auto_increment |
| MARCA | varchar(50) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
Now, I don´t really know how to import data partially (as what I need to do is just load first 'column'. What I came up with is:
load data local infile /myfile.txt
into table MARCAS
fields terminated by '#'
lines terminated by '\n';
but that just does nothing (apart of suspending the terminal).
Help please?
You can also discard an input value by assigning it to a user variable
and not assigning the variable to a table column:
source: http://dev.mysql.com/doc/refman/5.7/en/load-data.html
load data local infile '/myfile.txt'
into table MARCAS
fields terminated by '#'
lines terminated by '\n'
(ID_MARCA, MARCA, #ignore1, #ignore2, #ignore3);
footnotes:
Your query is most unusual in the sense that you have your column names in upper case and sql keywords in lowercase. The usual thing is to have it the other way round!
You have said your mysql console get's suspended, I do believe what you mean is that it takes a long time to return after this query is typed. If you have a large number of rows, there's nothing unusual in that.
I would import one million rows from a CSV document to a database table.
To do this fast i use MySQL load data infile.
The problem is that everything works exactly great!
But there is a problem with lines which optionally enclosed by ".
The CSV file.
Name|Value\n
Anna|\n
Ben |Test\n
Chip|"My ""special"" value"\n
Deny|"I" like it\n
East|You not\n
The MySQL command.
LOAD DATA LOCAL INFILE 'test.csv'
INTO TABLE `test`
FIELDS TERMINATED BY '|'
ENCLOSED BY '"'
LINES TERMINATED BY "\n"
IGNORE 1 LINES
(`name`, #value)
SET
`value` = nullif(#value, '')
;
The result.
Query OK, 4 rows affected, 1 warning (0.17 sec)
Records: 4 Deleted: 0 Skipped: 0 Warnings: 1
The warnings.
+---------+------+--------------------------------------------+
| Level | Code | Message |
+---------+------+--------------------------------------------+
| Warning | 1265 | Data truncated for column 'value' at row 4 |
+---------+------+--------------------------------------------+
The table.
+----+------+------------------------+
| id | name | value |
+----+------+------------------------+
| 1 | Anna | NULL |
| 2 | Ben | Test |
| 3 | Chip | My "special" value |
| 4 | Deny | "I" like it
East|You |
+----+------+------------------------+
How to solve?
Please note:
My problem is not the warning!
If you see: The csv file contains 6 lines and 5 rows. (without header)
Also i need 5 rows/entries in mysql table. I have only 4 entries.
Your csv-file is invalid: Line 5 contains a "-char so (according to RFC-4180)
the field has to be surrounded by double quotes and
the quotes inside the field have to be repeatet
Using this you can successfully Import your csv by modifying it to
Name|Value
Anna|
Ben |Test
Chip|"My ""special"" value"
Deny|"""I"" like it"
East|You not
This Data truncated for column 'value' at row 4 warning message indicate that your field value is greater then your specified size. So you have to increase size of varchar to maximum length of your value.
I have the following simple CSV file which I want to load into a MySQL table:
ViolationUtil1,RU
FiftyFifty2013_prof1,UM
Lunch_util1,RM
...
It contains several rows with two fields separated by a comma. I load it using the following command:
LOAD DATA LOCAL INFILE 'domains.txt'
INTO TABLE domains
FIELDS TERMINATED BY ",";
The table domains is defined in the following way:
CREATE TABLE domains (
domain varchar(63),
code varchar(3)
);
Strangely, the first letter of the first column disappears!
mysql> select * from domains;
+----------------------+------+
| domain | code |
+----------------------+------+
|iolationUtil1 | RU
|iftyFifty2013_prof1 | UM
|unch_util1 | RM
+----------------------+------+
When I changed the definition of the "code" column to:
code char(2)
I mysteriously got the correct result:
mysql> select * from domains;
+----------------------+------+
| domain | code |
+----------------------+------+
| ViolationUtil1 | RU |
| FiftyFifty2013_prof1 | UM |
| Lunch_util1 | RM |
+----------------------+------+
What is happening here?