Declare NULL values in mysql [duplicate] - mysql

This question already has answers here:
MySQL load NULL values from CSV data
(8 answers)
Closed 4 years ago.
I would like to create a SQL table using the LOAD DATA INFILE comande.
I have previously created a table with an attribut (column) FLOAT like:
CREATE TABLE table_1 (
foo INT );
So now I can use the LOAD DATA INFILE comand.
In the file I want to load, missing values are codded as 'NA'. When I run this command 'NA' values are translated as '0' and not as NULL values which is the mysql standard for missing values.
My question is: How I can declare to mysql that 'NAs' are missing values.
Appologize if this is a current topic but as I am not a SQL specialist I was not able to find the answer to this question.
Thanks in advance,
Marc

Something along the lines of
LOAD DATA INFILE 'path/to/file.csv'
INTO TABLE table_1
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
IGNORE 1 LINES
(#foo)
SET foo = NULLIF(#foo, 'NA')
Given the file path/to/file.csv has the following contents
foo
1
2
3
NA
4
NA
NA
Here is what you'd see in your table after loading the data
mysql> select * from table_1;
+------+
| foo |
+------+
| 1 |
| 2 |
| 3 |
| NULL |
| 4 |
| NULL |
| NULL |
+------+
7 rows in set (0.00 sec)

Related

How to join a table with a small text dataset?

I have a table like this:
// poses
+----+-------------+-------+
| id | pos_number | value |
+----+-------------+-------+
| 1 | 10001 | NULL |
| 2 | 10002 | NULL |
+----+-------------+-------+
Also, I have raw data (excel file) like this:
10001 | x
10002 | y
As I said, it's an excel file that I can access as a text file and parse it by regex. I want to join that real table (poses) with raw data and then update that table. Something like this:
UPDATE poses p
JOIN ( ... ) temp_table ON p.pos_number = temp_table. ...
SET p.value = temp_table. ...
Anyway, in which syntax can I use a pure text as a join to a real table via MySQl query?
Here is the expected result:
// poses
+----+-------------+-------+
| id | pos_number | value |
+----+-------------+-------+
| 1 | 10001 | x |
| 2 | 10002 | y |
+----+-------------+-------+
If the text file is big and you want to do it fast, you should use LOAD DATA Statement to import a text file into MYSQL table:
https://dev.mysql.com/doc/refman/8.0/en/load-data.html
Example (if you have a CSV file with header as first line):
LOAD DATA LOCAL INFILE 'file.csv'
INTO TABLE db.table
CHARACTER SET utf8mb4
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 lines
I did it this way:
Pattern:
([^ ]+) ([^ ]+)\n
Replacement:
select "$1" as pos_number, "$2" as value union \n
Removing the last union
Query:
UPDATE poses p join (
<result above>
) x on x.pos_number = p.pos_number
SET p.value = x.value

MySQL "ERROR 1262 (01000): Row 1 was truncated; it contained more data than there were input columns" why is this?

I have looked on previous forums for this, websites etc and can't find a solution. I keep getting this error despite me having 8 columns in my database and my csv file which I'm trying to load into the database.
I have included screenshots of my command line, database table which im loading into and my csv file.
Any help is much appreciated!
Any suggestions on this please??
It’s the line endings. MySQL isn’t getting what it expects, so specify the format of the file using LINES TERMINATED BY ‘\r\n’ or whatever is appropriate for you:
‘\r\n’ for files that came from Windows systems
‘\r’ for files from VMS
‘\n’ for every other source.
The issue is not related to the new lines but related to the commas within the data itself as seen in the "Full Name" column in your data.
Was able to replicate and fix the issue.
Database table used for the replication of the issue, please note that the data definitely fits within the column.
mysql> describe import;
+-------+--------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------+--------------+------+-----+---------+-------+
| col1 | varchar(255) | YES | | NULL | |
| col2 | varchar(255) | YES | | NULL | |
+-------+--------------+------+-----+---------+-------+
2 rows in set (0.01 sec)
Replication:
I am sure that the line terminator is \n as the file was created on linux.
# cat /var/lib/mysql-files/import.csv
col1,col2
test1,value1,value2
test2,value3
SQL statement that gives the issue:
LOAD DATA INFILE '/var/lib/mysql-files/import.csv'
INTO TABLE import
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Error:
ERROR 1262 (01000): Row 1 was truncated; it contained more data than there were input columns
Solution:
I had to change the data file and the SQL statement to make the issue go away.
I made sure that the data contained double quotes around the columns:
# cat /var/lib/mysql-files/import.csv
"col1","col2"
"test1","value1,value2"
"test2","value3"
Updated the SQL statement to know that the fields are enclosed by double quotes, seen "ENCLOSED BY '"'":
LOAD DATA INFILE '/var/lib/mysql-files/import.csv'
INTO TABLE import
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Result:
Query OK, 2 rows affected (0.01 sec)
Records: 2 Deleted: 0 Skipped: 0 Warnings: 0
Import was successful:
mysql> select * from import;
+-------+---------------+
| col1 | col2 |
+-------+---------------+
| test1 | value1,value2 |
| test2 | value3 |
+-------+---------------+
2 rows in set (0.00 sec)

Loading data to a table from a file

I need to load data to a table from a file using LOAD DATA command.
I've got a txt file that looks something like this:
1 "MARCA"#"MODELO"#"MATRICULA"#PRECIO
2 "CITROEN"#"PICASSA"#"CPG-2044"#12000
3 "CITROEN"#"PICASSA"#"CPR-1762"#12500
4 "CITROEN"#"C4"#"FPP-1464"#13500
5 "CITROEN"#"C4"#"FDR-4563"#13000
6 "CITROEN"#"C3"#"BDF-8856"#8000
7 "CITROEN"#"C3"#"BPZ-7878"#7500
8 "CITROEN"#"C2"#"CDR-1515"#5000
9 "CITROEN"#"C2"#"BCC-3434"#4500
Now, my first table is constructed as follows:
mysql> show columns from MARCAS;
+----------+-------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------+-------------+------+-----+---------+----------------+
| ID_MARCA | int(11) | NO | PRI | NULL | auto_increment |
| MARCA | varchar(50) | YES | | NULL | |
+----------+-------------+------+-----+---------+----------------+
Now, I don´t really know how to import data partially (as what I need to do is just load first 'column'. What I came up with is:
load data local infile /myfile.txt
into table MARCAS
fields terminated by '#'
lines terminated by '\n';
but that just does nothing (apart of suspending the terminal).
Help please?
You can also discard an input value by assigning it to a user variable
and not assigning the variable to a table column:
source: http://dev.mysql.com/doc/refman/5.7/en/load-data.html
load data local infile '/myfile.txt'
into table MARCAS
fields terminated by '#'
lines terminated by '\n'
(ID_MARCA, MARCA, #ignore1, #ignore2, #ignore3);
footnotes:
Your query is most unusual in the sense that you have your column names in upper case and sql keywords in lowercase. The usual thing is to have it the other way round!
You have said your mysql console get's suspended, I do believe what you mean is that it takes a long time to return after this query is typed. If you have a large number of rows, there's nothing unusual in that.

MySQL load data ignoring lines - if (optionally) enclosed is set

I would import one million rows from a CSV document to a database table.
To do this fast i use MySQL load data infile.
The problem is that everything works exactly great!
But there is a problem with lines which optionally enclosed by ".
The CSV file.
Name|Value\n
Anna|\n
Ben |Test\n
Chip|"My ""special"" value"\n
Deny|"I" like it\n
East|You not\n
The MySQL command.
LOAD DATA LOCAL INFILE 'test.csv'
INTO TABLE `test`
FIELDS TERMINATED BY '|'
ENCLOSED BY '"'
LINES TERMINATED BY "\n"
IGNORE 1 LINES
(`name`, #value)
SET
`value` = nullif(#value, '')
;
The result.
Query OK, 4 rows affected, 1 warning (0.17 sec)
Records: 4 Deleted: 0 Skipped: 0 Warnings: 1
The warnings.
+---------+------+--------------------------------------------+
| Level | Code | Message |
+---------+------+--------------------------------------------+
| Warning | 1265 | Data truncated for column 'value' at row 4 |
+---------+------+--------------------------------------------+
The table.
+----+------+------------------------+
| id | name | value |
+----+------+------------------------+
| 1 | Anna | NULL |
| 2 | Ben | Test |
| 3 | Chip | My "special" value |
| 4 | Deny | "I" like it
East|You |
+----+------+------------------------+
How to solve?
Please note:
My problem is not the warning!
If you see: The csv file contains 6 lines and 5 rows. (without header)
Also i need 5 rows/entries in mysql table. I have only 4 entries.
Your csv-file is invalid: Line 5 contains a "-char so (according to RFC-4180)
the field has to be surrounded by double quotes and
the quotes inside the field have to be repeatet
Using this you can successfully Import your csv by modifying it to
Name|Value
Anna|
Ben |Test
Chip|"My ""special"" value"
Deny|"""I"" like it"
East|You not
This Data truncated for column 'value' at row 4 warning message indicate that your field value is greater then your specified size. So you have to increase size of varchar to maximum length of your value.

mysql load query not working perfectly

I want to insert data into a table via load command in my sql but when ever i run my query the data is entered only in first column and the other one is null
My text file is:
- 1 server
- 2 client
- 3 network
- 4 system
First column is error code and second is comment and query is:
load data local infile 'C:/Users/nco/Desktop/help.txt' into table help;
After that select * from help;
And the output is:
mysql> select * from help;
+------------+-------------+
| error_code | description |
+------------+-------------+
| 1 | NULL |
| 2 | NULL |
| 3 | NULL |
| 4 | NULL |
+------------+-------------+
4 rows in set (0.03 sec)
Any idea what the problem might be?
If you created the file on Windows with an editor that uses \r\n as a line terminator, you should use this statement instead:
LOAD DATA LOCAL INFILE 'C:/Users/nco/Desktop/help.txt' INTO TABLE help
LINES TERMINATED BY '\r\n';