I am seeing a strange problem when loading my data with sqlldr. Here is my table schema:
CREATE TABLE TEST(
"COL1" VARCHAR2 (255 BYTE),
"COL2" VARCHAR2 (255 BYTE),
"COL3" NUMBER,
"COL4" VARCHAR2 (255 BYTE)
and here is just one row of data I am trying to ingest from the tab delimited file test.txt:
COL1 COL2 COL3 COL4
10 17-cc
notice that the first two columns are empty (null). So my row is really:
\t\t10\t17-cc
my loader script:
load data
infile 'test.txt'
append into table TEST
fields terminated by "\t" optionally enclosed by '"'
TRAILING NULLCOLS
(COL1,COL2,COL3,COL4)
This will be loaded into my table as:
COL1 COL2 COL3 COL4
10 17-CC (null) (null)
which is incorrect. it seems that the two leading tabs in the data row were ignored and COL3 position (10) was assigned to COL1. However, if I try to import the data as a comma separated file:
COL1,COL2,COL3,COL4
,,10,17-cc
it works as expected. Why does the tab delimited version fails here?
NOTE - Fixed my original wrong answer.
Your TAB is defined just fine. You need the NULLIF statement:
load data
infile 'test.txt'
append into table TEST
fields terminated by "\t" optionally enclosed by '"'
TRAILING NULLCOLS
(COL1 NULLIF(COL1=BLANKS),
COL2 NULLIF(COL2=BLANKS),
COL3 NULLIF(COL3=BLANKS),
COL4 NULLIF(COL4=BLANKS)
)
Related
Hi I have a csv dataset like
ukwn,34,2018-10-01,"757,271"
ukwn,3,2018-10-01,"7,342"
"hi",23,2018-10-01,"3,483,887"
i want to insert it in the database so i made the code:
LOAD DATA LOCAL INFILE 'data.csv' INTO TABLE app_spend_metric
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(col1,col2,col3,col4)
But i fails to insert the col4 (4th row) as there is ',' inside a "" like "7,345"
I tried then,
LOAD DATA LOCAL INFILE 'data.csv' INTO TABLE app_spend_metric
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(col1,col2,col3,col4)
But this time it enters the partial data in col4 like out of "7,344" it only enters "7
If col4 is numeric (eg, INT), then the problem is as follows:
Parse the line to get the string "7,344"
Strip the enclosing ": 7,344
Store the string into the INT column. This requires converting that string to a number.
Conversion stops at the first non-numeric character, namely the comma.
Result: The col4 is set to 7, and ,344 is tossed.
MySQL cannot deal with "thousands-separators" in numbers. But you could strip them:
LOAD ...
(col1, col2, col3, #num)
SET col4 = REPLACE(#num, ',', '')
In phpMyAdmin, in the tab import, how to specify which columns of the csv must be skipped by the import?
For example, I have this csv:
col1 col2 col3 col4 col5
a b c x 0
1 2 3 y 1
I need to manually import that csv skipping the 4th column.
With the Csv using LOAD DATA I can specify the Column names; in that field I tried to insert these but without working:
col1, col2, col3, #dummy, col5
Invalid column (#dummy) specified! Ensure that columns names are spelled correctly, separated by commas, and not enclosed in quotes.
and
col1, col2, col3, , col5
SQL query:
LOAD DATA LOCAL INFILE '/tmp/phpBlaBlaBla'
INTO TABLE `tblName` FIELDS TERMINATED BY ';' ENCLOSED BY '"' ESCAPED BY '\' LINES TERMINATED BY '\n' IGNORE 1 LINES
(`col1` , `col2` , `col3` , , `col5`)
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near `col5`)' at line 1
The phpMyAdmin version is 4.0.10.16.
Thanks in advance!
I have a CSV file. How can I tell the sqlldr control file to load missing values as NULL. (ie the table schema allows NULL for certain column)
Example of CSV
1,Name1
2,Name2
3,
4,Name3
Could you help me to edit my control file here so that a line 3 , the missing value is inserted as NULL in my table
Table
Create table test
( id Number(2), name Varchar(10), primary key (id) );
Control file
LOAD DATA INFILE '{path}\CSVfile.txt'
INSERT INTO test
FIELDS TERMINATED BY ','
(id CHAR,
name CHAR
)
I believe all you should have to do is this:
name CHAR(10) NULLIF(name=BLANKS)
You would have to hint to SQL*Loader that there might be nulls in your data.
2 ways to give that hint to SQL*Loader.
Use TRAILING NULLCOLS option.
LOAD DATA INFILE '{path}\CSVfile.txt'
INSERT INTO test<br>
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(id CHAR,
name CHAR
)
Recreate your CSV files with enclosed fields and then use OPTIONALLY ENCLOSED BY '"' which lets SQL*Loader clearly see the nulls in your data (nothing between quotes) that looks like "abcd",""
LOAD DATA INFILE '{path}\CSVfile.txt'
INSERT INTO test<br>
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
(id CHAR,
name CHAR
)
I found that using TRAILING NULLCOLS will do the job BUT it has to be for "blanks" at the end of the record line.
LOAD DATA INFILE {path}\Your_File
INSERT INTO TABLE Your_Table
TRAILING NULLCOLS
FIELDS TERMINATED BY ","
(
... your fields
)
Got working import code for csv files into my 8-column database here:
Load Data LOCAL InFile 'file.csv' into table myTable
Fields Terminated by ','
OPTIONALLY Enclosed by '"'
Lines Terminated by '\n'
IGNORE 1 Lines
(col1, col2, #var3, #var4, col5, col6, col7, col8)
Set
col3 = if(#var3 = '', NULL, #var3),
col4 = if(#var4 = '', NULL, #var4)
;
It's working fine in changing empty entries to NULL values, but....
Is there any way to shorten the Set part so I don't have to specify a condition for each and every column?
I actually need this for 7 of the 8 columns above, and this particular table is rather small.
Is there any way to shorten the Set part
Yes, MySQL provides a shorthand function NULLIF():
col3 = NULLIF(#var3, '') -- etc
so I don't have to specify a condition for each and every column?
Sadly not, although it should be fairly trivial to generate the desired SQL dynamically in your application code.
I have 35 CSV files which I want to import to MYSQL table(say 'test'). I want to create one column in 'test' table( say 'file_name'). This column will contain name of the CSV from which data has been imported. The file names are unique IDs, that is why I want to get file name as input in the table.
Suppose I have CSV files like X1.csv, X2.CSV, X3.csv .... X35.csv. I want a column in 'test' table as 'file_name' such that 'test' table looks something like:
col1 -> a, b, c, d
col2 -> x, y, w, z
...
...
... ....
file_name -> X1, X1, X2, X3
Note: I tried to search this question on forum but I could not find any suitable solution. Also I am new to MYSQL, please help even it is a trivial thing.
I'm not sure this is exactly what you are looking for, but at first sight, you should investigate the LOAD DATA INFILE statement:
LOAD DATA INFILE 'X1.csv' INTO TABLE tbl_name -- Load the content of the CSV file
FIELDS TERMINATED BY ',' ENCLOSED BY '"' -- assuming fields separate by ",", enclosed by "'"
LINES TERMINATED BY '\r\n' -- assuming end-of-line being '\r\n'
IGNORE 1 LINES -- assuming first line is a header and should be ignored
SET file_name = 'X1'; -- force the column `file_name` to be the name of the file
Please note that with such statement, each field will go in its own column of the table. And each line of the CSV data file will be loaded a one row in the table. This will imply that there will be several rows in the result table with the same file name. In fact one row per data line.