SQLLDR load file with single column - sql-loader

First time using sqlldr, simply trying to load a file with single column, and cannot find a proper way to do it :(
What do I put as a delimiter?
Here's my .ctl file :
load data
infile 'myfile.dat'
into table mytable
fields terminated by ''
(mycolumn)
I keep getting errors in the .log like :
Record 4: Rejected - Error on table ..., column ....
ORA-12899: value too large for column "... (actual: 80, maximum: 24)
Even though the values in the file are max 8 chars each or smth :
string1
string2
string3
Any help will be much appreciated.
Many thanks,
G

You don't need the fields terminated by line in this case but you should have a TRUNCATE or APPEND depending on if you want to keep existing data or not.
load data
infile 'myfile.dat'
truncate
into table mytable
(mycolumn)

Why not just change the size of your column to 80
Alter table modify varchar2(80)

Related

add fixed value to column in Mysql where loading data from csv file

I need to enter a text value that to represent the year (will be the same for every row) in a set of data being imported from a csv file. I am getting a syntax error each time. How do I specify the text value so that it will populate the column properly?
Load data local infile 'C:/Users/Candace.....csv'
into table estimate(State, '2010', Population)
fields terminated by ',';
Not tested, though according to the documentation it should work:
LOAD DATA INFILE 'file.csv'
INTO TABLE estimate
(State, Population)
SET Year = 2010;
Relevant part from the doc:
The SET clause can be used to supply values not derived from the input file.

LOAD DATA INFILE - fields terminated by character which also appears in field

I have a large .csv file which I want to import into a MySQL database. I want to use the LOAD DATA INFILE statement on the basis of its speed.
Fields are terminated by -|-. Lines are terminated by |--. Currently I am using the following statement:
LOAD DATA LOCAL INFILE 'C:\\test.csv' INTO TABLE mytable FIELDS TERMINATED BY '-|-' LINES TERMINATED BY '|--'
Most rows look something like this: (Note that the strings are not enclosed by any characters.)
goodstring-|--|-goodstring-|-goodstring-|-goodstring|--
goodstring-|--|-goodstring-|-goodstring-|-|--
goodstring-|-goodstring-|-goodstring-|-goodstring-|-|--
goodstring is a string that does not contain - as a character. As you can see the second or last column might be empty. Rows like the above do not cause any problems. However the last column may contain - characters. There might be a row that looks something like this:
goodstring-|--|-goodstring-|-goodstring-|---|--
The string -- in the last column causes problems. MySQL detects six instead of five columns. It inserts a single - character into the fifth column and truncates the sixth. The correct DB row should be ("goodstring", NULL, "goodstring", "goodstring", "--").
A solution would be to tell MySQL to regard everything after the fourth field has been terminated as part of the fith column (up until the line is terminated). Is this possible with LOAD DATA INFILE? Are there methods that yield the same result, do not require the source file to be edited and perform about as fast as LOAD DATA INFILE?
This is my solution:
LOAD DATA
LOCAL INFILE 'C:\\test.csv'
INTO TABLE mytable
FIELDS TERMINATED BY '-|-'
LINES TERMINATED BY '-\r\n'
(col1, col2, col3, col4, #col5, col6)
SET #col5 = (SELECT CASE WHEN col6 IS NOT NULL THEN CONCAT(#col5, '-') ELSE LEFT(#col5, LENGTH(#col5) - 2) END);
It will turn a row like this one:
goodstring-|--|-goodstring-|-goodstring-|-|--
Into this:
("goodstring", "", "goodstring", "goodstring", NULL)
And a bad row like this one:
goodstring-|--|-goodstring-|-goodstring-|---|--
Into this:
("goodstring", "", "goodstring", "goodstring", "")
I simply drop the last column after the import.

Load data Query with regular expression

I need to load data that is coming from a csv file to a particular table. I am loading 5 fields of the csv file into the table. I need to apply regular expression for a particular field value in the csv file. IF it doesnt match i need to reject that record. Is it possible ?
This is my load data query:
LOAD DATA LOCAL INFILE ''/test.csv''
INTO TABLE TEST_TABLE FIELDS
TERMINATED BY '',''
LINES TERMINATED BY ''\n''
(#FIELD1,#FIELD2,#FIELD3,#FIELD4,#FIELD5)
SET
FIELD1=STR_TO_DATE(#FIELD1,''%d-%m-%Y''), FIELD2=nullif(#FIELD2,''''),
FIELD3=nullif(#FIELD3,''''), FIELD4=nullif(#FIELD4,''''),
FIELD5=nullif(#FIELD5,'''');
If the values that is coming in field4 in csv file is equal to either 200 or 300, i need to consider that record and load other values otherwise i need to reject the record.
Sample file::
1),234232323,STATUS,200,33
2),45454545,STATUS,300,33
3),646546445,STATUS,100,33
here 1st and 2nd record should be considered and 3rd record should be rejected.
LOAD ...;
DELETE TEST_TABLE WHERE field4 NOT IN (200,300);

mysql load data, issue with boolean field

I m loading data in a mysql table with the instruction :
LOAD DATA LOCAL INFILE "/home/user123/Documents/PartageVB/export tables/pays.csv" INTO TABLE T_PAYS FIELDS TERMINATED BY ";" LINES TERMINATED BY "\n"(id, name, cit,actif);
In the file pays.csv, the column cit is boolean, and gets the value 1 or 0.
My problem is that once LOAD DATA is done, cit always gets the value 1 in the mysql table.
Does anyone know where I m wrong?
In the table, cit type is Bit(1).
Thanks
Please use "TINYINT(1)" instead of "Bit(1)".It might work.
Refer this link

load data local infile - select ignoring last column

I'm trying to import csv file into mysql database using load data local infile
My csv file looks like this:
1;Lubos;Chrapna;92;muz;Topolcany
2;Branislav;Grecni;28;muz;Topolcianky
3;David;Forgac;57;muz;Hronsky Benadik
4;Imrich;Doci;58;muz;Kosice
table I wanna put this cvs file in looks like this:
CREATE TABLE IF NOT EXISTS DIM_zakaznik (
id int(5),
meno varchar(15),
priezvisko varchar(15),
vek int(15),
pohlavie varchar(15),
bydlisko varchar(15))
and my query looks like this:
load data local infile 'dim_zakaznik.csv' into table DIM_zakaznik
fields terminated by ';'
enclosed by '\n'
lines terminated by '\n';
it works just fine and when I look at that table in phpMyAdmin everything looks OK but my problem is that when I'm trying to run select it's ignoring last column (called 'bydlisko')
for example I try this:
SELECT * FROM `dim_zakaznik` WHERE `bydlisko`='Topolcany'
and all it does it just says: "MySQL returned an empty result set (i.e. zero rows). ( Query took 0.0184 sec)"
it shouldn't be empty result right?
It's not working only with that last column and I have no idea why.
When I tried it before with those columns plus I added another one column (that column was last in table) it ignored that other column (that was last column in table)
I'd really appreciate some help. Thanks in advance!
Do not use "enclosed by '\n'" section in your query, as your fields in CSV are not delimited by '\n'. Try removing that section, then run your query again.
Also, just as a remark, be sure that you don't truncate column data. Last column is 15 chars long. Is that enough for 15-letters word ? (Varchar takes one more byte for length, so there would be necessarily 16 for value "Hronsky Benadik").
So an old one, but I just stumbled across this and noticed there's no good answers to it.
I see this happen every single time I assign and then grade an "upload csv homework" in a database class. What does your .csv file look like, you didn't show it here.
if this query you showed does NOT work:
SELECT * FROM `dim_zakaznik` WHERE `bydlisko`='Topolcany'
then I feel the need to ask, does this query work:
SELECT * FROM `dim_zakaznik` WHERE `bydlisko`=' Topolcany'
If so then remember: "spaces are for humans", and adjust your .csv file accordingly. (also checking LINES TERMINATED BY '\r\n' will help if your table is getting formatting issues.