I have a csv file (12 gb) which exported from oracle database,
formatted like
6436,,N,,,,,,,,,,,,04/01/1999,04/01/1999,352,1270,1270,406,406,1999,1,31/01/1999,0,88,0,A,11/12/2005,N,0,11/12/2005,,,,1270,1,0,,2974,,,,,,,,,,,,,,,,,,,,,,,,
As you see it has a lot of null values (mostly integer),
And when i import it to mysql database, It fills null values with zero
Like,
6436,0,0,,0,0,0,0,0,0,0,0,0,0,4,04/01/1999,04/01/1999,1270,1270,406,406,1999,1,31,31/01/1999,88,0,A,11/12/2005,N,0,11/12/2005,0,0,0,1270,1,0,,2974,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,,,0,0,00/00/0000,0,,0,0
What is the real issue here?
Thanks.
I figure it out with helpful comment of #Marc B
I wrote something like
LOAD DATA LOCAL INFILE '/path/data.csv' INTO TABLE table_name
FIELDS TERMINATED BY ','
IGNORE 1 LINES
(columns with #)
SET
column = IF(length(#column)= 0,null,#column),
date = str_to_date(#date, '%d/%m/%Y');
Related
I am loading a local csv for a table in MySQL. The table has 6 columns (or fields), but, after I load the data, only the message_date column is filled. In other words, the other 5 columns in the table are all NULL, after the LOAD DATA run. Why? I need to SET all of the 5 columns to bring them to my table?
Detail: When I do not SET any variable, just import them all as character fields, all of the columns appears. So it looks like is something after the IGNORE statement that is affecting all of the other 5 columns.
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/DadosBrutosEventTracks.csv'
INTO TABLE messages
FIELDS TERMINATED BY ';' LINES TERMINATED BY '\n'
IGNORE 1 LINES
(#var1, #var2, #var3, #var4, #var5, #var6)
SET
message_date = STR_TO_DATE(SUBSTRING(#var3, 1, 22), "%Y-%m-%d %H:%i:%s.%f");
You only load data into variables. Therefore no data goes into any columns. Loading data into variables happens instead of loading into columns.
(#var1, #var2, #var3, #var4, #var5, #var6)
Then you set one column:
SET
message_date = STR_TO_DATE(SUBSTRING(#var3, 1, 22), "%Y-%m-%d %H:%i:%s.%f");
The other columns are not mentioned in the SET clause, so they get no data.
I need a way to use the UTC_TIMESTAMP() function in a CSV file.
Currently I'm using the following Syntax;
LOAD DATA INFILE '/path/to/file.csv'
INTO TABLE my_table FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n';
I'm not quite sure what to do to keep the UTC_TIMESTAMP() from being enclosed in quotes. When entered in the database, I get the following result
"0000-00-00 00:00:00"
This was the only example I could find on stack overflow or on Google for converting a string to a MYSQL value.
MySQL CSV import: datetime value
I solved the problem by following the MySQL documentation on this page.
https://dev.mysql.com/doc/refman/5.7/en/load-data.html
About halfway down the page there is a section that shows you how to create variables for rows and then set table rows equal to native mysql functions or values assigned to those variables(Whichever you choose).
The example in the documentation that I'm referring to looks like this.
LOAD DATA INFILE 'file.txt'
INTO TABLE t1
(column1, column2)
SET column3 = CURRENT_TIMESTAMP;
I fixed my problem by restructuring my code like this...
LOAD DATA LOCAL INFILE '/path/to/file.csv'
INTO TABLE veh_icodes FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(id, vcode, year, make, model, body_style, tran_type, engine_cap, drive_train, doors, trim, created_by, updated_by, #created_at, #updated_at, active)
SET created_at = CURRENT_TIMESTAMP, updated_at = CURRENT_TIMESTAMP;"
I hope this helps someone. :)
I am kind of new here. Have been searching for 2 days and no luck so I am posting my question here. Simply put I need to load data into a table in mysql. Now the thing is the input data for this table will be coming from two different source.
For eg: below is the how the 2 input files will be.
Input_file1
Field Cust_ID1, Acct_ID1, MODIFIED, Cust_Name, Acct_name, Comp_name1, Add2, Comp_name2, Add4
Sample value C1001, A1001, XXXXXX, JACK, TIM KFC, SINGAPORE, YUM BRAND, SINGAPORE
Input_file2
Field ID, MODIFIEDBY, Ref_id, Sys_id
Sample value 3001, TONY, 4001, 5001
Sorry was not able to copy data as in excel so improvised. The ',' is to show separate values. Field specifies the column name and its corresponding value is under sample value.
And the table that the above data needs to be loaded into is as such
Sample _table_structure
ID
Cust_ID1
Acct_ID1
Ref_id
Sys_id
MODIFIED
MODIFIEDBY
Cust_Name
Acct_name
Comp_name1
Add2
Comp_name2
Add4
What I need to do is load data into this table from the input data that comes to me in one single go. Is this possible. As you can see the order is also not a match that I can append and load it. Which is one main issue for me.
And no, changing the input sequence is not a option. Data is huge so that will take too much effort. Any help with this I would appreciate. Also I would like to know if we could use a shell or perl script to do this.
Thanks in advance for the help & time.
load data local infile 'c:\\temp\\file.csv'
into table table_name fields terminated by ',' LINES TERMINATED BY '\r\n' ignore 1 lines
(#col1,#col2,#col3,#col4,#col5,#col6,#col7,#col8,#col9)
set Cust_ID1 = #col1,
Acct_ID1 = #col2,
MODIFIED =#col3,
Cust_Name =#col4....;
load data local infile 'c:\\temp\\file2.csv'
into table table_name fields terminated by ',' LINES TERMINATED BY '\r\n' ignore 1 lines
(#col1,#col2,#col3,#col4 ) ## here Number the respective columns as per the table
set ID = #col1,
MODIFIEDBY = #col2,
REF_ID = #col3,
sys_ID = #col4....
ID, MODIFIEDBY, Ref_id, Sys_id
same thing for csv file 2.
this way you can import file to table.
Note :
Please save Excel file as csv format and then import
I have a form where user updates the table in database with serial numbers, the problem is that in my .csv file serial number has value 0 and after inserting it, it has 000000, same for the 1, after inserting it is 000001. I need it in exact way like it is in .csv file. My code for the LOAD is:
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE im_seriennummer CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
In .csv file it is like this:
0
1
And in database
000000
000001
In the database sn is varchar(16).
Is this problem familiar to anyone? Please don't tell me to change the type of field, I need to have it in varchar since some serial numbers are like this MT 002
The solution,i think, is to use a temp table from import the csv.
CREATE TEMPORARY TABLE tmptab LIKE im_seriennummer;
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE tmptab CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
UPDATE tmptab SET SERIAL = RIGHT(CONCAT('000000', SERIAL), 6)
INSERT INTO im_seriennummer
SELECT * FROM tmptab
DROP TEMPORARY TABLE tmptab;
Need experts help to get whole original line using LOAD DATA LOCAL INFILE and put into my db column
Sample
table
DROP TABLE IF EXISTS `syslog`;
CREATE TABLE `syslog` (
`the_time` VARCHAR(80) NOT NULL,
`the_key` VARCHAR(30) NOT NULL,
`the_log` VARCHAR(1024) NOT NULL
)
ENGINE = MyISAM;
File : D:/rnd/syslog.csv
"device","date_time","src_ip","dst_ip","log_type","message"
"Fortigate","2012-05-02 12:02:03","192.168.1.1","192.168.1.11","vpn","Sample message1"
"Fortigate","2012-05-02 12:02:04","192.168.1.2","192.168.1.12","vpn","Sample message2"
"Fortigate","2012-05-02 12:02:05","192.168.1.3","192.168.1.13","traffic","Sample message3"
"Fortigate","2012-05-02 12:02:06","192.168.1.4","192.168.1.14","traffic","Sample message4"
"Fortigate","2012-05-02 12:02:07","192.168.1.5","192.168.1.15","vpn","Sample message5"
"Fortigate","2012-05-02 12:02:08","192.168.1.6","192.168.1.16","vpn","Sample message6"
Mysql Statement
SET #delimeter = ",";
LOAD DATA LOCAL INFILE
"D:/rnd/syslog.csv"
INTO TABLE syslog
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
( #device,
#date_time,
#src_ip,
#dst_ip,
#log_type,
#message)
SET the_time = #date_time,
the_key=CONCAT(#src_ip, "~" , #dst_ip),
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message);
Currently is is just working with manual setting like
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message)
Is there any other way to get the whole line since the actual column is 60 and it is not a good idea to do it manually inside the code + not easy to maintain later.
Objective : To use the csv data and manipulate it into my own table (Means the column does not same as the csv)
The other way is to create table structure as CSV-file, store data as is, and use CONCAT only when you SELECT data from log table.