To get the whole line while using Mysql LOAD DATA LOCAL INFILE - mysql

Need experts help to get whole original line using LOAD DATA LOCAL INFILE and put into my db column
Sample
table
DROP TABLE IF EXISTS `syslog`;
CREATE TABLE `syslog` (
`the_time` VARCHAR(80) NOT NULL,
`the_key` VARCHAR(30) NOT NULL,
`the_log` VARCHAR(1024) NOT NULL
)
ENGINE = MyISAM;
File : D:/rnd/syslog.csv
"device","date_time","src_ip","dst_ip","log_type","message"
"Fortigate","2012-05-02 12:02:03","192.168.1.1","192.168.1.11","vpn","Sample message1"
"Fortigate","2012-05-02 12:02:04","192.168.1.2","192.168.1.12","vpn","Sample message2"
"Fortigate","2012-05-02 12:02:05","192.168.1.3","192.168.1.13","traffic","Sample message3"
"Fortigate","2012-05-02 12:02:06","192.168.1.4","192.168.1.14","traffic","Sample message4"
"Fortigate","2012-05-02 12:02:07","192.168.1.5","192.168.1.15","vpn","Sample message5"
"Fortigate","2012-05-02 12:02:08","192.168.1.6","192.168.1.16","vpn","Sample message6"
Mysql Statement
SET #delimeter = ",";
LOAD DATA LOCAL INFILE
"D:/rnd/syslog.csv"
INTO TABLE syslog
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
( #device,
#date_time,
#src_ip,
#dst_ip,
#log_type,
#message)
SET the_time = #date_time,
the_key=CONCAT(#src_ip, "~" , #dst_ip),
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message);
Currently is is just working with manual setting like
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message)
Is there any other way to get the whole line since the actual column is 60 and it is not a good idea to do it manually inside the code + not easy to maintain later.
Objective : To use the csv data and manipulate it into my own table (Means the column does not same as the csv)

The other way is to create table structure as CSV-file, store data as is, and use CONCAT only when you SELECT data from log table.

Related

Trying to read mm/dd/yyyy format in sql while creating table

So I am very new in SQL and I am trying to create a table where I will later import a .csv file. In this table there is a time stamp column that I want to set it up to read mm/dd/yyyy hh:mi:ss, yet I've tried doing this:
create table Particle_counter_HiSam ( time_utc timestamp(m/d/Y hh:mi:ss),...
and i get this error
ERROR: syntax error at or near "m"
I just can't seem to figure this out.
Any help will do. Thanks!
Create the table as normal timestamp and use SET with STR_TO_DATE in load data infile as below.
-- table definition
create table Particle_counter_HiSam ( time_utc timestamp, ... );
-- load data
load data infile 'data.csv'
into table Particle_counter_HiSam
fields terminated BY ',' ESCAPED BY ""
lines terminated by '\r\n'
(#var1, c2, ....)
SET time_utc = STR_TO_DATE(#var1,'%m/%d/%Y %H:%i:%S');
if your creating a table for timestapm, just use this..
CREATE TABLE IF NOT EXIST 'Particle_counter_HiSam'
{
'date_log' timestamp NOT NULL,
}
hope this help..

Load csv file into mysql database

I'm trying to load a big CSV file into mysql but couldn't find out why it fails.
My CSV file looks like this:
_id,"event","unit","created","r1","r2","r3","r4","space_id","owner_id","name","display_name","users__email"
565ce313819709476d7eaf0e,"create",3066,"2015-12-01T00:00:19.604Z","563f592dd6f47ae719be8b38","3","13","7","55ecdd4ea970e6665f7e3911","55e6e3f0a856461404a60fc1","household","household","foo.bar#ace.com"
565ce350819709476d7eaf0f,"complete",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
565ce350819709476d7eaf0f,"delete",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
565ce350819709476d7eaf0f,"update",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
And my code to load the file into mysql is this one:
CREATE DATABASE IF NOT EXISTS analys;
USE analys;
CREATE TABLE IF NOT EXISTS event_log (
_id CHAR(24) NOT NULL,
event_log VARCHAR(255),
unit CHAR(4),
created VARCHAR(255),
r1 VARCHAR(255),
r2 VARCHAR(255),
r3 VARCHAR(255),
r4 VARCHAR(255),
space_id VARCHAR(255),
owner_id VARCHAR(255),
name VARCHAR(255),
display_name VARCHAR(255),
users__email VARCHAR(255),
PRIMARY KEY (_id)
)
LOAD DATA INFILE 'audits.export.csv'
INTO TABLE event_log
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n\r'
IGNORE 1 ROWS;
Everything is fine, including the query but I get NULL in every column (only one row).
Here is the Action Output:
22:31:21 LOAD DATA INFILE 'audits.export.csv' INTO TABLE event_log FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n\r' IGNORE 1 ROWS 0 row(s) affected Records: 0 Deleted: 0 Skipped: 0 Warnings: 0 0.156 sec
I tried to tweak the table and the load query but it doesn't work
I'm on Windows 7, using Mysql 5.6 and Workbench.
I heard about GUI solution or Excel connectors here (Excel Connector) but I prefer to do it programmaticaly as I need to reuse the code.
Any help? I couldn't solve the problem with similar posts on Stackoverflow.
This doens't seem a valid newline:
LINES TERMINATED BY '\n\r'
change to:
LINES TERMINATED BY '\r\n'
Or only one of them (could be single \n or \r depending on the system, or the software that created the csv file).
Because some might come here wanting to know how to do this with MySQL Workbench:
Save CSV data to a file, for example foo.csv
Open MySQL Workbench
Choose/Open a MySQL Connection
Right-click on a schema in the Object Browser (left Navigator)
Choose "Table Data Import Wizard"
Select the CSV file, which is foo.csv in our example
Follow the wizard; many configuration options are available, including the separator
When finished, the CSV data will be in a new or existing table (your choice)
For additional information, see the documentation titled Table Data Export and Import Wizard.
I just tested this in the example data provided in the question, and it worked.

using linux shell script insert dump data to Mysql

i use below script for insert data to sql from textpad.
#!/bin/bash
mysql --utest -ptest test << EOF
LOAD DATA INFILE 'test.txt'
INTO TABLE content_delivery_process
FIELDS TERMINATED BY ',';
EOF
in my test file i have a format like,
cast , date , name , buy
i can insert but i need format like below,
S.NO | date | name | buy | cast
You can specify the columns you want to import:
From the MySQL Manual:
MySQL LOAD DATA INFILE
The following example loads all columns of the persondata table:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata;
By default, when no column list is provided at the end of the LOAD
DATA INFILE statement, input lines are expected to contain a field for
each table column.
If you want to load only some of a table's columns, specify a column
list:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
You must also specify a column list if the order of the fields in the
input file differs from the order of the columns in the table.
Otherwise, MySQL cannot tell how to match input fields with table
columns.
You would include "FIELDS TERMINATED BY '|';" at the end to import data delimited with a '|' symbol.
Hope this helps.
create table [YOUR TABLE] ( `S.NO` INT AUTO_INCREMENT, date DATETIME, name VARCHAR(50), buy VARCHAR(50), cast VARCHAR(50));
Load data local infile 'test.txt' ignore into table [YOUR TABLE] fields terminated by ',' lines terminated by '\n'(cast , date , name , buy);

Adding a string to a field during import with LOAD DATA LOCAL INFILE

Im importing a csv file to a mysql table with the following query;
"LOAD DATA INFILE 'myfielname.csv'
INTO table customers
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\r'
IGNORE 3 LINES
(sales,regional,accounts)
";
Is there any way to insert a string of characters before a field that is to be imported?
For example: The field 'sales' above refers to account id numbers, which are being used in the application. Id like to append a URL before account number during import so the final record in the table will be as follows:
String I want to come before 'sales', but within the same record: http://www.url.com?id=
If a given sales id was 1234 the final record in the table would be http://www.url.com?id=1234
Thanks in advance for your help.
Try someting like this
LOAD DATA LOCAL INFILE 'C:/test.csv'
INTO TABLE test.test1
FIELDS TERMINATED BY ';'
(#test1col,#test2col)
set test1col=CONCAT('http://url.com?id=',#test1col),test2col=#test2col;
The test csv has 2 columns. I created a test table like this
CREATE TABLE `test1` (
`test1col` varchar(200) DEFAULT NULL,
`test2col` varchar(2000) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
You could try immediatley with your own, just make sure you name the columns correctly!
Give it a try it worked for me.

leading zeros in mysql

I have a form where user updates the table in database with serial numbers, the problem is that in my .csv file serial number has value 0 and after inserting it, it has 000000, same for the 1, after inserting it is 000001. I need it in exact way like it is in .csv file. My code for the LOAD is:
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE im_seriennummer CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
In .csv file it is like this:
0
1
And in database
000000
000001
In the database sn is varchar(16).
Is this problem familiar to anyone? Please don't tell me to change the type of field, I need to have it in varchar since some serial numbers are like this MT 002
The solution,i think, is to use a temp table from import the csv.
CREATE TEMPORARY TABLE tmptab LIKE im_seriennummer;
LOAD DATA LOCAL INFILE path_to_file.csv
INTO TABLE tmptab CHARACTER SET latin1
FIELDS TERMINATED BY ";"
IGNORE 1 LINES
(sn,description_sn)
UPDATE tmptab SET SERIAL = RIGHT(CONCAT('000000', SERIAL), 6)
INSERT INTO im_seriennummer
SELECT * FROM tmptab
DROP TEMPORARY TABLE tmptab;