I'm trying to load a big CSV file into mysql but couldn't find out why it fails.
My CSV file looks like this:
_id,"event","unit","created","r1","r2","r3","r4","space_id","owner_id","name","display_name","users__email"
565ce313819709476d7eaf0e,"create",3066,"2015-12-01T00:00:19.604Z","563f592dd6f47ae719be8b38","3","13","7","55ecdd4ea970e6665f7e3911","55e6e3f0a856461404a60fc1","household","household","foo.bar#ace.com"
565ce350819709476d7eaf0f,"complete",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
565ce350819709476d7eaf0f,"delete",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
565ce350819709476d7eaf0f,"update",3067,"2015-12-01T00:01:19.988Z","21","","","","55e6df3ba856461404a5fdc9","55e6e3f0a856461404a60fc1","Ace","Base","foo.bar#ace.com"
And my code to load the file into mysql is this one:
CREATE DATABASE IF NOT EXISTS analys;
USE analys;
CREATE TABLE IF NOT EXISTS event_log (
_id CHAR(24) NOT NULL,
event_log VARCHAR(255),
unit CHAR(4),
created VARCHAR(255),
r1 VARCHAR(255),
r2 VARCHAR(255),
r3 VARCHAR(255),
r4 VARCHAR(255),
space_id VARCHAR(255),
owner_id VARCHAR(255),
name VARCHAR(255),
display_name VARCHAR(255),
users__email VARCHAR(255),
PRIMARY KEY (_id)
)
LOAD DATA INFILE 'audits.export.csv'
INTO TABLE event_log
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n\r'
IGNORE 1 ROWS;
Everything is fine, including the query but I get NULL in every column (only one row).
Here is the Action Output:
22:31:21 LOAD DATA INFILE 'audits.export.csv' INTO TABLE event_log FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n\r' IGNORE 1 ROWS 0 row(s) affected Records: 0 Deleted: 0 Skipped: 0 Warnings: 0 0.156 sec
I tried to tweak the table and the load query but it doesn't work
I'm on Windows 7, using Mysql 5.6 and Workbench.
I heard about GUI solution or Excel connectors here (Excel Connector) but I prefer to do it programmaticaly as I need to reuse the code.
Any help? I couldn't solve the problem with similar posts on Stackoverflow.
This doens't seem a valid newline:
LINES TERMINATED BY '\n\r'
change to:
LINES TERMINATED BY '\r\n'
Or only one of them (could be single \n or \r depending on the system, or the software that created the csv file).
Because some might come here wanting to know how to do this with MySQL Workbench:
Save CSV data to a file, for example foo.csv
Open MySQL Workbench
Choose/Open a MySQL Connection
Right-click on a schema in the Object Browser (left Navigator)
Choose "Table Data Import Wizard"
Select the CSV file, which is foo.csv in our example
Follow the wizard; many configuration options are available, including the separator
When finished, the CSV data will be in a new or existing table (your choice)
For additional information, see the documentation titled Table Data Export and Import Wizard.
I just tested this in the example data provided in the question, and it worked.
Related
Each time I go and create my table and then add the .csv file, it keeps not fully loading in all of the data. Below is my code:
CREATE TABLE Parts_Maintenance (
Vehicle_ID BIGINT(20),
State VARCHAR(255),
Repair VARCHAR(255),
Reason VARCHAR(255),
Year YEAR,
Make VARCHAR(255),
Body_Type VARCHAR(255),
PRIMARY KEY (Vehicle_ID)
);
LOAD DATA INFILE '/home/codio/workspace/FleetMaintenanceRecords.csv'
INTO TABLE Parts_Maintenance
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\\n'
IGNORE 1 ROWS;
SELECT * FROM Parts_Maintenance;
Here is a photo of what it looks like in Codio:
And here is a photo of some of the data being brought in:
Could someone please help me pinpoint what I am doing wrong?
Tried to create table and bring in a .csv file. Table was created but the data is not all there and the table looks messed up
I agree with #barmer Your LINES TERMINATED BY is probably wrong, it's probably \r
You May use
LOAD DATA INFILE '/home/codio/workspace/FleetMaintenanceRecords.csv'
INTO TABLE Parts_Maintenance
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 ROWS;
Alternatively, you can use another method with workbench.
Right Click on Table
- Table data import wizard
- Choose the file path
Select the destination table
- Check and configure import settings
- Click on Next to execute.
This is the best and simple solution.
Goal: I have to import CSV file into MySQL server
Problem: Only 1st row is inserting and that too with wrong entry.
formant of my csv file is:
lifelock,LifeLock,,web,Tempe,AZ,1-May-07,6850000,USD,b
lifelock,LifeLock,,web,Tempe,AZ,1-Oct-06,6000000,USD,a
lifelock,LifeLock,,web,Tempe,AZ,1-Jan-08,25000000,USD,c
mycityfaces,MyCityFaces,7,web,Scottsdale,AZ,1-Jan-08,50000,USD,seed
flypaper,Flypaper,,web,Phoenix,AZ,1-Feb-08,3000000,USD,a
query to create sql table:
create table fund(permalink varchar(20), company varchar(20), numEmps int, category varchar(15),
city varchar(20),state varchar(15),fundedDate Date,raisedAmt int, raisedCurrency longtext,round longtext);
query to import csv file:
load data infile '/var/lib/mysql-files/TechCrunchcontinentalUSA.csv' into table fund fields terminated by ','
lines terminated by '\n' (permalink, company, #numEmps , category , city ,state, #funded, #raised, raisedCurrency ,round )
set numEmps=cast(#emps as unsigned), raisedAmt = cast(#raised as unsigned), fundedDate =STR_TO_DATE(#funded, '%d-%b-%Y') ;
Output image
Is your primary key set to AUTO_INCREMENT? Also, check if your csv file has newline separator for each line. Sometimes, csv column data can be too big to fit in to your defined db column say your are defining your city column to hold a value of varchar(20) but in csv file you have an instance where city is more than 20 characters.
You can always validate your csv using this online tool (make sure you don't have any sensitive data as it is a third party tool)
Alternatively, try this too if your csv file and table name are same:
mysqlimport --lines-terminated-by='\n' --fields-terminated-by=',' --fields-enclosed-by='"' --verbose --local db_name TechCrunchcontinentalUSA.csv
Provide username and passwords flags -uroot -proot if you have one setup.
sql query
load data infile '/var/lib/mysql-files/TechCrunchcontinentalUSA.csv' into table fund fields terminated by ','
lines terminated by '\r' (permalink, company, #numEmps , category , city ,state, #funded, #raised, raisedCurrency ,round )
set numEmps=cast(#emps as unsigned),fundedDate =STR_TO_DATE(#funded, '%d-%b-%y'), raisedAmt = cast(#raised as unsigned) ;
I have changed the line terminator from \n to \r and changed the date from %d-%b-%y to %d-%b-%y because in date Y takes 4 digits as year i.e., 2007 as year and in y it takes 2 digit year i.e., 07.
For date format specifier visit here.
I'm new to SQL and using it for something at work.
I have the following table I made:
CREATE TABLE PC_Contacts
(
POC VARCHAR(255) PRIMARY KEY NOT NULL,
Phone_1 VARCHAR(255),
Phone_2 VARCHAR(255)
);
I import some data from a CSV into the table using the following command in powershell:
cmd /c 'mysql -u root -p network < CSVImport.sql'. This is what is contained within the CSVImport file:
USE Network
LOAD DATA INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 5.7\\Uploads\\PC_Contacts.csv'
INTO Table PC_Contacts
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
After populating the table, something I noticed is when the last column (Phone_2) doesn't have a phone number populated, instead of inputting a NULL value, it is blank. In addition, several characters on the POC column are cut off whenever a phone number is omitted. So I input xxx-xxx-xxxx into the columns, repopulated it, and everything looked clean. How can I make it so I don't have to do this and the table can just populate itself with NULL values?
+------------------+--------------+--------------
| POC | Phone_1 | Phone_2
+------------------+--------------+--------------
|April Wilson| 123-456-7890 | xxx-xxx-xxxx
|Anton Watson | 234-567-8901| 567-890-1234
|Ashley Walker | 345-678-9012 | 456-789-0123
Names and phone have been altered, of course . If I were to take xxx-xxx-xxxx out though, you would see in the POC column something like 'lson' instead of the full name. Any thoughts?
This bit of code , found from this answer, should solve your problem I think.It will check each value and if its not present assign '' to it.
To me it seem what is happening is when a value is missing in PC_Contacts.csv the import gets confused and that is why you get some data trimed off the other columns.
USE Network
LOAD DATA INFILE 'C:\\ProgramData\\MySQL\\MySQL Server 5.7\\Uploads\\PC_Contacts.csv'
INTO Table PC_Contacts
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(#vone, #vtwo, #vthree, #vfour, #vfive)
SET
POC = nullif(#vone,''),
Phone_1 = nullif(#vtwo,''),
Phone_2 = nullif(#vthree,'')
;
Im importing a csv file to a mysql table with the following query;
"LOAD DATA INFILE 'myfielname.csv'
INTO table customers
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\r'
IGNORE 3 LINES
(sales,regional,accounts)
";
Is there any way to insert a string of characters before a field that is to be imported?
For example: The field 'sales' above refers to account id numbers, which are being used in the application. Id like to append a URL before account number during import so the final record in the table will be as follows:
String I want to come before 'sales', but within the same record: http://www.url.com?id=
If a given sales id was 1234 the final record in the table would be http://www.url.com?id=1234
Thanks in advance for your help.
Try someting like this
LOAD DATA LOCAL INFILE 'C:/test.csv'
INTO TABLE test.test1
FIELDS TERMINATED BY ';'
(#test1col,#test2col)
set test1col=CONCAT('http://url.com?id=',#test1col),test2col=#test2col;
The test csv has 2 columns. I created a test table like this
CREATE TABLE `test1` (
`test1col` varchar(200) DEFAULT NULL,
`test2col` varchar(2000) DEFAULT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
You could try immediatley with your own, just make sure you name the columns correctly!
Give it a try it worked for me.
Need experts help to get whole original line using LOAD DATA LOCAL INFILE and put into my db column
Sample
table
DROP TABLE IF EXISTS `syslog`;
CREATE TABLE `syslog` (
`the_time` VARCHAR(80) NOT NULL,
`the_key` VARCHAR(30) NOT NULL,
`the_log` VARCHAR(1024) NOT NULL
)
ENGINE = MyISAM;
File : D:/rnd/syslog.csv
"device","date_time","src_ip","dst_ip","log_type","message"
"Fortigate","2012-05-02 12:02:03","192.168.1.1","192.168.1.11","vpn","Sample message1"
"Fortigate","2012-05-02 12:02:04","192.168.1.2","192.168.1.12","vpn","Sample message2"
"Fortigate","2012-05-02 12:02:05","192.168.1.3","192.168.1.13","traffic","Sample message3"
"Fortigate","2012-05-02 12:02:06","192.168.1.4","192.168.1.14","traffic","Sample message4"
"Fortigate","2012-05-02 12:02:07","192.168.1.5","192.168.1.15","vpn","Sample message5"
"Fortigate","2012-05-02 12:02:08","192.168.1.6","192.168.1.16","vpn","Sample message6"
Mysql Statement
SET #delimeter = ",";
LOAD DATA LOCAL INFILE
"D:/rnd/syslog.csv"
INTO TABLE syslog
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
( #device,
#date_time,
#src_ip,
#dst_ip,
#log_type,
#message)
SET the_time = #date_time,
the_key=CONCAT(#src_ip, "~" , #dst_ip),
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message);
Currently is is just working with manual setting like
the_log=CONCAT(#device,#delimeter,#date_time,#delimeter,#src_ip,#delimeter,#dst_ip,#delimeter,#log_type,#delimeter,#message)
Is there any other way to get the whole line since the actual column is 60 and it is not a good idea to do it manually inside the code + not easy to maintain later.
Objective : To use the csv data and manipulate it into my own table (Means the column does not same as the csv)
The other way is to create table structure as CSV-file, store data as is, and use CONCAT only when you SELECT data from log table.