I have created a database and a table. I have also created all the fields I will be needing. I have created 46 fields including one that is my ID for the row. The CSV doesn't contain the ID field, nor does it contain the headers for the columns. I am new to all of this but have been trying to figure this out. I'm not on here being lazy asking for the answer, but looking for directions.
I'm trying to figure out how to import the CSV but have it start importing data starting at the 2nd field, since I'm hoping the auto_increment will fill in the ID field, which is the first field I created.
I tried these instructions with no luck. Can anyone offer some insight?
The column names of your CSV file must match those of your table
Browse to your required .csv file
Select CSV using LOAD DATA options
Check box 'ON' for Replace table data with file
In Fields terminated by box, type ,
In Fields enclosed by box, "
In Fields escaped by box, \
In Lines terminated by box, auto
In Column names box, type column name separated by , like column1,column2,column3
Check box ON for Use LOCAL keyword.
Edit:
The CSV file is 32.4kb
The first row of my CSV is:
Test Advertiser,23906032166,119938,287898,,585639051,287898 - Engager - 300x250,88793551,Running,295046551,301624551,2/1/2010,8/2/2010,Active,,Guaranteed,Publisher test,Maintainer test,example-site.com,,All,All,,Interest: Dental; custom geo zones: City,300x250,-,CPM,$37.49 ,"4,415","3,246",3,0,$165.52 ,$121.69 ,"2,895",805,0,0,$30.18 ,$37.49 ,0,$0.00 ,IMPRESSIONBASED,NA,USD
You can have MySQL set values for certain columns during import. If your id field is set to auto increment, you can set it to null during import and MySQL will then assign incrementing values to it. Try putting something like this in the SQL tab in phpMyAdmin:
LOAD DATA INFILE 'path/to/file.csv' INTO TABLE your_table FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' SET id=null;
Please look at this page and see if it has what you are looking for. Should be all you need since you are dealing with just one table. MYSQL LOAD DATA INFILE
So for example you might do something like this:
LOAD DATA INFILE 'filepath' INTO TABLE 'tablename' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (column2, column3, column4);
That should give you an idea. There are of course more options that can be added as seen in the above link.
be sure to use LOAD DATA LOCAL INFILE if the import file is local. :)
Related
I have an unnormalized events-diary CSV from a client that I'm trying to load into a MySQL table so that I can refactor into a sane format. I created a table called 'CSVImport' that has one field for every column of the CSV file. The CSV contains 99 columns , so this was a hard enough task in itself:
CREATE TABLE 'CSVImport' (id INT);
ALTER TABLE CSVImport ADD COLUMN Title VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN Company VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN NumTickets VARCHAR(256);
...
ALTER TABLE CSVImport Date49 ADD COLUMN Date49 VARCHAR(256);
ALTER TABLE CSVImport Date50 ADD COLUMN Date50 VARCHAR(256);
No constraints are on the table, and all the fields hold VARCHAR(256) values, except the columns which contain counts (represented by INT), yes/no (represented by BIT), prices (represented by DECIMAL), and text blurbs (represented by TEXT).
I tried to load data into the file:
LOAD DATA INFILE '/home/paul/clientdata.csv' INTO TABLE CSVImport;
Query OK, 2023 rows affected, 65535 warnings (0.08 sec)
Records: 2023 Deleted: 0 Skipped: 0 Warnings: 198256
SELECT * FROM CSVImport;
| NULL | NULL | NULL | NULL | NULL |
...
The whole table is filled with NULL.
I think the problem is that the text blurbs contain more than one line, and MySQL is parsing the file as if each new line would correspond to one databazse row. I can load the file into OpenOffice without a problem.
The clientdata.csv file contains 2593 lines, and 570 records. The first line contains column names. I think it is comma delimited, and text is apparently delimited with doublequote.
UPDATE:
When in doubt, read the manual: http://dev.mysql.com/doc/refman/5.0/en/load-data.html
I added some information to the LOAD DATA statement that OpenOffice was smart enough to infer, and now it loads the correct number of records:
LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
But still there are lots of completely NULL records, and none of the data that got loaded seems to be in the right place.
Use mysqlimport to load a table into the database:
mysqlimport --ignore-lines=1 \
--fields-terminated-by=, \
--local -u root \
-p Database \
TableName.csv
I found it at http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/
To make the delimiter a tab, use --fields-terminated-by='\t'
The core of your problem seems to be matching the columns in the CSV file to those in the table.
Many graphical mySQL clients have very nice import dialogs for this kind of thing.
My favourite for the job is Windows based HeidiSQL. It gives you a graphical interface to build the LOAD DATA command; you can re-use it programmatically later.
Screenshot: "Import textfile" dialog
To open the Import textfile" dialog, go to Tools > Import CSV file:
Simplest way which I have imported 200+ rows is below command in phpmyadmin sql window
I have a simple table of country with two columns
CountryId,CountryName
here is .csv data
here is command:
LOAD DATA INFILE 'c:/country.csv'
INTO TABLE country
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
Keep one thing in mind, never appear , in second column, otherwise your import will stop
I Used this method to import more than 100K records (~5MB) in 0.046sec
Here's how you do it:
LOAD DATA LOCAL INFILE
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);
It is very important to include the last line , if you have more than one field i.e normally it skips the last field (MySQL 5.6.17)
LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);
Then, assuming you have the first row as the title for your fields, you might want to include this line also
IGNORE 1 ROWS
This is what it looks like if your file has a header row.
LOAD DATA LOCAL INFILE
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(field_1,field_2 , field_3);
phpMyAdmin can handle CSV import. Here are the steps:
Prepare the CSV file to have the fields in the same order as the MySQL table fields.
Remove the header row from the CSV (if any), so that only the data is in the file.
Go to the phpMyAdmin interface.
Select the table in the left menu.
Click the import button at the top.
Browse to the CSV file.
Select the option "CSV using LOAD DATA".
Enter "," in the "fields terminated by".
Enter the column names in the same order as they are in the database table.
Click the go button and you are done.
This is a note that I prepared for my future use, and sharing here if someone else can benefit.
If you are using MySQL Workbench (currently 6.3 version) you can do this by:
Right click on "Tables";
Chose Table Data Import Wizard;
Chose your csv file and follow the instructions (JSON also could be used);
The good thing is that you can create a new table based on the csv file you want to import or load data to an existing table
You can fix this by listing the columns in you LOAD DATA statement. From the manual:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
...so in your case you need to list the 99 columns in the order in which they appear in the csv file.
Try this, it worked for me
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' ENCLOSED BY '"' IGNORE 1 ROWS;
IGNORE 1 ROWS here ignores the first row which contains the fieldnames. Note that for the filename you must type the absolute path of the file.
I see something strange. You are using for ESCAPING the same character you use for ENCLOSING. So the engine does not know what to do when it founds a '"' and I think that is why nothing seems to be in the right place.
I think that if you remove the line of ESCAPING, should run great. Like:
LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Unless you analyze (manually, visually, ... ) your CSV and find which character uses for escape. Sometimes is '\'. But if you do not have it, do not use it.
The mysql command line is prone to too many problems on import. Here is how you do it:
use excel to edit the header names to have no spaces
save as .csv
use free Navicat Lite Sql Browser to import and auto create a new table (give it a name)
open the new table insert a primary auto number column for ID
change the type of the columns as desired.
done!
Yet another solution is to use csvsql tool from amazing csvkit suite.
Usage example:
csvsql --db mysql://$user:$password#localhost/$database --insert --tables $tablename $file
This tool can automatically infer the data types (default behavior), create table and insert the data into the created table. --overwrite option can be used to drop table if it already exists. --insert option — to populate the table from the file.
To install the suite
pip install csvkit
Prerequisites: python-dev, libmysqlclient-dev, MySQL-python
apt-get install python-dev libmysqlclient-dev
pip install MySQL-python
In case if you using Intellij
https://www.jetbrains.com/datagrip/features/importexport.html
I use mysql workbench to do the same job.
create new schema
open newly created schema
right click on "Tables" and select "Table Data Import Wizard"
give the csv file path and table name and finally configure your column type because the wizard set default column type based on their values.
Note: take a look at mysql workbench's log file for any errors by using "tail -f [mysqlworkbenchpath]/log/wb*.log"
How to import csv files to sql tables
Example file: Overseas_trade_index data CSV File
Steps:
Need to create table for overseas_trade_index.
Need to create columns related to csv file.
SQL Query:
( id int not null primary key auto_increment,
series_reference varchar (60),
period varchar (60),
data_value decimal(60,0),
status varchar (60),
units varchar (60),
magnitude int(60),
subject text(60),
group text(60),
series_title_1 varchar (60),
series_title_2 varchar (60),
series_title_3 varchar (60),
series_title_4 varchar (60),
series_title_5 varchar (60),
);
Need to connect mysql database in terminal.
=>show databases;
=>use database;
=>show tables;
Please enter this command to import the csv data to mysql tables.
load data infile '/home/desktop/Documents/overseas.csv' into table trade_index fields terminated by ',' lines terminated by '\n' (series_reference,period,data_value,status,units,magnitude,subject,series_title1,series_title_2,series_title_3,series_title_4,series_title_5);
Find this overseas trade index data on sqldatabase:
select * from trade_index;
If you are using a windows machine with Excel spreadsheet loaded, the new mySql plugin to Excel is phenomenal. The folks at Oracle really did a nice job on that software. You can make the database connection directly from Excel. That plugin will analyse your data, and set up the tables for you in a format consistent with the data. I had some monster big csv files of data to convert. This tool was a big time saver.
http://dev.mysql.com/downloads/windows/excel/
You can make updates from within Excel that will populate to the database online. This worked exceedingly well with mySql files created on ultra inexpensive GoDaddy shared hosting. (Note when you create the table at GoDaddy, you have to select some off-standard settings to enable off site access of the database...)
With this plugin you have pure interactivity between your XL spreadsheet and online mySql data storage.
I know that my answer is late, but I'd like to mention a few other ways to do it.
The easiest one is using command line. The steps will be the following:
Accessing the MySQL CLI by entering the below command:
mysql -u my_user_name -p
Creating a table in the database
use new_schema;
CREATE TABLE employee_details (
id INTEGER,
employee_name VARCHAR(100),
employee_age INTEGER,
PRIMARY KEY (id)
);
Importing the CSV file into a table. We can either mention the file path or store the file in the default directory of the MySQL server.
LOAD DATA INFILE 'Path to the exported csv file'
INTO TABLE employee_details
FIELDS TERMINATED BY ','
IGNORE 1 ROWS;
It's the only one of many solutions, I found it in this tutorial
If loading CSV files into MySQL database is your daily task, then it'll be better to automate this process. In this case you can use some 3rd-party tools that allows you to load data in schedule.
PHP Query for import csv file to mysql database
$query = <<<EOF
LOAD DATA LOCAL INFILE '$file'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(name,mobile,email)
EOF;
if (!$result = mysqli_query($this->db, $query))
{
exit(mysqli_error($this->db));
}
**Sample CSV file data **
name,mobile,email
Christopher Gritton,570-686-3439,ChristopherKGritton#inbound.plus
Brandon Wilson,541-309-5149,BrandonMWilson#inbound.plus
Craig White,516-795-8065,CraigJWhite#inbound.plus
David Whitney,713-214-3966,DavidCWhitney#inbound.plus
Here is sample excel file screen shot:
Save as and choose .csv.
And you will have as shown below .csv data screen shot if you open using notepad++ or any other notepad.
Make sure you remove header and have column alignment in .csv as in mysql Table.
Replace folder_name by your folder name
LOAD DATA LOCAL INFILE
'D:/folder_name/myfilename.csv'
INTO TABLE mail
FIELDS TERMINATED BY ','
(fname,lname ,email, phone);
If big data, you can take coffee and have it load!.
Thats all you need.
Change servername,username, password,dbname,path of your file, tablename and the field which is in your database you want to insert
<?php
$servername = "localhost";
$username = "root";
$password = "";
$dbname = "bd_dashboard";
//For create connection
$conn = new mysqli($servername, $username, $password, $dbname);
$query = "LOAD DATA LOCAL INFILE
'C:/Users/lenovo/Desktop/my_data.csv'
INTO TABLE test_tab
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(name,mob)";
if (!$result = mysqli_query($conn, $query)){
echo '<script>alert("Oops... Some Error occured.");</script>';
exit();
//exit(mysqli_error());
}else{
echo '<script>alert("Data Inserted Successfully.");</script>'
}
?>
I did it in simple way using phpmyadmin. I followed the steps by #Farhan but all data were eltered in single column.
How I did:
Created a CSV file and deleted the header row with column names. Kept only data.
I created a table with column names matching the csv columns.
Remember to assign appropriate types to each column.
I just selected the import and went to import tab.
In browse I selected the CSV file and kept all options as it is.
To my surprise all the data got imported successfully in their appropriate columns.
When executing MySQL Query to import CSV I was getting error
'Error Code: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement'
So I moved file to secure file location
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Orders.csv'
INTO TABLE orderdetails.orders
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
Where location of file is 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Orders.csv' this is because, I moved my CSV file to 'secure_file_priv' location otherwise I was getting above error
You can get your secure_file_priv using query SHOW VARIABLES LIKE "secure_file_priv";
Source: Import CSV file to MySQL (Query or using Workbench)
Im trying to import a text file containing:
http://pastebin.com/qhzrq3M7
Into my database using the command
Load data local infile 'C:/Users/Gary/Desktop/XML/jobs.txt'
INTO Table jobs
fields terminated by '\t';
But I keep getting the error Row 1-13 doesn't contain data for all columns
Make sure the last field of each row ends with \t. Alternatively, use LINES TERMINATED BY
LOAD DATA LOCAL INFILE 'C:/Users/Gary/Desktop/XML/jobs.txt' INTO TABLE jobs COLUMNS TERMINATED BY '\t' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\r';
\r is a carriage return character, similar to the newline character (i.e. \n)
I faced same issue. How i fixed the issue:
Try to open the CSV file using Notepad++ (text editor)
I've seen a blank line at the end of my file, I've deleted it.
-- Hurrah, it resolved my issue.
Below URL also can help you out to resolve the issue.
http://www.thoughtspot.com/blog/5-magic-fixes-most-common-csv-file-problems
If you're on Windows, make sure to use the LINES TERMINATED BY \r\n as explained by the mariadb docs
sounds like load data local infile expects to see a value for each column.
You can edit the file by hand (to delete those rows -- could be blank lines), or you can create a temp table, insert the rows into a single column, and write a mysql command to split the rows on tab and insert the values into the target table
Make sure there are no "\"s at the end of any field. In the csv viewed as text this would look like "\," which is obviously a no-no, since that comma will be ignored so you won't have enough columns.
(This primarily applies when you don't have field encasings like quotes around each field.)
I have a csv file with headers name, dept, class. And the table in my sql database field has columns named id, name, class, dept, block.
How do I store and map these csv content into the sql database?
If you know the order of columns in your CSV, then just remove the header and run the following command on the mysql prompt:
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)
This will be much much faster than loading each row and creating an ActiveRecord object for each row.
For more information, check out the mysql documentation here for loading data: http://dev.mysql.com/doc/refman/5.1/en/load-data.html.
For me I use CSV-Mapper, It creates an array of hashes, and uses the first line (the headers) to create the key names in the hash, and each line is the value
Scroll down the page and check the example # Automagical Attribute Discovery Example and Import to ActiveRecord Example
I want to Retrieved data from one of the column of my table name "hote_line", the column name is "elite" and it contains data and I want to retrieve data only from the column "elite". It contains 15,346 names with some info. Is there any way to get that into my Excel so I can make some changes? Because I have to add some id number into each name and it will take long time to edit each name one by one.
First you need to change the engine of the table to allow CSV export:
ALTER TABLE hote_line ENGINE=CSV;
Then,run this query which will export your column into a file called test.csv
SELECT elite
INTO OUTFILE 'test.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n'
FROM hote_line
Excel can than import easily csv files.
Don`t forget to change the engine back to whatever it was,INNODB probably.
EDIT Try it like this:
SELECT elite
INTO OUTFILE 'test.csv'
FROM hote_line
I have an unnormalized events-diary CSV from a client that I'm trying to load into a MySQL table so that I can refactor into a sane format. I created a table called 'CSVImport' that has one field for every column of the CSV file. The CSV contains 99 columns , so this was a hard enough task in itself:
CREATE TABLE 'CSVImport' (id INT);
ALTER TABLE CSVImport ADD COLUMN Title VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN Company VARCHAR(256);
ALTER TABLE CSVImport ADD COLUMN NumTickets VARCHAR(256);
...
ALTER TABLE CSVImport Date49 ADD COLUMN Date49 VARCHAR(256);
ALTER TABLE CSVImport Date50 ADD COLUMN Date50 VARCHAR(256);
No constraints are on the table, and all the fields hold VARCHAR(256) values, except the columns which contain counts (represented by INT), yes/no (represented by BIT), prices (represented by DECIMAL), and text blurbs (represented by TEXT).
I tried to load data into the file:
LOAD DATA INFILE '/home/paul/clientdata.csv' INTO TABLE CSVImport;
Query OK, 2023 rows affected, 65535 warnings (0.08 sec)
Records: 2023 Deleted: 0 Skipped: 0 Warnings: 198256
SELECT * FROM CSVImport;
| NULL | NULL | NULL | NULL | NULL |
...
The whole table is filled with NULL.
I think the problem is that the text blurbs contain more than one line, and MySQL is parsing the file as if each new line would correspond to one databazse row. I can load the file into OpenOffice without a problem.
The clientdata.csv file contains 2593 lines, and 570 records. The first line contains column names. I think it is comma delimited, and text is apparently delimited with doublequote.
UPDATE:
When in doubt, read the manual: http://dev.mysql.com/doc/refman/5.0/en/load-data.html
I added some information to the LOAD DATA statement that OpenOffice was smart enough to infer, and now it loads the correct number of records:
LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
But still there are lots of completely NULL records, and none of the data that got loaded seems to be in the right place.
Use mysqlimport to load a table into the database:
mysqlimport --ignore-lines=1 \
--fields-terminated-by=, \
--local -u root \
-p Database \
TableName.csv
I found it at http://chriseiffel.com/everything-linux/how-to-import-a-large-csv-file-to-mysql/
To make the delimiter a tab, use --fields-terminated-by='\t'
The core of your problem seems to be matching the columns in the CSV file to those in the table.
Many graphical mySQL clients have very nice import dialogs for this kind of thing.
My favourite for the job is Windows based HeidiSQL. It gives you a graphical interface to build the LOAD DATA command; you can re-use it programmatically later.
Screenshot: "Import textfile" dialog
To open the Import textfile" dialog, go to Tools > Import CSV file:
Simplest way which I have imported 200+ rows is below command in phpmyadmin sql window
I have a simple table of country with two columns
CountryId,CountryName
here is .csv data
here is command:
LOAD DATA INFILE 'c:/country.csv'
INTO TABLE country
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
Keep one thing in mind, never appear , in second column, otherwise your import will stop
I Used this method to import more than 100K records (~5MB) in 0.046sec
Here's how you do it:
LOAD DATA LOCAL INFILE
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);
It is very important to include the last line , if you have more than one field i.e normally it skips the last field (MySQL 5.6.17)
LINES TERMINATED BY '\n'
(field_1,field_2 , field_3);
Then, assuming you have the first row as the title for your fields, you might want to include this line also
IGNORE 1 ROWS
This is what it looks like if your file has a header row.
LOAD DATA LOCAL INFILE
'c:/temp/some-file.csv'
INTO TABLE your_awesome_table
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(field_1,field_2 , field_3);
phpMyAdmin can handle CSV import. Here are the steps:
Prepare the CSV file to have the fields in the same order as the MySQL table fields.
Remove the header row from the CSV (if any), so that only the data is in the file.
Go to the phpMyAdmin interface.
Select the table in the left menu.
Click the import button at the top.
Browse to the CSV file.
Select the option "CSV using LOAD DATA".
Enter "," in the "fields terminated by".
Enter the column names in the same order as they are in the database table.
Click the go button and you are done.
This is a note that I prepared for my future use, and sharing here if someone else can benefit.
If you are using MySQL Workbench (currently 6.3 version) you can do this by:
Right click on "Tables";
Chose Table Data Import Wizard;
Chose your csv file and follow the instructions (JSON also could be used);
The good thing is that you can create a new table based on the csv file you want to import or load data to an existing table
You can fix this by listing the columns in you LOAD DATA statement. From the manual:
LOAD DATA INFILE 'persondata.txt' INTO TABLE persondata (col1,col2,...);
...so in your case you need to list the 99 columns in the order in which they appear in the csv file.
Try this, it worked for me
LOAD DATA LOCAL INFILE 'filename.csv' INTO TABLE table_name FIELDS TERMINATED BY ',' ENCLOSED BY '"' IGNORE 1 ROWS;
IGNORE 1 ROWS here ignores the first row which contains the fieldnames. Note that for the filename you must type the absolute path of the file.
I see something strange. You are using for ESCAPING the same character you use for ENCLOSING. So the engine does not know what to do when it founds a '"' and I think that is why nothing seems to be in the right place.
I think that if you remove the line of ESCAPING, should run great. Like:
LOAD DATA INFILE "/home/paul/clientdata.csv"
INTO TABLE CSVImport
COLUMNS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES;
Unless you analyze (manually, visually, ... ) your CSV and find which character uses for escape. Sometimes is '\'. But if you do not have it, do not use it.
The mysql command line is prone to too many problems on import. Here is how you do it:
use excel to edit the header names to have no spaces
save as .csv
use free Navicat Lite Sql Browser to import and auto create a new table (give it a name)
open the new table insert a primary auto number column for ID
change the type of the columns as desired.
done!
Yet another solution is to use csvsql tool from amazing csvkit suite.
Usage example:
csvsql --db mysql://$user:$password#localhost/$database --insert --tables $tablename $file
This tool can automatically infer the data types (default behavior), create table and insert the data into the created table. --overwrite option can be used to drop table if it already exists. --insert option — to populate the table from the file.
To install the suite
pip install csvkit
Prerequisites: python-dev, libmysqlclient-dev, MySQL-python
apt-get install python-dev libmysqlclient-dev
pip install MySQL-python
In case if you using Intellij
https://www.jetbrains.com/datagrip/features/importexport.html
I use mysql workbench to do the same job.
create new schema
open newly created schema
right click on "Tables" and select "Table Data Import Wizard"
give the csv file path and table name and finally configure your column type because the wizard set default column type based on their values.
Note: take a look at mysql workbench's log file for any errors by using "tail -f [mysqlworkbenchpath]/log/wb*.log"
How to import csv files to sql tables
Example file: Overseas_trade_index data CSV File
Steps:
Need to create table for overseas_trade_index.
Need to create columns related to csv file.
SQL Query:
( id int not null primary key auto_increment,
series_reference varchar (60),
period varchar (60),
data_value decimal(60,0),
status varchar (60),
units varchar (60),
magnitude int(60),
subject text(60),
group text(60),
series_title_1 varchar (60),
series_title_2 varchar (60),
series_title_3 varchar (60),
series_title_4 varchar (60),
series_title_5 varchar (60),
);
Need to connect mysql database in terminal.
=>show databases;
=>use database;
=>show tables;
Please enter this command to import the csv data to mysql tables.
load data infile '/home/desktop/Documents/overseas.csv' into table trade_index fields terminated by ',' lines terminated by '\n' (series_reference,period,data_value,status,units,magnitude,subject,series_title1,series_title_2,series_title_3,series_title_4,series_title_5);
Find this overseas trade index data on sqldatabase:
select * from trade_index;
If you are using a windows machine with Excel spreadsheet loaded, the new mySql plugin to Excel is phenomenal. The folks at Oracle really did a nice job on that software. You can make the database connection directly from Excel. That plugin will analyse your data, and set up the tables for you in a format consistent with the data. I had some monster big csv files of data to convert. This tool was a big time saver.
http://dev.mysql.com/downloads/windows/excel/
You can make updates from within Excel that will populate to the database online. This worked exceedingly well with mySql files created on ultra inexpensive GoDaddy shared hosting. (Note when you create the table at GoDaddy, you have to select some off-standard settings to enable off site access of the database...)
With this plugin you have pure interactivity between your XL spreadsheet and online mySql data storage.
I know that my answer is late, but I'd like to mention a few other ways to do it.
The easiest one is using command line. The steps will be the following:
Accessing the MySQL CLI by entering the below command:
mysql -u my_user_name -p
Creating a table in the database
use new_schema;
CREATE TABLE employee_details (
id INTEGER,
employee_name VARCHAR(100),
employee_age INTEGER,
PRIMARY KEY (id)
);
Importing the CSV file into a table. We can either mention the file path or store the file in the default directory of the MySQL server.
LOAD DATA INFILE 'Path to the exported csv file'
INTO TABLE employee_details
FIELDS TERMINATED BY ','
IGNORE 1 ROWS;
It's the only one of many solutions, I found it in this tutorial
If loading CSV files into MySQL database is your daily task, then it'll be better to automate this process. In this case you can use some 3rd-party tools that allows you to load data in schedule.
PHP Query for import csv file to mysql database
$query = <<<EOF
LOAD DATA LOCAL INFILE '$file'
INTO TABLE users
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(name,mobile,email)
EOF;
if (!$result = mysqli_query($this->db, $query))
{
exit(mysqli_error($this->db));
}
**Sample CSV file data **
name,mobile,email
Christopher Gritton,570-686-3439,ChristopherKGritton#inbound.plus
Brandon Wilson,541-309-5149,BrandonMWilson#inbound.plus
Craig White,516-795-8065,CraigJWhite#inbound.plus
David Whitney,713-214-3966,DavidCWhitney#inbound.plus
Here is sample excel file screen shot:
Save as and choose .csv.
And you will have as shown below .csv data screen shot if you open using notepad++ or any other notepad.
Make sure you remove header and have column alignment in .csv as in mysql Table.
Replace folder_name by your folder name
LOAD DATA LOCAL INFILE
'D:/folder_name/myfilename.csv'
INTO TABLE mail
FIELDS TERMINATED BY ','
(fname,lname ,email, phone);
If big data, you can take coffee and have it load!.
Thats all you need.
Change servername,username, password,dbname,path of your file, tablename and the field which is in your database you want to insert
<?php
$servername = "localhost";
$username = "root";
$password = "";
$dbname = "bd_dashboard";
//For create connection
$conn = new mysqli($servername, $username, $password, $dbname);
$query = "LOAD DATA LOCAL INFILE
'C:/Users/lenovo/Desktop/my_data.csv'
INTO TABLE test_tab
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(name,mob)";
if (!$result = mysqli_query($conn, $query)){
echo '<script>alert("Oops... Some Error occured.");</script>';
exit();
//exit(mysqli_error());
}else{
echo '<script>alert("Data Inserted Successfully.");</script>'
}
?>
I did it in simple way using phpmyadmin. I followed the steps by #Farhan but all data were eltered in single column.
How I did:
Created a CSV file and deleted the header row with column names. Kept only data.
I created a table with column names matching the csv columns.
Remember to assign appropriate types to each column.
I just selected the import and went to import tab.
In browse I selected the CSV file and kept all options as it is.
To my surprise all the data got imported successfully in their appropriate columns.
When executing MySQL Query to import CSV I was getting error
'Error Code: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement'
So I moved file to secure file location
LOAD DATA INFILE 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Orders.csv'
INTO TABLE orderdetails.orders
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
Where location of file is 'C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/Orders.csv' this is because, I moved my CSV file to 'secure_file_priv' location otherwise I was getting above error
You can get your secure_file_priv using query SHOW VARIABLES LIKE "secure_file_priv";
Source: Import CSV file to MySQL (Query or using Workbench)