How to import xls file in to MySQL table - csv

I'm using ubuntu 14.04 operating system and I want to import excel sheet(.xls) in to database.
I am using phpmyadmin mysql administration tool. If any one can give me the commands to import xls to mysql I would be grateful.

Import CSV file in to MySQL Table
Change file type in to CSV
Create table according to excel column names
Place the csv file in /var/lib/mysql
Open phpmyadmin sql command pane:
Execute this sql command
LOAD DATA INFILE 'testlist.csv'
INTO TABLE TEST
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Use the load data capability. See
http://dev.mysql.com/doc/refman/5.1/en/load-data.html
Export CSV file in to MySQL Table
SELECT ... FROM ... WHERE ...
INTO OUTFILE 'testlist.csv'
FIELDS TERMINATED BY ','
Here is an example that produces a file in the comma-separated values (CSV) format used by many programs:
SELECT * INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;

Related

How to load json from csv of postgres to mysql

I using postgres and mysql . Now i want migrate data from postgres to mysql.
I export data from postgres to csv and then reimport it to mysql.
My csv export from postgres with name : test.csv . Column data type json of postgres.
Test.csv
id,name,data
"1","someName","{""email"": ""a#gmail.com"", ""country"": ""US""}"
And in mysql i using command reimport to mysql with same column
LOAD DATA LOCAL INFILE '/tmp/test.csv' INTO TABLE someTable FIELDS TERMINATED BY ',' TERMINATED BY '\n' IGNORE 1 LINES;
When i import it import success but column data is null . I don't know why
If i using commmand
LOAD DATA LOCAL INFILE '/tmp/test.csv' INTO TABLE someTable fields terminated by ',' optionally enclosed by '\"' escaped by '\'' lines terminated by '\n' IGNORE 1 LINES;
MYSQL thow me exception :
Invalid JSON text: "The document root must not be followed by other values." at position 4 in value for column 'someTable.data'.
How to import from csv with json column to mysql. Please help . Thanks you

Importing CSV file to HeidiSQL, error 1148

I'm completely novice to MySQL, and I'm trying to load some CSV sheets to the server using HeidiSQL -> tools -> Import CSV. But it keeps giving me this error:
Error 1148: The used command is not allowed with this MySQL version".
Is there a way to fix that, or maybe another way to load a CSV?
For query based csv import you can use load data ... to load csv files.
For more info refer here
Example:
To skip the first line which is header in csv you can use ignore 1 lines
load data local infile 'D:std_table.csv' into table local.student
fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
ignore 1 lines;
For windows based system use \r\n for line termination and for Linux use \n
Side Note:
You could try using MySQL Workbench for MySQL from official site here.
try this one:
LOAD DATA INFILE 'c:/tmp/discounts.csv'
INTO TABLE discounts
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;

Export MySQL tables to CSV

According to the MySQL documentation you should be able to export to data to CSV files:
The mysqldump command can also generate output in CSV, other delimited text, or XML format.
But the document doesn't reveal how? Can anybody help?
SELECT * FROM `database_name`.`table_name` INTO OUTFILE 'path_to_folder/filename.csv' FIELDS ENCLOSED BY '"' TERMINATED BY ';' ESCAPED BY '"' LINES TERMINATED BY '\r\n';"
This is how you do it.

import csv file into mysql using LOAD DATA INFILE

LOAD DATA INFILE 'returns_inprogress_06-Feb-2016-11-35.csv'
IGNORE INTO TABLE fkreturn
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
ESCAPED BY '"'
LINES TERMINATED BY '\n'
(`FSN`,`Product`,`Order Id`,`ORDER ITEM ID`,`SKU Code`,`Return ID`,`Return Created On`,`Return Action`,`Return Type`,`Is FA`,`New Order Item Id`,`Sub Reason`,`Comments`,`Shipment Status`,`Tracking Id`,`Good Quantity`,`Bad Quantity`,`Total Price`,`Quantity`);
when executed it gives a error "#1290 - The MySQL server is running with the --secure-file-priv option" so it cannot execute this statement. I am using windows wamp, CSV file is in www folder. All I want is to ignore the duplicates of primary key (Return ID) and upload others.Should this field be marked as unique also. Or any other easy way to upload csv to table ignoring duplicates

Importing data from Sybase to MySQL

I'm migrating an old Sybase database to a new MySQL db.
Since the moment that Sybase can export data to .dat files (something similar to new csv), I decided to use it.
The problem is that Sybase uses commas as column separator and commas in strings are ignored because are enclosed in '', but not in MySQL.
Is there a way to solve the problem? Here's my query:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
LINES TERMINATED BY '\r\n';
Thank you in advance.
If they are enclosed by single quotes try this:
LOAD DATA LOCAL INFILE 'C:\\UNLOAD\\166.dat'
INTO TABLE linea_col
COLUMNS TERMINATED BY ','
ENCLOSED BY '\''
LINES TERMINATED BY '\r\n';
If that doesn't work then you should do ENCLOSED BY '''' but I'm 99.99% certain the first is correct.
I'm not familiar with how DAT files may differ to CSV, but you can use CSV as the common format.
You can export from Sybase using BCP and forcing the output to use CSV format
bcp database.dbo.tbl_name out c:\temp\output.csv -c -t, -U user_name -P password -S server_name
Then import into MySQL with
load data local infile 'uniq.csv' into table tblUniq fields terminated by ','
enclosed by '"'
lines terminated by '\n'
(uniqName, uniqCity, uniqComments)