mysqlimport - Import CSV File in MS Windows XAMPP emvironment - mysql

I am trying to import a CSV file from to a mysql database from the command line. This will be later incorporated into a Windows batch file.
mysqlimport -u user -puserpw --columns=ID,CID,Alerted --fields-terminated-by=',' --local School Customer.csv
All the data loads into the first column in the Customer table.
I want to correctly import data from the CSV to the appropriate column.
CSV Data format:
ID,CID,Alerted
1,CS,N
2,CS,N
3,CS,N
I would like to use mysqlimport since this will be easier to add to a Windows batch file.
How can I do this please help?

I guess you have to add --lines-terminated-by='\n' with mysqlimport. I run a quick test here and worked.
mysqlimport --local --ignore-lines=1 --fields-terminated-by=',' --lines-terminated-by='\n' db_name table_name.csv

I had to enclose the values in double quotes "
mysqlimport -u user -puserpw --columns=ID,CID,Alerted --ignore-lines=1 --fields-terminated-by="," --lines-terminated-by="\n" --local School Customer.csv
That fixed it.

Try to use the complete description for importing the csv file like :
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
Thanks

Related

Import *.ods file in MySQL without phpMyAdmin

I'm trying to import a *.ods file using mysqlimport or 'load data' instead of phpMyAdmin interface because I need to automate the process.
This is my first attempt:
mysqlimport --ignore-lines=1 -u root -p DATABASE /home/luca/Scrivania/lettura.ods
mysqlimport can't upload the file because there are two spreadsheets, I need both and I can't modify the file structure.
With phpMyAdmin i'm able to upload the file correctly. The content of the two spreadsheets fills correctly the two tables. I know that when importing an *.ods file like this, phpMyAdmin uses the sheet name as the table name for import the file, but this is not the behavior of mysqlimport. Mysqlimport uses the file name, not the sheet name.
So I've tried this:
mysql -u root -p -e "load data local infile '/home/luca/Scrivania/lettura.ods' into table TABLE_NAME" DATABASE
Returns no error but the data in the table is totaly inconsistent.
Any suggestions?
Thank You
MySQL doesn't know how to read spreadsheets. Your best bet is to use your spreadsheet program to export a CSV, then load that new CSV into the database. Specify comma-delimited loading in your query and try it:
LOAD LOCAL DATA INFILE '/home/luca/Scrivania/lettura.csv'
INTO TABLE TABLE_NAME
FIELDS TERMINATED BY ','
ENCLOSED BY '' ESCAPED BY '\'
LINES TERMINATED BY '\n'
STARTING BY ''
The documentation for file loading contains tons of possible options, if you need them.

Exporting comma delimited data from a very large table

I am trying to get all the data from a very large table from a remote host having around 13million entries into a text file. I have tried the following command but after sometime process gets killed and shows a message called "Killed." in the console.
mysql --user=username --password -h host -e "select * from db.table_name" >> output_file.txt
My primary goal is to copy data from mysql to redshift, which I am doing it by getting all the data with "," delimited int a text file uploading it on s3 and executing COPY query on redshift.
P.S for small tables the above command is working properly but not for large tables.
You could try mysqldump instead. It can be parameterized to output CSV if I recall correctly. I haven't tried this myself, so you might want to check the docs, but this should work:
mysqldump --user=username --password -h host \
--fields-terminated-by="," --fields-enclosed-by="\"" --lines-terminated-by="\n" \
dbname tablename > output_file.txt
If that does not work, you could try SELECT INTO OUTFILE. You will need to do that directly on the MySQL host like this:
SELECT * INTO OUTFILE '/tmp/data.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
FROM db.table_name

Mysql export issue?

I am using the following command to export thedabase,however i can't find the FILE.sql file after executing the command.Where is it stored?
mysqldump -u username -ppassword database_name > FILE.sql
Also how can i check my home directory , I have checked the program files(x86) mysql and respective bin folder in it.
The other way to export data to CSV file is by using "OUTFILE" syntax provided by MySQL like shown below.
[Usual MySQL Query]
INTO OUTFILE '/var/data_exports/huge_data.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
Read practical usage and more here. ResolveBug.com
It should be at either the root dir or in current dir where you run the dump command.

Import multiple CSV files into mysql

I have about 28 csv files and I need to import this into database as 28 tables.
I tried many things but just couldn't find a way.
You can link MYSQL directly to it and upload the information using the following SQL syntax.
load data local infile 'uniq.csv' into table tblUniq
fields terminated by ','
enclosed by '"'
lines terminated by '\n'
Read more here : LOAD DATA INFILE
Import CSV file directly into MySQL
It's a good solution for Windows-users. Just create a text file "import.bat" with code, after that run it.
#ECHO off
FOR %%I In (db\*.sql) DO mysqlimport.exe --local -uroot -proot vipamoda %%I
PAUSE
More complex code which importing first SQL-structure, then import TXT-data:
#ECHO off
FOR %%I IN (db\*.sql) DO (
mysql.exe -uroot -proot vipamoda < %%~dpnI.sql
mysqlimport.exe --local -uroot -proot vipamoda %%~dpnI.txt
)
PAUSE
And dump code for this import code is:
mysqldump.exe --compact --add-drop-table --tab=db -uroot -proot vipamoda

How do I store a MySQL query result into a local CSV file?

How do I store a MySQL query result into a local CSV file? I don't have access to the remote machine.
You're going to have to use the command line execution '-e' flag.
$> /usr/local/mysql/bin/mysql -u user -p -h remote.example.com -e "select t1.a,t1.b from db_schema.table1 t1 limit 10;" > hello.txt
Will generate a local hello.txt file in your current working directory with the output from the query.
Use the concat function of mysql
mysql -u <username> -p -h <hostname> -e "select concat(userid,',',surname,',',firstname,',',midname) as user from dbname.tablename;" > user.csv
You can delete the first line which contains the column name "user".
We can use command line execution '-e' flag and a simple python script to generate result in csv format.
create a python file (lets name = tab2csv) and write the following code in it..
#!/usr/bin/env python
import csv
import sys
tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)
for row in tab_in:
comma_out.writerow(row)
Run the following command by updating mysql credentials correctly
mysql -u orch -p -h database_ip -e "select * from database_name.table_name limit 10;" | python tab2csv > outfile.csv
Result will be stored in outfile.csv.
I haven't had a chance to test it against content with difficult characters yet, but the fantastic mycli may be a solution for many.
Command line
mycli --csv -e "select * from table;" mysql://user#host:port/db > file.csv
Interactive mode:
\n \T csv ; \o ~/file.csv ; select * from table1; \P
\n disables pager that requires pressing space to display each page
\T csv ; - sets the output format to csv
\o <filename> ; - appends next output to a file
<query> ;
\P turns the pager back on
I am facing this problem and I've been reading some time for a solution: importing into excel, importing into access, saving as text file...
I think the best solution for windows is the following:
use the command insert...select to create a "result" table. The ideal scenario whould be to automatically create the fields of this result table, but this is not possible in mysql
create an ODBC connection to the database
use access or excel to extract the data and then save or process in the way you want
For Unix/Linux I think that the best solution might be using this -e option tmarthal said before and process the output through a processor like awk to get a proper format (CSV, xml, whatever).
Run the MySQl query to generate CSV from your App like below
SELECT order_id,product_name,qty FROM orders INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
It will create the csv file in the tmp folder of the application.
Then you can add logic to send the file through headers.
Make sure the database user has permissions to write into the remote file-system.
You can try doing :
SELECT a,b,c
FROM table_name
INTO OUTFILE '/tmp/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
We use the OUTFILE clause here to store the output into a CSV file.We enclose the fields in double-quotes to handle field values that have the comma in them and we separate the fields by comma and separate individual line using newline.