MySQL loop for use with LOAD DATA LOCAL INFILE - mysql

I have a set of raw data files named using a pattern like a-1.txt, a-2.txt, etc. I am using the LOAD DATA LOCAL INFILE command in a MySQL script to load the raw data files into the database. That command cannot be run in a stored procedure. I'd like to avoid doing a copy/paste for loading 20 raw data files, like I described, in the MySQL script and would much rather use a LOOP to load the raw data files, but LOOP cannot be used outside of a stored procedure.
What's the best way to handle this? How do I get the MySQL script to do this?

Assuming you are using Bash, you can run the following to generate the SQL file.
rm testfile.sql
ls *.txt | xargs -I inputfile echo "LOAD DATA LOCAL INFILE 'inputfile' INTO TABLE mytable FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\r\n';" >> testfile.sql
Then to run it, you can add another line.
mysql -h localhost -u root -pXXXXXXX mydatabase < testfile.sql

Related

How to find the rows uploaded via Load Data Infile command

I am uploading a csv file via the LOAD DATA INFILE command in a bash script.
I need to find out how many lines of data was uploaded. I need to know this to compare with the number of lines in the csv.
This is to verify that all the data mentioned in the CSV file was uploaded.
I am using a bash script to do this upload as this will excuted on bi-hourly everyday.
How can I get the number of rows uploaded.
mysql -u $_db_user -p$_db_password -D $_db --local-infile=1 <<EOF
LOAD DATA LOCAL INFILE '$_csv_directory/TempImport.csv'
into table TempImport
fields terminated by ','
enclosed by '"'
lines terminated by '\r\n';
EOF

Import *.ods file in MySQL without phpMyAdmin

I'm trying to import a *.ods file using mysqlimport or 'load data' instead of phpMyAdmin interface because I need to automate the process.
This is my first attempt:
mysqlimport --ignore-lines=1 -u root -p DATABASE /home/luca/Scrivania/lettura.ods
mysqlimport can't upload the file because there are two spreadsheets, I need both and I can't modify the file structure.
With phpMyAdmin i'm able to upload the file correctly. The content of the two spreadsheets fills correctly the two tables. I know that when importing an *.ods file like this, phpMyAdmin uses the sheet name as the table name for import the file, but this is not the behavior of mysqlimport. Mysqlimport uses the file name, not the sheet name.
So I've tried this:
mysql -u root -p -e "load data local infile '/home/luca/Scrivania/lettura.ods' into table TABLE_NAME" DATABASE
Returns no error but the data in the table is totaly inconsistent.
Any suggestions?
Thank You
MySQL doesn't know how to read spreadsheets. Your best bet is to use your spreadsheet program to export a CSV, then load that new CSV into the database. Specify comma-delimited loading in your query and try it:
LOAD LOCAL DATA INFILE '/home/luca/Scrivania/lettura.csv'
INTO TABLE TABLE_NAME
FIELDS TERMINATED BY ','
ENCLOSED BY '' ESCAPED BY '\'
LINES TERMINATED BY '\n'
STARTING BY ''
The documentation for file loading contains tons of possible options, if you need them.

LOAD DATA LOCAL INFILE in a script not populating data (mysql)

I am trying to build an automated CSV to MYSQL dump which occurs everytime the CSV file in a certain directory is updated.
Once the file is updated, the following line of code is executed in a bash script:
mysql -u root -p$MASTER_DB_PASSW < /usr/local/scripts/order.sql
The contents of order.sql are as follows:
use test;
truncate test.ORDER;
load data infile '/home/test/ORDER.csv' into table test.ORDER fields terminated by ','
enclosed by '"'
lines terminated by '\r\n'
IGNORE 1 LINES
(order_date);
For some reason, when I run order.sql manually in Workbench, it works perfectly... but when I run it on the server, the contents of the csv file do not get dumped. Any tips/advice? Thanks in advance.
I solved it.
I had an error in my bash script. I found this by running:
bash -x scriptname

How to use a bash script to write to a mysql table

I'm using a bash script to pull data from online sources. Right now I just have it writing to a text file, but it would be better if the script could automatically put this data into mysql tables. How can this be done? Examples would be helpful.
Suppose you download a .csv file. which has header and have a database test in mysql.
Download the file first.
wget http://domain.com/data.csv -O data.csv
Dump the data to mysql table tbl
cat <<FINISH | mysql -uUSERNAME -pPASSWORD test
LOAD DATA INFILE 'data.csv' INTO TABLE `tbl`
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
FINISH
Here USERNAME must have FILE privilege.
You can use bash like this:
#!/bin/bash
params="dbname -uuser -ppassswd"
echo "SELECT * FROM table;" | mysql $params
# or
mysql $params <<DELIMITER
SELECT * FROM table;
DELIMITER

How do I store a MySQL query result into a local CSV file?

How do I store a MySQL query result into a local CSV file? I don't have access to the remote machine.
You're going to have to use the command line execution '-e' flag.
$> /usr/local/mysql/bin/mysql -u user -p -h remote.example.com -e "select t1.a,t1.b from db_schema.table1 t1 limit 10;" > hello.txt
Will generate a local hello.txt file in your current working directory with the output from the query.
Use the concat function of mysql
mysql -u <username> -p -h <hostname> -e "select concat(userid,',',surname,',',firstname,',',midname) as user from dbname.tablename;" > user.csv
You can delete the first line which contains the column name "user".
We can use command line execution '-e' flag and a simple python script to generate result in csv format.
create a python file (lets name = tab2csv) and write the following code in it..
#!/usr/bin/env python
import csv
import sys
tab_in = csv.reader(sys.stdin, dialect=csv.excel_tab)
comma_out = csv.writer(sys.stdout, dialect=csv.excel)
for row in tab_in:
comma_out.writerow(row)
Run the following command by updating mysql credentials correctly
mysql -u orch -p -h database_ip -e "select * from database_name.table_name limit 10;" | python tab2csv > outfile.csv
Result will be stored in outfile.csv.
I haven't had a chance to test it against content with difficult characters yet, but the fantastic mycli may be a solution for many.
Command line
mycli --csv -e "select * from table;" mysql://user#host:port/db > file.csv
Interactive mode:
\n \T csv ; \o ~/file.csv ; select * from table1; \P
\n disables pager that requires pressing space to display each page
\T csv ; - sets the output format to csv
\o <filename> ; - appends next output to a file
<query> ;
\P turns the pager back on
I am facing this problem and I've been reading some time for a solution: importing into excel, importing into access, saving as text file...
I think the best solution for windows is the following:
use the command insert...select to create a "result" table. The ideal scenario whould be to automatically create the fields of this result table, but this is not possible in mysql
create an ODBC connection to the database
use access or excel to extract the data and then save or process in the way you want
For Unix/Linux I think that the best solution might be using this -e option tmarthal said before and process the output through a processor like awk to get a proper format (CSV, xml, whatever).
Run the MySQl query to generate CSV from your App like below
SELECT order_id,product_name,qty FROM orders INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY '\n'
It will create the csv file in the tmp folder of the application.
Then you can add logic to send the file through headers.
Make sure the database user has permissions to write into the remote file-system.
You can try doing :
SELECT a,b,c
FROM table_name
INTO OUTFILE '/tmp/file.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
We use the OUTFILE clause here to store the output into a CSV file.We enclose the fields in double-quotes to handle field values that have the comma in them and we separate the fields by comma and separate individual line using newline.