how to create a bulk mysql upload script in linux - mysql

I have a mysql datbase with a tables that have the same column names as csv files that i recieve. I used to use a windows batch file to upload then into mysql.
#echo off
echo TRUNCATE TABLE `alarms`; > importalarm.sql
echo Creating list of MySQL import commands
for %%s in (*.csv) do
echo LOAD DATA LOCAL INFILE '%%s' INTO TABLE `alarms`
FIELDS TERMINATED BY ',' ENCLOSED BY ' '
ESCAPED BY '/'LINES TERMINATED BY '\r\n' IGNORE 1 LINES; >> importcsv.sql;
echo mysql -u root -p uninetscan < importalarm.sql
I need to turn this into a linux script.
Any ideas.

Here's how you'd do something similar on linux:
#!/bin/bash
echo 'TRUNCATE TABLE `alarms`;' > importalarm.sql
echo "Creating list of MySQL import commands"
for f in *.csv; do
echo "LOAD DATA LOCAL INFILE '$f' INTO TABLE \`alarms\` FILEDS TERMINATED BY ',' ENCLOSED BY ' ' ESCAPED BY '/' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;" >> importcsv.sql;
done
echo 'mysql -u root -p uninetscan < importalarms.sql'
I noticed that you are sending the output to 2 files (importalarm.sql and importcsv.sql), not sure if it's what you want, but it's really easy to change.
Also, the last line is just echoing the command, not actually executing it. If you want to execute it, just remove the echo and quotes.

Related

How to import multiple csvfiles to mysql

I have 3000 csv files that are " delimited and , separated. Each is named after the server the data was collected from. I need to import them into a database to allow me to work with the data.
For an individual file I've tried :
mysql -uuser -ppassword --local-infile mydatabase -e "LOAD DATA LOCAL INFILE '/root/downloads/test/reports/mytestcsvfile.csv' INTO TABLE results FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES FIELDS FIELDS ESCAPED BY '\' "
and this works fine except that every data element is enclosed in quotes. If I add
ENCLOSED BY '"'
then the " in that parameter stops the import working and just gives me a > prompt.
If I escape the " with
ENCLOSED BY '\"'
then I get an error :
ERROR 1064 (42000) at line 1: You have an error in your SQL syntax; check the manual that corresponds to your MariaDB server version for the right syntax to use near 'FIELDS ENCLOSED BY '"' FIELDS ESCAPED BY '\'' at line 1
I'll need to enclose the whole lot in something like :
#!/bin/bash
FILES= /root/downloads/smtptest/reports/
for f in $FILES
do
echo "processing $f..."
mysql -uuser -ppassword --local-infile mydatabase -e "LOAD DATA LOCAL INFILE '/root/downloads/test/reports/$f' INTO TABLE results FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' IGNORE 1 LINES FIELDS FIELDS ESCAPED BY '\' "
done
How can I run the import and strip the ". Mysqlimport looks to be ruled out as it imports into a table based on filename, not something I want.
And...
To allow me to run it for each file at the moment I'm having to run this first:
for f in file*.csv; do echo ":" $f ":"; sed -i "s/^/\"$f\",/" "$f"; done
This adds each server name (the filename) to the start of each line in each file so that that becomes part of the record in the database. Is there a more elegant way of doing this?
I found the problem: I was using "fields" multiple times, to define the enclosed, terminated and escaped by parameters. When I removed that it all started working
This worked :
!/bin/bash
for filename in /root/downloads/smtptest/reportsmod/*.csv; do
echo "processing $filename"
mysql -uuser -ppassword --local-infile smtpsslcheck -e "LOAD DATA LOCAL INFILE '$filename' INTO TABLE results FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n' IGNORE 1 LINES"
done

Add progress bar to MySQL in shell script

I'm using the below code to load data from a file to my database
mysql -h $_host -u $_db_user -p$_db_password $_db --local_infile=1 -e "use $_db" -e"
LOAD DATA LOCAL INFILE '$_file'
REPLACE INTO TABLE my_db.\`load_temp\`
FIELDS TERMINATED BY 0x2c
OPTIONALLY ENCLOSED BY '\"'
(
\`supplier_Pin\`,
\`products_sl\`,
\`products_long_description\`,
\`rv_no\`,
\`weight\`,
\`products_cost\`
)
; "
The files are around 500 MB so it does consume a lot of time.
I need to display the a progress bar during the import.
please suggest a method to do it

For Files in .... mysql update

I have a batch process that looks like this:
for /F %%a in (Files_april.txt) do (BlaBla_filedecryptionv3.exe -i FullRefresh_%%a.afi -k keys.txt -l customer_layout.txt -o customer_%%a.txt -t d -d \124)
mysql -q -h beast --port=3310 -u jxxxx --password="xxxx" di < load_april.sql
mysql -q -h beast --port=3310 -u xxxxx --password="xxxx" di < exportapril.sql > april.txt
I was wondering if it is possible to take the same idea of the %%a that I used to decrypt the files in a mysql qry.
The load_april.sql looks like:
load data local infile 'W:\\New DLP\\customer_20140331_0.txt' into table di.dlp_monthly_201404
fields terminated by '|'
lines terminated by '\r\n';
load data local infile 'W:\\New DLP\\customer_20140331_1.txt' into table di.dlp_monthly_201404
fields terminated by '|'
lines terminated by '\r\n';
Where the "20140331_0" and "20140331_1" are what I want replaced so it would be something like this:
load data local infile 'W:\\New DLP\\customer_%%a.txt' into table di.dlp_monthly_201404
fields terminated by '|'
lines terminated by '\r\n';
Is this possible? Any help would be great.
You can include an sql query directly on the mysql command line, rather than loading it from a file. The resulting file would look something like:
for /F %%a [...]
mysql [...] -e "load data local infile 'path\\%%a.txt' into [...]"
[...]
Where [...] needs to be expanded according to your needs.

Bash Script for Load Data Infile MySQL

So i'm trying to create a script that I can run that will do a batch import of csv files into a table.
I'm having trouble getting the script to work.
Here is the script i'm running:
#!/bin/bash
for f in *.csv
do
"/opt/lampp/bin/mysql -e use test -e LOAD DATA LOCAL INFILE ("$f") INTO TABLE temp_table FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' IGNORE 1 LINES (DATE, TIME, SITE_NAME, SITE_IP, TOTAL_TALKTIME, EDGE_UL_BYTES, EDGE_DL_BYTES);"
done
When I run the script I receive the following error message:
./script.sh: line 5: unexpected EOF while looking for matching `''
./script.sh: line 7: syntax error: unexpected end of file
The load data local infile command works fine directly in mysql.
When you want to use literal double quotes in double quoted strings, escape them with \". Since mysql doesn't care about line feeds, you can also break the line to make it more readable:
#!/bin/bash
for f in *.csv
do
/opt/lampp/bin/mysql -e "use test" -e "
LOAD DATA LOCAL INFILE '$f'
INTO TABLE temp_table
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(DATE, TIME, SITE_NAME, SITE_IP, TOTAL_TALKTIME,
EDGE_UL_BYTES, EDGE_DL_BYTES);"
done
mysql -u<username> -p<password> -h<hostname> <db_name> --local_infile=1 -e "use <db_name>" -e"LOAD DATA LOCAL INFILE '<path/file_name>'
IGNORE INTO TABLE <table_name>
FIELDS TERMINATED BY '\t'
OPTIONALLY ENCLOSED BY '\"'"

How to use a bash script to write to a mysql table

I'm using a bash script to pull data from online sources. Right now I just have it writing to a text file, but it would be better if the script could automatically put this data into mysql tables. How can this be done? Examples would be helpful.
Suppose you download a .csv file. which has header and have a database test in mysql.
Download the file first.
wget http://domain.com/data.csv -O data.csv
Dump the data to mysql table tbl
cat <<FINISH | mysql -uUSERNAME -pPASSWORD test
LOAD DATA INFILE 'data.csv' INTO TABLE `tbl`
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
FINISH
Here USERNAME must have FILE privilege.
You can use bash like this:
#!/bin/bash
params="dbname -uuser -ppassswd"
echo "SELECT * FROM table;" | mysql $params
# or
mysql $params <<DELIMITER
SELECT * FROM table;
DELIMITER