Add progress bar to MySQL in shell script - mysql

I'm using the below code to load data from a file to my database
mysql -h $_host -u $_db_user -p$_db_password $_db --local_infile=1 -e "use $_db" -e"
LOAD DATA LOCAL INFILE '$_file'
REPLACE INTO TABLE my_db.\`load_temp\`
FIELDS TERMINATED BY 0x2c
OPTIONALLY ENCLOSED BY '\"'
(
\`supplier_Pin\`,
\`products_sl\`,
\`products_long_description\`,
\`rv_no\`,
\`weight\`,
\`products_cost\`
)
; "
The files are around 500 MB so it does consume a lot of time.
I need to display the a progress bar during the import.
please suggest a method to do it

Related

MySQL - LOAD DATA LOCAL INFILE shell script not working

can someone please assist, I am new to mysql, and i have noticed that "LOAD DATA LOCAL INFILE" does not work in mysql event scheduler to update my databases from a normal .csv
So I'm trying to setup a "cron job" in linux to run a shell script to do the LOAD DATA INFILE to my databases, but im getting errors on the following shell script, please help correct it, see my script layout below...
#!/bin/bash
mysql -u root -p xxxxxxx testdb --local_infile=1 -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/Acelity/activity.csv'
INTO TABLE acelity_activity
FIELDS TERMINATED BY ';'
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;
Thank you for helping me
You must enclose the Double quotes in ENCLOSED BY and i think that you have a finalizing Double quote in you original code
#!/bin/bash
mysql -u root -p xxxxxxx testdb --local_infile=1 -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/Acelity/activity.csv'
INTO TABLE acelity_activity
FIELDS TERMINATED BY ';'
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;"
And you should test the command, in wokbench or phpmyadmin
Thanks for the help #nbk
So i modified the script above a little, and the following worked for me: (I removed the db name in the mysql syntax and add it in the query)
mysql -u root -pxxxxxxx -e"LOAD DATA LOCAL INFILE '/mnt/mysqldb/mysqldb-new/mysql/CK-BATCH-FTP/File/XXXX.csv'
INTO TABLE <DB NAME>.<TABLE NAME>
FIELDS TERMINATED BY ';'
ENCLOSED BY '\"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS;"
If you are using Mysql8 then this will help you.
mysql -udbuser -pXXXXX -h host-pc test1 --local_infile=1 -e "LOAD DATA LOCAL INFILE '/home/abc/data.csv' INTO TABLE tempTable FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' (col1, col2);";
Make sure there is no space in username and password i.e. it should be like this as mentioned in the above command.
Also ENCLOSED BY needs to write in the mentioned format only.

Linux bash MySQL load infile

I'm trying to simplify a load data local infile, by putting it into a .sh file, and bash to run it.
Here is my count_portal.sh file
mysql -h host -u root -p password
load data local infile
"/workplace/user/dump/count_portal.txt"
replace into table test.icqa_count_portal
fields terminated by '\t' lines terminated by '\n' ignore 1 lines;
And here is my bash script
bash /home/user/Desktop/count_portal.sh I get an output that doesn't do what it is designed to do. When I simply make the count_portal.sh contain mysql -h host it longs in when running the script.
I figured it out. Here is my file.
#!/bin/bash
/usr/bin/mysql --host=host --user=root --password=password --database=test<<EOFMYSQL
load data local infile '/workplace/user/ETLdump/count_portal.txt' replace INTO TABLE count_portal fields terminated by '\t' LINES
TERMINATED BY '\n' ignore 1 lines;
EOFMYSQL
Works flawlessly!
Try removing the space between the user and password switches. So it would be something like:
mysql -h host -uroot -ppassword etc....
The spaces seem to cause problems for me when doing similar things inside bash scripts.
missed -e ?
mysql -h host -u root -ppassword dbname -e "load data local infile '/workplace/user/dump/count_portal.txt' replace into table test.icqa_count_portal fields terminated by '\t' lines terminated by '\n' ignore 1 lines;"
I agree with #PasteBT, his answer should work well. I think there are escape problem or shell variable is empty.
Have you tried this?:
echo "LOAD DATA LOCAL INFILE '/workplace/user/dump/count_portal.txt' REPLACE INTO TABLE test.icqa_count_portal FIELDS TERMINATED BY '\t' LINES TERMINATED BY '\n' IGNORE 1 LINES;" | mysql -h host -u root -ppassword
and you original shell script is wrong. could you post again what did you do?
mysql -u<username> -p<password> -h<hostname> <db_name> --local_infile=1 -e "use <db_name>" -e"LOAD DATA LOCAL INFILE '<path/file_name>'
IGNORE INTO TABLE <table_name>
FIELDS TERMINATED BY '\t'
OPTIONALLY ENCLOSED BY '\"'"

Exporting comma delimited data from a very large table

I am trying to get all the data from a very large table from a remote host having around 13million entries into a text file. I have tried the following command but after sometime process gets killed and shows a message called "Killed." in the console.
mysql --user=username --password -h host -e "select * from db.table_name" >> output_file.txt
My primary goal is to copy data from mysql to redshift, which I am doing it by getting all the data with "," delimited int a text file uploading it on s3 and executing COPY query on redshift.
P.S for small tables the above command is working properly but not for large tables.
You could try mysqldump instead. It can be parameterized to output CSV if I recall correctly. I haven't tried this myself, so you might want to check the docs, but this should work:
mysqldump --user=username --password -h host \
--fields-terminated-by="," --fields-enclosed-by="\"" --lines-terminated-by="\n" \
dbname tablename > output_file.txt
If that does not work, you could try SELECT INTO OUTFILE. You will need to do that directly on the MySQL host like this:
SELECT * INTO OUTFILE '/tmp/data.csv'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\' LINES TERMINATED BY '\n'
FROM db.table_name

cron job - how to export data table to csv/xls file?

i'm trying to configure cron job to export automatically everyday data table from mysql DB to csv/xls.
and i'm stuck in te command field.
which command should i use ?
thank you.
mor
This is an old topic, but for posterity, what I've found would work in this case is to simply write the SQL in a separate file, then source the file into mysql. That way, you don't have to worry about quotes and escapes etc.
So, for example, I would have a SQL file like so, called exportinventory.sql
SELECT *
FROM inventory INTO OUTFILE '/tmp/inventory.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';
And then, from the command line:
mysql -u user -p database < exportinventory.sql
There are three steps to accomplish:
write proggy, that exports the data as you want (php, java)
write a bash script, which properly sets the environment and calls
this proggy
insert this script in to a users crontab
you might find information for all of the three over here.
You can try something like:
mysql -h $DB_HOST -u $DB_USER --password=$DB_PASSWORD $DB_NAME -e "$YOUR_QUERY INTO OUTFILE '$PATH_TO_OUTPUT_FILE' FIELDS TERMINATED BY ',' ENCLOSED BY '\"' LINES TERMINATED BY '\n'"

how to create a bulk mysql upload script in linux

I have a mysql datbase with a tables that have the same column names as csv files that i recieve. I used to use a windows batch file to upload then into mysql.
#echo off
echo TRUNCATE TABLE `alarms`; > importalarm.sql
echo Creating list of MySQL import commands
for %%s in (*.csv) do
echo LOAD DATA LOCAL INFILE '%%s' INTO TABLE `alarms`
FIELDS TERMINATED BY ',' ENCLOSED BY ' '
ESCAPED BY '/'LINES TERMINATED BY '\r\n' IGNORE 1 LINES; >> importcsv.sql;
echo mysql -u root -p uninetscan < importalarm.sql
I need to turn this into a linux script.
Any ideas.
Here's how you'd do something similar on linux:
#!/bin/bash
echo 'TRUNCATE TABLE `alarms`;' > importalarm.sql
echo "Creating list of MySQL import commands"
for f in *.csv; do
echo "LOAD DATA LOCAL INFILE '$f' INTO TABLE \`alarms\` FILEDS TERMINATED BY ',' ENCLOSED BY ' ' ESCAPED BY '/' LINES TERMINATED BY '\r\n' IGNORE 1 LINES;" >> importcsv.sql;
done
echo 'mysql -u root -p uninetscan < importalarms.sql'
I noticed that you are sending the output to 2 files (importalarm.sql and importcsv.sql), not sure if it's what you want, but it's really easy to change.
Also, the last line is just echoing the command, not actually executing it. If you want to execute it, just remove the echo and quotes.