I have a bunch of jpeg images stored as blobs on a mysql database which I need to download to my local machine, the following does not work, can someone please advise?
Note I know the below code just overwrites the same file but for the purpose of this exercise it does not matter.
IFS=$'\n'
for i in `mysql -sN -u******* -p******** -h****** -e "select my_images from mutable"; do
echo $i > myimage.jpg
done
I'm not really sure what is not working with your code, but you should be able to fetch all data and save each image like this:
#!/bin/bash
counter=0;
for i in `mysql -s -N -u******* -p******** -h****** -e"select my_images from mutable"`; do
echo $i > "image${counter}.jpg";
counter=$((counter+1));
done
#!/bin/bash
counter=0;
for i in `mysql -N -u * -p* -e "SELECT id from multable"`; do
mysql -N -u * -p* -e "SELECT my_images FROM multable WHERE id=$i INTO DUMPFILE
'/var/lib/mysql-files/tmp.jpg';"
mv /var/lib/mysql-files/tmp.jpg "image${counter}.jpg"
counter=$((counter+1));
done
This code can extract image from blob and the image will not be invalid.
The keypoint is to use INTO DUMPFILE and then the file can save to the folder indicated by ##secure_file_priv.
You can see the ##secure_file_priv by using the following command
$ echo "SELECT ##secure_file_priv" | mysql -u * -p
And you can change ##secure_file_priv value by setting it in my.cnf.
There are many my.cnf in mysql and you can check the loading sequence by following command
$ mysqld --help --verbose | grep cnf -A2 -B2
But I still suggest to use /var/lib/mysql-files/ to load image file from database first and then copy it because some version of MySQL can only use /var/lib/mysql-files folder even you have changed the ##priv_secure_file to indicate another folder.
Related
I want to connect to mysql databse and execute some queries and export its result to a varibale, and do all of these need to be done entirely by bash script
I have a snippet code but does not work.
#!/bin/bash
BASEDIR=$(dirname $0)
cd $BASEDIR
mysqlUser=n_userdb
mysqlPass=d2FVR0NA3
mysqlDb=n_datadb
result=$(mysql -u $mysqlUser -p$mysqlPass -D $mysqlDb -e "select * from confs limit 1")
echo "${result}" >> a.txt
whats the problem ?
The issue was resolved in the chat by using the correct password.
If you further want to get only the data, use mysql with -NB (or --skip-column-names and --batch).
Also, the script needs to quote the variable expansions, or there will be issues with usernames/passwords containing characters that are special to the shell. Additionally, uppercase variable names are usually reserved for system variables.
#!/bin/sh
basedir=$(dirname "$0")
mysqlUser='n_userdb'
mysqlPass='d2FVR0NA3'
mysqlDb='n_datadb'
cd "$basedir" &&
mysql -NB -u "$mysqlUser" -p"$mysqlPass" -D "$mysqlDb" \
-e 'select * from confs limit 1' >a.txt 2>a-err.txt
Ideally though, you'd use a my.cnf file to configure the username and password.
See e.g.
MySQL Utilities - ~/.my.cnf option file
mysql .my.cnf not reading credentials properly?
Do this:
result=$(mysql -u $mysqlUser -p$mysqlPass -D $mysqlDb -e "select * from confs limit 1" | grep '^\|' | tail -1)
The $() statement of Bash has trouble handling variables which contain multiple lines so the above hack greps only the interesting part: the data
If I would like to generate 2 files(same data,same file name but different path) How can I do this?
Here is my code
mysql -h -u -p -e "select * from customer" prdb> D:\patha.txt
mysql -h -u -p -e "select * from customer" prdb> C:\patha.txt
Please suggest if there is a better way to do.
This seems like a use case for tee if you have Powershell (you're on Windows I suppose) available. It works like this:
echo "query result" | tee fileC fileD
For the docs of tee in Powershell, see here.
I want to monitor the progress while importing multiple mysql dump files. I am using below script;
ls -1 *.sql | awk '{ print "source",$0 }' | mysql --batch -u {username} -p {dbname}
After a little research I have found mysql 'pv' command but I don't know how to use it with this script.
I have a folder with o lot of sql scripts. I want to run all of them without specifying names of them. Just specify a folder name. Is it possible?
You can not do that natively, but here's simple bash command:
for sql_file in `ls -d /path/to/directory/*`; do mysql -uUSER -pPASSWORD DATABASE < $sql_file ; done
here USER, PASSWORD and DATABASE are the corresponding credentials and /path/to/directory is full path to folder that contains your files.
If you want to filter, for example, only sql files, then:
for sql_file in `ls /path/to/directory/*.sql`; do mysql -uUSER -pPASSWORD DATABASE < $sql_file ; done
That was what worked for me:
1. Created a shell script in the folder of my scripts
for f in *.sql
do
echo "Processing $f file..."
mysql -u user "-pPASSWORD" -h HOST DATABASE < $f
done
The standard mysqldump command that I use is
mysqldump --opt --databases $dbname --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
To dump multiple databases
mysqldump --opt --databases $dbname1 $dbname2 $dbname3 $dbname_etc --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
My question is how do you dump multiple databases from different MySQL accounts into just one file?
UPDATE: When I meant 1 file, I mean 1 gzipped file with the difference sql dumps for the different sites inside it.
Nobody seems to have clarified this, so I'm going to give my 2 cents.
Going to note here, my experiences are in BASH, and may be exclusive to it, so variables and looping might work different in your environment.
The best way to achieve an archive with separate files inside of it is to use either ZIP or TAR, i prefer to use tar due to its simplicity and availability.
Tar itself doesn't do compression, but bundled with bzip2 or gzip it can provide excellent results. Since your example uses gzip I'll use that in my demonstration.
First, let's attack the problem of MySQL dumps, the mysqldump command does not separate the files (to my knowledge anyway). So let's make a small workaround for creating 1 file per database.
mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; done
So now we have a string that will show databases per file, and export those databases out to where ever you need simply edit the part after the > symbol
Next, let's add some look at the syntax for TAR
tar -czf <output-file> <input-file-1> <input-file-2>
because of this configuration it allows us to specify a great number of files to archive.
The options are broken down as follows.
c - Compress/Create Archive
z - GZIP Compression
f - Output to file
j - bzip compression
Our next problem is keeping a list of all the newly created files, we'll expand our while statement to append to a variable while running through each database found inside of MySQL.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done
Now we have a DBLIST variable that we can use to have an output of all our files that will be created, we can then modify our 1 line statement to run the tar command after everything has been handled.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done && tar -czf $filename "$DBLIST"
This is a very rough approach and doesn't allow you to manually specify databases, so to achieve that, using the following command will create you a TAR file that contains all of your specified databases.
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST"
The looping through MySQL databases from the MySQL database comes from the following stackoverflow.com question "mysqldump with db in a separate file" which was simply modified in order to fit your needs.
And to have the script automatically clean it up in a 1 liner simply add the following at the end of the command
&& rm "$DBLIST"
making the command look like this
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST" && rm "$DBLIST"
For every MySQL server account, dump the databases into separate files
For every dump file, execute this command:
cat dump_user1.sql dump_user2.sql | gzip > super_dump.gz
There is a similar post on Superuser.com website: https://superuser.com/questions/228878/how-can-i-concatenate-two-files-in-unix
just in case "multiple db" is literally "all db" for you
mysqldump -u root -p --all-databases > all.sql