Store query in array in bash - mysql

My script need to store in a structure the result of a query:
#!/bin/bash
user="..."
psw="..."
database="..."
query="select name, mail from t"
customStructure=$(mysql -u$user -p$psw $database -e "$query";)
I've no idea how store the array of {name, mail} from query result..
I need structure like this:
array=[ [name1,mail1] , [name2,mail2], ....., [nameN, mailN] ]
Is there a way to do this in bash?

Bash arrays are initialized like so:
myarray=("hi" 1 "2");
To capture the individual portions of output of a command into an array, we must loop through the output, adding it's results to the array. That can be done like so:
for i in `echo "1 2 3 4"`
do
myarray+=($i)
done
In your example, it looks like you wish to get the output of a MySQL command and store the parts of it's output lines into subarrays. I will show you how to capture lines into arrays, and given that, you should be able to figure out how to put subarrays into that yourself.
while read line
do
myarray+=("$line")
done < <(mysql -u${user} -p${psw} ${database} -e "${query}")
It's also worth mentioning that for this kind of MySQL operation, where you don't need output metadata (such as pretty formatting and table names), you can use MySQL's -B option to do 'batch output'.

Field level record can be accessed via read -a command and IFS is set to the empty string to prevent read from stripping leading and trailing whitespace from the line.
#!/bin/bash
user="..."
psw="..."
database="..."
query="select name, mail from t"
OIFS="$IFS" ; IFS=$'\n' ; oset="$-" ; set -f
while IFS="$OIFS" read -a line
do
echo ${line[0]}
echo ${line[1]}
done < <(mysql -u${user} -p${psw} ${database} -e "${query}")

Related

Bash loop Through CSV but left last row value with newline

I have case with loop. My task is to create json file with loop from csv data. Unfornunately when i generate field pk, the value is empty that make my json fault.This is the subset of my csv
table,pk
aaa,nik
aab,ida
aac,idb
aad,idc
aae,idd
aef,ide
...
This is my full code:
#!bin/bash
CSV_LIST="/xxx/table_lists.csv"
DATA=${CSV_LIST}
mkdir sqlconn
cd sqlconn
cat ${DATA} |
while IFS=',' read table pk ; do
PK= echo ${pk} | tr -d '\n'
cat > ./sqlservercon_$table.json << EOF
{"name" :"sqlservercon_$table","config":{"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector","topics":"$table",
...
,"pk.fields":" $PK","pk.mode":"record_value","destination.table.format":"db.dbo.$table","errors.tolerance":"all","flush.size":"10000"
}}
EOF
done
So the rendered result give me this:
{"name" :"sqlservercon_XXX","config":{"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector","topics":"XXX",...
,"pk.fields":" ","pk.mode":"record_value","destination.table.format":"XXX.XXX.XXX","errors.tolerance":"all","flush.size":"10000"
}}
but when i not edited my field pk
...,
"pk.fields":" $pk",
...
, it gives me wrong JSON file like this:
...,"pk.fields":" id
",...
Any helps are appreciated
UPDATE
When i check my csv using cat -v table_lists.csv the last column has ^M character that ruin the json file. But i still don't know how to deal with it.
In respect to the comments I gave, the following script were working
#!/bin/bash
cd /home/test
CSV_LIST="/home/test/tableList.csv"
DATA=${CSV_LIST}
# Prepare data file
sed -i "s/\r//g" ${DATA}
# Added for debugging purpose
echo "Creating connection file in JSON for"
# Print file content from 2nd line only
tail --lines=+2 ${DATA} |
while IFS=',' read TABLE PK ; do
# Added for debugging purpose
echo "Table: ${TABLE} and PK: ${PK}"
# Added missing $()
PK_TRIMMED=$(echo ${PK} | tr -d '\n')
cat > ./sqlservercon_${TABLE}.json << EOF
{"name":"sqlservercon_${TABLE}","config":{"connector.class":"io.confluent.connect.jdbc.JdbcSinkConnector","topics":"${TABLE}",...,"pk.fields":"${PK_TRIMMED}","pk.mode":"record_value","destination.table.format":"db.dbo.${TABLE}","errors.tolerance":"all","flush.size":"10000"}}
EOF
done
Okay, after several check, beside wrong script that i have give in here, I investigate the CSV file. I download it directly from Spreadsheet Google, even it's give me .csv, but not encoded right for UNIX or Ubuntu as my development environment.
So i decided to do something like this manually:
From google spreadsheet, select all column that i want to use
Create an empty csv file
Copy paste the cells it into the .csv file
Change the " "(double space) with ,
And for the loop one, because i want to curl it instead save the json, i do this:
#!/bin/bash
CSV_LIST="/home/admin/kafka/main/config/tables/table_lists.csv"
DATA=${CSV_LIST}
while IFS=',' read table pk; do
curl -X POST http://localhost:8083/connectors -H 'Content-Type:application/json' -d'{"name" :"sqlservercon_'$table'","config":{...,...,"destination.table.format":"db.dbo.'$table'","errors.tolerance":"all",
"flush.size":"10000"
}}' | jq
done < ${DATA}

mysql query inside escaped bash -c string. How to put quotes?

Hello I am having trouble with a very specific line in a bash script.
Here is the code:
ssh $SOURCEIP "/usr/bin/time -f \"%e\" bash -c \"seq $ITER | parallel -n0 \"mysql --silent -h $TARGET -uroot -ppass -e 'SELECT * FROM dbname.tablename WHERE size = $SIZE;' >> out.txt\""
The problem is I ran out of quotes. The opening and escaped double quotes at the beginning of "mysql" are closing those from "bash -c". I have to put the mysql statement in double quotes and the query in single quotes, otherwise i get an error and I can't figure out how to proceed. I know that I should not pass the password like that and it will be changed later, I get this warning "$ITER"-times everytime i test this because --silent doesn't suppress this.
The problematic code is part of a small shell script that is supposed to just perform this data transfer.
I want to change to the other machine with ssh first and not via parallel because of consistency with other scripts.
So basically I need the double quotes around the bash -c command to get this whole parallel operation to work, which are already escaped because of the opening ssh doublequotes and also I need to put the mysql command inside quotes as well but they are closing each other somehow.
Any help will be greatly appreciated.
Thanks in advance.
Largio
Edit: (SOLUTION)
As suggested by #ole-tange the following command worked for me.
parallel --shellquote | parallel --shellquote
After invoking in a shell, i pasted my string in question into the prompt and got the masked string back. I still had troubles with finding out what exactly to paste but in the end it is just logical.
What exactly i pasted into the quoter was:
sql mysql://root:pass#$TARGET/ 'SELECT data FROM db_name.tablename WHERE size = ${SIZE};' >> out.txt
But still i had some problems with my variables inside my query. The problem here was that i had to de-mask the masking of the 2 variables $TARGET and $SIZE after everything got masked by the parallel quoter. Maybe my thinking has a too laborious manner but i could not get it to work in another way. Also note that i did not put quotes around the whole sql statement, as my plan was before, because now the quoter compensated for that. For consistency reasons i paste the final string that i got working in the end (with my changes afterwards):
ssh $SOURCEIP "/usr/bin/time -f \"%e\" bash -c \"seq $ITER | parallel -n0 sql\\\ mysql://root:pass#$TARGET/ \\\'SELECT\\\ data\\\ FROM\\\ db_name.tablename\\\ WHERE\\\ size\\\ =\\\ ${SIZE}\\\;\\\'\\\ \\\>\\\>\\\ out.txt\""
GNU Parallel has a quoter:
$ parallel --shellquote
"*\`$
[CTRL-D]
\"\*\\\`\$
And you can do it twice:
$ parallel --shellquote | parallel --shellquote
"*\`$
[CTRL-D]
\\\"\\\*\\\\\\\`\\\$
So just paste the string you want quoted.
But you might want to consider using functions and use env_parallel to copy the function:
myfunc() {
size=$1
target=$2
sql mysql://root:pass#$target/ "SELECT data FROM db_name.tablename WHERE size = $size;" >> out.txt
}
env_parallel --env myfunc -S $SOURCEIP --nonall myfunc $SIZE $TARGET
Also: Instead of mysql try sql mysql://root:pass#/ 'SELECT * FROM dbname.tablename WHERE size = $SIZE;'

BASH: get data from MySQL where records have spaces

I'm trying to get values from a MySQL database, for a later substitution with perl, everything works fine but if a record contain space " " the variables will be set uncorrectly.
Example if in the database I have:
sede = "street"
fede = "calvin and hobbes"
lede = "12"
the result for the variables will be:
$TAGSEDE = "steet"
$TAGFEDE = "calvin"
$TAGLEDE = "and"
I understood there is something wrong in setting $DBDATAF but, I can't identify it (english isn't my mother language so some misunderstanding are more than a possibility).
DBDATAF=$(mysql -u$DBUSER -p$DBPASS -se "USE $DBNAME; SELECT sede, fede, lede FROM $DBTABL WHERE cf='$CFPI'")
read TAGSEDE TAGFEDE TAGLEDE <<< $DBDATAF
/usr/bin/perl -p -i -e "s/TAGDATAINSERT/$TAGDATAINSERT/g" $i
Your code breaks down for two reasons:
When $DBDATAF is passed to the <<< operator, the tabs are discarded. To keep them, double quote the variable as shown below.
read separates the input line into words using the IFS special variable. By default, it separates on tab, space or newline. So even though tabs are now preseved when passed to <<<, the read command splits on spaces also. Setting IFS to a tab makes read split as desired. Putting the IFS assignment and read on one line ensures that IFS returns to the default after read exits
IFS="$( echo -e '\t' )" read TAGSEDE TAGFEDE TAGLEDE <<< "$DBDATAF"

Select mysql query with bash

How to select mysql query with bash so each column will be in a separate array value?
I've tried the following command but it only works if the content is one word. for example:
id= 11, text=hello, important=1
if I've an article for instance in text. the code will not work properly. I guess I can use cut -f -d but if "text" contains special characters it wont work either.
while read -ra line; do
id=$(echo "${line[1]}")
text=$(echo "${line[2]}")
important=$(echo "${line[3]}")
echo "id: $id"
echo "text: $text"
echo "important: $important"
done < <(mysql -e "${selectQ}" -u${user} -p${password} ${database} -h ${host})
Bash by default splits strings at any whitespace character. First you need a unique column identifier for your output, you can use mysql --batch to get tab-separated csv output.
From the MySQL man page:
--batch, -B
Print results using tab as the column separator, with each row on a new line. With this option, mysql does not use the history file.
Batch mode results in nontabular output format and escaping of special characters. Escaping may be disabled by using raw mode; see the description for the --raw option
You want the result to be escaped, so don't use --raw, otherwise a tab character in your result data will break the loop again.
To skip the first row (column names) you can use the option --skip-column-names in addition
Now you can walk through each line and split it by tab character.
You can force bash to split by tab only by overriding the IFS variable (Internal Field Separator) temporarily.
Example
# myread prevents collapsing of empty fields
myread() {
local input
IFS= read -r input || return $?
while (( $# > 1 )); do
IFS= read -r "$1" <<< "${input%%[$IFS]*}"
input="${input#*[$IFS]}"
shift
done
IFS= read -r "$1" <<< "$input"
}
# loop though the result rows
while IFS=$'\t' myread id name surname url created; do
echo "id: ${id}";
echo "name: ${name}";
echo "surname: ${surname}";
echo "url: ${url}";
echo "created: ${created}";
done < <(mysql --batch --skip-column-headers -e "SELECT id, name, surname, url, created FROM users")
myread function all credits to this answer by Stefan Kriwanek
Attention:
You need to be very careful with quotes and variable delimiters.
If you just echo $row[0] without the curly brackets, you will get the wrong result
EDIT
You still have a problem , when a column returns empty string because the internal field separator matches any amount of the defined char:
row1\t\trow3 will create an array [row1,row3] instead of [row1,,row3]
I found a very nice approach to fix this, updated the example above.
Also read can directly seperate the input stream into variables.

Escaping MYSQL command lines via Bash Scripting

PHP has mysql_real_escape_string() to correctly escape any characters that might cause problems. What is the best way to mimic this functionality for BASH?
Is there anyway to do prepared mysql statements using bash? This seems to be the best way.
Most of my variables won't (shouldn't) have special characters, however I give the user complete freedom for their password. It may include characters like ' and ".
I may be doing multiple SQL statements so I'll want to make a script that takes in parameters and then runs the statement. This is what I have so far:
doSQL.sh:
#!/bin/sh
SQLUSER="root"
SQLPASS="passwor339c"
SQLHOST="localhost"
SQL="$1"
SQLDB="$2"
if [ -z "$SQL" ]; then echo "ERROR: SQL not defined"; exit 1; fi
if [ -z "$SQLDB" ]; then SQLDB="records"; fi
echo "$SQL" | mysql -u$SQLUSER -p$SQLPASS -h$SQLHOST $SQLDB
and an example using said command:
example.sh:
PASSWORD=$1
doSQL "INSERT INTO active_records (password) VALUES ('$PASSWORD')"
Obviously this would fail if the password password contained a single quote in it.
In Bash, printf can do the escaping for you:
$ a=''\''"\;:#[]{}()|&^$#!?, .<>abc123'
$ printf -v var "%q" "$a"
$ echo "$var"
\'\"\\\;:#\[\]\{\}\(\)\|\&\^\$#\!\?\,\ .\<\>abc123
I'll leave it to you to decide if that's aggressive enough.
This seems like a classic case of using the wrong tool for the job.
You've got a lot of work ahead of you to implement the escaping done by mysql_real_escape_string() in bash. Note that mysql_real_escape_string() actually delegates the escaping to the MySQL library which takes into account the connection and database character sets. It's called "real" because its predecessor mysql_escape_string() did not take the character set into consideration, and could be tricked into injecting SQL.
I'd suggest using a scripting language that has a MySQL library, such as Ruby, Python, or PHP.
If you insist on bash, then use the MySQL Prepared Statements syntax.
There is no escape from the following construct, no matter what quotes you use:
function quoteSQL() {
printf "FROM_BASE64('%s')" "$(echo -n "$1" | base64 -w0 )"
}
PASSWORD=$1
doSQL "INSERT INTO active_records (password) VALUES ($(quoteSQL "$PASSWORD"));"
# I would prefer piping
printf 'INSERT INTO active_records (password) VALUES (%s);\n' $(quoteSQL "$PASSWORD") | doSQL
mysql_real_escape_string() of course only escapes a single string literal to be quoted, not a whole statement. You need to be clear what purpose the string will be used for in the statement. According to the MySQL manual section on string literals, for inserting into a string field you only need to escape single and double quotation marks, backslashes and NULs. However, a bash string cannot contain a NUL, so the following should suffice:
#escape for MySQL single string
PASSWORD=${PASSWORD//\\/\\\\}
PASSWORD=${PASSWORD//\'/\\\'}
PASSWORD=${PASSWORD//\"/\\\"}
If you will be using the string after a LIKE, you will also probably want to escape % and _.
Prepared statements are another possibility. And make sure you don't use echo -e in your bash.
See also https://www.owasp.org/index.php/SQL_Injection_Prevention_Cheat_Sheet
This will escape apostrophes
a=$(echo "$1" | sed s/"'"/"\\\'"/g)
Please note though that mysql_real_escape_string also escapes \x00, \n, \r, \, " and \x1a. Be sure to escape these for full security.
To escape \x00 for example:
a=$(echo "$1" | sed s/"\x00"/"\\\'"/g)
With a bit of effort you can probably escape these using one sed command.
Sure, why not just use the real thing?
A script, anywhere, such as
~/scripts/mysqli_real_escape.php
#!/bin/php
<?php
$std_input_data = '';
$mysqli = new mysqli('localhost', 'username', 'pass', 'database_name');
if( ftell(STDIN) !== false ) $std_input_data = stream_get_contents(STDIN);
if( empty($std_input_data) ) exit('No input piped in');
if( mysqli_connect_errno( ) ) exit('Could not connect to database');
fwrite ( STDOUT,
$mysqli->real_escape_string($std_input_data)
);
exit(0);
?>
Next, run from bash terminal:
chmod +x ~/script/mysqli_real_escape.php`
ln -s ~/script/mysqli_real_escape.php /usr/bin/mysqli_real_escape
All set! Now you can use mysqli_real_escape in your bash scripts!
#!/bin/bash
MyString="stringW##)*special characters"
MyString="$(printf "$MyString" | mysqli_real_escape )"
Note: From what I understand, command substitution using "$(cmd ..."$var")" is preferred over using backticks. However, as no further nesting would be needed either should be fine.
Further Note: When inside command substitution, "$(...)", a new quote context is created. This is why the quotes around variables do not screw up the string.
This is how I did it, where my-file.txt contains spaces, new lines and quotes:
IFS='' content=$(cat my-file.txt)
mysql <flags> -e "update table set column = $(echo ${content#Q} | cut -c 2-) where something = 123"
Here are a couple Bash functions I wrote, grouped into a library.
It provides methods for proper quoting/escaping strings and identifiers:
##### db library functions #####
# Executes SQL Queries on localhost's MySQL server
#
# #Env
# $adminDBUser: The database user
# $adminDBPassword: The database user's password
#
# #Params
# $#: Optional MySQL arguments
#
# #Output
# >&1: The MySQL output stream
db::execute() {
# Uncomment below to debug
#tee --append debug.sql |
mysql \
--batch \
--silent \
--user="${adminDBUser:?}" \
--password="${adminDBPassword:?}" \
--host=localhost \
"$#"
}
# Produces a quoted string suitable for inclusion in SQL statements.
#
# #Params
# $1: The string to bo quoted
#
# #Output
# >&1: The quoted identifier suitable for inclusion in SQL statements
db::quoteString() {
local -- string="${1:?}"
local -- bas64String && bas64String=$(printf %s "${string}" | base64)
db::execute <<< "SELECT QUOTE(FROM_BASE64('${bas64String}'));"
}
# Produces a quoted identifier suitable for inclusion in SQL statements.
#
# #Params
# $1: The identifier to bo quoted
#
# #Output
# >&1: The quoted identifier suitable for inclusion in SQL statements
db::quoteIdentifier() {
local -- identifier="${1:?}"
local -- bas64Identifier && bas64Identifier=$(printf %s "${identifier}" | base64)
db::execute <<< "SELECT sys.quote_identifier(FROM_BASE64('${bas64Identifier}'))"
}
This will work:
echo "John O'hara" | php -R 'echo addslashes($argn);'
To pass it to a variable:
name=$(echo "John O'hara" | php -R 'echo addslashes($argn);')