how to add parentheses and single quotes to bash string - mysql

EDITED
I am trying to make some inserts in my mysql data base from a bash script. In this data base there are some varchar type arguments so I have to quote them when I make the query
I execute the query this way
mysql --user=**** --password=**** -e $ext dbname
I want to introduce in $ext the next text: insert into tname (string,int) values ('$var',$ivar)
The problem is that when I am trying to insert the parenthesis and the "'" they only put in the start of the string.
after I failed putting the example I will post parts of the code:
I get the datas from a file and store all the lines in myarray;
while IFS=$'\n' read -r line_data; do
myarray[i]="${line_data]"
((++i))
done < file.txt
and then in other while
somevar=${myarray[i++]}
var="'$somevar'"
echo $somevar
echo $var
Result:
Hello
'Hello

Your assignment syntax is incorrect. The $ is only used when referring to the variable, not when assigning it a value. Fix your assignment and it should work fine:
$ var=foo
$ ivar=23
$ ext="insert into tname (string,int) values ('$var',$ivar)"
$ echo $ext
insert into tname (string,int) values ('foo',23)
Now you can use it in your command like this:
mysql --user=**** --password=**** -e "$ext" dbname
There are a number of ways this could go wrong, depending on what characters are in the data you're trying to insert, and it might be easier to write to a file instead of a variable.

As #chepner have said it is becouse at getting the string it was a \r character at end of all strings so that strange thing happens
Delete the las character of all the strings
var=${myarray[i++]}
var=${var::-1}

Related

How to Insert SQL queries with multiple quotes into a JSON parameter as a shell variable

I have sql file that contain multiple queries inside (as a single line) that contain lots of single quote ('), double quote (") and a backtick (`) inside.
e.g.:
declare x ... ; create table ...; select ... ; . . .
And I have shell script in which this script will update the queries in previous sql file to the existing scheduled query in BigQuery. It will read the file and put it in a shell variable and then use that variable in the parameter "params" which read a JSON format string
query=`cat queries.sql`
bq update \
--project_id=$project \
--location=$location \
--display_name=$job_name \
--schedule="every day 21:00" \
--params="{\"query\":\"$queries\"}" \
--transfer_config \
projects/<id>/locations/<location>>transferConfigs/<config-id>
When I run this shell script, I got this error:
BigQuery error in update operation: Parameters should be specified in JSON format when creating the transfer configuration.
I am pretty sure that its because of these quotes inside the sql script but how should I put in the parameter "params" which receive a json format input. How should I do this?
It's quite confusing work with multiple strings one inside another, reading files, using variables and so one. The way I could make it was as following:
#Avoid substitute any * for a wildcard :
set -f
# Change " -> \" and ' -> \" :
query=$(sed 's/"/\\"/g' queries.sql | sed 's/'"'"'/\\"/g')
#create the json var:
json_query='{"query":"'$query'"}'
#Generate de bq command:
cmd="bq update --project_id=$project --location=$location --display_name=$job_name --schedule='every day 21:00' --params='${json_query}' --transfer_config projects/<id>/locations/<location>>transferConfigs/<config-id>"
# Send to temp file and execute:
echo $cmd > tmp
bash tmp

variable out of mysql query in bash not filled by local bash variables on echo

I would like to query from mysql by bash. In the mysql field is a text "$variable". (it's a html template at all).
In Bash Script you find:
variable="Text for Template";
test=/usr/bin/mysql ... -execute"use db; select de from table where id is = '1';"
If I do
echo $test;
I just get $variable and not "Text for Template"
What could I change to get the expected value ? Any ideas ?
For Information:
In the MYSQL DB field is a complete html webpage with diffrent $variables. In past I solved it by insert the complete html webpage as htmlvar="...$variable" into the bash script. This worked good but was uncomfortable because of around 5 web templates. SO I thought if I put the HTML page into Database I could read it to a variable an then I would be filled by the local variables who are in the bash script like in the past. But now I only get the plain text of the DB field and the $variable is not set.
Thank you !
I'm not sure if I understood your question correct but if you want to print your database results you can just do :
$ mysql -u root -p -e "select de from db.table where id is = '1';"
but if you want print the variable value you can do :
$ echo "$variable"
Anyway I believe you can not print "Text for Template" by command "echo $test;" according your code.
Your are probably looking for variable expansion:
$ variable="Hello there"
$ test=variable
$ echo ${!test}
>> Hello there
Please note, that test=variable is without dollar sign (not test=$variable). If you have dollar sign and would like to use it as well, you can do double echo:
echo `echo $test`

Mysql : Get comma delimited Output

I am connecting to MySQL server and executing a select statement using Perl backticks. The output of the command is being captured in an array as shown below:
my #output = `mysql -u <user> -p<password> -e 'select * from <database_name>.<table_name>' -s`;
The -e option gives me tab delimited output with each row on a new line (batch mode) and -s gives minimal output in a non tabular format(silent mode).
Is there an option in the MySQL command to get a coma delimited output instead of tab delimited ?
(NOTE: I want to avoid concatenating values in the sql query)
There is no obvious option to do this (the options are here). You can change the query to get what you want:
select concat_ws(',', col1, col2, . . . )
from <database_name>.<table_name>
But this requires listing all the columns (which I personally think is a good thing). You can also do the substitution after the fact.

Change output format for MySQL command line results to CSV

I want to get headerless CSV data from the output of a query to MySQL on the command line. I'm running this query on a different machine from the MySQL server, so all those Google answers with "INTO OUTFILE" are no good.
So I run mysql -e "select people, places from things". That outputs stuff that looks kinda like this:
+--------+-------------+
| people | places |
+--------+-------------+
| Bill | Raleigh, NC |
+--------+-------------+
Well, that's no good. But hey, look! If I just pipe it to anything, it turns it into a tab-separated list:
people places
Bill Raleigh, NC
That's better- at least it's programmatically parseable. But I don't want TSV, I want CSV, and I don't want that header. I can get rid of the header with mysql <stuff> | tail -n +2, but that's a bother I'd like to avoid if MySQL just has a flag to omit it. And I can't just replace all tabs with commas, because that doesn't handle content with commas in it.
So, how can I get MySQL to omit the header and give me data in CSV format?
As a partial answer: mysql -N -B -e "select people, places from things"
-N tells it not to print column headers. -B is "batch mode", and uses tabs to separate fields.
If tab separated values won't suffice, see this Stackoverflow Q&A.
The above solutions only work in special cases. You'll get yourself into all kinds of trouble with embedded commas, embedded quotes, other things that make CSV hard in the general case.
Do yourself a favor and use a general solution - do it right and you'll never have to think about it again. One very strong solution is the csvkit command line utilities - available for all operating systems via Python. Install via pip install csvkit. This will give you correct CSV data:
mysql -e "select people, places from things" | csvcut -t
That produces comma-separated data with the header still in place. To drop the header row:
mysql -e "select people, places from things" | csvcut -t | tail -n +2
That produces what the OP requested.
I wound up writing my own command-line tool to take care of this. It's similar to cut, except it knows what to do with quoted fields, etc. This tool, paired with #Jimothy's answer, allows me to get a headerless CSV from a remote MySQL server I have no filesystem access to onto my local machine with this command:
$ mysql -N -e "select people, places from things" | csvm -i '\t' -o ','
Bill,"Raleigh, NC"
csvmaster on github
It is how to save results to CSV on the client-side without additional non-standard tools.
This example uses only mysql client and awk.
One-line:
mysql --skip-column-names --batch -e 'select * from dump3' t | awk -F'\t' '{ sep=""; for(i = 1; i <= NF; i++) { gsub(/\\t/,"\t",$i); gsub(/\\n/,"\n",$i); gsub(/\\\\/,"\\",$i); gsub(/"/,"\"\"",$i); printf sep"\""$i"\""; sep=","; if(i==NF){printf"\n"}}}'
Logical explanation of what is needed to do
First, let see how data looks like in RAW mode (with --raw option). the database and table are respectively t and dump3
You can see the field starting from "new line" (in the first row) is splitted into three lines due to new lines placed in the value.
mysql --skip-column-names --batch --raw -e 'select * from dump3' t
one line 2 new line
quotation marks " backslash \ two quotation marks "" two backslashes \\ two tabs new line
the end of field
another line 1 another line description without any special chars
OUTPUT data in batch mode (without --raw option) - each record changed to the one-line texts by escaping characters like \ <tab> and new-lines
mysql --skip-column-names --batch -e 'select * from dump3' t
one line 2 new line\nquotation marks " backslash \\ two quotation marks "" two backslashes \\\\ two tabs\t\tnew line\nthe end of field
another line 1 another line description without any special chars
And data output in CSV format
The clue is to save data in CSV format with escaped characters.
The way to do that is to convert special entities which mysql --batch produces (\t as tabs \\ as backshlash and \n as newline) into equivalent bytes for each value (field).
Then whole value is escaped by " and enclosed also by ".
Btw - using the same characters for escaping and enclosing gently simplifies output and processing, because you don't have two special characters.
For this reason all you have to do with values (from csv format perspective) is to change " to "" whithin values. In more common way (with escaping and enclosing respectively \ and ") you would have to first change \ to \\ and then change " into \".
And the commands' explanation step by step:
# we produce one-line output as showed in step 2.
mysql --skip-column-names --batch -e 'select * from dump3' t
# set fields separator to because mysql produces in that way
| awk -F'\t'
# this start iterating every line/record from the mysql data - standard behaviour of awk
'{
# field separator is empty because we don't print a separator before the first output field
sep="";
-- iterating by every field and converting the field to csv proper value
for(i = 1; i <= NF; i++) {
-- note: \\ two shlashes below mean \ for awk because they're escaped
-- changing \t into byte corresponding to <tab>
gsub(/\\t/, "\t",$i);
-- changing \n into byte corresponding to new line
gsub(/\\n/, "\n",$i);
-- changing two \\ into one \
gsub(/\\\\/,"\\",$i);
-- changing value into CSV proper one literally - change " into ""
gsub(/"/, "\"\"",$i);
-- print output field enclosed by " and adding separator before
printf sep"\""$i"\"";
-- separator is set after first field is processed - because earlier we don't need it
sep=",";
-- adding new line after the last field processed - so this indicates csv record separator
if(i==NF) {printf"\n"}
}
}'
How about using sed? It comes standard with most (all?) Linux OS.
sed 's/\t/<your_field_delimiter>/g'.
This example uses GNU sed (Linux). For POSIX sed (AIX/Solaris)I believe you would type a literal TAB instead of \t
Example (for CSV output):
#mysql mysql -B -e "select * from user" | while read; do sed 's/\t/,/g'; done
localhost,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0,,
localhost,bill,*2470C0C06DEE42FD1618BB99005ADCA2EC9D1E19,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0,,
127.0.0.1,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0,,
::1,root,,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,Y,,,,,0,0,0,0,,
%,jim,*2470C0C06DEE42FD1618BB99005ADCA2EC9D1E19,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,N,,,,,0,0,0,0,,
mysqldump utility can help you, basically with --tab option it's a wrapped for SELECT INTO OUTFILE statement.
Example:
mysqldump -u root -p --tab=/tmp world Country --fields-enclosed-by='"' --fields-terminated-by="," --lines-terminated-by="\n" --no-create-info
This will create csv formatted file /tmp/Country.txt
If you are using mysql client you can set up the resultFormat per session e.g.
mysql -h localhost -u root --resutl-format=json
or
mysql -h localhost -u root --vertical
Check out the full list of arguments here.
mysql client can detect the output fd, if the fd is S_IFIFO(pipe) then don't output ASCII TABLES, if the fd is character device(S_IFCHR) then output ASCII TABLES.
you can use --table to force output the ASCII TABLES like:
$mysql -t -N -h127.0.0.1 -e "select id from sbtest1 limit 1" | cat
+--------+
| 100024 |
+--------+
-t, --table Output in table format.
You can use spyql to read the tab-delimited output of mysql and generate a comma-delimited CSV and turn off header writing:
$ mysql -e "SELECT 'Bill' AS people, 'Raleigh, NC' AS places" | spyql -Oheader=False "SELECT * FROM csv TO csv"
Bill,"Raleigh, NC"
spyql detects if the input has a header and what is the delimiter. The output delimiter is the comma by default. You can specify all these options manually if you wish:
$ mysql -e "SELECT 'Bill' AS people, 'Raleigh, NC' AS places" | spyql -Idelimiter="'\t'" -Iheader=True -Odelimiter="," -Oheader=False "SELECT * FROM csv TO csv"
Bill,"Raleigh, NC"
I would not turn off header writing on mysql because spyql can take advantage of it, for example, if you choose to generate JSONs instead of CSVs:
$ mysql -e "SELECT 'Bill' AS people, 'Raleigh, NC' AS places" | spyql "SELECT * FROM csv TO json"
{"people": "Bill", "places": "Raleigh, NC"}
or if you need to reference your columns:
$ mysql -e "SELECT 'Bill' AS people, 'Raleigh, NC' AS places" | spyql -Oindent=2 "SELECT *, 'I am {} and I live in {}.'.format(people, places) AS message FROM csv TO json"
{
"people": "Bill",
"places": "Raleigh, NC",
"message": "I am Bill and I live in Raleigh, NC."
}
Disclaimer: I am the author of spyql

Bash Script Loop Through MySQL

I need a bash script that can retrieve MySQL data from a remote data base. Actually I have that done, but what I need it to do now is loop through the records somehow and pass a variable to another bash file.
Here's my MySQL call:
mysql -X -u $MyUSER -p$MyPASS -h$MyHOST -D$MyDB -e'SELECT `theme_name`, `guid` FROM `themes` WHERE `theme_purchased`="1" AND `theme_compiled`='0';' > themes.xml
download_themes.sh
It exports the data into an xml file called theme.xml right now, I was just trying to figure out some way to loop through the data. I am trying to avoid PHP and perl and just trying to use bash. Thanks in advance.
something like:
mysql -e "SELECT `theme_name`, `guid` FROM `themes` WHERE `theme_purchased`='1' AND `theme_compiled`='0'" | while read theme_name guid; do
# use $theme_name and $guid variables
echo "theme: $theme_name, guid: $guid"
done
in short: the mysql command outputs record separated by '\n' and fields separated by '\t' when the output is a pipe. the read command reads a line, splits in fields, and puts each on a variable.
if your data has spaces in the fields, you get some problems with the default read splitting. there are some ways around it; but if you're only reading two fields and one of them shouldn't have any spaces (like the guid), then you can put the 'dangerous' field at the end, and read will put everything 'extra' in the last variable.
like this:
mysql -e "SELECT `guid` `theme_name`, FROM `themes` WHERE `theme_purchased`='1' AND `theme_compiled`='0'" | while read guid theme_name; do
# use $theme_name and $guid variables
echo "theme: $theme_name, guid: $guid"
done
Rather than outputting XML, may I suggest you simply use the SELECT INTO OUTFILE syntax or mysql --batch --raw to output tab-separated values. You then have much easier access through bash to the rest of the Unix toolchain, like cut and awk to retrieve the fields you need and reuse them with Bash. No other scripting language is necessary and you needn't mess with XML.
mysql --batch --raw -u $MyUSER -p$MyPASS -h$MyHOST -D$MyDB -e'SELECT `theme_name`, `guid` FROM `themes` WHERE `theme_purchased`="1" AND `theme_compiled`='0';' \
| awk '{print "theme: " $1 " guid: " $2}'
The accepted answer does not work when spaces are in the output. It is an easy fix (IFS=$'\t' -- Note the $ -- it is weird):
>mysql ... -BNr -e "SELECT 'funny man', 'wonderful' UNION SELECT 'no_space', 'I love spaces';" | while IFS=$'\t' read theme_name guid; do echo "theme: $theme_name guid: $guid"; done
theme: funny man guid: wonderful
theme: no_space guid: I love spaces
You will, of course, want to substitute your own query.
Pipe '|' to while is dangerous, for the changes inside the loop happen in another subprocess and would not take effect in the current script.
By the way, I hate to turn to the external file solution.
I suggest to use "Process Substitute".
while read field1 field2 field3
do
done < <(mysql -NB -e "$sql")
# ^
# space needed