Heredocs, variables and single quotes in BASH and MySQL - mysql

I'm trying to send some data to a remote MySQL database using a BASH script on GNU/Linux, but get various errors.. Here's the line that's not working:
mysql --host=192.168.0.100 --user=petercapaldi --password=mypassword mystartrekcharacterbase << EOF
INSERT into myfourlegs values ('$PERSON','$THETIME','$THETIME','$THEDATE','$DAYOFWEEK');
EOF
and this too (just in case):
mysql --host=192.168.0.100 --user=petercapaldi --password=mypassword mystartrekcharacterbase << EOF
INSERT into myfourlegs values (\047$PERSON\047,\047$THETIME\047,\047$THETIME\047,\047$THEDATE\047,\047$DAYOFWEEK\047);
EOF

Scrap that. My fault - missed the first field in the database. The single quotes work as they should with heredocs.. (i.e. '$VARIABLE' prints 'myvariable' just like $VARIABLE prints myvariable).

Related

How do you send multiple commands, including an SQL query, via an SSH-Tunnel with Plink? [duplicate]

This question already has answers here:
Escaping parentheses within parentheses for batch file
(4 answers)
Closed 2 years ago.
I am trying to create a Batch file from a Python script which executes Plink to send an SQL-Query to an external Database via SSH. The script would have to activate a batch file with multiple command lines to be sent to the server.
Researching on the internet I have found, that a solution akin to the code snipped below should work.
(
echo command 1
echo command 2
...
) | plink.exe user#hostname -i sshkey.ppk
Entering my commands would yield the following:
(
echo mysql -u admin -pPassword Database
echo INSERT INTO Table VALUES(DEFAULT, (SELECT ID FROM Another_Table WHERE Another_ID = 'foo'), 'bar', 'foobar', 0, 'date', 1);
) | plink.exe user#hostname -i sshkey.ppk
The problem I have is that I am getting the following error: 'bar' can't be processed syntactically at this point. (I am sorry if the translation might be off here, english is not my first language).
I have checked if some special characters have to be escaped, but have not found any conclusive answers. Note, that the first command is correct and works as intended on its own; only the second command seems to be faulty. Would anybody be willing to provide me a solution?
So the answer here is that you need to escape the closing parenthesis TWICE, not only once, and thus have to use three "^" characters. This is because the command inside the brackets is parsed twice and the second "^" needs to be escaped for the first parsing, thus requiring a third character.
See here for details: Escaping parentheses within parentheses for batch file
The code would therefore look like this:
(
echo mysql -u admin -pPassword Database
echo INSERT INTO Table VALUES(DEFAULT, (SELECT ID FROM Another_Table WHERE Another_ID = 'foo'^^^), 'bar', 'foobar', 0, 'date', 1^^^);
) | plink.exe user#hostname -i sshkey.ppk

Mysql query to update multiple rows using a input file from linux

I'm trying to update multiple rows in a DB using a small script.
I need to update the rows based on some specific user_ids which I have in a list on Linux machine.
#! /bin/bash
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in ()";
As you see above, the user_ids are in a file, let's say /opt/test/user_ids_txt.
How can I import them into this command?
This really depends on the format of user_ids_txt. If we assume it just happens to be in the correct syntax for your SQL in statement, the following will work:
#! /bin/bash
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in ($(< /opt/test/user_ids_txt))";
The bash interpreter will substitute in the contents of the file. This can be dangerous for SQL queries, so I would echo out the command on the terminal to make sure it is correct before implementing it. You should be able to preview your SQL query by simply running the following on the command line:
echo "update device set in_use=0 where user_id in ($(< /opt/test/user_ids_txt))"
If your file is not in the SQL in syntax you will need to edit it (or a copy of it) before running your query. I would recommend something like sed for this.
Example
Let's say your file /opt/test/user_ids_txt is just a list of user_ids in the format:
aaa
bbb
ccc
You can use sed to edit this into the correct SQL syntax:
sed 's/^/\'/g; s/$/\'/g; 2,$s/^/,/g' /opt/test/user_ids_txt
The output of this command will be:
'aaa'
,'bbb'
,'ccc'
If you look at this sed command, you will see 3 separate commands separated by semicolons. The individual commands translate to:
1: Add ' to the beginning of every line
2: Add ' to the end of every line
3: Add , to the beginning of every line but the first
Note: If your ID's are strictly numeric, you only need the third command.
This would make your SQL query translate to:
update device set in_use=0 where user_id in ('aaa'
,'bbb'
,'ccc')
Rather than make a temporary file to store this, I would use a bash variable, and simply plug that into the query like this:
#! /bin/bash
in_statement="$(sed 's/^/\'/g; s/$/\'/g; 2,$s/^/,/g' /opt/test/user_ids_txt)"
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in (${in_statement})";

How to import a JSON file to PostgreSQL database in BASH

I'm working on a bash script and I need to import a JSON file into my Postgres database. This JSON file is really big and I have tried diferents ways but all of this didn´t work.
I can´t post a example of the JSON file because it is really big (like 15Mb)
Using a bash variable to store the data:
VAR=$(cat cucumber.json.1)
su -c "psql -A -t -d tareas -c \"insert into consulta (url, identificador, fecha, artefacto) values ('UNKNOWN', $identificadorBBDD, '$diaBBDD', :'$VAR')\"" postgres
It returns that the list of the arguments is too long, I think is because when is reading the variable, the command thinks that the variable just finish and it breaks the structure of the query.
I tried to use the function in postgres lo_import but I got the same result.
I used a postgres command to store the data but it didn´t work:
\set content `cat cucumber.json.1` create temp table t ( j json); insert into t values(:'content');
Thanks for your help.
Thanks for your help. Finally i can solve using the next code into my .sh
VAR=$(cat cucumber.json.1)
psql postgres://user:password#localhost:5432/tareas << EOF
insert into consulta (url, identificador, fecha ,artefacto) values ('UNKNOWN',
ident, 'date', '$VAR');
EOF
I don´t know why it doesn´t give me the same error as before but in this way, the query works and it imports the json file to the database.

How to convert MySQL query output in array in shell scripting?

I am storing output of MySQL query in a varible using shell scripting. The output of SQL query is in multiple rows. When I checked the count of the variable (which I think is an array), it is giving 1. My code snippet is as follows:
sessionLogin=`mysql -ugtsdbadmin -pgtsdbadmin -h$MYSQL_HOST -P$MYSQLPORT CMDB -e " select distinct SessionID div 100000 as 'MemberID' from SessionLogin where ClientIPAddr like '10.104%' and LoginTimestamp > 1426291200000000000 order by 1;"`
echo "${#sessionLogin[#]}"
How can I store the MySQL query output in an array in shell scripting?
You can loop over the output from mysql and append to an existing array. For example, in Bash 3.1+, a while loop with process substitution is one way to do it (please replace the mysql parameters with your actual command)
output=()
while read -r output_line; do
output+=("$output_line")
done < <(mysql -u user -ppass -hhost DB -e "query")
echo "There are ${#output[#]} lines returned"
Also take a look at the always excellent BashFaq

Using shell script to insert data into remote MYSQL database

I've been trying to get a shell(bash) script to insert a row into a REMOTE database, but I've been having some trouble :(
The script is meant to upload a file to a server, get a URL, HASH, and a file size, connect to a remote mysql database, and insert the data into an existing table. I've gotten it working until the remote MYSQL database bit.
It looks like this:
#!/bin/bash
zxw=randomtext
description=randomtext2
for file in "$#"
do
echo -n *****
ident= *****
data= ****
size=` ****
hash=`****
mysql --host=randomhost --user=randomuser --password=randompass randomdb
insert into table (field1,field2,field3) values('http://www.example.com/$hash','$file','$size');
echo "done"
done
I'm a total noob at programming so yeah :P
Anyway, I added the \ to escape the brackets as I was getting errors. As it is right now, the script is works fine until connects to the mysql database. It just connects to the mysql database and doesn't do the insert command (and I don't even know if the insert command would work in bash).
PS: I've tried both the mysql commands from the command line one by one, and they worked, though I defined the hash/file/size and didn't have the escaping "".
Anyway, what do you guys think? Is what I'm trying to do even possible? If so how?
Any help would be appreciated :)
The insert statement has to be sent to mysql, not another line in the shell script, so you need to make it a "here document".
mysql --host=randomhost --user=randomuser --password=randompass randomdb << EOF
insert into table (field1,field2,field3) values('http://www.site.com/$hash','$file','$size');
EOF
The << EOF means take everything before the next line that contains nothing but EOF (no whitespace at the beginning) as standard input to the program.
This might not be exactly what you are looking for but it is an option.
If you want to bypass the annoyance of actually including your query in the sh script, you can save the query as .sql file (useful sometimes when the query is REALLY big and complicated). This can be done with simple file IO in whatever language you are using.
Then you can simply include in your sh scrip something like:
mysql -u youruser -p yourpass -h remoteHost < query.sql &
This is called batch mode execution. Optionally, you can include the ampersand at the end to ensure that that line of the sh script does not block.
Also if you are concerned about the same data getting entered multiple times and your rdbms getting inconsistent, you should explore MySql transactions (commit, rollback, etc).
Don't use raw SQL from bash; bash has no sane facility for sanitizing the data beforehand. Generate a CSV file and upload that instead.