Execute MySQL Query batch file - mysql

Hi everyone here is my problem.
I'm trying to run this sql query:
select *columns* from table *left joins* where *conditions* and column >= date_sub(current_date(),INTERVAL 1 YEAR) order by column INTO OUTFILE 'C:\path\file.csv' FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n';
Through this file.bat code:
echo select... | mysql -u user -p******* -D database
But the output file is not made, instead a weird file named "date_sub(current_date()" file with no extension is made, this file contains information about mysql, like commands, version and other stuff which tells me this part of the query is creating conflict >=date_sub(current_date(),INTERVAL 1 YEAR)
I've tried several other tests like:
Adding double quotes to the sql query, but that gives a mysql syntax error.
Using another way to execute the mysql query through the batch file: mysql -u user -p****** -D database select... but this gives the same output as before
If I remove the condition that gives me trouble then the file.csv is successfully created.
Is there any other way to work around this problem?
Thanks in advance for the answers.

Related

Mysql query to update multiple rows using a input file from linux

I'm trying to update multiple rows in a DB using a small script.
I need to update the rows based on some specific user_ids which I have in a list on Linux machine.
#! /bin/bash
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in ()";
As you see above, the user_ids are in a file, let's say /opt/test/user_ids_txt.
How can I import them into this command?
This really depends on the format of user_ids_txt. If we assume it just happens to be in the correct syntax for your SQL in statement, the following will work:
#! /bin/bash
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in ($(< /opt/test/user_ids_txt))";
The bash interpreter will substitute in the contents of the file. This can be dangerous for SQL queries, so I would echo out the command on the terminal to make sure it is correct before implementing it. You should be able to preview your SQL query by simply running the following on the command line:
echo "update device set in_use=0 where user_id in ($(< /opt/test/user_ids_txt))"
If your file is not in the SQL in syntax you will need to edit it (or a copy of it) before running your query. I would recommend something like sed for this.
Example
Let's say your file /opt/test/user_ids_txt is just a list of user_ids in the format:
aaa
bbb
ccc
You can use sed to edit this into the correct SQL syntax:
sed 's/^/\'/g; s/$/\'/g; 2,$s/^/,/g' /opt/test/user_ids_txt
The output of this command will be:
'aaa'
,'bbb'
,'ccc'
If you look at this sed command, you will see 3 separate commands separated by semicolons. The individual commands translate to:
1: Add ' to the beginning of every line
2: Add ' to the end of every line
3: Add , to the beginning of every line but the first
Note: If your ID's are strictly numeric, you only need the third command.
This would make your SQL query translate to:
update device set in_use=0 where user_id in ('aaa'
,'bbb'
,'ccc')
Rather than make a temporary file to store this, I would use a bash variable, and simply plug that into the query like this:
#! /bin/bash
in_statement="$(sed 's/^/\'/g; s/$/\'/g; 2,$s/^/,/g' /opt/test/user_ids_txt)"
mysql -u user-ppassword db -e "update device set in_use=0 where user_id in (${in_statement})";

#1148 - The used command is not allowed with this MariaDB version

I've to import some data from a CSV file into a table of db on my Aruba server.
I use the following query:
LOAD DATA LOCAL INFILE 'test.csv' INTO TABLE dailycoppergg
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\r\n'
(
ddmmyy,
lmedollton,
changedolleuro,
euroton,
lmesterton,
delnotiz,
girm,
sgm
)
I tested this query on other Aruba server and it worked correctly but here, I've the following error:
#1148 - Il comando utilizzato non e` supportato in questa versione di MariaDB
How can I modify my query to import csv file data into dailycoppergg table? Can you help me, please? Thanks!
The query is fine, but MySQL client (mysql) disables local infile by default, you need to run it as mysql --local-infile ..., and then the same query should work.
The error message is a legacy and it's confusing.
Since you're using phpMyAdmin, I highly recommend you just use the Import tab instead of manually entering the import query in the SQL tab. phpMyAdmin can easily import CSV files and I don't see any advantage to entering the query manually.
In MySQL Workbench Add the line below. in the Advanced Tab, check the test connection, and close.
OPT_LOCAL_INFILE=1

redirect sql query results to a .log file

I have a c-shell scrip that connects to a mysql database database through and invokes a sql script which in turn invokes another sql script to run a query and return a report
#!/bin/csh
set MYSQL=${MYSQL_HOME}/mysql
set REPORT=${CLEADM_HOME}/Scripts/DataValidation/EOreport.sql
${MYSQL} ${CLEDBUSER} <${REPORT}
Then within the eoreport.sql I invoke another script like so
Source IERSs.sql
and finally in the IERSs.sql script i need to log the results to a log file but it is not working
SELECT *
FROM TB_EARTHORIENTATIONPARAMETER_UI
INTO OUTFILE '/vobs/tools/Scripts /results.log'
This is not working. All i see is the results of the query printed to the xterm(im using tcsh on solaris and the database is mysql client). Am i missing something?
i have even done research about the tee command that is supposed to pipe in you input and output i to the file that you specify as follows
tee /vobs/tools/Scripts/DataValidation/results.txt
SELECT * FROM TB_EARTHORIENTATIONPARAMETER_UI;
but this still outputs results to the screen and leaves my result.txt file empty. What am i missing ?
SELECT *
FROM TB_EARTHORIENTATIONPARAMETER_UI
INTO OUTFILE '/vobs/tools/Scripts /results.log'
you have a extra space between scripts and /, try this:
SELECT *
FROM TB_EARTHORIENTATIONPARAMETER_UI
INTO OUTFILE '/vobs/tools/Scripts/results.log'
Also you said :
"leaves my result.txt file empty." and you are trying to write a result.log file

Using MySQL in Powershell, how do I pipe the results of my script into a csv file?

In PowerShell, how do I execute my mysql script so that the results are piped into a csv file? The results of this script is just a small set of columns that I would like copied into a csv file.
I can have it go directly to the shell by doing:
mysql> source myscript.sql
And I have tried various little things like:
mysql> source myscript.sql > mysql.out
mysql> source myscript.sql > mysql.csv
in infinite variation, and I just get errors. My db connections is alright because I can do basic table queries from the command line etc... I haven't been able to find a solution on the web so far either...
Any help would be really appreciated!
You seem to not be running powershell, but the mysql command line tool (perhaps you started it in a powershell console though.)
Note also that the mysql command line tool cannot export directly to csv.
However, to redirect the output to a file just run
mysql mydb < myscript.sql >mysql.out
or e.g.
echo select * from mytable | mysql mydb >mysql.out
(and whatever arguments to mysql you need, like username, hostname)
Are you looking for SELECT INTO OUTFILE ? dev.mysql.com/doc/refman/5.1/en/select.html – Pekka 19 hours ago
Yep. Select into outfile worked! But to make sure you get column names you also need to do something like:
select *
from
(
select
a,
b,
c
)
Union ALL
(Select *
from actual)

Using shell script to insert data into remote MYSQL database

I've been trying to get a shell(bash) script to insert a row into a REMOTE database, but I've been having some trouble :(
The script is meant to upload a file to a server, get a URL, HASH, and a file size, connect to a remote mysql database, and insert the data into an existing table. I've gotten it working until the remote MYSQL database bit.
It looks like this:
#!/bin/bash
zxw=randomtext
description=randomtext2
for file in "$#"
do
echo -n *****
ident= *****
data= ****
size=` ****
hash=`****
mysql --host=randomhost --user=randomuser --password=randompass randomdb
insert into table (field1,field2,field3) values('http://www.example.com/$hash','$file','$size');
echo "done"
done
I'm a total noob at programming so yeah :P
Anyway, I added the \ to escape the brackets as I was getting errors. As it is right now, the script is works fine until connects to the mysql database. It just connects to the mysql database and doesn't do the insert command (and I don't even know if the insert command would work in bash).
PS: I've tried both the mysql commands from the command line one by one, and they worked, though I defined the hash/file/size and didn't have the escaping "".
Anyway, what do you guys think? Is what I'm trying to do even possible? If so how?
Any help would be appreciated :)
The insert statement has to be sent to mysql, not another line in the shell script, so you need to make it a "here document".
mysql --host=randomhost --user=randomuser --password=randompass randomdb << EOF
insert into table (field1,field2,field3) values('http://www.site.com/$hash','$file','$size');
EOF
The << EOF means take everything before the next line that contains nothing but EOF (no whitespace at the beginning) as standard input to the program.
This might not be exactly what you are looking for but it is an option.
If you want to bypass the annoyance of actually including your query in the sh script, you can save the query as .sql file (useful sometimes when the query is REALLY big and complicated). This can be done with simple file IO in whatever language you are using.
Then you can simply include in your sh scrip something like:
mysql -u youruser -p yourpass -h remoteHost < query.sql &
This is called batch mode execution. Optionally, you can include the ampersand at the end to ensure that that line of the sh script does not block.
Also if you are concerned about the same data getting entered multiple times and your rdbms getting inconsistent, you should explore MySql transactions (commit, rollback, etc).
Don't use raw SQL from bash; bash has no sane facility for sanitizing the data beforehand. Generate a CSV file and upload that instead.