Running multile sql statment in one transaction using myBatis - mysql

When trying to run multiple queries like:
<insert id="insertTest">
insert into table1 values('foo');
insert into table2 values('foo');
</insert>
using myBatis I get exception with sql error
You have an error in your SQL syntax ... at line 2
I have tried various combination of the following settings, all returned same result.
## JDBC connection properties.
driver=com.mysql.jdbc.Driver
url=jdbc:mysql://localhost:3306/test_db?allowMultiQueries=true
username=root
password=********
# If set to true, each statement is isolated
# in its own transaction. Otherwise the entire
# script is executed in one transaction.
auto_commit=false
# This controls how statements are delimited.
# By default statements are delimited by an
# end of line semicolon. Some databases may
# (e.g. MS SQL Server) may require a full line
# delimiter such as GO.
delimiter=;
full_line_delimiter=false
# This ignores the line delimiters and
# simply sends the entire script at once.
# Use with JDBC drivers that can accept large
# blocks of delimited text at once.
send_full_script=true
settings from question

Isn't the column name missing into which you want to insert the data?
insert into table1 (column_name) values('foo');

Related

MySQL - Error while connecting to MySQL Not all parameters were used in the SQL statement

I use MySQL connector (Python3) and I would Like to upload in a existing Table values of an CSV just one Column. I created a new column in the DB with:
ALTER TABLE myTable ADD `TEST` TEXT;
Now I created a python Query what is the Problem there?
#stvk is my dataframe
for i,row in stvk_u.iterrows():
print(row["datas_of_other_csv"])
cursor.execute("INSERT INTO myTable (TEST) VALUES(%s)",tuple(row["datas_of_other_csv"]))
But I get the error:
Error while connecting to MySQL Not all parameters were used in the SQL statement
Can I do not just insert into a existing table? I do not see what is wrong.
Thanks in advance
The statement requires exactly one parameter, while you provide > 1 parameter.
tuple function splits a string and returns a tuple of characters:
$ python3 -c "print(tuple('foo'))"
('f', 'o', 'o')
Correct would be:
cursor.execute(statement, (row["datas_of_other_csv"],)) # note the comma at the end of tuple

Executing sql strings from csv into mysql database

I've created a huge .csv with only one column, each column is a valid sql string update like:
UPDATE TBL SET LAST = 0 WHERE ID = 1534781;
Is there a way to execute each row as a single sql query? Also, I'm using datagrip, if anyone knows of a sort of tool I would be happy.
To execute a file against your database in DataGrip just use the context menu when observing your file in Files tool window
A CSV file that contains one column is just called a "file." :-)
The most common way of executing a series of SQL statements in a file is with the command-line mysql client:
mysql -e "source myfile.csv"
How about script:
begin
update ...
update ...
update ...
...
end;

JMeter sql syntax error using parameters with insert

I'm working with JMeter to load test queries on a mySQL database (memsql server, mySQL syntax). I'm using a gui version of JMeter to create a test plan xml file and then go to another server and run that xml in non-gui mode.
I have two queries running, one that selects and one that inserts. Both queries use parameters taken from a csv file I made with a script.
My SELECT statement works just fine with parameters taken from a csv file but I run into syntax errors with my INSERT statement:
INSERT INTO customer_transactions_current (`column_name1`, ... , `column_name12`)
VALUES ((${r1},${r2},${r3},${r4},${r5},${r6},${r7},${r8},${r9},${r10},${r11},${r12}));
In the section of the query in the gui mode under 'CSV Data Set Config' I choose to delimit the data by ',' and the variable names are r1,..,r12.
Under the query itself I entered the parameter types and again the same names, just as I did for the working SELECT query.
When I run the query I run into a syntax error on the first column (which is of type datetime):
java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near &apos;19:00:00,75400492,936988,56,1115,5,2156,8,2,3,909,3))&apos; at line 2
The dates I'm entering are of the form: '2018-11-2 20:00:00' and in the csv file they are present without apostrophes.
It seems that the syntax error has something to do with the date and in the position where it holds a space. I tried entering the STR_TO_DATE function for that column but kept getting syntax errors.
BUT when I try to take some values from the file and manually run the query it works fine! So my thoughts are that it has something to do JMeter's conversion of spaces before sending out the query.
Is the problem with my configurations of JMeter? Since the query is OK manually.
Add apostrophes to insert and remove unnecessary parenthesis
INSERT INTO customer_transactions_current ('column_name1', ... , 'column_name12')
VALUES ('${r1}','${r2}','${r3}','${r4}','${r5}','${r6}','${r7}','${r8}','${r9}','${r10}','${r11}','${r12}');
If you have date issue see using STR_TO_DATE

How to run multiple MySQL statements via JDBC sampler in JMeter

I am using JDBC sampler in JMeter 2.13.
I have around 100 delete statements in my JMeter sampler like below:
delete from abc where id >= ${Variable_Name};
delete from qwe where id >= ${Variable_Name};
delete from xyz where id >= ${Variable_Name};
Problem is that when I run a single statement in JDBC sampler, it works fine. But when ever I try to run 2 or more than 2 statements from my JDBC sampler. It always throws error.
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'delete from qwe where id >= 1;
Can someone please mention a workaround it? and how I can overcome this problem.
It seems you cannot execute multiple statements in a single JDBC Request element.
I had a similar situation where I needed to execute some clean up statements on the database before proceeding with the rest of the tests. I was able to achieve this by reading the SQL statements from an external file, using CSV Data Set Config nested in a Loop Controller, in a separate setUp Thread Group.
The elements were placed like this:
And I used the following configurations:
Loop Controller
Loop Count: Forever
CSV Data Set Config
Filename: /path/to/multiple-statements.sql
Variable Name: STMT
Recycle on EOF: False
Stop thread on EOF: True
JDBC Request
Query: ${STMT}
The Loop Controller is set to run forever, as the stop condition is set on the CSV Data Set Config. Each iteration will read one line of the file, set the variable STMT, then JDBC Request will execute the query ${STMT}.
When the end-of-file is reached, the setUp Thread Group will stop and the core test Thread Group will proceed.

Load data Infile #variable into infile error

I want to use a variable as the file name in Load data Infile. I run a below code:
Set #d1 = 'C:/Users/name/Desktop/MySQL/1/';
Set #d2 = concat( #d1, '20130114.txt');
load data local infile #d2 into table Avaya_test (Agent_Name, Login_ID,ACD_Time);
Unfortunately after running, there is a error with the commment like below:
"Error Code: 1064. You have an error in your SQL syntax ......"
Variable "#D2" is underlined in this code so it means that this error is caused by this variable.
Can you help me how to define correctly a variable of file name in LOAD DATA #variable infile ?
Thank you.
A citation from MySQL documentation:
The file name must be given as a literal string. On Windows, specify backslashes in path names as forward slashes or doubled backslashes. The character_set_filesystem system variable controls the interpretation of the file name.
That means that it can not be a parameter of a prepared statement, stored procedure, or anything "server-side". The string/path evaluation must be done client side.
In general, it is not possible directly but you can use a temporary file.
The procedure is similar to using a combination PREPARE-EXECUTE. Notice that you cannot use PREPARE with LOAD DATA INFILE, for this reason you require a temporary file.
Here's an example of how to read a file with today's date:
SET #sys_date = CURRENT_DATE();
SET #trg_file = CONCAT("LOAD DATA INFILE '/home/data/file-",#sys_date, "' INTO TABLE new_data FIELDS TERMINATED BY ' ';");
SELECT #trg_file INTO OUTFILE '/tmp/tmp_script.sql';
SOURCE /tmp/tmp_script.sql;
WARNING: You cannot overwrite files with mysql, for this reason the temporary file must not exist. It is a serious problem if you want to automatize the previous example.
Unfortunately, this does not seem to be possible in mysql.
In addition to Binary Alchemist's answer, it's also not possible with prepaired statements either, as it is not on this list: SQL Syntax Allowed in Prepared Statements
You could use soemthing external to generate your load data infile statement, and then run the sql. For instance you could create it in Excel.
This sysntax can't work since the variables are interpreted by the server, and the file is read by the mysql client.
Therefore for the client #d2 is an illegal file name.
My solution works on Linux/Mac/[Windows with bash] (e.g. cygwin)
Create a template with the load SQL, e.g. load.tpl -- note the %FILENAME% placeholder
LOAD DATA LOCAL INFILE '/path/to/data/%FILENAME%'
INTO TABLE Avaya_test ( Agent_Name, Login_ID, ACD_Time );
Then run this command:
for n in `ls *.txt`; do echo $n; sed -e 's/%FILENAME%/'$n'/g' load.tpl | mysql -u databaseUser databaseName; done