How to set the IGNORE flag in mysqlsh importTable util? - mysql

The following triggers a LOAD DATA INFILE statement in mysql8:
util.importTable("sample.csv", {schema: "myschema", table: "mytable", dialect: "csv-unix", fieldsTerminatedBy: ";", showProgress: true})
Question: how can I add the IGNORE INTO flag?

util.importTable is using LOAD DATA LOCAL INFILE and ignores duplicate-keys by default[1]. If you want replace duplicate-keys set replaceDuplicates option to true.
[1] MySQL Reference Manual for LOAD DATA Syntax
Duplicate-Key Handling
(…) With LOCAL, the default behavior is the same as if IGNORE is specified; this is because the server has no way to stop transmission of the file in the middle of the operation.

Related

MySQL LOAD DATA LOCAL INFILE - Would like to fail when delimiters don't match

I am attempting to use MySQL's LOAD DATA LOCAL INFILE function in airflow to populate a table from a temp file. The issue I am having, is that when delimiters don't match what is expected, I only get warning messages and the load continues anyway. Is it possible to use the LOCAL load and get the sql to fail entirely if the delimiters don't match? This is causing incorrect data to be loaded to the table when I'd prefer it to fail instead.
I attempted the non-local version LOAD DATA INFILE but I get permission errors, because my temp file and mysql server are on different hosts, so I think I am stuck with using the LOCAL option.
temp_to_table = MySQLOperator(
task_id='temp_to_table',
conn_id=mysql_conn_id,
sql="""
LOAD DATA LOCAL INFILE '{{ ti.xcom_pull(key='file_path') }}'
INTO TABLE airflow.users_test
FIELDS TERMINATED BY '\x01'
LINES TERMINATED BY '\x02';
""",
autocommit=True
)
Here are some of the warning messages I get, but the data continues loading to the table:
[2021-03-02 16:03:03,742] {logging_mixin.py:112} WARNING - /opt/airflow/plugins/mysql_plugins.py:89: Warning: (1265, "Data truncated for column 'id' at row 1")
cur.execute(self.sql)
[2021-03-02 16:03:03,742] {logging_mixin.py:112} WARNING - /opt/airflow/plugins/mysql_plugins.py:89: Warning: (1261, "Row 1 doesn't contain data for all columns")
cur.execute(self.sql)
[2021-03-02 16:03:03,743] {logging_mixin.py:112} WARNING - /opt/airflow/plugins/mysql_plugins.py:89: Warning: (1265, "Data truncated for column 'superuser' at row 2")

JMeter sql syntax error using parameters with insert

I'm working with JMeter to load test queries on a mySQL database (memsql server, mySQL syntax). I'm using a gui version of JMeter to create a test plan xml file and then go to another server and run that xml in non-gui mode.
I have two queries running, one that selects and one that inserts. Both queries use parameters taken from a csv file I made with a script.
My SELECT statement works just fine with parameters taken from a csv file but I run into syntax errors with my INSERT statement:
INSERT INTO customer_transactions_current (`column_name1`, ... , `column_name12`)
VALUES ((${r1},${r2},${r3},${r4},${r5},${r6},${r7},${r8},${r9},${r10},${r11},${r12}));
In the section of the query in the gui mode under 'CSV Data Set Config' I choose to delimit the data by ',' and the variable names are r1,..,r12.
Under the query itself I entered the parameter types and again the same names, just as I did for the working SELECT query.
When I run the query I run into a syntax error on the first column (which is of type datetime):
java.sql.SQLSyntaxErrorException: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '19:00:00,75400492,936988,56,1115,5,2156,8,2,3,909,3))' at line 2
The dates I'm entering are of the form: '2018-11-2 20:00:00' and in the csv file they are present without apostrophes.
It seems that the syntax error has something to do with the date and in the position where it holds a space. I tried entering the STR_TO_DATE function for that column but kept getting syntax errors.
BUT when I try to take some values from the file and manually run the query it works fine! So my thoughts are that it has something to do JMeter's conversion of spaces before sending out the query.
Is the problem with my configurations of JMeter? Since the query is OK manually.
Add apostrophes to insert and remove unnecessary parenthesis
INSERT INTO customer_transactions_current ('column_name1', ... , 'column_name12')
VALUES ('${r1}','${r2}','${r3}','${r4}','${r5}','${r6}','${r7}','${r8}','${r9}','${r10}','${r11}','${r12}');
If you have date issue see using STR_TO_DATE

MYSQL Bulk insert failed to import the CSV file

I have a CSV file with field delimiter as '~:-' and row delimiter as '!^('. When I execute the Bulk insert query as follows
QUERY FOR INSERTION:
"BULK INSERT SAMPLETABLE FROM sample.csv WITH (FIELDTERMINATOR = '~:-',ROWTERMINATOR = '!^(',ERRORFILE = 'C:/log/error.log',KEEPIDENTITY, KEEPNULLS,FIRSTROW = 2, DATAFILETYPE='widechar')".
I get the SQLException: Bulk load: An unexpected end of file was encountered in the data file.
I made sure the end of file is not '\n'. The other tables which are working, does not end with '\n'.
Even though ERRORFILE option is enabled in the SQL query, I am not seeing the error.log file getting created when SQLException is thrown.
Finally, found the root cause, I was doing a data migration from older version of the database to newer version. Found out that there is a schema change in the newer table, hence i got SQLException. Irony is that the Exception thrown didn't specifically indicate the Schema change.

Running multile sql statment in one transaction using myBatis

When trying to run multiple queries like:
<insert id="insertTest">
insert into table1 values('foo');
insert into table2 values('foo');
</insert>
using myBatis I get exception with sql error
You have an error in your SQL syntax ... at line 2
I have tried various combination of the following settings, all returned same result.
## JDBC connection properties.
driver=com.mysql.jdbc.Driver
url=jdbc:mysql://localhost:3306/test_db?allowMultiQueries=true
username=root
password=********
# If set to true, each statement is isolated
# in its own transaction. Otherwise the entire
# script is executed in one transaction.
auto_commit=false
# This controls how statements are delimited.
# By default statements are delimited by an
# end of line semicolon. Some databases may
# (e.g. MS SQL Server) may require a full line
# delimiter such as GO.
delimiter=;
full_line_delimiter=false
# This ignores the line delimiters and
# simply sends the entire script at once.
# Use with JDBC drivers that can accept large
# blocks of delimited text at once.
send_full_script=true
settings from question
Isn't the column name missing into which you want to insert the data?
insert into table1 (column_name) values('foo');

Load data Infile #variable into infile error

I want to use a variable as the file name in Load data Infile. I run a below code:
Set #d1 = 'C:/Users/name/Desktop/MySQL/1/';
Set #d2 = concat( #d1, '20130114.txt');
load data local infile #d2 into table Avaya_test (Agent_Name, Login_ID,ACD_Time);
Unfortunately after running, there is a error with the commment like below:
"Error Code: 1064. You have an error in your SQL syntax ......"
Variable "#D2" is underlined in this code so it means that this error is caused by this variable.
Can you help me how to define correctly a variable of file name in LOAD DATA #variable infile ?
Thank you.
A citation from MySQL documentation:
The file name must be given as a literal string. On Windows, specify backslashes in path names as forward slashes or doubled backslashes. The character_set_filesystem system variable controls the interpretation of the file name.
That means that it can not be a parameter of a prepared statement, stored procedure, or anything "server-side". The string/path evaluation must be done client side.
In general, it is not possible directly but you can use a temporary file.
The procedure is similar to using a combination PREPARE-EXECUTE. Notice that you cannot use PREPARE with LOAD DATA INFILE, for this reason you require a temporary file.
Here's an example of how to read a file with today's date:
SET #sys_date = CURRENT_DATE();
SET #trg_file = CONCAT("LOAD DATA INFILE '/home/data/file-",#sys_date, "' INTO TABLE new_data FIELDS TERMINATED BY ' ';");
SELECT #trg_file INTO OUTFILE '/tmp/tmp_script.sql';
SOURCE /tmp/tmp_script.sql;
WARNING: You cannot overwrite files with mysql, for this reason the temporary file must not exist. It is a serious problem if you want to automatize the previous example.
Unfortunately, this does not seem to be possible in mysql.
In addition to Binary Alchemist's answer, it's also not possible with prepaired statements either, as it is not on this list: SQL Syntax Allowed in Prepared Statements
You could use soemthing external to generate your load data infile statement, and then run the sql. For instance you could create it in Excel.
This sysntax can't work since the variables are interpreted by the server, and the file is read by the mysql client.
Therefore for the client #d2 is an illegal file name.
My solution works on Linux/Mac/[Windows with bash] (e.g. cygwin)
Create a template with the load SQL, e.g. load.tpl -- note the %FILENAME% placeholder
LOAD DATA LOCAL INFILE '/path/to/data/%FILENAME%'
INTO TABLE Avaya_test ( Agent_Name, Login_ID, ACD_Time );
Then run this command:
for n in `ls *.txt`; do echo $n; sed -e 's/%FILENAME%/'$n'/g' load.tpl | mysql -u databaseUser databaseName; done