I was running a simple query in MySQL, joining two tables and getting where it doesn't match. Both tables have 500k data. my query was something like
select count(*) from t1 join t2 t1.id <> t2.id
and after 300 seconds I got following error
Error Code: 1317. Query execution was interrupted
after that, I could not run a simple query on that table, like
select * from t1 limit 50
but all other tables were working and my system also got down for a while. Finally, I restarted my MySQL server then everything started working.
Any idea why my table got stuck??
TIA
Your table was locked, if a query crash for some reason, you have to kill your query to unlock you table (or restart your mysql server)
Related
I found a strange behavior on following query:
UPDATE llx_socpeople SET no_email=1 WHERE rowid IN (SELECT source_id FROM llx_mailing_cibles where tag = "68d74c3bc618ebed67919ed5646d0ffb");
takes 1 min and 30 seconds.
When I split up the commands to 2 queries:
SELECT source_id FROM llx_mailing_cibles where tag = "68d74c3bc618ebed67919ed5646d0ffb";
Result is 10842
UPDATE llx_socpeople SET no_email=1 WHERE rowid = 10842;
Result is shown in milliseconds.
Table llx_socpeople has about 7.000 records, llx_mailing_cibles has about 10.000 records.
MySQL Version is: 5.7.20-0ubuntu0.16.04.1
I already tried to optimize/repair both tables with no effect.
Any ideas?
Currently, as the subquery is being run for each row of the main query, we can expect a longer execution time.
What I would suggest would be to rely on a inner join for performing the update:
UPDATE llx_socpeople AS t1
INNER JOIN llx_mailing_cibles AS t2
ON t1.rowid = t2.source_id
SET t1.no_email=1
WHERE t2.tag = "68d74c3bc618ebed67919ed5646d0ffb";
This way you will definitely get far better performance.
You can troubleshoot your slow queries using the EXPLAIN MySql statement. Find more details about it on it's dedicated page from the official documentation. It might help you discover any missing indexes.
Is it possible to select subquery on same table, i.e.
...(select * from tableA a, select somecolumnA, max(somecolumnB) from tableA group by somecolumnA) b...
... indicates the other queries in the other tables using left joins etc.
will this cause any slowness in mysql?
Actually, I am having error "lost connection to mySQL server during query". I am looking if this part is actually causing the problem.
Running the following works great:
SELECT email FROM User WHERE empNum IN (126,513,74)
However, this takes a very long to reply (no errors) using:
SELECT email FROM table1 WHERE empNum IN (
SELECT empNum FROM table2 WHERE accomp = 'onhold' GROUP BY empNum
)
What is causing this?
How about that one?
SELECT DISTINCT table1.email
FROM table1
INNER JOIN table2 USING(empNum)
WHERE table2.accomp = 'onhold'
You should probably make an index on table2.accomp if you use that query often enough:
CREATE INDEX accomp ON table2 (accomp);
or maybe
CREATE INDEX accomp ON table2 (empNum,accomp);
To perform some crude (but deciding) benchmarks:
log in mysql console
clear the query cache(*):
RESET QUERY CACHE;
run the slow query and write down the timing
create an index
clear the query cache
run the slow query and write down the timing
drop the index
create the other index
clear the cache
run the slow query one more time
compare the timings and keep the best index (by droping the current one and creating the correct one if necessary)
(*) You will need the relevant privileges to run that command
I think the join statement you need is:
SELECT email FROM table1
INNER JOIN table2
ON table1.empNum=table2.empNum
AND table2.accomp = 'onhold'
I am somewhat inexperience with MySQL so please excuse my naivety.
I am trying to merge two tables according to an ISR id number and then insert that data into another table, adr. I am running a SQL query on a MySQL database in Workbench 5.2 and my query takes less than a second if I limit my code as follows:
TRUNCATE adr;
INSERT into adr
(SELECT drug08q1.isr, drug08q1.drugname, reac08q1.pt
FROM drug08q1
INNER JOIN reac08q1
ON drug08q1.isr = reac08q1.isr LIMIT 0,2815);
SELECT * FROM adr;
If I increment the limit to:
LIMIT 0,20816
The SQL will say it's running forever and never finish. I have no idea why incrementing the LIMIT beyond some random threshold will result in an infinite loop. What could I be doing incorrectly?
Thank you in advance!
I have a statement as below
CREATE TABLE INPUT_OUTPUT
SELECT T1_C1,.....,T1_C300, T1_PID from T1
INNER JOIN (SELECT T2_C1,T2_C2,T2_PID FROM T2) as RESPONSE ON T1.T1_PID=RESPONSE.T2_PID
which is running extremely slow - for 5 hours now. The two tables have about 4 million rows and a few hundred columns.
I have an 8-core, 64gb ram ubuntu-linux machine and using top I can see that not even 3gb is being used by the mysql process on just one core, although admittedly it's usage is consistently at 100%. It's upsetting that not all cores are being used.
I want to create the table much faster than this.
Should I use
CREATE TABLE INPUT_OUTPUT LIKE T1
alter INPUT_OUTPUT by adding the extra columns for those relevant in T2 and then populate it? I'm not sure of the syntax to do it and whether it will lead to a speed up.
Does T1_PID have an index? If so, this should run quickly. Run an EXPLAIN of the SELECT part of your query and see what it says.
That said, I don't understand why you need the subquery. What is wrong with:
CREATE TABLE INPUT_OUTPUT
SELECT T1_C1,.....,T1_C300, T1_PID, T2_C1, T2_C2, T2_PID
FROM T1 INNER JOIN T2 ON T1.T1_PID=T2.T2_PID
Using the latter should work if either T1 or T2 has a PID index.