"Lost connection to MySQL server during query" kills Windows Service - mysql

I have developed a windows service using Delphi 2007. It connects to a remote MySql database via the internet using TAdoConnection and TAdoQuery. I have retained the default value of 30 seconds for CommandTimeout property. I also create the connection/query objects on each new query and free them when done (i.e. I don't create the database connection at startup and keep it open).
Every once in a while the service stops and the event viewer shows "Lost connection to MySQL server during query". I have everything wrapped in exceptions. My suspicion is that there is a drop in the network while the query is executing.
Anyone have any resolution/ideas?
What triggers windows to shutdown the service?
Also, I have the service "Recovery" set to restart the service but this never happens.
My next step will be to start logging when each query starts and compare this to the date/time of the shutdown. Because as of now I don't know how log this is.

This is may be not a direct answer, but I had same problem few days ago, and I have the mysql on local server, and I connect using Mydac components.
After many tries, I found the problem came from one table that has BLOB fields, I tried to query on the table like
select * from table
And I got this problem after the query fetch around 1600 rows, after few inspection I found the problem came from few records and seems they corrupted. so when I do query like
Select * from my table where id not
between 1599 and 1650
I had this problem, if I removed Not from they query which fetch only 51 records I got the error, which means there's some records are corrupted, also I did mysqlcheck but it didn't fix the problem, and I tried some other check tools and I got same result, I didn't tried to delete these records, because I need why this problem happen, but I got busy with other things so I left the server for a while.
BTW, I used MySql Query browser for trying to do the queries, because other tools will give me the errors without showing how many records fetched before mysql instance terminated unexpectedly.

Related

MySql AutoIncrement causes slow MySQL connection

This may be an open-ended question, which could have many answers, but I will ask anyway...
I am in process of moving over MySql databases to AWS while developing a new web app. I try to keep the current MySQL PROD updated every time I change something in the DEV AWS system now. So when I sync the DEV with PROD weekly, table structures are all consistent.
I am also using Sequelize NodeJs package, which by default, is best when it works with ID fields when hooking to other tables/models. And I would rather avoid writing the SQL query statements. Therefore I needed to create an ID field on an existing table, rather than previous Primary Keys, which included 3 columns. When I add the ID column and assign it as the Primary and AI and remove the other 3 columns as not primary, the server opens up very slowly in MS Access, which is what the current company is using while I develop the Web application and slowly move away from current database.
Why would my actions cause this slowness? And how can I fix the issue of having the ID field as an AI field, but not cause MySQL to open up slow. I have verified this is what's causing this by undoing what I did and then MS Access can connect to db in 1-2 minutes.
Apparently when MS Access starts and finally connects to MySQL, everything runs fine after, it is the initial connection that takes around 15 minutes where before it was taking 1-2 minutes.
I have done a lot of research focusing more on MS Access configurations and MySQL configurations with no luck rather than maybe focusing on what I did that could have caused this. In which, I do not quite understand.
Thanks in advance.

MySQL hangs a user when running a query

I give you some technical details before going forward to the issue:
MySQL Version: 5.3.1
Engine: Innodb
Server running under Centos
I'm experiencing some troubles in the user connections when running certain queries. For this example to show you my problem I will make a query slow enough to provoke this scenario to happen.
Lets say I have my user "testuser1" and I use it to run a 2 minutes long query (long in purpose). While this query is running I try to run another query with the same user (and different tables) but it doesn't actually run until the first query finish (Kind of sequential).
First thing I thought is... well, maybe those tables are locked? (even tho I cannot even connect to the database at all).
So I tried to use a new user "testuser2" while this query was running. In this case I didn't have any problem at all on connecting to the database and running queries.
After this I thought the user might be blocked, so I tried to connect
my "testuser1" to a different database with no issue.
The next thing I tried was to use "testuser1" on a different IP while the first one was running. However I didn't experience any issue on running my queries.
Does anyone know why I cannot run two queries at a time on the same IP and how to fix this problem?
Thanks a lot!

Spotfire connection with MySql

I have been trying to use MySql data with Spotfire, Had a connection established and managed to make a working dxp file. Tried to do the same with a second table and it threw this error
The following error message "Streaming result set com.mysql.jdbc.RowDataDynamic#XXXXXX is still active. No statements may be issued when any streaming result sets are open and in use on a given connection. Ensure that you have called .close() on any active streaming result sets before attempting more queries" comes from MySQL.
afterwards I opened my already working dxp file and the error popped again. It seems I somehow have a connection still open and active but spotfire is suppose to manage the connection as far as I know at least
After looking this up online some people say it's a bug, others say I need to close the connection, but I can't find a solution. Please help
Edit: changing the minimum connections number (which probably closes the open connections) makes ot possible for the dxp to run once but then it fails the second time (after closing and opening)

Effect of Multiple TADOConnection

What is the effect of multiple tadoconnection?
Here is what I did :
I put a TADOConnection to almost every form in my application.
Those TADOConnection will connect to the database(MySQL) everytime I create an instance of a form.
In an average use of the application, about 15 forms will be used(15 tadoconnections connected to the database). So far my application is running smooth. But yesterday, a user complained of an error "MySQL has gone away".
I've encountered that error in the past and it was because the data is too large, or hardware problem. But today, the data is not big and the hardware is in excellent condition. By the way, the connection is local. Does the multiple tadoconnection produced the error?
The effect of multiple ADOConnections is that you, open multiple independent Session in the Database. I wouldnt recommend your solution, in consideration of Transactionmanagement and table locking
Server has gone away: http://dev.mysql.com/doc/refman/5.1/en/gone-away.html

Connecting and disconnecting from Mysql continuously with Excel

Newbie question....sorry
I have a simple mysql database running in our Intranet (windows server) which >20 people connect to for searching/inserting records, etc
This is done with a simple Excel GUI.
Process is:
Search Strings are typed in excel cells
VBA opens connection to Mysql and query is run
Results retrieved are put on excel Connection to
mysql closed with VBA
The above process takes in general 0-2 seconds. Records retrieved <100.
Everthing runs fine so far.
In order to be able to connect more people in future, I would like to have some feedback on whether it is ok to continously connect and disconnect from mysql in the way I am doing.
Can it cause some type of crash/memory leaks, etc ??
Is there some better way to do this?
I am hoping to get <2000 users, but I understand the more users connected, worse it is.
By disconnecting after each search/insert, I am hoping to keep the number of live connections as low as possible.
thanks for your input
This constant connecting and disconnecting is an expensive process.
A better way would be to use server-side scripting to manage your connections. This way you would have a single persistent connection to each server, and the users will execute their queries through the single connection. You will also need to implement some sort of job queue for execution.