So this question is a matter of good idea/bad idea. I am using a MySQL connection many times in a short amount of time. I have created my own method calls to update values, insert, delete, etc.
I am reusing the same connection for each of these methods, but I am opening and closing the connection at each call. The problem being that I need to check to make sure that the connection is not open before I try to open it again.
So, the question is: Is there danger in just leaving the MySQL connection open in between method calls? I'd like to just leave it open and possibly improve speed while I am at it.
Thanks for any advice!
Generally speaking, no you shouldn't be closing it if in the same class / library / code scope you're just going to open it again.
This is dependant on the tolling / connection library you're using. if you're using connection pooling some library's will not actually close the connection (immediately) but return it to the pool.
The only comment I'll make about reusing a connection is that if you're using variables that are connection specific those variables will still be valid for the same connection and may cause problems later if another query uses one of them and it has a value from a past query that is no longer reliant - however this would also raise questions about the suitability of the variable in the first place.
Opening a connection is something is within MySQL is fairly light (compared with other databases) however you shouldn't be creating extra work if you can avoid it.
Related
I'm using civetweb as a (websocket) server. I have some handlers for when I receive data, that will query mysql. These will spawn as a thread, every time there is a request.
Until now I was only using one mysql connection with the database, which I setup on the start off the program, in combination with the mongoose library. But with the threaded requests, it's causing me headaches, since mysql isn't thread-safe from the time you do mysql_select() until mysql_store_result(). I have tried putting a mutex around these mysql functions, but then perfomce drops a tenfold (from ~750 requests/second to ~75 requests/second).
What is the correct way to handle this? I've heard about a 'connection pool', but it's hard to find some simple examples with google (or wrap my head around a sane implementation).
It seems unlikely I'm the first person to encounter such a problem :).
I'm not sure if this will help you. I'd put it in a comment but I don't have enough reputation yet.
I had the same problem in VB.NET when I added multi threading to my application. To correct it there I made sure to call connection.open before all of my queries and added "pooling=true;" to the end of my mysql Connection String. Mysql will determine if it needs to open a connection or using existing one.
I'm working on a website that interacts with the database several times on a single page. At the moment, I'm opening a connection once at the top of the script and closing it after every query(at the bottom of the script).
But I read an article stressing the importance of closing a connection after each query, reopening it if another query needs to be made.
This doesn't seem practical, since it would noticeably increase the execution duration.
On the other end of the scale, I know some people don't even close the connection.
So my question is:
Which is the best method in general? Keep the connection open throughout the whole script, or open/close it for each query? And is it worth closing a connection?
I understand there's other related topics, but they all seem to have conflicting answers.
Thank you!
it dependents buddy :), usually some developers keep open the connection at peak times, but this is not nice to keep a (external) connection alive.
the best practice would be keeping connection open for a timeout, for example keep connection open for 10 seconds and after that close it if connection was idle.
also I have to mention about pool, databases usually provides connection pools for external access, it means database prepare x(256) number of connections, so the execution time will be decreased for the next connection, because they have prepared before, it dependents to the database behavior, for example a database would recycle a closed connection again and use it for another request, while another database would create one connection and act with external requests as sessions.
but generally this is recommended don't keep a database connection open, because it takes some resources(external resources that you cannot manage them) that causes reduce the performance.
and +1 for good question :).
when we want to connect to database from visual studio should we always check(before using another query) that if our connection is open,close it and again open it??
if(connection is open){
close connection;
}
open connection for another use;
Assuming you're using .NET, you should generally open and close the connection for each logical operation. If you're doing multiple queries in the same code path (i.e. you know you're about to do another query) then it makes sense to leave the connection open, but I wouldn't leave it open "just in case".
Bear in mind that the connection pooling in .NET makes it reasonably cheap to "open a connection" as the network connection will still be up anyway. Typically you'd use something like:
using (SqlConnection connection = new SqlConnection(...))
{
connection.Open();
using (SqlCommand command = new SqlCommand(connection, sql))
{
// Execute the command etc
}
}
The using statement will "close the connection" (read: return it to the connection pool) automatically when you're done.
Keeping each query (or set of queries) logically separate means it's harder to end up with race conditions, threading issues etc. I wouldn't recommend keeping a connection as a global variable or anything like that.
Of course, if you're using something like LINQ to SQL the connection handling is likely to be mostly done for you.
It depends, but in most cases - the answer is no.
You should close the connection only if there is a relatively long time between queries.
And I personally think, that the connection should be closed if the queries are not connected semantically - ie. the queries from different workflows or other logically complete pieces.
Especially for your case - if you use the connection pooling - it does not matter - will you close the connection or not,
BUT
in case of improper use of transactions or setting session state SET statement, which can interfere with the future queries - then I recommend to close the connection and reopen it - just to be sure, that you committed or rolled back all the forgotten transactions and session state was reset to its defaults
I have program that constantly query a mysql server. Each time I access the server I made a connection then query it. I wonder if I can actually save time by reuse the same connection and only reconnect when the connection is closed. Given that I can fit many of my queries within the duration of connection timeout and my program has only one thread.
yes - this is a good idea.
remember to use the timeout so you don't leave connections open permanently.
also, remember to close it when the program exits. (even after exceptions)
Yes by all means, re-use the connection!
If you are also doing updates/delete/inserts through that connection make sure you commit (or rollback) you transactions properly so that once you are "done" with the connection, it is left in a clean state.
Another option would be to use a connection pooler.
Yes, you should reuse the connection, within reason. Don't leave a connection open indefinitely, but you can batch your queries together so that you get everything done, and then close it immediately afterwards.
Leaving a connection open too long means that under high traffic you might hit the maximum number of possible connections to your server.
Reconnecting often is just slow, causes a lot of unnecessary chatter, and is simply a waste.
Instead, you should look into using mysql_pconnect function which will create persistent connection to the database. You can read about it here:
http://php.net/manual/en/function.mysql-pconnect.php
I'm in a situation where I need to make a call to a stored procedure from Rails. I can do it, but it either breaks the MySQL connection, or is a pseudo hack that requires weird changes to the stored procs. Plus the pseudo hack can't return large sets of data.
Right now my solution is to use system() and call the mysql command line directly. I'm thinking that a less sad solution would be to open my own MySQL connection independent of Active Record's connection.
I don't know of any reasons why this would be bad. But I also don't know the innards of the MySQL well enough to know it's 100% safe.
It would solve my problem neatly, in that with the controller actions that need to call a stored proc would open a fresh database connection, make the call and close it. I might sacrifice some performance, but if it works that's good enough. It also solves the issue of multiple users in the same process (we use mongrel, currently) in edge rails where it's now finally thread safe, as the hack requires two sql queries and I don't think I can guarantee I'm using the same database connection via Active Record.
So, is this a bad idea and/or dangerous?
Ruby on Rails generally eschews stored procedures, or implementing any other business logic in the database. One might say that you're not following "the Rails way" to be calling a stored proc in the first place.
But if you must call the stored proc, IMO opening a second connection from Ruby must be preferable to shelling out with system(). The latter method would open a second connection to MySQL anyway, plus it would incur the overhead of forking a process to run the mysql client.
You should check out "Enterprise Recipes with Ruby and Rails" by Maik Schmidt. It has a chapter on calling stored procedures from Rails.
MySQL can handle more than one connection per request, though it will increase the load on the database server. You should open the second connection in a 'lazy' manner, only when you are sure you need it on a given request.
Anyway, if performance were important in this application, you wouldn't be using Rails! >:-)
(joking!)
Considering how firmly RoR is intertwined with its own view of dbms usage, you probably should open a second connection to the database for any interaction it doesn't manage for you, just for SoC purposes if nothing else. It sounds from your description like it's the simplest approach as well, which is usually a strong positive sign.
Applications from other languages (esp. e.g. PHP) open multiple connections regularly (which doesn't make it desirable, but at least it demonstrates that mysql won't object.)
We've since tried the latest mysql gem from github and even that doesn't solve the problem.
We've patched the mysql adapter in Rails and that actually does work. All it does is make sure the MySQL connection has no more results before continuing on.
I'm not accepting this answer, yet, because I don't feel 100% that the fix is a good one. We haven't done quite enough testing. But I wanted to put it out there for anyone else looking at this question.