I'm using SQL Magic to connect to a db2 instance. However, I can't seem to find the syntax anywhere on how to close the connection when I'm done querying the database.
you cannot explicitly close a connection using Jupyter SQL Magic. In fact, that is one of the shortcoming of using Jupyter SQL Magic to connect to DB2. You need to close your session to close the Db2 connection. Hope this helps.
This probably isn't very useful, and to the extent it is it's probably not guaranteed to work in the future. But if you need a really hackish way to close the connection, I was able to do it this way (for a postgres db, I assume it's similar for db2):
In[87]: connections = %sql -l
Out[87]: {'postgresql://ngd#node1:5432/graph': <sql.connection.Connection at 0x7effdbcf6b38>}
In[88]: conn = connections['postgresql://ngd#node1:5432/graph']
In[89]: conn.session.close()
In[90]: %sql SELECT 1
...
StatementError: (sqlalchemy.exc.ResourceClosedError) This Connection is closed
[SQL: SELECT 1]
[parameters: [{'__name__': '__main__', '__doc__': 'Automatically created module for IPython interactive environment', '__package__': None, '__loader__': None, '__s ... (123202 characters truncated) ... stgresql://ngd#node1:5432/graph']", '_i28': "conn = connections['postgresql://ngd#node1:5432/graph']\nconn.session.close()", '_i29': '%sql SELECT 1'}]]
A big problem is--if you want to reconnect, that doesn't seem to work. Even after running %reload_ext sql, and trying to connect again, it still thinks the connection is closed when you try to use it. So unless someone knows how to fix that behavior, this is only useful for disconnecting if you don't want to re-connect again (to the same db with the same params) before restarting the kernel.
You can also restart the kernel.
This is the most simple way I've found to close all connections at the end of the session. You must restart the kernel to be able to re-establish the connection.
connections = %sql -l
[c.session.close() for c in connections.values()]
sorry for being to late but I've just started with working with SQL Magic and got annoyed with the constant errors appearing. It's a bit of a awkward patch but this helped me use it.
def multiline_qry(qry):
try:
%sql {qry}
except Exception as ex:
if str(type(ex).__name__) != 'ResourceClosedError':
template = "An exception of type {0} occurred. Arguments:\n{1!r}"
message = template.format(type(ex).__name__, ex.args)
print (message)
qry = '''DROP TABLE IF EXISTS EMPLOYEE;
CREATE TABLE EMPLOYEE(firstname varchar(50),lastname varchar(50));
INSERT INTO EMPLOYEE VALUES('Tom','Mitchell'),('Jack','Ryan');
'''
multiline_qry(qry)
log out the notebook first if you want to close the connection.
Related
MySQL scenario:
When I execute "SELECT" queries in MySQL using multiple threads I get the following message: "Commands out of sync; you can't run this command now", I found that this is due to the limitation of having to wait "consume" the results to make another query. C ++ example:
void DataProcAsyncWorker::Execute()
{
std::thread (&DataProcAsyncWorker::Run, this).join();
}
void DataProcAsyncWorker :: Run () {
sql::PreparedStatement * prep_stmt = c->con->prepareStatement(query);
...
}
Important:
I can't help using multiple threads per query (SELECT, INSERT, ETC) because the module I'm building that is being integrated with NodeJS "locks" the thread until the result is already obtained, for this reason I need to run in the background (new thread) and resolve the "promise" containing the result obtained from MySQL
Important:
I am saving several "connections" [example: 10], and with each SQL call the function chooses a connection. This is: 1. A connection pool that contains 10 established connections, Ex:
for (int i = 0; i <10; i ++) {
Com * c = new Com;
c->id = i;
c->con = openConnection ();
c->con->setSchema("gateway");
conns.push_back(c);
}
The problem occurs when executing> = 100 SELECT queries per second, I believe that even with the connection balance 100 connections per second is a high number and the connection "ex: conns.at(10)" is in process and was not consumed
My question:
Does PostgreSQL have this limitation as well? Or in PostgreSQL there is also such a limitation?
Note:
In PHP Docs about MySQL, the mysqli_free_result command is required after using mysqli_query, if not, I will get a "Commands out of sync" error, in contrast to the PostgreSQL documentation, the pg_free_result command is completely optional after using pg_query.
That said, someone using PostgreSQL has already faced problems related to "commands are out of sync", maybe there is another name for this error?
Or is PostgreSQL able to deal with this problem automatically for this reason the free_result is being called invisibly by the server without causing me this error?
You need to finish using one prepared statement (or cursor or similar construct) before starting another.
"Commands out of sync" is often cured by adding the closing statement.
"Question:
Does PostgreSQL have this limitation as well? Or in PostgreSQL there is also such a limitation?"
No, the PostgreSQL does not have this limitation.
I am using the Mariaex.start_link method to establish a connection with MySQL database and it returns me a pid. I was wondering what's the best practice to manage these pids, i.e. close and create new ones every time? keep 1, 2, ... n pid(s) around as needed?
Also how would I close that connection or kill that pid? I tried Process.exit with :normal which doesn't stop it and I tried it with :kill but I get an error probably from Mariaex and it doesn't seem clean to kill it that way.
Thanks!
You might refer to Ecto codebase to see how it handles this case.
Basically, it starts a connection, executes a query and stops the Mariaex GenServer immediately after:
with {:ok, conn} <- Mariaex.start_link(opts) do
value = Ecto.Adapters.MySQL.Connection.execute(conn, sql, [], opts)
GenServer.stop(conn)
value
end
There are few example out there but non of them are very clarified (or on old version).
I want to call MySQL procedure and check the return status (in rails 4.2). The most common method I saw is to call result = ActiveRecord::Base.connection.execute("call example_proc()"), but in some places people wrote there is prepared method result = ActiveRecord::Base.connection.execute_procedure("Stored Procedure Name", arg1, arg2) (however it didn't compiled).
So what is the correct way to call and get the status for MySQL procedure?
Edit:
And how to send parameters safly, where the first parameter is integer, second string and third boolean?
Rails 4 ActiveRecord::Base doesn't support execute_procedure method, though result = ActiveRecord::Base.connection still works. ie
result = ActiveRecord::Base.connection.execute("call example_proc('#{arg1}','#{arg2}')")
You can try Vishnu approach below
or
You can also try
ActiveRecord::Base.connections.exec_query("call example_proc('#{arg1}','#{arg2}')")
here is the document
In general, you should be able to call stored procedures in a regular where or select method for a given model:
YourModel.where("YOUR_PROC(?, ?)", var1, var2)
As for your comment "Bottom line I want the most correct approach with procedure validation afterwards (for warnings and errors)", I guess it always depends on what you actually want to implement and how readable you want your code to be.
For example, if you want to return rows of YourModel attributes, then it probably would be better if you use the above statement with where method. On the other hand, if you write some sql adapter then you might want to go down to the ActiveRecord::Base.connection.execute level.
BTW, there is something about stored proc performance that should be mentioned here. In several databases, database does stored proc optimization on the first run of the stored proc. However, the parameters that you pass to that first run might not be those that will be running on it more frequently later on. As a result, your stored-proc might be auto-optimized in a "none-optimal" way for your case. It may or may not happen this way, but it is something that you should consider while using stored procs with dynamic params.
I believe you have tried many other solutions and got some or other errors mostly "out of sync" or "closed connection" errors. These errors occur every SECOND time you try to execute the queries. We need to workaround like the connection is new every time to overcome this. Here is my solution that didn't throw any errors.
#checkout a connection for Model
conn = ModelName.connection_pool.checkout
#use the new connection to execute the query
#records = conn.execute("call proc_name('params')")
#checkout the connection
ModelName.connection_pool.checkin(conn)
The other approaches failed for me, possibly because ActiveRecord connections are automatically handled to checkout and checking for each thread. When our method tries to checkout a connection just to execute the SP, it might conflict since there will be an active connection just when the method started.
So the idea is to manually #checkout a connection for the model instead of for thread/function from the pool and #checkin once the work is done. This worked great for me.
I have made a script using the following library;
http://www.autohotkey.com/board/topic/72629-mysql-library-functions/
to connect to my non-local database. However I'm issuing some max_user_connections problems and I think this is due to the fact that I never close the database connection.
I can't seem to find a way to do that using this library but I am not certain, maybe theres a way to close any connection to the internet or any database or whatever that would work build-in in AHK?
Script:
hi() {
mysql := new mysql
db := mysql.connect("x","x","x","x") ; host,user,password,database
if db =
return
sql =
(
UPDATE something
SET yo = yo+1
WHERE id = 1
)
result := mysql.query(db, sql)
}
Thanks in advance
The DLL of the AHK script has the mysql_close function, but it's not coded into the AHK library.
You can technically manually call the DLL just like the AHK and see if it'll work.
Since I also need to connect to a MySQL DB via AHK, I'll update this answer when a full solution is available.
I've connected to a MySQL database using Perl DBI. I would like to find out which database I'm connected to.
I don't think I can use:
$dbh->{Name}
because I call USE new_database and $dbh->{Name} only reports the database that I initially connected to.
Is there any trick or do I need to keep track of the database name?
Try just executing the query
select DATABASE();
From what I could find, the DBH has access to the DSN that you initially connected with, but not after you made the change. (There's probably a better way to switch databases.)
$dbh->{Name} returns the db name from your db handle.
If you connected to another db after connected with your dbh, using mysql query "USE db_name", and you did not setup a new perl DBI db handle, of course, $dbh->{Name} will return the first you previously connected to... It's not spontaneic generation.
So to get the connected db name once the db handle is set up - for DBI mysql:
sub get_dbname {
my ($dbh) = #_;
my $connected_db = $dbh->{name};
$connected_db =~ s/^dbname=([^;].*);host.*$/$1/;
return $connected_db;
}
You can ask mysql:
($dbname) = (each %{$dbh->selectrow_hashref("show tables")}) =~ /^Tables_in_(.*)/;
Update: obviously select DATABASE() is a better way to do it :)
When you create a connection object it is for a certain database. In DBI's case anyway. I I don't believe doing the SQL USE database_name will affect your connection instance at all. Maybe there is a select_db (My DBI is rusty) function for the connection object or you'll have to create a new connection to the new database for the connection instance to properly report it.
FWIW - probably not much - DBD::Informix keeps track of the current database, which can change if you do operations such as CREATE DATABASE. The $dbh->{Name} attribute is specified by the DBI spec as the name used when the handle is established. Consequently, there is an Informix-specific attribute $dbh->{ix_DatabaseName} that provides the actual current database name. See: perldoc DBD::Informix.
You could consider requesting the maintainer(s) of DBD::MySQL add a similar attribute.