I have a Message object that has log entries that are added as the message is processed.
Domain class Message has:
SortedSet messageLogEntries
static hasMany = [messageLogEntries: MessageLogEntry]
void addLogEntry(String entry) {
def mle = new MessageLogEntry(logEntry: entry)
this.addToMessageLogEntries(mle)
this.save(failOnError: true, flush: true)
log.debug(entry)
}
I can step through the code the entry is created and saved and allocated an id but when I query the database in MySql the entry is not there.
This was working but is not since I converted from mysql 5.5 to 5.6.10
Please help.
If this code is being executed inside a Hibernate transaction, you will not see it in the DB until the transaction is committed. Are you running this code inside an integration test or from a transactional execution stream (e.g. a service)?
Related
In FastAPI, I need to dynamically connect to a database after a POST request, i.e, in the POST request body, I receive database_name on which I need to connect.
So I have tried this:
import databases
#app.post("/computers/", response_model=Computer)
async def create_computer(computer: ComputerIn):
DATABASE_URL = "postgresql://username:password#localhost/"+computer.database_name
database = databases.Database(DATABASE_URL)
database.connect()
...
But I get the following error:
File
"/home/.local/lib/python3.8/site-packages/databases/backends/postgres.py",
line 169, in acquire
assert self._database._pool is not None, "DatabaseBackend is not running" AssertionError: DatabaseBackend is not running
Any idea why this might not work ?
Thanks
Two things:
You forgot the await keyword. Here's an example taken from the github's page https://github.com/encode/databases
# Create a database instance, and connect to it.
from databases import Database
database = Database('sqlite+aiosqlite:///example.db')
await database.connect()
# Create a table.
query = """CREATE TABLE HighScores (id INTEGER PRIMARY KEY, name VARCHAR(100), score INTEGER)"""
await database.execute(query=query)
It is not good practice to connect to a database on every request. It is better to connect at runtime and then share the connection pool, so that at every request a new connection is not create. Instead, the currently active connection pool is used, wasting fewer resource.
See https://fastapi.tiangolo.com/advanced/events/ for further information
Can anyone please provide advice on how to enlist in an MVCC session from SSIS?
Reading from an Ingres DB, we have a requirement to enable MVCC and specify the isolation level from within an SSIS 2008 R2 package.
An existing application exists over this database, that DOES NOT use MVCC, and hence it is not appropriate to simply enable MVCC on the existing DBMS. The reason we want our reads to enlist in MVCC is to ensure we do not cause locks and break this existing application (as is currently periodically happening when we do not use MVCC to perform these reads).
DB version is Ingres II 10.0.0 (su9.us5/132)
ADO.NET driver version is Ingres.Client.IngresConnection, Ingres.Client, Version=2.1.0.0 driver,
We have a similar requirement to do so programmatically from within Tibco BusinessWorks, and interactively via eg SQL Squirrel, and meet this need by issuing the following commands via direct SQL execution (via JDBC):
SET LOCKMODE SESSION WHERE LEVEL = MVCC;
SET SESSION ISOLATION LEVEL READ COMMITTED;
In SSIS we can set the session isolation level using the IsolationLevel property of the task/sequence. But I can find no means of issuing the MVCC command directly.
I have attempted to issue the command via an Exceute SQL Task step, but I encounter the following error:
Syntax error on line 1. Last symbol read was: 'SET LOCKMODE'
What I've tried, to no avail:
With or without the terminating ;
Execute step placed within or outside of a sequence
Enabled the DelayValidation property, at both the sequence and step level
Various TransactionOption settings at the sequence and task level (in case they mattered!)
Setting the lockmode via a windows environment variable ING_SET = "SET LOCKMODE SESSION WHERE LEVEL = MVCC". But my testing shows this is not honoured by the ADO.NET driver we're using in SSIS (nor, incidentally, is it honoured by the JDBC driver we use for SQL Squirrel or Tibco). I believe this is probably an ODBC feature.
Issuing the command from within an ADO.NET source step within a dataflow. Same syntax error.
[UPDATE] Had also tried wrapping the SET ... commands in an Ingres procedure, but this resulted in syntax errors suggesting the SET ... command is not valid anywhere within a procedure.
Can anyone please provide advice on how to enlist in an MVCC session from SSIS?
At this stage (I believe) we're constrained to the ADO.NET driver, but if there's no other option that to go with ODBC then so be it.
Answering my own question here.
Two possible approaches were conceived.
1. Use a Script Component (within a Data Flow step)
From within a script component, I was able to issue the SET ... commands directly via ADO.NET.
The problem with this approach was that I wasn't able to retain the connection on which these commands had been run, for subsequent (or parallel, within the same dataflow) ADO.NET source components.
Attempting to work via a specific connection via transactions was no good, because these commands must be issued outside of an ongoing transaction.
So ultimately I had to also issue the source select from within this component, which even then is less than ideal as the subsequent destination insert operation could then not enlist in the same transaction as the source select.
The solution using this approach ended up being:
- Using MVCC, copy the data from a source view, into a temp staging table on the source system.
- Then using a transaction, read from the source staging table into the destination system.
Code looks something like this (NB had to explicitly add a reference to Ingres .NET Data Provider\v2.1\Ingres.Client.dll
/* Microsoft SQL Server Integration Services Script Component
* Write scripts using Microsoft Visual C# 2008.
* ScriptMain is the entry point class of the script.*/
using System;
using System.Data;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;
using Ingres.Client;
using System.Collections.Generic;
[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
private bool debug = true;
private IDTSConnectionManager100 cm;
private IngresConnection conn;
public override void AcquireConnections(object Transaction)
{
// The connection manager used here must be configured in the Script Component editor's "Connection Managers" page.
// "Connection" is the (default) strongly typed name for the first connection added.
// In this case, it needs to be a reference to the xxxxx connection manager (by convention it should be "xxxxx_ADONET").
cm = this.Connections.Connection;
conn = (IngresConnection)cm.AcquireConnection(Transaction);
}
public override void PreExecute()
{
debugMessage("PreExecute", "Started");
base.PreExecute();
string viewName = Variables.vViewName;
IngresCommand cmdSetSessionLockMode = conn.CreateCommand();
IngresCommand cmdSetSessionIsolationLevel = conn.CreateCommand();
IngresCommand cmdReaderQuery = conn.CreateCommand();
List<string> sqlCommandStrings = new List<string>();
if (Variables.vUseIngresMVCC)
{
sqlCommandStrings.Add("SET LOCKMODE SESSION WHERE LEVEL = MVCC");
}
sqlCommandStrings.Add("SET SESSION ISOLATION LEVEL READ COMMITTED");
sqlCommandStrings.Add(String.Format("MODIFY {0}_STAGING TO TRUNCATED", viewName));
sqlCommandStrings.Add(String.Format("INSERT INTO {0}_STAGING SELECT * FROM {0}", viewName));
foreach (string sqlCommandString in sqlCommandStrings)
{
debugMessage("PreExecute", "Executing: '{0}'", sqlCommandString);
IngresCommand command = conn.CreateCommand();
command.CommandText = sqlCommandString;
int rowsAffected = command.ExecuteNonQuery();
string rowsAffectedString = rowsAffected >= 0 ? rowsAffected.ToString() : "No";
debugMessage("PreExecute", "Command executed OK, {0} rows affected.", rowsAffectedString);
}
debugMessage("PreExecute", "Finished");
}
public override void CreateNewOutputRows()
{
// SSIS requires that we output at least one row from this source script.
Output0Buffer.AddRow();
Output0Buffer.CompletedOK = true;
}
public override void PostExecute()
{
base.PostExecute();
// NB While it is "best practice" to release the connection here, doing so with an Ingres connection will cause a COM exception.
// This exception kills the SSIS BIDS designer such that you'll be unable to edit this code through that tool.
// Re-enable the following line at your own peril.
//cm.ReleaseConnection(conn);
}
private void debugMessage(string method, string messageFormat, params object[] messageArgs)
{
if (this.debug)
{
string message = string.Format(messageFormat, messageArgs);
string description = string.Format("{0}: {1}", method, message);
bool fireAgain = true;
this.ComponentMetaData.FireInformation(0, this.ComponentMetaData.Name, description, "", 0, ref fireAgain);
}
}
}
Answering my own question here.
Two possible approaches were conceived.
2. Set up a dedicated MVCC-enabled process of the Ingres DBMS over the existing database, and connect via this
This is the approach we're currently pursuing (as it is supported, and ideally transparent). I will update with details once they are known.
I'm getting the an OptimisticLockException when I try to update a managed entity EJB.
The EJB was fetched via:
port = entityManager.find(PortEntity.class, portID);
and then the EJB and the entityManager has been passed to a SAX ContentHandler so that in the endDocuent() method it can be updated. The ContentHandler has extracting the time zone information from the data returned from Google's Time Zone API server(s).
The code snippet is:
entityManager.refresh(port);
if (entityManager.contains(port))
log.info("Contained: " + port);
else
log.info("NOT Contained: " + port);
port.setTimezone(toTimezone);
entityManager.flush(); // <-- Line 70
And the log file show:
13:48:05,568 INFO [GeotimezoneHandler] Status: OK
13:48:05,569 INFO [GeotimezoneHandler] Raw offset: 3600.0000000
13:48:05,570 INFO [GeotimezoneHandler] DST offset: 0.0000000
13:48:05,570 INFO [GeotimezoneHandler] Timezone ID: Europe/Madrid
13:48:05,571 INFO [GeotimezoneHandler] Timezone Name: Central European Standard Time
13:48:05,577 INFO [GeotimezoneHandler] Contained: SeaPort[id=ESBCN, name=Barcelona]
13:48:05,591 ERROR [GeotimezoneHandler] Updating curise: javax.persistence.OptimisticLockException: org.hibernate.StaleObjectStateException: Row was updated or deleted by another transaction (or unsaved-value mapping was incorrect): [com.nutrastat.voyager.entity.PortEntity$Sea#ESBCN]
at org.hibernate.ejb.AbstractEntityManagerImpl.wrapStaleStateException(AbstractEntityManagerImpl.java:1390) [hibernate-entitymanager-4.0.1.Final.jar:4.0.1.Final]
at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1308) [hibernate-entitymanager-4.0.1.Final.jar:4.0.1.Final]
at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1289) [hibernate-entitymanager-4.0.1.Final.jar:4.0.1.Final]
at org.hibernate.ejb.AbstractEntityManagerImpl.convert(AbstractEntityManagerImpl.java:1295) [hibernate-entitymanager-4.0.1.Final.jar:4.0.1.Final]
at org.hibernate.ejb.AbstractEntityManagerImpl.flush(AbstractEntityManagerImpl.java:976) [hibernate-entitymanager-4.0.1.Final.jar:4.0.1.Final]
at org.jboss.as.jpa.container.AbstractEntityManager.flush(AbstractEntityManager.java:439) [jboss-as-jpa-7.1.1.Final.jar:7.1.1.Final]
at com.nutrastat.voyager.util.GeotimezoneHandler.endDocument(GeotimezoneHandler.java:70) [voyager-lib.jar:]
So if the entityManager contains the EJB why after modifing it do I get the exception?
As always many thanks for your help
Steve
P.S.
I have looked at this thread and The MySQL database is using InnoDB, but I don't know how to execute the SELECT ##tx_isolation; command from within my code.
After two days of research I finally found what the problem was.
The entity class inherited from its superclass a javax.persistence.Version field. I had also hand injected data into the table, and because the version field was defined as allowing nulls had not bothered to insert a value, but one was needed.
using:
Python 2.7.3
SQLAlchemy 0.7.8
PyODBC 3.0.3
I have implemented my own Dialect for the EXASolution DB using PyODBC as the underlying db driver. I need to make use of PyODBC's output_converter function to translate DECIMAL(x, 0) columns to integers/longs.
The following code snippet does the trick:
pyodbc = self.dbapi
dbapi_con = connection.connection
dbapi_version = dbapi_con.getinfo(pyodbc.SQL_DRIVER_VER)
(major, minor, patch) = [int(x) for x in dbapi_version]
if major >= 3:
dbapi_con.add_output_converter(pyodbc.SQL_DECIMAL, self.decimal2int)
I have placed this code snippet in the initialize(self, connection) method of
class EXADialect_pyodbc(PyODBCConnector, EXADialect):
Code gets called, and no exception is thrown, but this is a one time initialization. Later on, other connections are created. These connections are not passed through my initialization code.
Does anyone have a hint on how connection initialization works with SQLAlchemy, and where to place my code so that it gets called for every new connection created?
This is an old question, but something I hit recently, so an updated answer may help someone else along the way. In my case, I was trying to automatically downcase mssql UNIQUEIDENTIFIER columns (guids).
You can grab the raw connection (pyodbc) through the session or engine to do this:
engine = create_engine(connection_string)
make_session = sessionmaker(engine)
...
session = make_session()
session.connection().connection.add_output_converter(SQL_DECIMAL, decimal2int)
# or
connection = engine.connect().connection
connection.add_output_converter(SQL_DECIMAL, decimal2int)
I have two databases with the same schema inside a Sql 2008 R2 Server, of which names are Database1 and Database2. I connected and performed queries on the Database1, and then changed to Database2 to fetch my entities using the following code
this.ConnectionString = "Server=TestServer; Database=Database2;Trusted_Connection=true";
using (IDataAccessAdapter adapter = new DataAccessAdapter(this.ConnectionString))
{
var entities = new EntityCollection<T>();
adapter.FetchEntityCollection(entities, null);
return entities;
}
(The connection string was set before executing the code).
I debugged the application and looked at the value of the connection string, it pointed to the Database2.
However, when I executed the above code, the result was return from the Database1. And if I looked at SQL Profiler, the statement was executed against Database1.
So, could anyone know what was going on? Why the query was executed against the Database1, not Database2.
PS: If I used the above connection string with plain ADO.NET, I was able to retrieve data from Database2.
Thanks in advance.
I have figured out what was going on. The reason was: by default LLBL Gen Pro uses fully qualified names like [database1].[dbo].[Customer] to access database objects, and the catalog is specified when generating entities. So you can't access objects just by changing the connection string.
Hence, to change to another database you have to override the default catalogue by using following code
var adapter= new DataAccessAdapter(ConnectionString, false,
CatalogNameUsage.ForceName, DbName)
{CommandTimeOut = TenMinutesTimeOut};
More information can be found at the following link