I just found a strange problem in Hibernate.
In My Java EE web project within Hibernate framework and json-plugin. My code like this
private User user;
get(),set()....
public String getUser(){
if(findUser(...) != null){
user = findUser(...);
user.setPasssword("")//!important for the purpose of does not transmit the password to the front
return "success";
} else {
return "error";
}
}
the problem appeared when the code executed the User's password in database be cleared, I'm sure any update and insert function dosen't be triggered.
I want to know why? who can figure it out and thanks!
That's the base principle of an ORM like Hibernate: you manipulate objects mapped to database tables, and attached to a persistent session, and every changes you make on these objects are automatically recorded, transparently, in the database.
If you want your changes to the User object not recorded in the database, you need to first detach the object from the Hibernate session, using session.evict(user).
You don't seem to have grasped basic (and very important) principles of Hibernate. Read its excellent documentation.
Related
I am new to broadleaf application. I am able to run application using tomcat + mysql integration well. Now I want to move on with the development to customize the site project as per my requirement.
I am stuck on the point of persistant in broadleaf site module. I have tried using em.merge that returns my entity but do not save it in database and also tried #Transactional(value="blTransactionManager") but It still problem persists. I have tried bellow code in applicationContext-servlet.xml
<aop:config>
<aop:pointcut id="blMyStoreDao" expression="execution(* com.mycompany.dao.StoreDaoImpl.save*(..))"/>
<aop:advisor advice-ref="blTxAdvice" pointcut-ref="blMyStoreDao"/>
</aop:config>
Here is my controller code
newStore.setCustomer(customer);
newStore.setProductList(new ArrayList<ProductImpl>());
Store getStore=store.save(em, newStore);
System.out.println(getStore.getCustomer().getUsername());
System.out.println("customer fetched: "+customer.getEmailAddress());
Here is my daoimpl code
#Repository("blMyStoreDao")
#Transactional(value="blTransactionManager")
public class StoreDaoImpl implements StoreDao {
#PersistenceContext(unitName="blPU")
protected EntityManager em;
#Transactional(value="blTransactionManager")
public Store save(EntityManager em, Store store) {
System.out.println(em);
System.out.println(store.getCustomer().getUsername());
Store s= em.merge(store);
return s;
}
}
But it also didn't resolve my issue.
Code runs perfectly as it should be, but it doesn't save my entity in database.
Anybody Help. Thanks In advance
There isn't any reason to use <aop:config> especially in applicationContext-servlet.xml (if anywhere it should be in the root application context)
You should use #Transactional(TransactionUtils.DEFAULT_TRANSACTION_MANAGER to annotate your method
It is likely that your class was not being scanned by Spring. In Broadleaf, there is a default component scan set up in applicationContext.xml to scan com.mycompany.core.
I would recommend verifying that your dao is actually scanned by Spring and is initialized as a Spring bean. The fact that the entity manager did not get injected indicates that it did not get loaded by Spring correctly. One way to verify this would be to add an #PostConstruct method and print something or set a breakpoint to verify that it gets hit.
Update: I think is down to a Windsor configuration, does any one have any idea as to what I have not configured correctly with Windsor?
I am currently using Envers within a C# WebApi project. Windsor is used for IoC.
I have a custom RevisionEntity which add a User property to audit the user who has made the data change.
To ensure all configurations were correct I started off with a "simple string here" being added in the NewRevision method;
public class AuditRevisionListener : IRevisionListener
{
public void NewRevision(object revisionEntity)
{
((AuditRevision)revisionEntity).User = "Simple string here";
}
}
and all persisted as expected.
Next step is to achieve a full User object to which I need to obtain the UserService;
public class AuditRevisionListener : IRevisionListener
{
public void NewRevision(object revisionEntity)
{
var userServices = (IUserServices)GlobalConfiguration.Configuration.DependencyResolver.GetService(typeof(IUserServices));
var user = userServices.GetRequestingUser();
((AuditRevision)revisionEntity).User = user;
}
}
However, the DependencyResolver.GetService is throwing the error;
"Cannot access a disposed object. Object name: 'Scope cache was already disposed. This is most likely a bug in the calling code.'. "
UPDATE
I have now created a demo project available at https://github.com/ScottFindlater/WindsorEnversIssue
On first setting up the solution all will run fine because the custom Envers RevisionListener is not performing any dependency resolving.
Run the solution which performs a GET to the HomeController, which simply loads one User and modifies another;
Dependency resolving is shown to be working as there is an ActionFilter called DependencyResolverDoesWork which successfully resolves the UserServices.
Envers is shown to be working as the UserAudit table is populated.
To “turn on” the dependency resolving in the customer RevisionListener navigate to; Domain NHibernate project, Auditing folder, AuditRevisionListener class, NewRevision method and uncomment the 2 lines of code.
Full rebuild and then run the solution again and the project will run time exception in the WindsorDependencyResolver class, GetService method with “Cannot access a disposed object”, and clicking the View Detail Action expands this message to “{"Cannot access a disposed object.\r\nObject name: 'Scope cache was already disposed. This is most likely a bug in the calling code.'."}”.
The comment posted by Roger, thank you so much, which suggests changing the LifeStyle to Singleton does work. However, this demo has been purposefully kept simple and the use of PerWebRequest LifeStyle is needed because the ApplicationServices in the real project has contextual related data injected such as requesting user which is used to enforce security.
I am so stuck now and any pointers/ answers as to what I have setup wrong will be gratefully received. In addition, I know this has been posted at SO and Envers forum, I WILL update an answer on both.
I think is down to a Windsor configuration, does any one have any idea as to what I have not configured correctly with Windsor?
I haven't tried to run your sample, but I think this is down to an interplay between the two http modules defined in your web.config (https://github.com/ScottFindlater/WindsorEnversIssue/blob/master/API%20Endpoints/Web.config)
Castle.MicroKernel.Lifestyle.PerWebRequestLifestyleModule - Controls the lifetime of "per web request" components
APIEndpoints.HttpModules.NHibernateSessionCoordinator - Opens a session and begins a transaction at the beginning of each web request, then commits the transaction and disposes the session at the end of the web request
It is at the point where you commit your transaction - at the end of the request, triggered by NHibernateSessionCoordinator, that any changes you've made to objects within your NHibernate ISession actually get written to the database. This is the point at which Envers does its stuff and, in turn, at which you attempt to resolve IUserService from your Windsor container. The exception is thrown because IUserService is registered with the "per web request" lifestyle and Windsor is treating the current web request as complete and has disposed any objects tied to the request.
Have you tried reversing the order in which the HttpModules are defined, e.g. NHibernateSessionCoordinator before PerWebRequestLifestyleModule? This will result in your NHibernate transaction being committed before per web request components are disposed.
I have the following setup.
Spring 3.0.5
Hibernate 3.5.6
MySql 5.1
To save a record in the DB via Hibernate I have the following workflow
send JSON {id:1,name:"test",children:[...]} to Spring MVC App and use Jackson to transform it into an object graph (if it is an existing instance the JSON has the proper ID of the record in the DB set
save the object in DB via service layer call (details below)
the save function of service layer interface SomeObjectService has the #Transactional annotation on it with readOnly=false and Propagation REQUIRED
the implementation of this service layer SomeObjectServieImpl calls the DAO save
method
the DAO saves the new data via a call of hibernate's merge e.g. hibernateTempate().merge(someObj)
hibernate merge loads the object first from the DB via SELECT
I have a EntityListener who is wired to spring (I used this technique Spring + EntityManagerFactory +Hibernate Listeners + Injection) and listens to #PostLoad
The listener uses a LockingServie to updates one field of someObject to set it as locked (this should actually only happen when someObject is loaded via Hibernate HQL,SQL or Criteria calls but gets called also on merge)
the LockingServie has a function lock(someObj,userId) which is also annotated with #Transactional with readOnly=false and REQUIRED
the update happens via a call of Query query = sess.createQuery("update someObj set lockedBy=:userId"); and then
query.executeUpdate();
after merge has loaded the data it start with updating someObject and inserting relevant children (<= exacely here is the point where the deadlock happens)
return JSON result (this also includes the newly created object ID) back to client.
The problem seems for me that first
the record gets loaded in a transaction
then gets changed in another (inner-)transaction
and then should get updated again with the data of the outer transaction but can't get updated because it is locked.
I can see via MySQL's
SHOW OPEN TABLES
that a child table (that is part of the object graph) is locked.
Interesting fact is that the deadlock doesn't occur on the someObj table but rather on a table that represents a child.
I am a bit lost here. Any help is more than welcome.
BTW can maybe the isolation level get me out of this problem here?
I ended up using #Bozho's HibernateExtendedJpaDialect
which is explained here >>
Hibernate, spring, JPS & isolation - custom isolation not supported
To set the isolation to READ_UNCOMMITED
#Transactional(readOnly = false, propagation = Propagation.REQUIRED, isolation=Isolation.READ_UNCOMMITTED)
public Seizure merge(Seizure seizureObj);
Not a very nice solution I know but at least this solved my problem.
If somebody wanna have a detailed description please ask...
I don't know the solution to the problem, but I would not have a transactional lock method. If at all you need to lock something manually, make it within another transactional service method.
I've written an ASP.Net MVC 3 application using the Code First paradigm whereby when I make a change to the model the Entity Framework automatically attempts to re-create the underlying SQL Server Database via DROP and CREATE statements. The problem is the application is hosted on a 3rd party remote server which limits the number of databases I can have and does not seem to allow me to programmatically execute "CREATE DATABASE..." statements as I gather from this error message:
CREATE DATABASE permission denied in database 'master'.
Is there any way to stop the Entity Framework from dropping and attempting to re-create the whole database and instead make it simply drop the tables and re-create them?
After creating the database manually and running the application I also get the following error I guess as the Entity Framework tries to modify the database:
Model compatibility cannot be checked because the database does not contain model metadata. Ensure that IncludeMetadataConvention has been added to the DbModelBuilder conventions.
UPDATE: Found this gem through google, it sounds like its exactly what you need: http://nuget.org/Tags/IDatabaseInitializer
You can use a different database initializer. Lets say your context is called SampleContext then your constructor would look like this:
public SampleContext()
{
System.Data.Entity.Database.SetInitializer(new CreateDatabaseIfNotExists<SampleContext>());
}
Note that the above is the default initializer. You will probably need to create your own custom initializer by implementing IDatabaseInitializer. Theres some good info here: http://sankarsan.wordpress.com/2010/10/14/entity-framework-ctp-4-0-database-initialization/
Using EF 4.3 with Migrations you do not get this behavior - at least I have not seen it. But I also have this set in my code -
public sealed class DbConfiguration : DbMigrationsConfiguration<DatabaseContext>
{
public DbConfiguration()
{
AutomaticMigrationsEnabled = false;
}
}
The environment of my application: web-based, Spring MVC+Security, Hibernate, MySQL(InnoDB)
I am working on a small database application operated from a web interface. There are specific and known users that handle the stored data. Now I need to keep track of every create/update/delete action a user executes on the database and produce simple, "list-like" reports from this. As of now, I am thinking of a "log" table (columns: userId + timestamp + description etc.). I guess an aspect could be fired upon any C(R)UD operation inserting a log row in this table. But I am not sure this is how it should be done.
I am also aware of the usual MySQL logs as well as log4j. As for the logfiles, I might need more information than what is available to MySQL. Log4j might be a solution, but I do not see how it is able to write to MySQL tables. Also, I would like to have some associations preserved in my log table (e.g. the user id) to let the db do the basic filtering etc. Directions on this one appreciated.
What would you recommend? Is there even any built-in support in Hibernate/Spring or is log4j the right way to go?
Thanks!
Log4j is modular, you can write your own backend that writes the log into a database if you wish to do so; in fact, it even comes with a JDBC appender right out of box, although make note of the big red warning there.
For Hibernate, you probably can build something on the interceptors and events that keep track of all modifications and log them to a special audit table.
Have you looked into using a MappedSuperclass for C(R)UD operation logging?
#MappedSuperclass
public class BaseEntity {
#Basic
#Temporal(TemporalType.TIMESTAMP)
public Date getLastUpdate() { ... }
public String getLastUpdater() { ... }
...
}
#Entity class Order extends BaseEntity {
#Id public Integer getId() { ... }
...
}
In case you go for logging solution and looking for doing it yourself, try searching for JDBCAppender, it's not perfect but should work.
In case you want off the shelf product for centralized logging - consider trying logFaces - it can write directly into your own database (Disclosure: I am the author of this product.)