Converting MySQL to H2 - mysql

I’m working on a project following this code :
Link to download of code: https://bitbucket.org/vrto/spring-tutorial/get/a66534cc7033.zip
Now what I really want, instead of MySQL, is to have an embedded database running HSQL or H2. So I’ve working on my own project trying to implement such a system.
In the tutorial code, he creates a database. And then h2 + hibernates creates the future tables for him – so it would be create if I could do this without having to have an sql schema or anything to set it up.
So I’ve gotten rid of mySql maven dependencies and am working on replacing the persistence-beans.xml
Now I have to replace my dataSource with a relevant h2 or hsql version.
So this is what I’ve gotten.
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource"
destroy-method="close">
<property name ="driverClassName" value = "org.h2.driver"/>
<property name = "url" value ="jdbc:h2:mem:test;DB_CLOSE_DELAY=-1" />
<property name = "username" value = "sa" />
<property name = "password" value = "" />
</bean>
// I also tried this (but again, having a schema with his code is tricky I’ve found
<jdbc:embedded-database id="dataSource" type="H2">
<jdbc:script location="classpath:schema.sql"/>
<jdbc:script location="classpath:test-data.sql"/>
</jdbc:embedded-database>
But it fails upon running HibernateConfigurationTest.Java (and in extension – all the others)
Any help would be greatly appreciated.
I've managed to get this test work with my schema
#ContextConfiguration(locations = "/persistence-beans.xml")
public class HibernateConfigurationTest extends AbstractJUnit4SpringContextTests {
#Autowired
private SessionFactory sessionFactory;
#Test
public void testHibernateConfiguration() {
// Spring IOC container instantiated and prepared sessionFactory
assertNotNull (sessionFactory);
}
}
the rest tests are in this link : http://vrtoonjava.wordpress.com/2012/06/17/part-3-dao-and-service-layer/

Related

Hibernate Connection Leak

i'm kinda new on web dev, i'm a trainee on a company since last year and i have the following problem:
I'm making a web app with JSF2.3 and Hibernate 5.4.2.Final and c3p0 5.4.2.Final. The thing is everytime i run and go for the login page, i need to check if there is an admin user already registered - i make a count on employee's table based on employee's code - and if there isn't any administrator, then i get a list of country states and render a form register menu.
So, i get the session from the sessionfactory.opensession() in mine HibernateUtil.class, do the count and clear/close the session like the snipet:
public Long retornaLong(String query) throws Exception{
Session session = new HibernateUtil().getSession();
try {
return (Long) session.createQuery(query).getSingleResult();
}finally {
session.clear();
session.close();
}
}
then i get the country states list from
#SuppressWarnings("unchecked")
public List<T> retornaList(String query) throws Exception{
Session session = new HibernateUtil().getSession();
try {
return (List<T>) session.createQuery(query).getResultList();
}finally {
session.clear();
session.close();
}
}
but if i keep refreshing the page (#viewscoped), like 15+ times, eventually i'll get too many connection exception, this doesn't happen if i use one session for both queries. I think there's no enough time for the session to close, causing a connection leak. I want to use one session for each query, can someone help me. Thanks a lot.
My hibernate.cfg.xml
<hibernate-configuration>
<!-- a SessionFactory instance listed as /jndi/name -->
<session-factory>
<!-- properties -->
<property name="hibernate.dialect">org.hibernate.dialect.MySQLInnoDBDialect</property>
<property name="hibernate.connection.driver_class">com.mysql.jdbc.Driver</property>
<property name="hibernate.connection.url">jdbc:mysql://localhost:3306/vetsnpets?useTimezone=true&serverTimezone=UTC</property>
<property name="hibernate.connection.username">vetsNpets</property>
<property name="hibernate.connection.password">123</property>
<property name="hiberante.show_sql">false</property>
<property name="hiberante.format_sql">false</property>
<property name="hbm2ddl.auto">validate</property>
<property name="current_session_context_class">thread</property>
<!-- C3P0 -->
<property name="hibernate.c3p0.initialPoolSize">3</property>
<property name="hibernate.c3p0.minPoolSize">3</property>
<property name="hibernate.c3p0.maxPoolSize">20</property>
<property name="hibernate.c3p0.maxStatements">100</property>
<property name="hibernate.c3p0.maxStatementsPerConnection">5</property>
<property name="hibernate.c3p0.maxIdleTime">2700</property>
<property name="hibernate.c3p0.maxIdleTimeExcessConnections">600</property>
<property name="hibernate.c3p0.acquireIncrement">1</property>

Shiro: code-duplication in datasource configuration

I use shiro to implement authentication for my CXF web service. I am using a jdbc-Realm and configured it with the help of an ini-file (attached below). The authentication data is persisted in the same database like the other data I need, but for the rest of the system I use a properties-file (can be found below, too) to provide the connection-information.
Now obviously the data for the datasource in both cases is the same, but I do not seem to find a way to resolve this code duplication. Does anybody more experienced with the development of webapplications have a solution? I could change both, the config of shiro and of the rest of the system, if it would help.
Thanks in advance,
zakum
shiro.ini:
[main]
jdbcRealm = org.apache.shiro.realm.jdbc.JdbcRealm
jdbcRealm.permissionsLookupEnabled = true
jdbcRealm.authenticationQuery = SELECT password FROM users WHERE username = ?;
ds = org.postgresql.ds.PGSimpleDataSource
ds.user = postgres
ds.password = password
ds.databaseName = servicedb
ds.serverName = localhost
ds.portNumber = 5432
jdbcRealm.dataSource = $ds
securityManager.realms = $jdbcRealm
service.properties:
db. It looks like:
db.name = servicedb
db.user = postgres
db.password = password
db.url = //localhost:5432/
Use a container like Spring to configure Shiro and your JDBC connections. Then you can pass the dataSource as a reference into the JDBCRealm.
<bean id="dataSourceBean" class="com.apache.commons.dbcp.BasicDataSource>
<property name="driverClassName" value"class for driver"/>
... more setup for the data source ...
</bean>
<bean id="jdbcRealm" class="org.apache.shiro.realm.jdbc.JdbcRealm">
<property name="dataSource" ref="dataSourceBean"/>
<property name="permissionsLookupEnabled" value="true"/>
<property name="authenticationQuery" value="SELECT password FROM users WHERE username = ?"/>
</bean>
You could automatically replace tokens during your build process using Ant or Maven (among others).
Here's an example using Ant:
<copy file="shiro.template.ini" tofile="shiro.ini" overwrite="true" />
<replace file="shiro.ini" token="#DB_NAME#" value="servicedb"/>
In the *.template.* files you'd use tokens:
ds.databaseName = #DB_NAME#
which would get replaced with the real values during the build process:
ds.dataBaseName = servicedb
(and it would obviously be better to specify the tokens/values in a configuration file and apply the replacements to a list of files using globbing patterns)
Ant: https://ant.apache.org/manual/Tasks/replace.html
Maven: https://code.google.com/p/maven-replacer-plugin/
Although this is an old question, I had a similar query and I solved it the following way. Hope it will be helpful to others.
Shiro ini can be used to configure any class. For data source, I have written my application specific class as this.
import javax.sql.DataSource;
public class MyConfig
{
private static DataSource dataSource;
public void setDataSource(Object ds)
{
dataSource = (DataSource)ds;
}
public static DataSource getDataSource() // your application will use this method to get data source.
{
return dataSource;
}
}
Now in the ini file, I passed the same datasource reference to both, the jdbc realm and my class.
myConfig = my.package.MyConfig
myConfig.dataSource = $ds

Binary File To SQL Database Apache Camel

I need some guidance around which approach to use to load binary files from a folder into a MySQL Database using Camel. Basically I want to store voice logs from our PBX system into a database. The directory with the voice logs will be a remote directory
I have designed a prototype but I am not sure if this is really efficient, it works but I am not happy with the design. Let me explain what I am doing. Camel route as follows:
<camelContext xmlns="http://camel.apache.org/schema/spring">
<package>com.hia.camelone</package>
<route>
<from uri="file://c:/CTest/Inbox?noop=true&recursive=true&delay=3000"/>
<to uri="bean://fileToSQL"/>
<to uri="jdbc://timlogdb"/>
</route>
</camelContext>
<bean id="timlogdb" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value=" com.mysql.jdbc.Driver"/>
<property name="url" value="jdbc:mysql://127.0.0.1:3306/TimLog" />
<property name="username" value="root" />
<property name="password" value="blahblah" />
</bean>
<bean id="fileToSQL" class="com.hia.camelone.fileToSQL"/>
And the code to fileToSQL bean is:
public class fileToSQL {
public String toString(#Headers Map<String,Object> header, #Body Object body){
StringBuilder sb = new StringBuilder();
String filename =(String)header.get("CamelFileNameOnly");
String escapedFileName = StringEscapeUtils.escapeJava(filename).replace("\'", "");
String filePath = StringEscapeUtils.escapeJava((String)header.get("CamelFilePath"));
sb.append("insert into FileLog ");
sb.append("(FileName,FileData) values (");
sb.append("'").append(escapedFileName).append("',").append("LOAD_FILE(\"").append(filePath).append("\")");
sb.append(")");
System.out.println(sb.toString());
System.out.println(body);
System.out.println(header.toString());
return sb.toString();
}
}
Ok short explanation I get the file component to consume the files then I build a SQL string using the MySQL LOAD_FILE() function to load the file.
My thoughts around this:
The LOAD_FILE function only works on the local machine and thus this route will only with the files being on the local machine. I could use a file producer to copy the files from some remote directory to a local directory and then use the route. My route would be something like this then:
<route>
<from uri="file://c:/CTest/Inbox?noop=true&recursive=true&delay=3000"/>
<to uri="file://c:/outbox"/>
<to uri="bean://fileToSQL"/>
<to uri="jdbc://timlogdb"/>
</route>
However since I have access to the files content in the message from the files consumer I should be able to theoretically be able to access the body/content of the string and build a SQL command that does NOT use the LOAD_FILE() function.
The only way I know how to build such a string is by using the prepared statement of JDBC. This would be first prize if I could somehow build a insert statement with the content from the file consumer.
Can I create a prepared statement in my fileToSQL bean and pass it to my jdbc component?
Or how do I build a INSERT statement without the LOAD_FILE() function?
Since I have to use the LOAD_FILE() function I would now have to cater for both unix and windows filepaths. While this should not be difficult I just dont like the idea of putting OS specific code into my applications(feels like a work around).
Anybody here ever uploaded binary files to a MySQL database using Camel who can give me some guidance on the points above. While I could work around the problems I just want to make sure I dont miss a obvious way of doing things.
I had a look around here and only found people working with mostly text files. Guys please don't even go down the route of me storing the file on the files system and linking it to the database. We have some very specific disaster recovery requirements and legal requirements that enforce the need for me to store it in a database.
Right so I managed to find a way and it was not that difficult. What I essentially did was get rid of the JDBC Camel Component in the route. I then injected the data source bean into my fileToSQL bean. I then used a simple prepared statement to insert the file and its name into MySQL.
As always code is much more explicit than my english.
<camelContext xmlns="http://camel.apache.org/schema/spring">
<package>com.hia.camelone</package>
<route>
<from uri="file://c:/CTest/Inbox?noop=true&recursive=true&delay=3000"/>
<to uri="bean://fileToSQL"/>
<!--<to uri="jdbc://timlogdb"/>-->
</route>
</camelContext>
<bean id="timlogdb" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value=" com.mysql.jdbc.Driver"/>
<property name="url" value="jdbc:mysql://127.0.0.1:3306/TimLog" />
<property name="username" value="root" />
<property name="password" value="lalala" />
</bean>
<bean id="fileToSQL" class="com.hia.camelone.fileToSQL">
<property name="dataSource" ref="timlogdb"/>
</bean>
As you can see I inject my timlogdb bean into my fileToSQL bean. Spring ROCKS!
So here is my fileToSQL bean.
public class fileToSQL {
private DriverManagerDataSource dataSource;
private static final String SQL_INSERT="insert into FileLog(FileName,FileData)values(?,?)";
#Handler
public void toString(#Headers Map<String,Object> header,Exchange exchange){
Connection conn = null;
PreparedStatement stmt=null;
String filename =StringEscapeUtils.escapeJava(((String)header.get("CamelFileNameOnly")).replace("\'", ""));
try {
conn= dataSource.getConnection();
stmt =conn.prepareStatement(SQL_INSERT);
stmt.setString(1, filename);
byte[] filedata = exchange.getIn().getBody(byte[].class);
stmt.setBytes(2,filedata );
int s = stmt.executeUpdate();
}
catch (Exception e)
{
System.out.println(e.getMessage());
}
finally{
try
{
if (stmt!=null)
{
stmt.close();
}
if (conn!=null)
{
conn.close();
}
}
catch(SQLException e)
{
System.out.println(e.getMessage());
}
}
}
/**
* #param dataSource the dataSource to set
*/
public void setDataSource(DriverManagerDataSource dataSource) {
this.dataSource = dataSource;
}
}
The guys from Camel did a great job. Camel is truly flexible especially when you combine it with Spring.
What a ride!

Jackson serializationConfig

I am using Jackson JSON in a Spring 3 MVC app. To not serialize each and every single Date field, I created a custom objectmapper that uses a specific DateFormat:
#Component("jacksonObjectMapper")
public class CustomObjectMapper extends ObjectMapper
{
Logger log = Logger.getLogger(CustomObjectMapper.class);
#PostConstruct
public void afterProps()
{
log.info("PostConstruct... RUNNING");
//ISO 8601
getSerializationConfig().setDateFormat(new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SZ"));
}
//constructors...
}
This custom ObjectMapper is injected into the JsonConverter:
<bean id="jsonConverter" class="org.springframework.http.converter.json.MappingJacksonHttpMessageConverter">
<property name="supportedMediaTypes" value="application/json" />
<property name="objectMapper" ref="jacksonObjectMapper" /> <!-- defined in CustomObjectMapper -->
</bean>
There is no exception in the logs and serialization works, but it is not picking up the dateformat, it simple serializes to a timestamp. The #PostConstruct annotation works, the log statement in the method is in the logs.
Does anyone know why this fails?
You may also need to specify that you want textual Date serialization, by doing:
configure(SerializationConfig.Feature.WRITE_DATES_AS_TIMESTAMPS, false);
(although I was assuming setting non-null date format might also trigger it, but maybe not)
Also, you can do configuration of mapper directly from constructor (which is safe). Not that it should change behavior, but would remove need for separate configuration method.
I've done the below which works to get around compatability with Java / PHP timestamps. Java uses milliseconds since EPOCH and PHP uses seconds so was simpler to use ISO dates.
I declare the below message adapters:
<bean id="messageAdapter"
class="org.springframework.web.servlet.mvc.annotation.AnnotationMethodHandlerAdapter">
<property name="messageConverters">
<list>
<bean id="jacksonJsonMessageConvertor"
class="my.app.MyMappingJacksonHttpMessageConverter"/>
</list>
</property>
</bean>
And MyMappingJacksonHttpMessageConverter looks like the below:
public class MyMappingJacksonHttpMessageConverter extends MappingJacksonHttpMessageConverter {
public MyMappingJacksonHttpMessageConverter(){
super();
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(Feature.WRITE_DATES_AS_TIMESTAMPS, false);
setObjectMapper(objectMapper);
}
}
With the above all dates are written out in ISO format.
For Spring config application.properties
spring.jackson.serialization.fail-on-empty-beans=false

Getting more attributes from CAS than just user id

I am using CAS with JDBC Authentication handler and was wondering is it possible to get the other attributes of principal object (for e.g. firstname, lastname) not just the username from CAS after successful authentication?
In the casServiceValidationSuccess.jsp, I add like below:
<cas:attributes>
<c:forEach var="attr" items="${assertion.chainedAuthentications[fn:length(assertion.chainedAuthentications)-1].principal.attributes}">
**<cas:${fn:escapeXml(attr.key)}>${fn:escapeXml(attr.value)}</cas:${fn:escapeXml(attr.key)}>**
</c:forEach>
</cas:attributes>
In the deployerConfigContent.xml, I add like below:
<bean class="org.jasig.cas.authentication.principal.UsernamePasswordCredentialsToPrincipalResolver" >
**<property name="attributeRepository">
<ref bean="attributeRepository" />
</property>**
</bean>
<bean id="attributeRepository" class="org.jasig.services.persondir.support.jdbc.SingleRowJdbcPersonAttributeDao">
<constructor-arg index="0" ref="dataSource"/>
<constructor-arg index="1" value="select * from bbs_members where {0}" />
<property name="queryAttributeMapping">
<map>
<entry key="username" value="username" />
</map>
</property>
<property name="resultAttributeMapping">
<map>
<entry key="uid" value="uid"/>
<entry key="email" value="email"/>
<entry key="password" value="password"/>
</map>
</property>
</bean>
It works.
I came across this problem during the debug, please close the browser if you change this JSP or XML files, otherwise the changes won't work. Be careful.
To get any user attributes from DB I did the following:
use PersonDirectoryPrincipalResolver
in deployerConfigContext.xml:
<bean id="primaryPrincipalResolver"
class="org.jasig.cas.authentication.principal.PersonDirectoryPrincipalResolver" >
<property name="attributeRepository" ref="singleRowJdbcPersonMultiplyAttributeDao" />
</bean>
instead of using standard SingleRowJdbcPersonAttributeDao class create your own implementation which returns not only one row from a query result but aggregated data from all returned rows:
copy all code from SingleRowJdbcPersonAttributeDao and change only one method parseAttributeMapFromResults.
you will have something like that:
public class SingleRowJdbcPersonMultiplyAttributeDao extends AbstractJdbcPersonAttributeDao<Map<String, Object>> {
...
#Override
protected List<IPersonAttributes> parseAttributeMapFromResults(final List<Map<String, Object>> queryResults, final String queryUserName) {
final List<IPersonAttributes> peopleAttributes = new ArrayList<IPersonAttributes>(queryResults.size());
Map<String, List<Object>> attributes = new HashMap<String, List<Object>>();
for (final Map<String, Object> queryResult : queryResults) {
for (final Map.Entry<String, Object> seedEntry : queryResult.entrySet()) {
final String seedName = seedEntry.getKey();
final Object seedValue = seedEntry.getValue();
if (attributes.get(seedName) != null && !attributes.get(seedName).get(0).equals(seedValue)) {
attributes.get(seedName).add(seedValue);
} else {
List<Object> list = new ArrayList<Object>();
list.add(seedValue);
attributes.put(seedName, list);
}
}
}
final IPersonAttributes person;
final String userNameAttribute = this.getConfiguredUserNameAttribute();
if (this.isUserNameAttributeConfigured() && attributes.containsKey(userNameAttribute)) {
// Option #1: An attribute is named explicitly in the config,
// and that attribute is present in the results from LDAP; use it
person = new CaseInsensitiveAttributeNamedPersonImpl(userNameAttribute, attributes);
} else if (queryUserName != null) {
// Option #2: Use the userName attribute provided in the query
// parameters. (NB: I'm not entirely sure this choice is
// preferable to Option #3. Keeping it because it most closely
// matches the legacy behavior there the new option -- Option #1
// -- doesn't apply. ~drewwills)
person = new CaseInsensitiveNamedPersonImpl(queryUserName, attributes);
} else {
// Option #3: Create the IPersonAttributes doing a best-guess
// at a userName attribute
person = new CaseInsensitiveAttributeNamedPersonImpl(userNameAttribute, attributes);
}
peopleAttributes.add(person);
return peopleAttributes;
}
...
}
and in deployerConfigContext.xml:
<bean id="singleRowJdbcPersonMultiplyAttributeDao"
class="com.scentbird.SingleRowJdbcPersonMultiplyAttributeDao">
<constructor-arg index="0" ref="dataSource" />
<constructor-arg index="1" value="SELECT attributes_table1.*, attributes_table2.attr1, attributes_table2.roles AS roles FROM user_table ut LEFT JOIN roles_table rt ON <condition> LEFT JOIN another_table at ON <condition> WHERE {0}" />
<property name="queryAttributeMapping">
<map>
<entry key="username" value="username" />
</map>
</property>
</bean>
Also in my case I used SAML protocol.
As a result you will get on the client all attributes which your select returns.
For example, if user have many roles you could have on the client:
User: username, firstname, lastname, email, ... , [ROLE_1, ROLE_2, ROLE_3]
My case works with Spring Security and Grails.
I'm not sure this is 100% Feng Shui solution :) as it's fast cooked but it works in our case.
Hope it helps.
I just spent the last three days attempting to get CAS properly configured. One of the issues I encountered was that I had to explicitly instruct CAS to publish the properties. I did this by:
opening https://localhost/cas/services
going to the 'Manage Services' tab
click 'edit' for each service
highlight the properties you wish to publish
click the save button
FWIW, the other issue is that casServiceValidationSuccess.jsp does contain any code to pass the properties back in the response. I was looking for a solution to this when I found your question. I notice that you have rewritten your implementation.
The definitive and complete solution is the following (for this undocumented feature):
Server side:
a. Add an attributeRepository to your CredentialsToPrincipalResolver.
b. Implement the your.package.YourPersonAttributeDao like an IPersonAttributeDao.
c. Declare the attributes that will be transmitted into assertion to client.
d. Modify the casServiceValidationSuccess.jsp to display the attributes (thx to xiongjiabin).
Client side. You get all attributes by doing this:
Due to formatting problem I can't post the code of the definitive solution.... Let me know if you are interested, I will send you an email with all the code.
In addition to the answer provided by #xiongjiabin if you are using CAS v4+ you probably want to use assertion.primaryAuthentication instead of assertion.chainedAuthentications in casServiceValidationSuccess.jsp:
<cas:attributes>
<c:forEach var="attr" items="${assertion.primaryAuthentication.principal.attributes}">
<cas:${fn:escapeXml(attr.key)}>${fn:escapeXml(attr.value)}</cas:${fn:escapeXml(attr.key)}>**
</c:forEach>
</cas:attributes>
If you do use assertion.chainedAuthentications with CAS v4+ then the serviceRegistryDao list of allowedAttributes will be ignored and all attributes will be returned.