How to connect to oracle and retrive data - playframework-2.3

I am new to Java and Playframework as well. I have play running. But am unable to retrieve data from oracle. I added the /lib folder and added the ojdbc6.jar and play deployed successfully. I know play is connected to my DB as when I had the incorrect config it complained. The config is below in the application.conf
db.default.driver=oracle.jdbc.driver.OracleDriver
db.default.url="jdbc:oracle:thin:#hostname:1521:SOMESID"
db.default.user=someuser
db.default.password=somepassword
All the above seems ok. Now in the application.java I am trying to fetch data from DB. I do not want to use Hibernate or any ORM layers. The application.java is below. Added the method getMetaData to talk to Oracle but does not work. Perhaps I have not written the method properly or not imported the relevant libraries. A working .java file would be very helpful. If I can manage to retrieve data form oracle great otherwise hope I don't have to go the spring framework route. Any help would be greatly appreciated. Thanks
package controllers;
import play.*;
import play.mvc.*;
import views.html.*;
import play.db.*;
public class Application extends Controller {
public static Result index() {
//return ok(index.render("Hello Worldxxx"));
//me: the below is the default return
return redirect(routes.Application.tasks());
}
public static Result tasks() {
return TODO;
}
public static Result newTask() {
return TODO;
}
public static Result deleteTask(Long id) {
return TODO;
}
public static Result getMetaData() {
Connection connection = DB.getConnection();
ResultSet resultSet = connection.prepareStatement("SELECT * FROM sometable").executeQuery();
metaData = resultSet.getMetaData();
connection.close();
return ok(metaData.toString());
}
}

Related

Spring data r2dbc is not auto creating tables in mysql [duplicate]

I am creating a quick project using R2DBC and H2 to familiarize myself with this new reactive stuff. Made a repository that extends ReactiveCrudRepository and all is well with the world, as long as i use the DatabaseClient to issue a CREATE TABLE statement that matches my entity first...
I understand spring data R2DBC is not as fully featured as spring data JPA (yet?) but is there currently a way to generate the schema from the entity classes?
Thanks
No, there is currently no way to generate schema from entities with Spring Data R2DBC.
I'm using it in a project with Postgres DB and it's complicated to manage database migrations, but I managed to wire in Flyway with synchronous Postgre driver (Flyway doesn't work with reactive drivers yet) at startup to handle schema migrations.
Even though you still have to write your own CREATE TABLE statements which shouldn't be that hard and you could even modify your entities in some simple project to create JPA entities and let Hibernate create schema then copy-paste it into a migration file in your R2DBC project.
It is possible for tests and for production.
I production make sure your user has no access to change schema otherwise you may delete tables by mistake!!! or use a migration tool like flyway.
You need to put your schema.sql in the main resources and add the relevant properties
spring.r2dbc.initialization-mode=always
h2 for test and postgres for prod
I use gradle and the versions of driver are:
implementation 'org.springframework.boot.experimental:spring-boot-actuator-autoconfigure-r2dbc'
runtimeOnly 'com.h2database:h2'
runtimeOnly 'io.r2dbc:r2dbc-h2'
runtimeOnly 'io.r2dbc:r2dbc-postgresql'
runtimeOnly 'org.postgresql:postgresql'
testImplementation 'org.springframework.boot.experimental:spring-boot-test-autoconfigure-r2dbc'
The BOM version is
dependencyManagement {
imports {
mavenBom 'org.springframework.boot.experimental:spring-boot-bom-r2dbc:0.1.0.M3'
}
}
That's how I solved this problem:
Controller:
#PostMapping(MAP + PATH_DDL_PROC_DB) //PATH_DDL_PROC_DB = "/database/{db}/{schema}/{table}"
public Flux<Object> createDbByDb(
#PathVariable("db") String db,
#PathVariable("schema") String schema,
#PathVariable("table") String table) {
return ddlProcService.createDbByDb(db,schema,table);
Service:
public Flux<Object> createDbByDb(String db,String schema,String table) {
return ddl.createDbByDb(db,schema,table);
}
Repository:
#Autowired
PostgresqlConnectionConfiguration.Builder connConfig;
public Flux<Object> createDbByDb(String db,String schema,String table) {
return createDb(db).thenMany(
Mono.from(connFactory(connConfig.database(db)).create())
.flatMapMany(
connection ->
Flux.from(connection
.createBatch()
.add(sqlCreateSchema(db))
.add(sqlCreateTable(db,table))
.add(sqlPopulateTable(db,table))
.execute()
)));
}
private Mono<Void> createDb(String db) {
PostgresqlConnectionFactory
connectionFactory = connFactory(connConfig);
DatabaseClient ddl = DatabaseClient.create(connectionFactory);
return ddl
.execute(sqlCreateDb(db))
.then();
}
Connection Class:
#Slf4j
#Configuration
#EnableR2dbcRepositories
public class Connection extends AbstractR2dbcConfiguration {
/*
**********************************************
* Spring Data JDBC:
* DDL: does not support JPA.
*
* R2DBC
* DDL:
* -does no support JPA
* -To achieve DDL, uses R2dbc.DataBaseClient
*
* DML:
* -it uses R2dbcREpositories
* -R2dbcRepositories is different than
* R2dbc.DataBaseClient
* ********************************************
*/
#Bean
public PostgresqlConnectionConfiguration.Builder connectionConfig() {
return PostgresqlConnectionConfiguration
.builder()
.host("db-r2dbc")
.port(5432)
.username("root")
.password("root");
}
#Bean
public PostgresqlConnectionFactory connectionFactory() {
return
new PostgresqlConnectionFactory(
connectionConfig().build()
);
}
}
DDL Scripts:
#Getter
#NoArgsConstructor(access = AccessLevel.PRIVATE)
public final class DDLScripts {
public static final String SQL_GET_TASK = "select * from tasks";
public static String sqlCreateDb(String db) {
String sql = "create database %1$s;";
String[] sql1OrderedParams = quotify(new String[]{db});
String finalSql = format(sql,(Object[]) sql1OrderedParams);
return finalSql;
}
public static String sqlCreateSchema(String schema) {
String sql = "create schema if not exists %1$s;";
String[] sql1OrderedParams = quotify(new String[]{schema});
return format(sql,(Object[]) sql1OrderedParams);
}
public static String sqlCreateTable(String schema,String table) {
String sql1 = "create table %1$s.%2$s " +
"(id serial not null constraint tasks_pk primary key, " +
"lastname varchar not null); ";
String[] sql1OrderedParams = quotify(new String[]{schema,table});
String sql1Final = format(sql1,(Object[]) sql1OrderedParams);
String sql2 = "alter table %1$s.%2$s owner to root; ";
String[] sql2OrderedParams = quotify(new String[]{schema,table});
String sql2Final = format(sql2,(Object[]) sql2OrderedParams);
return sql1Final + sql2Final;
}
public static String sqlPopulateTable(String schema,String table) {
String sql = "insert into %1$s.%2$s values (1, 'schema-table-%3$s');";
String[] sql1OrderedParams = quotify(new String[]{schema,table,schema});
return format(sql,(Object[]) sql1OrderedParams);
}
private static String[] quotify(String[] stringArray) {
String[] returnArray = new String[stringArray.length];
for (int i = 0; i < stringArray.length; i++) {
returnArray[i] = "\"" + stringArray[i] + "\"";
}
return returnArray;
}
}
It is actually possible to load a schema by defining a specific class in this way:
import io.r2dbc.spi.ConnectionFactory
import org.springframework.context.annotation.Bean
import org.springframework.context.annotation.Configuration
import org.springframework.core.io.ClassPathResource
import org.springframework.data.r2dbc.repository.config.EnableR2dbcRepositories
import org.springframework.r2dbc.connection.init.ConnectionFactoryInitializer
import org.springframework.r2dbc.connection.init.ResourceDatabasePopulator
#Configuration
#EnableR2dbcRepositories
class DbConfig {
#Bean
fun initializer(connectionFactory: ConnectionFactory): ConnectionFactoryInitializer {
val initializer = ConnectionFactoryInitializer()
initializer.setConnectionFactory(connectionFactory)
initializer.setDatabasePopulator(
ResourceDatabasePopulator(
ClassPathResource("schema.sql")
)
)
return initializer
}
}
Pay attention that IntelliJ gives an error "Could not autowire. No beans of 'ConnectionFactory' type found" but it is actually a false positive. So ignore it and build again your project.
The schema.sql file has to be put in resources folder.

Integration multi node Couchbase with Springboot

I am new to Couchbase and integrating local Couchbase server from my Springboot application, using Couchbase v6.6 and spring-data-couchbase v 4.0.5
I initially had single node running and my application was able to insert documents using CrudRepository.
I recently configured multiple nodes on my localhost using Docker images, but now my Springboot application doesn’t work as expected.
Relevant section of my logs attached, according to which it looks like the service connects to all 3 nodes, but then immediately disconnects for some reason!
My config class looks like this…
public class CouchbaseConfig extends AbstractCouchbaseConfiguration {
#Override
public String getConnectionString() {
return "127.0.0.1";
}
#Override
public String getUserName() {
return "Administrator";
}
#Override
public String getPassword() {
return "password";
}
#Override
public String getBucketName() {
return "bucket";
}
}
Any helpful tips would be appreciated.
Many Thanks

Configure Spring Data Couchbase on a cluster host address

My data people gave me the http://127.0.0.1:8091/pools url to connect to our Couchbase server and I've been told the pools suffix is the address to all the nodes in the cluster.
I'm using Spring 4.2.0.RELEASE with spring-data-couchbase 2.0.0.M1 against Couchbase 2.5.1 enterprise edition (build-1083)
Now, if I add the above url as is into the getBootstrapHosts list:
#Override
protected List<String> getBootstrapHosts() {
return Collections.singletonList(couchbaseProperties.getHost());
}
I get a number format exception on the 8091/pools value.
But when using the http://127.0.0.1:8091 url I get an invalid password exception.
I reckon the first url is to be used, but not in the way I went for.
There is probably a method I should override in the AbstractCouchbaseConfiguration class, but looking at the source code didn't really enlighten me.
Here is the Couchbase configuration class.
#Configuration
#EnableCouchbaseRepositories(basePackages = { "com.thalasoft.data.couchbase.repository" })
#ComponentScan(nameGenerator = PackageBeanNameGenerator.class, basePackages = { "com.thalasoft.data.couchbase.config" })
#EnableTransactionManagement
public class CouchbaseConfiguration extends AbstractCouchbaseConfiguration {
private static Logger logger = LoggerFactory.getLogger(CouchbaseConfiguration.class);
#Autowired
private CouchbaseProperties couchbaseProperties;
#Override
protected List<String> getBootstrapHosts() {
return Collections.singletonList(couchbaseProperties.getHost());
}
#Override
protected String getBucketName() {
return couchbaseProperties.getBucketName();
}
#Override
protected String getBucketPassword() {
return couchbaseProperties.getBucketPassword();
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
public LocalValidatorFactoryBean validator() {
return new LocalValidatorFactoryBean();
}
#Bean
public ValidatingCouchbaseEventListener validationEventListener() {
return new ValidatingCouchbaseEventListener(validator());
}
}
The fact that your database administrators gave you 127.0.0.1 as the adress to connect to seem strange, but indeed could be valid if one node of the cluster is running colocated with the client code...
This url-based syntax was the one used for the 1.4.x generation of SDK, and configuration is indeed a bit different in 2.x (reflecting the evolution of the Couchbase SDK between 1.4.x and 2.x): you just need to provide the hostname or ip of each node to bootstrap from, in a list.
You should try with just "127.0.0.1". It is possible also that you need to specify a bucket name and/or a password (ask your administrator). The defaults used by Spring Data Couchbase for each is "default" and "" (empty password), but you can override the getBucketName() and getBucketPassword() methods from AbsctractCouchbaseConfiguration to change that.
PS: the Spring Data Couchbase documentation is available here

Query for JSON String using JdbcTemplate to neo4j?

I want to use a JdbcTemplate and the Neo4j JDBC driver to query my neo4j database and return a JSON string.
Is there an existing method to do this?
I've googled and I can't find one.
It otherwise looks like a matter of creating a home cooked RowMapper as per here.
The query :
MATCH (s:Site) - [r] - (ss:SiteState) return s,ss;
it return a json but for my use i use an object
public class SiteRowMapper implements RowMapper<Site> {
#Override
public Site mapRow(ResultSet rs, int rowNum) throws SQLException {
Site site = new Site();
SiteState siteState = new SiteState();
Gson json = new Gson();
site = json.fromJson(rs.getString("s"), Site.class);
siteState = json.fromJson(rs.getString("ss"), SiteState.class);
site.setName(siteState.getName());
return site;
}
}

How to use custom mysql query from my liferay custom portlet?

I am using Liferay and developing my custom portlet, now I want to use custom query to retrieve some data from multiple table with joins etc.
I have googled the things for my problem but can't find the simple way to understand the step-by-step procedure.
So if any one can guide me or give me any tutorial to create Custom SQL query for my custom portlet.
after this 4th step i have built my service in eclipse,and its showing successfully.there are two file created in service/persistence package with the name AdvertiseFinder.java and AdvertiseFinderUtil.java but when i try to access the method getAd_DisplayforReports with the advertiseFinderUtil.getAd_DisplayforReports("Any arguement with string")
its giving me error that no such method in AdvertiseFinderUtil
I have build the service after updating my AdvertiseFinderImpl Method.but its not working
this is my AdvertiseFinderImpl Class
package emenu.advertise.database.service.persistence;
import com.liferay.portal.service.persistence.impl.BasePersistenceImpl;
import emenu.advertise.database.model.ad_display;
import emenu.advertise.database.model.advertise;
import emenu.advertise.database.model.impl.ad_displayImpl;
import java.util.List;
import com.liferay.portal.SystemException;
import com.liferay.portal.kernel.dao.orm.QueryPos;
import com.liferay.portal.kernel.dao.orm.SQLQuery;
import com.liferay.portal.kernel.dao.orm.Session;
import com.liferay.util.dao.orm.CustomSQLUtil;
public class AdvertiseFinderImpl extends BasePersistenceImpl<ad_display> implements advertiseFinder{
public void getall() {
}
// the name of the query
public static String GET_ADVERTISE = AdvertiseFinderImpl.class.getName()
+ ".getAdvertise";
// the method which will be called from the ServiceImpl class
public List<ad_display> getAd_DisplayforReports(String pattern) throws SystemException {
Session session = null;
try {
// open a new hibernate session
session = openSession();
// pull out our query from book.xml, created earlier
String sql = CustomSQLUtil.get(GET_ADVERTISE);
// create a SQLQuery object
SQLQuery q = session.createSQLQuery(sql);
// replace the "Book" in the query string with the fully qualified java class
// this has to be the hibernate table name
q.addEntity("a_ad_display", ad_displayImpl.class);
// Get query position instance
QueryPos qPos = QueryPos.getInstance(q);
// fill in the "?" value of the custom query
// this is same like forming a prepared statement
qPos.add(pattern);
// execute the query and return a list from the db
return (List<ad_display>)q.list();
/*
// use this block if you want to return the no. of rows (count)
int rows = 0;
Iterator<Long> itr = q.list().iterator();
if (itr.hasNext()) { Long count = itr.next();
if (count != null) { rows = count.intValue(); } }
return rows;
*/
} catch (Exception e) {
throw new SystemException(e);
} finally {
closeSession(session);
}
}
}
my default-ext.xml is following
<?xml version="1.0"?>
<custom-sql>
<sql file="custom-sql/emenu.xml" />
</custom-sql>
my emenu.xml is here
<custom-sql>
<sql id="emenu.advertise.database.service.persistence.AdvertiseFinderImpl.getAd_DisplayforReports">
<![CDATA[
SELECT
*
FROM
a_ad_display
]]>
</sql>
</custom-sql>
change
return (List<ad_display>)q.list();
to
return (List<ad_display>) QueryUtil.list(q, getDialect(), -1, -1);
Following are the steps to write custom query / finder methods in Liferay:
Create a new finder called EntityFinderImpl.java in the /generated/service/persistence directory.
'build-service' on the project.
The ServiceBuilder autogenerates the following two extra files: EntityFinder.java and EntityFinderUtil.java
Now open the EntityFinderImpl.java file and let this class extend the BasePersistenceImpl and implement EntityFinder. (Assumed that the Entity (table-name) is defined in the service.xml and other required classes are also autogenerated by ServiceBuilder)
Now add required custom method to EntityFinderImpl.java and build service again to distribute this method to Util classes.
Custom method can be created using liferay's DynamicQuery API or SQL-query as following:
public List<Entity> getCustomDataFromFinder("Parameters") throws SystemException {
Session session = null;
StringBuilder queryString = new StringBuilder();
Entity e = new EntityImpl();
try {
session = openSession();
queryString.append(" Write your Query here and conditionally append parameter value(s).");
SQLQuery query = session.createSQLQuery(queryString.toString());
query.addEntity("EntityName", EntityImpl.class);
return (List<Entity>) QueryUtil.list(query, getDialect(), 0, -1);
}
catch (Exception e) {
throw new SystemException(e);
}
finally {
if (session != null) {
closeSession(session);
}
}
}