Communication between Swing Client and Ejb container - swing

i'm trying to drop all database access on my swing application thats why i created an EJB module hosted in a jboss server where i put all my database calls.
My question is : what is the best middleware i can use to ensure effective communication between client and server without decreasing my swing application performance ?
I've seen that JMS represents a solution of my problem , but is it the best one ?
Please Help !!

JMS is preferably used for communicating between two systems where failure or downtime of one system leads to instability (i.e providing full availablity). Also db calls will be highly transactional so JMS is not the right approach.
You can either call the ejb from your swing action class, but it is likely to add a considerable amount of boilerplate code. Another alternative will be exposing web services that will in turn call the ejb's for db access. That might give you a whole service oriented design.
If it is possible, you can move your whole swing application to Dynamic Web application, that way it will give you 'n' number of options.
Reading through J2EE basics might help you to find the solution.

Related

ODBC Bridge needed for JSON / RESTful web application

I've been running in circles looking for a way to deliver data from a proprietary flat file database (based on the ProvideX platform) to a client-thick web application that makes RESTful requests and expects JSON responses.
ProvideX and Sage MAS 90 provided an ODBC driver that works for pulling tables, but I can't think of a good way to connect the dots without needing to program a bunch of server-side code.
Before I go down the path of programming custom server-side middleware, does anyone have any bright ideas, (or obvious ideas that I have overlooked)?
I am not locked into any particular architecture at the moment because we are hashing out requirements for the web application, so any ideas would be helpful.
ProvideX/Sage provides a web services module, but I can't use it because my company has refused to invest in the software module and upgrade costs. Let's not let that be a distraction, however, because I am still looking for a way to use the ODBC driver in this question thread.
ODBC-ODBC Bridges exist but all the ones I know are commercial.

JSP, MySQL and Geronimo

The task is to create several JSPs, in which the user would be able to interact by inputting information, which would be saved on a database server, so the info can be called up later.
I'm not sure if this question is constructive enough or not, but I have no idea how to even start. I know what each one of the components means, but that's about it. I have no idea how the whole process works and I don't know what's it called, so I can't even search for it properly.
Could anyone briefly describe the process from start to finish how this system would work and what should be my first concern? I'm more interested in the JSP hosting (would Tomcat be a better choice, or is Geronimo much better in my case) and the connection of JSP to the database.
You need several components and layers for an application like that, so the first thing to do is select your technology stack so you don't reinvent the wheel and adopt best practices that your frameworks include. My choice, is Spring Framework.
Your JSP's represent the View Layer of your app. You can use JavaScript/AJAX to embelish your forms and sent data to your server.
The data that the user enters in the form is received and processed by the Controller Layer. Spring MVC has some neat collection of controllers for you to use. Once the data es ready, you can pass it to the Service Layer to execute Business Logic.
The Service Layer contains Business Logic rules. Spring Framework let this Layer to be simple POJO's, and to apply Transactional logic if you wish. It's highly probable that Service Layer requires to persist some data in the Database, so it invokes the DAO layer.
The classes in DAO layer have the responsability of storing data in the database. You can use several frameworks for this, and Spring supports many of them. Also, Spring includes some inherent JDBC support with templates included.
With that you can start your project. It should run with no problems in Tomcat, Geronimo or any Java EE Container

WCF: Best way to get data from Oracle 10g, MySQL and SQL Server 2008 databases?

I am designing a simple C# WCF service using ASP.NET 4.0 and hosted on IIS7, which will be used by .NET and Java web applications and desktop applications to extract data stored in various databases (both local and remote). I am starting to learn how to use VS2010 and WCF after working for a few years on VS2005 and asp.net web services, so am somewhat of a noob to WCF but know a bit about web services and Visual Studio.
Does anyone have opinions on what the best approach would be in terms of project/class/file setup in Visual Studio 2010 to do this, seeing as how I want to maximize code re-use and minimize development time yet still have the ability to connect to the different databases? I have a WCF Service Application project for the service, and have generated a WCF Client to use for testing using svcutil.exe, but now I'm at the point where I need to start writing database access layer code (or "model" code for MVC if that's the design route I need to go down).
Any help appreciated, thanks!
Each of the databases will have their set of nuisances while integration. The first thing you need to start with would be to design your model in more of OO (Object Oriented) fashion than relation DB way. Once such a model is created, you need to implement mapper layer/classes that would map data from a relational form to a OO format. Then for each DB you need to write some data access code. The amount of code you write for data access may well depend upon the tools\technologies you use. You could look into Entity Framework or NHibernate or other such ORMs to decrease the code required to access data. But keep in mind these ORM mappers may require their own set of tweaks to work well with MySQL, Oracle, SQL Server.

Using "LINQ to SQL" and "WCF Data Services" on top of Data Access Application Block

In my application, the entity database schema is created after application deployment based on inputs captured from end user, using a tool. I cannot use Entity Framework in this situation, since modeling is not possible without development environment (Visual Studio). The 'Code First' approach is also ruled out since it would require code generation which may lead to needless complexity.
Anyhow I need a Data Access Layer. I am therefore planning to introduce Data Access Application Block (DAAB) into my solution. Using SQL Management Objects (SMO) I can carry out the DDLs and for Data access I will use DAAB.
Now here is my confusion. Can I use LINQ for SQL technology on top of DAAB? I want DAAB to abstract all data access related complexities and then use LINQ to query. I also have a situation where I need to expose entity data through RESTful interface (read as OData). Would I be able to expose my data using WCF Data Services via DAAB?
LINQ is not supported in DAAB. DAAB is based on the good old DataSet and DataReader approach. This post has much more detailed answer with respect to role of DAAB.
LINQ support in Enterprise Library Data Access Application Block
I don't yet fully understand your scenario. If your database schema is created after deployment, then hows your front end application being developed against (as there wont' be any schema, if I get your question right).
If the schema is created after deployment what functionality is in your deployed application. Are you creating user interfaces on the fly using the dynamic schema that end user modelled?
Please do correct my understanding also good if you can give in some more info about your scenario.

EDA based SOA and NServiceBus: Why not just use SSIS packages?

I have been investigating NServiceBus. I liked the idea of the pub-sub model, and that the only real coupling of the publisher and subscriber is the semantic of the message. Right now we use SQL replication to sync our data between the databases of different functional areas of our software. I hate this because our private schema is directly coupled to by the subscriber, and it makes it difficult to change on our side. I was thinking it would be great to replace this with NServiceBus publications, but the change seems a little drastic. What about just using something like SSIS? Can I accomplish the same decoupling using SSIS instead of NServiceBus?
SSIS is based on meta-data and therefore will still need to understand the inner schema of all your data sources and sinks. If the underlying meta-data for each source/sink changes your packages will have to change. You also are connecting via MS technologies and are thusly platform coupled. Since you are moving whole sets of data around it sounds like you may not be temporally coupled(system A has to wait for something to respond in system B). It's hard to tell without knowing more about the systems. Lastly SSIS must be aware of the physical locations of all the players in the exchange so you are spatially coupled as well.
In my opinion I don't think you can get to the same place as NSB without developing a lot of the NSB concepts into the packages. This would require using XML messages over the Sql Broker or something to that effect which has already been solved in NSB(see the NSB Contrib project on Github for the Sql Broker transport).