Message bridge between TIBCO EMS and Solace EMS - message-queue

We are migrating from TIBCO EMS to Solace EMS and in order to minimize any disruption, we are trying to bridge the messages from TIBCO to Solace. Information from TIBCO Support is that messages cannot be routed to another JMS provider, however I find this improbable. Does anyone have any ideas how to connect both EMS systems?

Solace has recently launched an integration tool called HybridEdge which is based on Apache Camel. Part of the Solace integration is a JMS component (Camel adapter). Using HybridEdge, you could easily set up a "route" (Camel flow) that has Tibco EMS via the Camel JMS component using the EMS JMS connectionFactory and bridge to Solace JMS via their component (which uses their JMS connectionFactory)
https://github.com/SolaceProducts/solace-hybridedge is where the Solace HybridEdge starter project is. It's an example of how you can get started with HybridEdge.
You would then use the Camel JMS component to connect to EMS. Info on the component is here: http://camel.apache.org/jms.html
Keep in mind that you are bridging 2 brokers through another middleware (the Camel Exchange)... this is bound to have more latency and less msgs/sec than you are used to with just EMS or Solace alone, especially with persistent messages that need to be ack'd all the way back.

You could use 'forwarding channels' in Replay for Messaging: https://www.tradeweb.com/institutional/services/replayservice/
Replay for Messaging is a cross-provider Messaging Database and Messaging Bridge originally developed at CodeStreet, now owned by Tradeweb (Note: I work there). The ReplayServer is written in C++, so it's low-latency and you can quickly setup bridges between TIBCO EMS and Solace from the WebUI, also with optional Conversion, if needed.
The Replay function can help with testing during the migration process.

Related

Hexagonal Architecture / Ports & Adapters: Application Configuration with multiple driver adapters

I'm look for some guidance or best practise for how to configure and structure an Application which conforms to Hexagonal architecture that supports multiple (driver) adapters simultaneously.
My API / Application Layer / Ports represent the boundary of the Application. I am now writing the driver adapters, with the goal that the application supports both a console / CLI adapter and REST adapter in tandem.
Does anyone have any thoughts on approaches to the Main Component that configures and wires the application together?
A single Main Component that configures the full application: including all primary adapters. Along with loading the application configuration. In this case it would start the REST services and start the CLI console app.
A separate Main Component for each type of Primary adapter. ie. One for the REST application. One for the CLI / Console application. My concern is will result in a lot of duplication for configuring the Application within the boundary (ie. the API Services, Repositories, etc etc).
Follow the above approach but extract the common configuration / wiring into a shared class.
If anyone has any examples they could share that would be interesting to see.
Cheers,
Steve
This is an interesting question.
From my point of view, trying to be faithful to the pattern explained by its author, although it would also be posible to run more than one driver adapter for one driver port, the "app as a whole" (let's call it system, since the app is the hexagon) is an instance of a driver adapter running on each driver port of the hexagon, and a driven adapter implementing each driven port.
The configuration of the system is the adapter to select for each port. When you run the main component, you have to specify which adapter you want for every port.
That said, I studied two approaches in order to run the system:
(1) To have an additional component (name it main component, composition root, startup, init, or whatever you want) that instantiates the driven adapters and the hexagon, and finally instantiate the driver adapters and run them. This way, the system architecture would look like an app container in the driver side, and a plugin architecture in the driven side.
(2) To run each driver adapter on its own. It is the driver adapter that starts the game, asking the hexagon for a driver port instance, and the hexagon would ask every driven port for a driven adapter instance.
So to your question about the main component in your example, according to my approach (1), I would have two hexagon instances running, but you could have just one, I don't see any problem on that.
I wrote a theorical article about hexagonal architecture at https://softwarecampament.wordpress.com/portsadapters/ , and now I'm working on an article about how to implement hexagonal architecture, and a code example.

Connectors in Activiti BPM

Currently we evaluate Activiti as a possible Open Source Business Process Engine. One important requirement is an easy integration of external systems (ECM, CRM, SharePoint, SAP...) within the processes. During research I found some articles claiming that there are no build-in connectors to other systems. The only way to interact with external systems is to invoke java classes (see http://forums.activiti.org/content/how-create-connector and http://books.google.de/books?id=kMldSaOSgPYC&pg=PA100&lpg=PA100&dq=Bonita+Open+Solution+connectors&source=bl&ots=uwzz5OSten&sig=h2wf0q5J3xAxwN3AZ7Vondemnec&hl=de&sa=X&ei=uwBYUtehHoTqswacrYHgDQ&ved=0CIUBEOgBMAc4Cg#v=onepage&q=Bonita%20Open%20Solution%20connectors&f=false)
How complex is the integration of external systems in Activiti processes? Is it true that there are no bulid-in connectors? This would be a showstopper-criteria for us.
best regards and thanks for you reply
Ben
Currently (as version 5.14) Activiti has direct connection to
Alfresco for document repository
Drools for rule tasks
LDAP for groups and users
Mule for sending messages
Camel for sending/receiving messages
To integrate any other external system you need to use Java Service Task, where you can use Java classes to delegate workflow to your external system. These Java classes can take variables from your workflow, can direct to one of its outgoing flows and of course you can use any capability of your external system.

Node.js SOA with JSON web-services - configuration

I am starting research on how to implement Node.js SOA (service oriented architecture) with JSON web-services.
As a small sub-question, I need an approach/framework/system to make universal configuration center for all companies web-services. So that we don't configure every application with exact address of other application, but just link to some central server to get that information.
(This should be very well worked-out topic for XML-based services, so some terminology/approaches/etc could/should be borrowed.)
Related to
RESTful JSON based SOA Registry
Service Oriented Architecture suggestions
UPDATE: This questions is about web-services configuration & orchestration.
GO for an active(having activity happening off late) framework with lean architecture.There's one called Geddy and another called Restify. If in doubt, Express can also be used for building webservices with JSON.
You can work on reading the centrally stored config from different app codebse when you use any of these.

interested in zeroMQ but client binding options prove limiting

This is related to an earlier question I had asked about what sort of middleware one can use for developing a client/server app.
Among the options suggested, I was intrigued by zeroMQ and its capabilities.
Since morning, I have been researching on using zeroMQ for my application. However, since my client is a Adobe AIR/FLEX, I see a steep curve in using zeroMQ since there are no bindings available for actionscript.
Google search shows a popular client called STOMP that can be used for messaging in flex based applications but there doesn't seem to be any STOMP adapter for zeroMQ either.
This leaves me with other alternatives such as RabbitMQ or ActiveMQ (since they both seem to have STOMP adapters) as possible middleware choices.
How hard/easy it is to develop to stomp adapter for zeroMQ? I hardly found any documentation on writing an adapter. Or is it worth writing an adapter for zeroMQ than focus on, say, using RabbitMQ that supports STOMP.
Finally, what are other popular alternatives to STOMP for Flex on the client side and leverage zeroMQ on the middleware part.
Thanks
Dece
STOMP is probably going to be the only option from FLEX since it's by far the simplest MOM protocol available. Furthermore, since it's such a simple protocol, I'm surprised someone has not yet implemented a bridge from STOMP to zeromq.

How to get a mixed SSIS-J2EE system to communicate via messaging?

I'm currently developing an ETL solution which, for various reasons, include SSIS components as well as J2EE services.
I need the various components to communicate asynchronously via messaging queues. However, the obvious constraint is that SSIS only integrates with MSMQ while it obviously makes sense to use JMS on the Java side.
I have considered the MSMQ/MQSeries Bridge (we use WebsphereMQ internally) but I feel this adds another layer of complexity to the solution.
I now wonder whether there is a simpler solution to achieve cross-platform messaging. The purpose of the messaging approach is really to implement transfer of control between components, rather than pass data. Each component, whether it's a SSIS package or a J2EE service, will read/write from the same underlying database so I wonder if I'm better off just implementing a polling mechanism on either side. Suggestions are welcome.
Christophe.
depending on your needs you could write your own bridge to move messages between MSMQ and WMQ. We have done pretty easily using .NET and the IBM XMS libraries.
http://www-01.ibm.com/support/docview.wss?rs=171&uid=swg24011756&loc=en_US&cs=utf-8&lang=en
You could use an ESB instead of JMS and use the Web Service task in SSIS to connect to and from the ESB via SOAP.
If all you need in J2EE->SSIS channel is ability to start SSIS package from J2EE, I think the simplest solution is to configure SQL Server Agent Job that runs this package, and then invoke sp_startjob stored proc from Java - should be way easier, and less additional components involved.
I'm not sure what's the best way to call SSIS->J2EE.