What would it take to persist the Orion events in a database other than MySQL, like PostgreSQL, using Cygnus?
Thanks.
In order to persists Orion context data in PostgreSQL (or any other backend not yet considered by Cygnus) you will have to create your own sink, let's say, OrionPostgreSQLSink.
Being said that... don't panic! It should be quite easy :) We at Cygnus team have written this guidelines regarding new sinks creation by external contributors. Basically, creating a new sink for Cygnus is as easy as exending OrionSink class and implementing the persist() method. In addition, most of the code from OrionMySQLSink could be re-used; most probably the most specific part will be the development of a PostgreSQLBackend convenience class.
UPDATE:
Some time has passed, and a sink for PostgreSQL is now available in Cygnus! Please check this link for further details.
Related
Recently, I started investigation of activiti framework to integrate it into my current project.
In our project we use teradata database.
So I added activiti dependency and created simple bpmn process for testing purposes.
I tested this process with h2 inmemory database and it worked fine.
But when configured project to use teradata I've got exception on spring boot application startup.
Caused by: org.activiti.engine.ActivitiException: couldn't deduct database type from database product name 'Teradata'
I have googled and found only this topic on internet space:
https://hub.alfresco.com/t5/alfresco-process-services/does-activiti-support-teradata-database/m-p/17587#M287
It seems there is no way to integrate activiti and teradata for now.
So the reason why I am here posting this question is that I just want to make sure there is no way to reach integration between those technologies.
Any suggestions and ideas will be welcomed. Thank you.
Activiti is an open source product and can be "adapted" to almost any back end transactional database. Transaction support is a must as any BPMN engine is basically a state machine.
Database access is isolated in the entity layer and specific SQL is managed by the Ibatis ORM.
To integrate a specific database, you will need to modify the entity and ORM layers.
Certainly possible and actually not that much work (typically about 30 hours in my experience), but it is work you have to do and maintain yourself.
Couchbase CLI comes with the cbbackup and cbrestore commands which I had hoped would allow me to take a database in a known state and back it up and then restore it somewhere else where only a newly installed instance exists. Unfortunately it appears that the target database must have all the right buckets setup already and (possibly) that the restore command requires that each bucket name be mentioned explicitly.
This wouldn't pose too much of a problem if I were hand-holding the process but the goal is to start a new environment in a fully automated fashion and I'm wondering if someone has a working method of achieving this goal.
If it where me, I'd use the CLI, REST API or one of the Couchbase SDKs to write something to automate the creation of the target bucket then do the restore.
REST API:
http://docs.couchbase.com/couchbase-manual-2.5/cb-rest-api/#creating-and-editing-buckets
CLI:
http://docs.couchbase.com/couchbase-manual-2.5/cb-cli/#couchbase-cli-commands
Another option you might look into is to use these same kinds of methods to automate set up of uni-directional XDCR from the source to the target cluster.
I need to broadcast the real time database table to my web application (Asp.net).
I know I can implement it using SignalR and SqlDepedency. But the issue is my database is MySQL. I have done lot of research on how to implement it using MySQL. But didn't find anything useful.
Please guide me if there is something I can try with MySQL to achieve this ?
Thanks.
You should avoid SqlDep. anyway. I always recommend publishing database changes on a service bus and let signalr pick it up. This way you can fire state changes anyware in your domain and signalr can pick it up
Check this library out to abstract signalr from your backend bus (I'm the author)
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki
I have written couple of methods to retrieve data from LDap and put it into MySql database. I put those methods in a Listener, so that it executes while deploying the War.
Now this is a one time action. That means, I have to take all the data from Ldap and put those into the MySql DB, and then work on the database tables. I have nothing to do with the LDap data farther.
Is there any better way to do the data migration thing? Since it is a one time work, and once the database is created successfully, there is no need of these methods.
Please Suggest!
Thanks. :)
For migration exercises, look into the Open Source Pentaho Data Integration tool (PDI, or commonly known as Kettle).
There is a slight learning curve, but it's easy to use, and you'll have it forever.
i was asked to do a book manager at university with hibernate and mysql. I have a simple question. If i choose to do a web application, grails already uses hibernate. GORM runns over hibernate. so to use mysql i only need to configure jdbc grails drivers and that's it?
i mean, "for the project you must use hibernate and mysql" - this are the requirements. So can i do that way?
thanks in advance,
JM
Yes, of course you can.
You'll need to get the MySQL JDBC driver from this location.
Grails? When you're new to programming? Whose idea was this?
Personally, I think that taking on all these unknowns is risky for someone who's "new to programming." Do you know anything about SQL or JDBC? Have you ever written a web project before? This could be difficult.
I don't know how to be more specific. Download the JDBC JAR from the link I gave you.
I'd recommend that you start with a JDBC tutorial first. Hibernate is not for you - yet.
Hibernate is an object-relational mapping tool (ORM). It's a technology that lets you associate information in relational database tables to objects in your middle tier. So if you have a PERSON table with columns id, first, and last Hibernate will let you associate those data with the private data members in your Person Java class.
That sounds easy, but it gets complicated quickly. Your relational and object models might have one-to-many and many-to-many relationships; Hibernate can help with those. Lazy loading, caching, etc. can be managed by Hibernate.
But it comes at a cost. And it can be difficult if you aren't familiar with it.
If you have a deadline, I'd recommend creating a Java POJO interface for your persistence classes and doing the first implementation using JDBC. If you want to swap it out for Hibernate later on you can do it without affecting clients, but you'll have a chance of making progress without Hibernate that way.