Spring Batch integration with spring cloud data flow on local server to add spring admin capabilities - spring-batch-admin

I have a basic spring batch app that runs on embedded Apache Tomcat in spring boot. I need to add spring admin capabilities to it. As per latest spring docs, I need to use spring cloud data flow to do this (https://docs.spring.io/spring-batch-admin/). So now I need to use spring cloud dataflow and integrate my spring batch app on local server. I just want it to run on my local machine under tomcat without deploying it to any cloud environments like cloud foundry or openshift. Is it possible? I am sure its possible. I would like to get some references/Examples on this type of integration and starter guide integrating spring batch app. Do I need to create tasks in spring cloud data flow to run my spring batch app? If there are any sample examples/pseudo code to guide me then it would be easy.

As described in the migration-guide, you can use the "local" variant of Spring Cloud Data Flow (SCDF) as a replacement to Spring Batch Admin (SBA).
SCDF is a simple Spring Boot application that you can run it as a standalone Java process similar to how you're running the application today.
Also, as described in the migration-steps, you'd have to port your existing batch workload to Spring Cloud Task model, and that should be a straightforward process - use this Spring Batch sample. To the most part, you'd copy/paste the business logic into a Spring Cloud Task application and all the infrastructure including schemas, repository, and other batch goodies will continue to work. There are few complex implementations in task-app-starers, which can be used as a reference, too.
Lastly, you can use SCDF's dashboard for monitoring and management.

Related

Configuration of Axon Framework 4.2 without Spring Boot

I'm new to Axon framework. I'm evaluating it with Axon Server.
Using Spring Boot is very convenient und worked right away.
Problem is, we are aiming at using Spring Boot processes tied together with non Spring Boot processes (Hybris Commerce from SAP).
My question is: What do I have to do to configure Axon framework 4.2 without Spring Boot? (Without Spring Boot the AxonServerAutoConfiguration does not work).
Thanks in advance!
Axon provides the Configuration API, which as been given its own module as of release 4.
This package contains the Configurer, AggregateConfigurer, EventProcessingConfigurer and SagaConfigurer which should give you all the handles to define any set up with buses, aggregates and query models.
The Reference Guide typically provides snippets on how to configure a given component, both through the Configuration API and Spring Boot as separate code tabs.
Additionally, it's sensible that the AxonServerAutoConfiguration does not work without Spring Boot, as it's a Spring Boot dedicated solution to configuring your application.
Even without it though, Axon will (through a Service Loader mechanism) auto-load the ServerConnectorConfigurerModule once you've created a default Configurer. This should give you the required infrastructure components to configure an application using Axon Server without utilizing Spring Boot.

How to wire/configure two pubsub gcp projects in one spring boot application with spring cloud?

Currently, we're working on a spring boot application in a GCP project, which connects to a PubSub endpoint in the same GCP project, but also to a PubSub endpoint in another GCP project. I want to use plain spring cloud GCP PubSub components, but with those, I have no chance to set a second PubSub connection to a second GCP project. Also, if I would have one service account, with PubSubTemplate object I have no possibility to target a topic in another project than the current from the service account. Is the only way to implement/extend the PubSubAdmin/PubSubTemplate or is there also a solution like multiple connection and template for JPA databases?
Kind Regards
Sven
Considering correct privileges for another project,you can publish/subscribe to topics in different projects with a fully qualified topic/subscription name
eg. for topic publish pubSubTemplate.publish("projects/other-project/topics/the-topic", "payload").
This should be available with the latest spring-cloud-gcp version. Please see this issue for more details.

Deploy Spring Microservices on Openshift

I need to deploy a few microservices on the Openshift. These microservices are implemented using Spring Cloud. I use Spring Eureka for service discovery/load-balancing && Spring Zuul for service routing.
From what I understand, Openshift already provides these features ( service discovery, load balancing, routing ) via Kubernetes.
With this being said, can I integrate Spring Eureka and Spring Zuul with the openshift platform?
Woudn't it be redundant to add Spring Eureka & Spring Zuul components into Openshift since the platform itself already provides these microservice features ?
I was thinking of removing the service registry & routing Spring components and just implement routing using Openshift. However, that would leave the project heavily dependent on this cloud platform.
What would your approach be? Use the features provided by the OpenShift (routing, load balancing) or use the ones provided by the Spring framework and try to integrate them with the cloud platform?
Thanks
It would indeed be redundant.
Eureka can be replaced by Kubernetes services. (they provide a load balancer and a domain name for a group of pods)
Zuul can be replaced by OpenShift Routes for exposing your services.
If you are using a platform, use the platform provided functionality. Kubernetes services will be used on any Kubernetes based platform. So I think that's the easy one to replace and keep your coupling to the platform low. The routing can be more difficult, if Zuul is only used for routing; replace it with the OpenShift router. If Zuul also has other responsibilities like security it might be better to stick with Zuul.
I agree with #Jeff and I want to add about using spring zuul as a gateway instead of openshift routes:
If you use spring zuul as a gateway, you provide the accessing from single point to your cluster. Otherwise, your client you must know the urls exposing by openshift routes. It gets increase the complexity of your code and hard to maintain. A major benefit of using an API Gateway is that it encapsulates the internal structure of the application.
The other is about security. If you use openshift routes to expose your internal microservices, actually you open door of the microservice to the public world directly. In addition, If you want to use JWT or security token, you should choose the spring zuul.
The API Gateway provides each kind of client with a specific API. This reduces the number of round trips between the client and application.

Spring Integration - FTP is not working with OpenShift

I have used Spring Integration FTP to copy files from an FTP server to local and processing it. It is working fine with local using Spring Boot and Spring Integration, but I have deployed the application in OpenShift, but it is not looking for the FTP files.
For example:
In local FileReadingMessageSource is getting invoked
[task-scheduler-5] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=file-temp\abcd.xml, headers={id=30e5289a-aba6-19db-1d81-3036fca251b0, timestamp=1464675579294}]]
But it is not invoked in OpenShift. Is there any special configuration required for it to work?
As per the Linux team, OpenShift is not supporting FTP Client and it is not possible to run FTP using Spring Integration in OpenShift.

Working with azure(mysql database) in node javascript with cloud API

I'm working with an API app on Azure by deploying an API written in NodeJs which stores data in MongoDb.
Make a new API for web & mobile apps from Azure portal.
Choose MongoDb by adding MongoLab module from Azure.
Create a table (collection) and populate it with few entries.
Prepare our Git repository from Azure portal and link it to our local Git on computers.
Decide which NodeJs modules to use to set the dependencies.
Edit the configuration file for NodeJs.
Make the main API file with the following functionalities:
Connection to the database.
Running the service.Making CRUD operation services (CREATE, READ,DELETE...)
Testing our API on a browser.
Making an application using our API
My question: how to use these steps to store data in mysql database(azure)?
I cannot fully understand your requirement, do you want to use the MySQL database to store your data in your API application in Node.js? If not, please clarify your purpose.
To implement connection to MySQL in Node.js, you can use some 3rd part MySQL handler modules.
For example:
node-mysql - A pure node.js JavaScript Client implementing the MySql protocol
or
Sequelize - A promise-based ORM for Node.js and io.js
Any further concern, please feel free to let me know.