I have used Spring Integration FTP to copy files from an FTP server to local and processing it. It is working fine with local using Spring Boot and Spring Integration, but I have deployed the application in OpenShift, but it is not looking for the FTP files.
For example:
In local FileReadingMessageSource is getting invoked
[task-scheduler-5] o.s.i.file.FileReadingMessageSource : Created message: [GenericMessage [payload=file-temp\abcd.xml, headers={id=30e5289a-aba6-19db-1d81-3036fca251b0, timestamp=1464675579294}]]
But it is not invoked in OpenShift. Is there any special configuration required for it to work?
As per the Linux team, OpenShift is not supporting FTP Client and it is not possible to run FTP using Spring Integration in OpenShift.
Related
I have spring boot admin server deployed in openshift with the help of fabric8 maven plugin
And also i have several applications deployed in openshift.
Spring boot admin server (SBAS) use spring cloud kubernetes discovery to discover services (applications) registered / running in namespace / cluster, which is automatic client discovery.
SBAS discovered as expected, its fine but some applications shown / registered in SBAS use http and some use https to check the health as like below
I have no idea, why SBAS use http for some apps and for https for some apps to check the health.
Since SBAS use https and port 8443 it shows applications are offline but those applications are exposed in http 8080 only
I have compared applications code and openshift configurations but i don't see any difference and how to fix this issue.
I am new to all above concepts could some one help me ?
I didn't find solution for this issue, but i did work around which helped me.
Since i am using only one port 8080, i have deleted other ports such as 8443 and 8778 via openshif yml as shown below. but you have you have to expose more ports this won't help.
I'm deploying WSO2 API manager 2.6.0 with an external MySQL database and I'm trying to have my API's persist when I change my deployment.
Currently I have 2 deployments using the same external database, one local and the other hosted on an AWS EKS cluster. When I create an API on my local deployment, I can only view it on my AWS deployment if I'm logged in to the store, and visa-versa for my localhost deployment.
The expected and desired behaviour is that all APIs created on both deployments should be displayed on the store no matter if I'm logged in or not, is there any configurations I can change to make the happen?
Here is the doc I used to configure the external database: https://docs.wso2.com/display/AM260/Installing+and+Configuring+the+Databases
I am using Red Hat Openshift and Spring Cloud / Netflix OSS. I have developed the applications and can get this working locally on my machine. However, when I deploy to Openshift I cannot get the applications to register to the Eureka server. I can get Eureka running but when I deploy the Eureka client applications they do not register to the Eureka server.
I have updated the properties to point to the Eureka server but the applications do not register. I am running Eureka on port 8761. The properties of my application (Zuul application) is as follows:
spring.application.name=netflix-zuul-api-gateway-server
server.port=8765
eureka.instance.hostname=eureka-service.currency-conversion-service.svc
eureka.client.service-url.default- zone=http://${eureka.instance.hostname}:8761/eureka/
eureka.instance.preferIpAddress=false
I have pointed the application to the service hostname that was allocated to the Eureka server in Openshift and I have added the port to the Eureka client URL as can be seen above. I am not sure this is correct or I should be using the route. I am new to Openshift and there does not seem to be much information on the net for Spirng Boot and Spring Cloud in Openshift.
Can anybody help please.
Thanks
I have a basic spring batch app that runs on embedded Apache Tomcat in spring boot. I need to add spring admin capabilities to it. As per latest spring docs, I need to use spring cloud data flow to do this (https://docs.spring.io/spring-batch-admin/). So now I need to use spring cloud dataflow and integrate my spring batch app on local server. I just want it to run on my local machine under tomcat without deploying it to any cloud environments like cloud foundry or openshift. Is it possible? I am sure its possible. I would like to get some references/Examples on this type of integration and starter guide integrating spring batch app. Do I need to create tasks in spring cloud data flow to run my spring batch app? If there are any sample examples/pseudo code to guide me then it would be easy.
As described in the migration-guide, you can use the "local" variant of Spring Cloud Data Flow (SCDF) as a replacement to Spring Batch Admin (SBA).
SCDF is a simple Spring Boot application that you can run it as a standalone Java process similar to how you're running the application today.
Also, as described in the migration-steps, you'd have to port your existing batch workload to Spring Cloud Task model, and that should be a straightforward process - use this Spring Batch sample. To the most part, you'd copy/paste the business logic into a Spring Cloud Task application and all the infrastructure including schemas, repository, and other batch goodies will continue to work. There are few complex implementations in task-app-starers, which can be used as a reference, too.
Lastly, you can use SCDF's dashboard for monitoring and management.
Can I deploy Mule on any of the application server. If so how do we deploy the Mule examples.
I have configured my eclipse to run JBoss and Mule Flows dont get deployed in the JBOss server. The syncronisation gives error(null pointer).
But when I run as Mule Application it runs fine but starts Mule Server.
How does Mule deployed in the production? Do we need Mule server in production working along with the application Server?
Can we package all the application in one(ESB + application) and deploy it in the application server.
You have the choice regarding production deployments of Mule:
Use Mule standalone as your first choice, as it comes packed with all modules/transport, is production grade (control scripts...) and supports hot application reloads.
If your deployment environment forces you to deploy on application servers (like JBoss), package your Mule application as a web application (WAR).
If you decide to package your Mule application as a WAR, I strongly suggest you use Maven to do so, as they each Mule module/transport require numerous dependencies. Dealing with this by hand would be insane.
Also be sure to use Servlet inbound endpoints instead of HTTP inbound endpoints otherwise Mule will open another HTTP server inside the web container. You want Mule to use the servlet container for its inbound HTTP requests.
Yes you can. You might want to take a look to the Embedding Mule in a Java Application or Webapp manual page, and the Deploying Mule as a Service to Tomcat.