Failed to connect jira api: failed to access class BasicHttpCache from class CachingHttpAsyncClient - apache-httpclient-4.x

I faced an issue durring connecting to jira api by existing code on ear application using application server wildfly19.0.1.Final :
1- the added code :
AsynchronousJiraRestClientFactory factory = new AsynchronousJiraRestClientFactory();
IssueRestClient client= factory.createWithBasicHttpAuthentication(URI.create("https://jiralink.com"), "username", "password");
dependencies
org.apache.httpcomponents:httpasyncclient-cache:4.1.4
org.apache.httpcomponents:httpclient-cache:4.5.5
the issue:
Caused by: java.lang.IllegalAccessError: failed to access class org.apache.http.impl.client.cache.BasicHttpCache from class org.apache.http.impl.client.cache.CachingHttpAsyncClient (org.apache.http.impl.client.cache.BasicHttpCache is in unnamed module of loader 'deployment.MegaPack.ear.httpclient-cache-4.5.5.jar' #51bff63e; org.apache.http.impl.client.cache.CachingHttpAsyncClient is in unnamed module of loader 'deployment.MegaPack.ear.httpasyncclient-cache-4.1.4.jar' #1746dc55) at org.apache.http.impl.client.cache.CachingHttpAsyncClient.(CachingHttpAsyncClient.java:174) ~[httpasyncclient-cache-4.1.4.jar:4.1.4] at com.atlassian.httpclient.apache.httpcomponents.ApacheAsyncHttpClient.

Related

Spring Boot Apache Kafka: ListenerExecutionFailedException Listener failed

Trying to read messages in consumer I get the following exception:
org.springframework.kafka.listener.ListenerExecutionFailedException: Listener failed; nested exception is org.springframework.kafka.support.serializer.DeserializationException: failed to deserialize; nested exception is org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found
...
Caused by: org.springframework.messaging.converter.MessageConversionException: failed to resolve class name. Class not found
I've been looking at the deserialiser but I cannot seem to find the right way to resolve it.
I am working on an application split across different microservices.
Right now I am working on the logic to send emails to newly registered users. So for this scenario, I have two microservices; the user service and the email service.
User Management - Producer - application.yml
kafka:
properties:
security.protocol: 'PLAINTEXT'
template:
default-topic: user-creation
producer:
bootstrap-servers: ${kafka_bootstrap_servers:localhost:9092}
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
key-serializer: org.apache.kafka.common.serialization.StringSerializer
Email service - Consumer - application.yml
kafka:
properties:
security.protocol: 'PLAINTEXT'
consumer:
bootstrap-servers: ${kafka_bootstrap_servers:localhost:9092}
group-id: user-creation-consumer
auto-offset-reset: earliest
# Configures the Spring Kafka ErrorHandlingDeserializer that delegates to the 'real' deserializers
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
properties:
# Delegate deserializers
spring.json.trusted.packages: '*'
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
The user management service uses a Kafka topic user-creation to alert different microservices of user generation.
private final KafkaTemplate<String, RegisteredUser> kafkaTemplate;
public void sendMessage(RegisteredUser registeredUser){
log.info("########## Sending message: {}", registeredUser.toString());
this.kafkaTemplate.send(new ProducerRecord<>("user-creation", registeredUser));
}
The email service listens to the for updates on the user-creation topic:
#KafkaListener(topics = "user-creation")
#Service
#Slf4j
public class Consumer {
#KafkaHandler
public void listen(String string){
log.info("Received String message {}", string);
}
#KafkaHandler
public void listen(ConsumerRecord<String, NewUser> record) {
log.info("Receive NewUser object {}", record.value());
}
#KafkaHandler(isDefault = true)
public void consume(#Payload Object data) {
log.info("received data='{}'", data);
}
}
The two services are split to avoid tight coupling; hence the object RegisteredUser DTO used in User creation is not accessible to the Email service or the other services. I am using a very similar class with the same signature and fields but that is still failing.
What is the best way to handle such a scenario? I am quite new to Kafka so I am not sure how to progress - most tutorials online have the producer and consumer in the same code base so the DTO can be easily shared.
The idea is that the RegisteredUser DTO has fields/elements useful for other services so it will include more data - I only need to read a part of it.
TIA

Connect to MySQL from Google Dataflow

I am trying to connect to an AWS RDS MySQL instance from Google Dataflow. I created a java program to create the pipeline. The job creates successfully but the MySQL connection always fails with the following error:
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:338)
at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:308)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
at com.google.cloud.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
at com.google.cloud.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.create(MapTaskExecutorFactory.java:154)
at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:308)
at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:264)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:133)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:113)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:100)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: com.mysql.cj.jdbc.exceptions.CommunicationsException: Communications link failure
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at org.apache.beam.sdk.io.jdbc.JdbcIO$ReadFn$DoFnInvoker.invokeSetup(Unknown Source)
at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:63)
at com.google.cloud.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:45)
at com.google.cloud.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:94)
at com.google.cloud.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:74)
at com.google.cloud.dataflow.worker.MapTaskExecutorFactory.createParDoOperation(MapTaskExecutorFactory.java:415)
at com.google.cloud.dataflow.worker.MapTaskExecutorFactory$3.typedApply(MapTaskExecutorFactory.java:326)
... 14 more
Caused by: com.mysql.cj.jdbc.exceptions.CommunicationsException:
Communications link failure
Caused by: java.net.SocketTimeoutException: connect timed out
The JAVA source code is below:
public class MySQLToBQ {
public static void main(String[] args) throws Exception {
DataflowPipelineOptions options = PipelineOptionsFactory.as(DataflowPipelineOptions.class);
options.setProject("project_name");
options.setStagingLocation("gs://staging");
options.setTempLocation("gs://temp");
options.setRunner(DataflowRunner.class);
options.setJobName("MySQL-To-BQ-" + new SimpleDateFormat("yyyyMMdd-HHmmss").format(new Date()));
System.out.println("Job Name " + options.getJobName());
Pipeline p = Pipeline.create(options);
DataSourceConfiguration mysqlConfig = JdbcIO.DataSourceConfiguration.create(
"com.mysql.cj.jdbc.Driver", "jdbc:mysql://mysql_host:3306/mysql_database")
.withUsername("user")
.withPassword("password");
p.apply("mysql_source", JdbcIO.<SourceRow>read()
.withDataSourceConfiguration(mysqlConfig)
.withQuery("query")
.withCoder(SerializableCoder.of(SourceRow.class))
.withRowMapper(new JdbcIO.RowMapper<SourceRow>() {
#Override
public SourceRow mapRow(ResultSet resultSet) throws Exception {
SourceRow datarow = new SourceRow();
ResultSetMetaData rsmd = resultSet.getMetaData();
for(int i = 1; i <= rsmd.getColumnCount(); i++) {
datarow.add(rsmd.getColumnName(i), resultSet.getString(i));
}
return datarow;
}
}
)
)
.apply(table + "_transform", ParDo.of(new TransformToTableRow()))
.apply(table + "_destination", BigQueryIO.writeTableRows()
.to("table_name")
.withSchema(getSchema())
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
);
p.run();
}
}
I was able to create a Compute Engine VM instance and successfully connect to the MySQL database from there.
On Dataflow you cannot whitelist an IP to enable Dataflow to access a SQL instance. I'm not sure for AWS RDS but for Cloud SQL so you should use JDBC socket factory instead https://cloud.google.com/sql/docs/mysql/connect-external-app#java
For Java, you can use public access and using this: https://github.com/GoogleCloudPlatform/cloud-sql-jdbc-socket-factory.
Did you follow Connect to an Amazon Aurora DB cluster from outside a VPC to make the instance publicly accessible?
To connect to an Amazon Aurora DB cluster directly from outside the VPC, the instances in the cluster must meet the following requirements:
The DB instance must have a public IP address
The DB instance must be running in a publicly accessible subnet
As well as configuring your DB instance so that it can be connected to from outside a VPC, you can also secure the connections using Transport Layer Security (TLS), formerly known as Secure Sockets Layer (SSL).
You'll need to do that prior to being able to connect to it. Based on the code sample (jdbc:mysql://mysql_host:3306/mysql_database), it doesn't look like that's a public host.

Kohana 'Model not found' in Amazon EC2

I've a Kohana 3.3 project working in a linux server and I moved it to Amazon EC2 linux instance.
It loads correctly all 'classes/Model/xxxxx.php' Models, but fails if model has no file definition (models that only resides in database), showing a 'Model not found' error.
Also I've some problems with model's properties, showing a 'The aaaaaa property does not exist in the Model_Bbbbbb class'
I am aware of PSR-0 implementation on Kohana >3.2
Model not found error:
ErrorException [ Fatal Error ]: Class 'Model_role' not found
Property error:
Kohana_Exception [ 0 ]: The team property does not exist in the Model_User class
These are my implementations:
/application/classes/Model/user.php
class Model_User extends Model_Auth_User
{
public function rules()
{
…
}
protected $_has_many = array(
'team' => array('through' => 'user_teams'),
);
}
Database tables:
roles
users
user_teams
Since this code is working on a previous linux server, I discarded PSR-0 problems, and I think this is a misconfiguration of Amazon Linux AMI.
Any idea?
Model not found error:
ErrorException [ Fatal Error ]: Class 'Model_role' not found
You need to have a PHP class for each model. It's possible it can be very simple, such as class Model_Role extends ORM {}, but it needs to exist and be placed in the right directory for autoloading.
Property error:
Kohana_Exception [ 0 ]: The team property does not exist in the
Model_User class
probably you need to update the $_table_columns property in your Model_User to include the team field. The other possibility is your database doesn't have the team column in the users table.

Flyway migration, Unable to obtain Jdbc connection from DataSource

I am trying to use flyway to create and manage a MySQL database. Here is the code i have got so far.
FlywayMigration.java : Class that applys the migration
public class FlywayMigration
{
public FlywayMigration(DatabaseConfiguration configuration, Flyway flyway)
{
flyway.setDataSource(configuration.getDataSource());
flyway.migrate();
}
public static void main(String[] args)
{
new FlywayMigration(new DatabaseConfiguration("database.properties"), new Flyway());
}
}
DatabaseConfiguration.java : Configuration class, this class will configure the datasource to be applyed to the Flyway.setDataSource method
public class DatabaseConfiguration
{
private final Logger LOGGER = LoggerFactory.getLogger(this.getClass());
private PropertiesUtil prop = null;
public DatabaseConfiguration(String file)
{
prop = new PropertiesUtil(file);
}
public String getDataSourceClass()
{
return prop.getProperty("mysql.data.source.class");
}
public String getURL ()
{
return prop.getProperty("mysql.url");
}
public String getHostName()
{
return prop.getProperty("mysql.host.name");
}
public String getDatabaseName()
{
return prop.getProperty("mysql.database.name");
}
public DataSource getDataSource()
{
MysqlDataSource dataSource = new MysqlDataSource();
dataSource.setURL(getURL());
dataSource.setUser(prop.getProperty("mysql.user.name"));
dataSource.setPassword(null);
return dataSource;
}
}
database.properties is the file where i store the database information, password can be null
mysql.data.source.class=com.mysql.jdbc.Driver
mysql.url=jdbc:mysql://localhost:3306/vmrDB
mysql.host.name=localhost
mysql.database.name=vmrDB
mysql.user.name=root
And i get the folowing error in my trace
Exception in thread "main" org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
at org.flywaydb.core.internal.util.jdbc.JdbcUtils.openConnection(JdbcUtils.java:56)
at org.flywaydb.core.Flyway.execute(Flyway.java:1144)
at org.flywaydb.core.Flyway.migrate(Flyway.java:811)
at com.bt.sitb.vmr.migration.FlywayMigration.<init>(FlywayMigration.java:10)
at com.bt.sitb.vmr.migration.FlywayMigration.main(FlywayMigration.java:15)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
Can someone please tell me why the DataSource from MySQL is not connecting?
It looks like Flyway cannot connect to the database.
One reason for this is that the database in the database URL does not exist.
Question: does your database schema exist?
If your answer is no, then:
connect to jdbc:mysql://localhost:3306/mysql
also specify the schema to use for migration with flyway.setSchemas(configuration.getDatabaseName())
you also need flyway.init() before you can initialize migration of your database.
Ran into this same issue. Apparently, the problem was with my .properties file. The jar was using the one packaged with it and not the external one. So I moved my external properties file out of the resources folder and into the root directory of the jar and problem solved!
Hope this helps someone.
I had this same issue when working on a Java application in Debian 10 using Tomcat Application server.
I defined the connection strings for the database in the context.xml file, however, when I start out the application and try to log into the application, I get the error:
Exception in thread "main" org.flywaydb.core.api.FlywayException: Unable to obtain Jdbc connection from DataSource
at org.flywaydb.core.internal.util.jdbc.JdbcUtils.openConnection(JdbcUtils.java:56)
at org.flywaydb.core.Flyway.execute(Flyway.java:1144)
Here's what I figured out:
I finally realized that the application was using internally defined database connection strings that were packaged with it. The internally defined database connection strings were different from my own database connection strings defined in the context.xml file.
The solution for me was to either modify the internally defined database connection strings that were packaged with the application or use the same internally defined database connection strings that were packaged with application in my context.xml file.
That's all.
I hope this helps.

main() throws exception from createEntityManager() when using EclipseLink

I have a simple program using JPA entities to write into a Derby DB (the entities were generated from an existing DB tables). I am using Eclipse and there is a working connection between the Derby client and the server via the EclipseLink Data Source Explorer .
Here is the start of my main():
import javax.persistence.*;
import java.sql.Timestamp;
import java.util.*;
import javax.*;
public class start {
/**
* #param args
*/
private static final String PERSISTENCE_UNIT_NAME = "zodiac";
private static EntityManagerFactory factory;
public static void main(String[] args) {
try {
factory = Persistence.createEntityManagerFactory(PERSISTENCE_UNIT_NAME);
EntityManager em = factory.createEntityManager();
System.out.println("after factory gen" );
when I the line with createEntityManager() is executed the following exception is thrown:
[EL Info]: 2012-03-07 22:46:21.892--ServerSession(253038357)--EclipseLink, version: Eclipse Persistence Services - 2.3.2.v20111125-r10461
[EL Severe]: 2012-03-07 22:46:22.064--ServerSession(253038357)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.3.2.v20111125-r10461): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: java.sql.SQLException: No suitable driver
Error Code: 0
Any idea what is the problem ? thanks
If you're in Eclipse you need to add the driver to your project classpath. Sounds like you already have a datasource so you must have defined the driver library. All you need to do is "Add Library" to your Java Build Path and choose "Connectivity Driver Definition" and then the Derby driver from the drop down list of available driver definitions.
FYI, there's a checkbox in the New JPA Project wizard where you can select "add driver to classpath" to do this when you create a new project.
Of course you can also add the derbyclient.jar to your classpath directly or define a user library that includes it.
--Shaun