CAS cannot find authentication handler that supports UsernamePasswordCredential - cas

I have a custom handler like this:
Public class DatabaseAuthenticationHandler extends AbstractJdbcUsernamePasswordAuthenticationHandler {
#Override
protected AuthenticationHandlerExecutionResult authenticateUsernamePasswordInternal(
UsernamePasswordCredential credential, String originalPassword) throws GeneralSecurityException, PreventedException {
final String username = credential.getUsername();
logger.debug("***Username:"+username);
logger.debug("***Password:"+credential.getPassword());
return createHandlerResult(credential, new SimplePrincipal(), null);
}
#Override
public boolean supports(final Credential credential) {
return true;
}
}
To me, this should always log a user in no matter what. But I see in the logs this:
ERROR [org.apereo.cas.authentication.PolicyBasedAuthenticationManager]
- <Authentication has failed. Credentials may be incorrect or CAS cannot find authentication handler that supports
[UsernamePasswordCredential(username=sadf, source=MyJDBCAuthenticationManager)] of type [UsernamePasswordCredential].
Examine the configuration to ensure a method of authentication is defined and analyze CAS logs at DEBUG level to trace the authentication event.
which makes no sense to me as I can see in the logs that cas is calling the authenticatUsernamePasswordInternal method. Obviously this handler supports, well everything.
Why can't I log in?

I think you best use principalFactory.createPrincipal to create the principal rather than returning an new SimplePrincipal().
In your AuthenticationEventExecutionPlanConfigurer & DatabaseAuthenticationHandler, add the following:
AuthenticationEventExecutionPlanConfigurer.java
#Autowired
#Qualifier("principalFactory")
private PrincipalFactory principalFactory;
#Bean
public DatabaseAuthenticationHandler databaseAuthenticationHandler() {
return new DatabaseAuthenticationHandler(principalFactory);
}
DatabaseAuthenticationHandler
Public class DatabaseAuthenticationHandler extends AbstractJdbcUsernamePasswordAuthenticationHandler {
private final PrincipalFactory principalFactory;
public DatabaseAuthenticationHandler (PrincipalFactory principalFactory){
this.principalFactory = principalFactory;
}
#Override
protected AuthenticationHandlerExecutionResult authenticateUsernamePasswordInternal(
UsernamePasswordCredential credential, String originalPassword) throws GeneralSecurityException, PreventedException {
final String username = credential.getUsername();
logger.debug("***Username:"+username);
logger.debug("***Password:"+credential.getPassword());
/////// below here's the change /////////
return createHandlerResult(credential, this.principalFactory.createPrincipal(username), null);
}
#Override
public boolean supports(final Credential credential) {
return true;
}
}
See if the above works, thanks.

The root cause of this problem is that you pass a null parameter to createHandlerResult method,you can change it to new ArrayList<>. I also encountered this problem(My CAS version is 5.3.9).And I also tried the solution gaving by Ng Sek Long,but it didn't work.Then I tried to solve it by myself. I searched for the error message in CAS code and found it in PolicyBasedAuthenticationManager class.
try {
PrincipalResolver resolver = this.getPrincipalResolverLinkedToHandlerIfAny(handler, transaction);
LOGGER.debug("Attempting authentication of [{}] using [{}]", credential.getId(), handler.getName());
this.authenticateAndResolvePrincipal(builder, credential, resolver, handler);
AuthenticationCredentialsThreadLocalBinder.bindInProgress(builder.build());
Pair<Boolean, Set<Throwable>> failures = this.evaluateAuthenticationPolicies(builder.build(), transaction);
proceedWithNextHandler = !(Boolean)failures.getKey();
} catch (Exception var15) {
LOGGER.error("Authentication has failed. Credentials may be incorrect or CAS cannot find authentication handler that supports [{}] of type [{}]. Examine the configuration to ensure a method of authentication is defined and analyze CAS logs at DEBUG level to trace the authentication event.", credential, credential.getClass().getSimpleName());
this.handleAuthenticationException(var15, handler.getName(), builder);
proceedWithNextHandler = true;
}
In the above code snippet, the authenticateAndResolvePrincipal method declaired two kinds of exception.Looked at this method, I found there is a line of code which may throws that two.
AuthenticationHandlerExecutionResult result = handler.authenticate(credential);
The key code which lead to this problem is in DefaultAuthenticationHandlerExecutionResult class.
public DefaultAuthenticationHandlerExecutionResult(final AuthenticationHandler source, final CredentialMetaData metaData, final Principal p, #NonNull final List<MessageDescriptor> warnings) {
this(StringUtils.isBlank(source.getName()) ? source.getClass().getSimpleName() : source.getName(), metaData, p, warnings);
if (warnings == null) {
throw new NullPointerException("warnings is marked #NonNull but is null");
}
}
So, if you use createHandlerResult(credential, new SimplePrincipal(), null), NullPointerException will throw at runtime.It will be catched by catch (Exception var15) code bock and log the error message you see.

Related

CAS difference between Candidate/Registered and Sorted and registered Authentication Handler

I am having a blocker situation with CAS 6.0.x and I cant get past it. I Am unable to log in with a UsernamePasswordCredential. I have even removed all actual checks, and simply return a credential.
Here is the coide:
public class MyDatabaseAuthenticationHandler extends AbstractJdbcUsernamePasswordAuthenticationHandler {
public MyDatabaseAuthenticationHandler(String name, ServicesManager servicesManager, PrincipalFactory principalFactory, Integer order, DataSource dataSource) {
super(name, servicesManager, principalFactory, order, dataSource);
}
#Override
protected AuthenticationHandlerExecutionResult authenticateUsernamePasswordInternal(UsernamePasswordCredential credential, String originalPassword) throws GeneralSecurityException, PreventedException {
return createHandlerResult(credential, this.principalFactory.createPrincipal(username), null);
}
#Override
public boolean supports(final Credential credential) {
return true;
}
}
Here is my config:
#Configuration("My6CasConfiguration")
public class My6CasConfiguration implements AuthenticationEventExecutionPlanConfigurer {
#Autowired
#Qualifier("principalFactory")
private PrincipalFactory principalFactory;
#Bean
public AuthenticationHandler getMyJdbcAuthenticationHandler() {
return new MyDatabaseAuthenticationHandler("MYJDBCAuthenticationManager",
servicesManager,
principalFactory,
0,
customDataSource());
}
#Override
public void configureAuthenticationExecutionPlan(AuthenticationEventExecutionPlan plan) {
plan.registerAuthenticationHandler(getMyJdbcAuthenticationHandler());
}
}
This is what I am getting in the logs:
[36m2019-06-02 11:07:38,544 DEBUG [org.apereo.cas.authentication.DefaultAuthenticationEventExecutionPlan] - <Candidate/Registered authentication handlers for this transaction are [[org.apereo.cas.authentication.handler.support.HttpBasedServiceCredentialsAuthenticationHandler#277fd34b, com.xxx.cas.handler.MyDatabaseAuthenticationHandler#58b0ac93, org.apereo.cas.adaptors.x509.authentication.handler.support.X509CredentialsAuthenticationHandler#41078c16]]>^[[m
[[36m2019-06-02 11:07:38,544 DEBUG [org.apereo.cas.authentication.DefaultAuthenticationEventExecutionPlan] - <Sorted and registered authentication handler resolvers for this transaction are [[org.apereo.cas.authentication.handler.ByCredentialSourceAuthenticationHandlerResolver#663cfaa1, org.apereo.cas.authentication.handler.RegisteredServiceAuthenticationHandlerResolver#272f62cc]]>^[[m
[[36m2019-06-02 11:07:38,545 DEBUG [org.apereo.cas.authentication.DefaultAuthenticationEventExecutionPlan] - <Authentication handler resolvers for this transaction are [[org.apereo.cas.authentication.handler.ByCredentialSourceAuthenticationHandlerResolver#663cfaa1, org.apereo.cas.authentication.handler.RegisteredServiceAuthenticationHandlerResolver#272f62cc]]>^[[m
[[1;31m2019-06-02 11:07:38,549 ERROR [org.apereo.cas.authentication.PolicyBasedAuthenticationManager] - <Authentication has failed. Credentials may be incorrect or CAS cannot find authentication handler that supports [UsernamePasswordCredential(username=asdf, source=MYJDBCAuthenticationManager)] of type [UsernamePasswordCredential]. Examine the configuration to ensure a method of authentication is defined and analyze CAS logs at DEBUG level to trace the authentication event.>^[[m
[[1;31m2019-06-02 11:07:38,550 ERROR [org.apereo.cas.authentication.PolicyBasedAuthenticationManager] - <[MYJDBCAuthenticationManager]: [warnings is marked #NonNull but is null]>
What am I doing wrong that this is not working? And what is that difference between Candidate/Registered, Sorted/Registered, and handler resolvers?
The act that my custom class only shows up in the first, makes me think I have configured something wrong.
Any Ideas?

Spring AMQP RPC consumer and throw exception

I have a consumer (RabbitListner) in RPC mode and I would like to know if it is possible to throw exception that can be treated by the publisher.
To make more clear my explication the case is as follow :
The publisher send a message in RPC mode
The consumer receive the message, check the validity of the message and if the message can not be take in count, because of missing parameters, then I would like to throw Exception. The exception can be a specific business exception or a particular AmqpException but I want that the publisher can handle this exception if it is not go in timeout.
I try with the AmqpRejectAndDontRequeueException, but my publisher do not receive the exception, but just a response which is empty.
Is it possible to be done or may be it is not a good practice to implement like that ?
EDIT 1 :
After the #GaryRussel response here is the resolution of my question:
For the RabbitListner I create an error handler :
#Configuration
public class RabbitErrorHandler implements RabbitListenerErrorHandler {
#Override public Object handleError(Message message, org.springframework.messaging.Message<?> message1, ListenerExecutionFailedException e) {
throw e;
}
}
Define the bean into a configuration file :
#Configuration
public class RabbitConfig extends RabbitConfiguration {
#Bean
public RabbitTemplate getRabbitTemplate() {
Message.addWhiteListPatterns(RabbitConstants.CLASSES_TO_SEND_OVER_RABBITMQ);
return new RabbitTemplate(this.connectionFactory());
}
/**
* Define the RabbitErrorHandle
* #return Initialize RabbitErrorHandle bean
*/
#Bean
public RabbitErrorHandler rabbitErrorHandler() {
return new RabbitErrorHandler();
}
}
Create the #RabbitListner with parameters where rabbitErrorHandler is the bean that I defined previously :
#Override
#RabbitListener(queues = "${rabbit.queue}"
, errorHandler = "rabbitErrorHandler"
, returnExceptions = "true")
public ReturnObject receiveMessage(Message message) {
For the RabbitTemplate I set this attribute :
rabbitTemplate.setMessageConverter(new RemoteInvocationAwareMessageConverterAdapter());
When the messsage threated by the consumer, but it sent an error, I obtain a RemoteInvocationResult which contains the original exception into e.getCause().getCause().
See the returnExceptions property on #RabbitListener (since 2.0). Docs here.
The returnExceptions attribute, when true will cause exceptions to be returned to the sender. The exception is wrapped in a RemoteInvocationResult object.
On the sender side, there is an available RemoteInvocationAwareMessageConverterAdapter which, if configured into the RabbitTemplate, will re-throw the server-side exception, wrapped in an AmqpRemoteException. The stack trace of the server exception will be synthesized by merging the server and client stack traces.
Important
This mechanism will generally only work with the default SimpleMessageConverter, which uses Java serialization; exceptions are generally not "Jackson-friendly" so can’t be serialized to JSON. If you are using JSON, consider using an errorHandler to return some other Jackson-friendly Error object when an exception is thrown.
What worked for me was :
On "serving" side :
Service
#RabbitListener(id = "test1", containerFactory ="BEAN CONTAINER FACTORY",
queues = "TEST QUEUE", returnExceptions = "true")
DataList getData() {
// this exception will be transformed by rabbit error handler to a RemoteInvocationResult
throw new IllegalStateException("mon expecion");
//return dataHelper.loadAllData();
}
On "requesting" side :
Service
public void fetchData() throws AmqpRemoteException {
var response = (DataList) amqpTemplate.convertSendAndReceive("TEST EXCHANGE", "ROUTING NAME", new Object());
Optional.ofNullable(response)
.ifPresentOrElse(this::setDataContent, this::handleNoData);
}
Config
#Bean
AmqpTemplate amqpTemplate(ConnectionFactory connectionFactory, MessageConverter messageConverter) {
var rabbitTemplate = new RabbitTemplate(connectionFactory);
rabbitTemplate.setMessageConverter(messageConverter);
return rabbitTemplate;
}
#Bean
MessageConverter jsonMessageConverter() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(SerializationFeature.FAIL_ON_EMPTY_BEANS, false);
objectMapper.disable(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES);
objectMapper.registerModule(new JavaTimeModule());
var jsonConverter = new Jackson2JsonMessageConverter(objectMapper);
DefaultClassMapper classMapper = new DefaultClassMapper();
Map<String, Class<?>> idClassMapping = Map.of(
DataList.class.getName(), DataList.class,
RemoteInvocationResult.class.getName(), RemoteInvocationResult.class
);
classMapper.setIdClassMapping(idClassMapping);
jsonConverter.setClassMapper(classMapper);
// json converter with returned exception awareness
// this will transform RemoteInvocationResult into a AmqpRemoteException
return new RemoteInvocationAwareMessageConverterAdapter(jsonConverter);
}
You have to return a message as an error, which the consuming application can choose to treat as an exception. However, I don't think normal exception handling flows apply with messaging. Your publishing application (the consumer of the RPC service) needs to know what can go wrong and be programmed to deal with those possibilities.

Apache HttpComponents CookieStore Not Storing Cookies

I'm using HttpComponents 4.5.2 and I'm trying to store cookies as I need to use them for login and other requests. The code works fine whilst the application is still running, but the problem here is when I restart it, the cookies that were supposed to be stored in CookieStore are not there. Here's what I've written:
public static void main( String[] args ) throws InterruptedException
{
RequestConfig globalConfig = RequestConfig.custom()
.setCookieSpec(CookieSpecs.STANDARD).build();
BasicCookieStore cookieStore = new BasicCookieStore();
HttpClientContext context = HttpClientContext.create();
context.setCookieStore(cookieStore);
CloseableHttpAsyncClient httpclient = HttpAsyncClients.custom()
.setDefaultRequestConfig(globalConfig)
.setDefaultCookieStore(cookieStore)
.build();
httpclient.start();
login(httpclient, context);
}
public static void login(CloseableHttpAsyncClient httpClient, HttpClientContext context) throws InterruptedException
{
JSONObject json = new JSONObject("{ email : blahblahblah1, password : blahblahblah2 }");
StringEntity requestEntity = new StringEntity(
json.toString(),
ContentType.APPLICATION_JSON);
HttpPost postMethod = new HttpPost("http://localhost:8080/login");
postMethod.setEntity(requestEntity);
final CountDownLatch latch = new CountDownLatch(1);
httpClient.execute(postMethod, context, new FutureCallback<HttpResponse>() {
public void completed(final HttpResponse response) {
latch.countDown();
System.out.println(postMethod.getRequestLine() + "->" + response.getStatusLine());
//System.out.println(context.getCookieStore().getCookies().size());
}
public void failed(final Exception ex) {
latch.countDown();
System.out.println(postMethod.getRequestLine() + "->" + ex);
}
public void cancelled() {
latch.countDown();
System.out.println(postMethod.getRequestLine() + " cancelled");
}
});
latch.await();
}
I've read the HttpComponents documentation and the section 3.5 about cookies says:
HttpClient can work with any physical representation of a persistent cookie store that implements the CookieStore interface. The default CookieStore implementation called BasicCookieStore is a simple implementation backed by a java.util.ArrayList. Cookies stored in an BasicClientCookie object are lost when the container object get garbage collected. Users can provide more complex implementations if necessary
So I'm wondering if it's left to it's users to implement some kind of structure that can effectively store cookies or if I'm missing something.
Yes, using BasicCookieStore backed by ArrayList means that when your jvm exists, the data there is being lost just like any ArrayList in memory.
BasicCookieStore class also implements Serializable so you can use that to persist it to disk and restore back on your app startup if the file was there.
You can borrow some code from the tests verifying that flow TestBasicCookieStore#testSerialization.

JAX-RS Exception Mapper not working in Grizzly container

Working on a Jersey web application with a team, as the project got bigger and bigger, we decided to switch from Tomcat to Grizzly to allow deploying parts of the project on different port numbers. What I've found out now, that the custom exception handling we have fails to work now, instead I always get the grizzly html page.
Example exception:
public class DataNotFoundException extends RuntimeException{
private static final long serialVersionUID = -1622261264080480479L;
public DataNotFoundException(String message) {
super(message);
System.out.println("exception constructor called"); //this prints
}
}
Mapper:
#Provider
public class DataNotFoundExceptionMapper implements ExceptionMapper<DataNotFoundException>{
public DataNotFoundExceptionMapper() {
System.out.println("mapper constructor called"); //doesnt print
}
#Override
public Response toResponse(DataNotFoundException ex) {
System.out.println("toResponse called"); //doesnt print
ErrorMessage errorMessage = new ErrorMessage(ex.getMessage(), 404, "No documentation yet.");
return Response.status(Status.NOT_FOUND)
.entity(errorMessage)
.build();
//ErrorMessage is a simple POJO with 2 string and 1 int field
}
}
I'm not sure where is the problem source, if needed I can provide more information/code. What's the problem, what can I try?
EDIT:
Main.class:
public class Main {
/**
* Main method.
* #param args
* #throws Exception
*/
public static void main(String[] args) throws Exception {
...
List<ServerInfo> serverList = new ArrayList<ServerInfo>();
serverList.add(new ServerInfo(
"api",8450,
new ResourceConfig().registerClasses(
the.package.was.here.ApiResource.class)
));
for(ServerInfo server : serverList) {
server.start();
}
System.out.println("Press enter to exit...");
System.in.read();
for(ServerInfo server : serverList) {
server.stop();
}
}
}
EDIT2:
based on this question I've tried using this ServerProperties.RESPONSE_SET_STATUS_OVER_SEND_ERROR, "true"property, which only helped a little. I still get the html grizzly page when the exception happens, but now I see my exception (+stack trace) in the body of the page.
You're only registering one resource class for the entire application
new ResourceConfig().registerClasses(
eu.arrowhead.core.api.ApiResource.class
)
The mapper needs to be registered also
new ResourceConfig().registerClasses(
eu.arrowhead.core.api.ApiResource.class,
YourMapper.class)
)
You can also use package scanning, which will pick up all classes and automatically register them, if they are annotated with #Path or #Provider
new ResourceConfig().packages("the.packages.to.scan")

Jersey Jackson data entity filtering JsonMappingException on collection

I have an issue when trying to put in place the “selectable entity filtering”. I have an Abstract class like following one:
// In your Pom
<dependency>
<groupId>org.glassfish.jersey.ext</groupId>
<artifactId>jersey-entity-filtering</artifactId>
</dependency>
....
//Somewhere in resourceConfig: Register entity-filtering selectable feature.
register(SelectableEntityFilteringFeature.class);
property(SelectableEntityFilteringFeature.QUERY_PARAM_NAME, "select");
register(JacksonFeature.class);
…..
Before registering the “selectable entity filtering” all was working fine, I tested that a lot.
And after registering “selectable entity filtering” I have the following error:
[2016-02-15 17:25:36] - DEBUG EntityMapper:116 [http-bio-8080-exec-3] Preparing query INSERT INTO
[2016-02-15 17:25:43] - ERROR JsonMappingExceptionMapper:29 [http-bio-8080-exec-3] Malformed Json!
com.fasterxml.jackson.databind.JsonMappingException: Can not resolve PropertyFilter with id 'java.util.HashMap'; no FilterProvider configured
at com.fasterxml.jackson.databind.ser.std.StdSerializer.findPropertyFilter(StdSerial izer.java:285)
at com.fasterxml.jackson.databind.ser.std.MapSerializer.serialize(MapSerializer.java:459)
at com.fasterxml.jackson.databind.ser.std.MapSerializer.serialize(MapSerializer.java:29)
at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:129)
at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:851)
at com.fasterxml.jackson.jaxrs.base.ProviderBase.writeTo(ProviderBase.java:650)
at org.glassfish.jersey.jackson.internal.FilteringJacksonJaxbJsonProvider.writeTo(FilteringJacksonJaxbJsonProvider.java:135)
at org.glassfish.jersey.message.internal.WriterInterceptorExecutor$TerminalWriterInterceptor.invokeWriteTo(WriterInterceptorExecutor.java:265)
at org.glassfish.jersey.message.internal.WriterInterceptorExecutor$TerminalWriterInterceptor.aroundWriteTo(WriterInterceptorExecutor.java:250)
at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:162)
at org.glassfish.jersey.server.internal.JsonWithPaddingInterceptor.aroundWriteTo(JsonWithPaddingInterceptor.java:106)
at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:162)
at org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundWriteTo(MappableExceptionWrapperInterceptor.java:86)
It seems that the issue comes from the
StdSerializer.findPropertyFilter(StdSerializer.java:285)
protected PropertyFilter findPropertyFilter(SerializerProvider provider,
Object filterId, Object valueToFilter)
throws JsonMappingException
{
FilterProvider filters = provider.getFilterProvider();
// Not ok to miss the provider, if a filter is declared to be needed.
if (filters == null) {
throw new JsonMappingException("Can not resolve PropertyFilter with id '"+filterId+"'; no FilterProvider configured");
}
PropertyFilter filter = filters.findPropertyFilter(filterId, valueToFilter);
// But whether unknown ids are ok just depends on filter provider; if we get null that's fine
return filter;
}
I don’t understand why the filtering is activated even in POST requests ? The strange thing is I didn’t put the “select” query parameter in the request!
Could you please help ?
It seems that when you are using the SelectableEntityFilteringFeature and if you are putting Collection as an Entity in Response then you will get a JsonMappingException. For me it is a bug. The work around is you should encapsulate your collection into GenericEntity to be able to be serialized by Jersey-Jackson.
return Response.status(Status.OK)
.entity(new GenericEntity<Set<MyEntity>>(entityIDs) {}).build();
// Use GenericEntity to avoid JsonMappingException because of the new flow with Filtering
I am using SecurityEntityFilteringFeature, and I ran upon the same error.
StdSerializer.findPropertyFilter.getFilterProvider and
StdSerializer.findPropertyFilter are returning null.
My solution is:
#Provider
public class JsonMappingExceptionOnCollectionResponseFilter implements ContainerResponseFilter {
#Override
public void filter(ContainerRequestContext requestCtx, ContainerResponseContext responseCtx) throws IOException {
ObjectWriterInjector.set(new ObjectWriterModifier() {
#Override
public ObjectWriter modify(EndpointConfigBase<?> endpoint, MultivaluedMap<String, Object> responseHeaders, Object valueToWrite, ObjectWriter w, JsonGenerator g) throws IOException {
SimpleFilterProvider filterProvider = new SimpleFilterProvider();
SimpleBeanPropertyFilter simpleBeanPropertyFilter = new SimpleBeanPropertyFilter() {
#Override
protected boolean include(BeanPropertyWriter writer) {
return true;
}
#Override
protected boolean include(PropertyWriter writer) {
return true;
}
};
filterProvider.addFilter("your entity class", simpleBeanPropertyFilter);
filterProvider.addFilter("your entity class", simpleBeanPropertyFilter);
return w.with(filterProvider);
}
});
}
}