Having issues saving JPA Entity through CrudRepository. It seems to return the original object that was passed to it without persisting the object to the database. I know this is not the correct way to do it but for testing purposes if I place an instance of TransactionSynchronizationManager before the save, it seems persist correctly. This leads me to think maybe there an issue the Transaction Manager?
In addition if I discard the repository and use the Entity Manager (em) I get the same result, however if I call em.flush(), I get the Exception "Transactionrequiredexception, No Transaction in Progress".
#Transactional
public class UserServiceImpl implements UserService{
#Autowired
private UserRepository userRepository;
#Autowired
private EntityManagerFactory emf;
#Override
#Transactional(readOnly=false)
public User save(User user) {
// EntityManager em = emf.createEntityManager();
//Object persists when adding following line
// TransactionSynchronizationManager.bindResource(emf , new EntityManagerHolder(em));
return userRepository.save(user);
}
}
#ComponentScan(basePackages = {"..."})
#Import(value={DataContextConfig.class,SecurityConfig.class})
public class AppConfig{
}
#Configuration
#ComponentScan(basePackages = {".."})
#Import(DataConfig.class)
public class DataContextConfig {
}
#Configuration
#EnableTransactionManagement(mode=AdviceMode.ASPECTJ)
#EnableJpaRepositories(value={"com.repository"}, entityManagerFactoryRef="entityManagerFactory", transactionManagerRef="transactionManager")
#PropertySource("classpath:data.properties")
public class DataConfig {
...
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory((EntityManagerFactory) entityManagerFactory());
return txManager;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
factory.setPackagesToScan("...");
factory.setJpaPropertyMap(jpaProperties());
factory.setDataSource(dbSource());
return factory;
}
#Bean
public DriverManagerDataSource dbSource(){
DriverManagerDataSource driverManagerDataSource = new DriverManagerDataSource();
driverManagerDataSource.setDriverClassName(environment.getRequiredProperty("jdbc.driverClassName"));
driverManagerDataSource.setUrl(environment.getRequiredProperty("jdbc.url"));
driverManagerDataSource.setUsername(environment.getRequiredProperty("jdbc.username"));
driverManagerDataSource.setPassword(environment.getRequiredProperty("jdbc.password"));
return driverManagerDataSource;
}
}
I have uploaded a small project that isolates the exception. Zip and run AccountTester.class http://www14.zippyshare.com/v/81636273/file.html
As per your sample project.
Below are the only classes modified to avoid your TransactionRequiredException: no transaction is in progress problem and successfully insert an account:
package com.jpa.base.repository;
import com.jpa.base.entity.Account;
import org.springframework.data.repository.CrudRepository;
import org.springframework.data.repository.query.Param;
import org.springframework.stereotype.Repository;
#Repository
public interface AccountRepository extends CrudRepository<Account, Long> {
public Account findByEmailAddress(#Param(value = "emailAddress") String emailAddress);
public Account findByAccountId(#Param(value = "accountId") Long accountId);
}
Original for reference
You need to mark your Spring Data JPA repository with #Repository (Not your service). See here
package com.jpa.base.service.impl;
import com.jpa.base.entity.Account;
import com.jpa.base.repository.AccountRepository;
import com.jpa.base.service.AccountService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
#Service("accountService")
public class AccountServiceImpl implements AccountService {
#Autowired
private AccountRepository accountRepository;
#Override
#Transactional(readOnly = true)
public Account findByAccountId(Long accountId) {
return accountRepository.findByAccountId(accountId);
}
#Override
#Transactional(readOnly = true)
public Account findByEmailAddress(String emailAddress) {
return accountRepository.findByEmailAddress(emailAddress);
}
#Override
#Transactional
public Account save(Account account) {
return accountRepository.save(account);
}
}
Original for reference
Note the changes of removing #Repository from the service (needs to be on your Spring Data JPA repository interface) and using the accountRepository to persist your entity in your save(...) method.
Not sure why you were trying to use the EntityManagerFactory to create a new EntityManager (if you really need an EntityManager instance you should just inject the configured EntityManager, not the factory). This also happens to be the reason for your TransactionRequiredException.
Anyhoo, why bother with all that when you can just use your repository to persist your entity.
Running your AccountTester now produces the desired functionality:
...
Hibernate: insert into account (email_address, name, version) values (?, ?, ?)
INFO : com.jpa.base.entity.AccountTester - Account Saved: Account Id: 3, Email Address:james.brown#outlook.com, Name: James Brown, Version: 0
It turns out there were multiple #EnableTransactionManagement(mode=AdviceMode.ASPECTJ) annotations within the application. This was cause of the problem, after removing the one within the Neo4j #Configuration Class the problem went away.
Related
When you test your code, you send a message to rabbit-mq. How do you get the message back when the test is over?
public interface RabbitProducer {
String OUTPUT = "rabbitmq_producer_channel";
#Output(OUTPUT)
MessageChannel output();
}
public class SysGroupServiceImpl {
#Autowired
private RabbitProducer rabbitProducer;
#Override
public Result remove(Collection<? extends Serializable> idList) {
rabbitProducer.output().send(MessageBuilder.withPayload(idList)
.setHeader("x-delay", 5000)
.setHeader("MessageType", "GroupBatchDelete").build());
return Result.booleanResult(true);
}
}
#SpringBootTest
#Transactional
#Rollback
public class SysGroupServiceTest {
#Autowired
private SysGroupService sysGroupService;
#Test
void removeTest(){
sysGroupService.remove(Stream.of("1").collect(Collectors.toList()));
}
}
I use Spring Cloud Stream to be compatible with RabbitMQ, and all the relevant code is there.Is there a way to mock this out?I tried the following scheme, but due to X-dealy, I got this error:No exchange type x-delayed-message
<dependency>
<groupId>com.github.fridujo</groupId>
<artifactId>rabbitmq-mock</artifactId>
</dependency>
#Component
public class RabbitMqMock {
#Bean
public ConnectionFactory connectionFactory() {
return new CachingConnectionFactory(MockConnectionFactoryFactory.build());
}
}
I know little about mocks. Can mocks create an X-delay exchange?
I want to test my implementation for AttributeConverter using #DataJpaTest.
test code
#RunWith(SpringRunner.class)
#DataJpaTest
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
class FooRepositoryTest {
#Autowired
private FooRepository repository;
#Test
void getPojoTest(){
FooEntity fooEnity= repository.findById("foo");
FooPojo fooPojo = fooEntity.getJsonPojo()
//some assertion
}
}
Entity
#Entity
#Data
#NoArgsConstructor
public class FooEntity{
....
#Column(columnDefinition= "JSON")
#Convert(converter = FooConverter.class)
private FooPojo data;
....
}
Attribute Converter
public class FooConverter implements AttributeConverter<FooPojo, String> {
#Autowired
private ObjectMapper mapper;
#SneakyThrows
#Override
public String convertToDatabaseColumn(FooPojo attribute) {
return mapper.writeValueAsString(attribute);
}
#SneakyThrows
#Override
public FooPojo convertToEntityAttribute(String dbData) {
return mapper.readValue(dbData, FooPojo.class);
}
}
with my code above, when I run getPojoTest(), the #autowired OjbectMapper in Converter is null. When I try the same test with #SpringBootTest instead, it works just fine. I wonder is there any walk-around to use #DataJpaTest and ObjectMapper together.
A better alternative compared to creating your own ObjectMapper is adding the #AutoConfigureJson annotation:
#DataJpaTest
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
#AutoConfigureJson
public void FooRepositoryTest {
}
This is also what #JsonTest uses.
From Docs:
#DataJpaTest can be used if you want to test JPA applications. By
default it will configure an in-memory embedded database, scan for
#Entity classes and configure Spring Data JPA repositories. Regular
#Component beans will not be loaded into the ApplicationContext.
I am working with kafka and spring boot and I need to send JSON object to kafka, the point is that I am able to send an object as JSON configuring KafkaTemplate but just for this object.
package com.bankia.apimanager.config;
import com.bankia.apimanager.model.RequestDTO;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.support.serializer.JsonSerializer;
import java.util.HashMap;
import java.util.Map;
#Configuration
public class KafkaConfiguration {
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
#Bean
public ProducerFactory<String, RequestDTO> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, RequestDTO> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
package com.bankia.apimanager.controller;
import com.bankia.apimanager.model.RequestDTO;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.MediaType;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.SendResult;
import org.springframework.util.concurrent.ListenableFuture;
import org.springframework.util.concurrent.ListenableFutureCallback;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestMethod;
import org.springframework.web.bind.annotation.RestController;
#RestController
#RequestMapping("/infrastructure")
public class InfraStructureRequestController {
private final static Logger LOG = LoggerFactory.getLogger( InfraStructureRequestController.class );
private static final String TOPIC = "test";
#Autowired
private KafkaTemplate<String, RequestDTO> sender;
#RequestMapping(value = "/test", method = RequestMethod.GET)
public String postMessage(){
ListenableFuture<SendResult<String, RequestDTO>> future = sender.send(TOPIC, new RequestDTO("Hola","Paco"));
future.addCallback(new ListenableFutureCallback<SendResult<String, RequestDTO>>() {
#Override
public void onSuccess(SendResult<String, RequestDTO> result) {
LOG.info("Sent message with offset=[" + result.getRecordMetadata().offset() + "]");
}
#Override
public void onFailure(Throwable ex) {
LOG.error("Unable to send message due to : " + ex.getMessage());
}
});
return "OK";
}
}
but what about if now I want to send a new DTO object? do I have to declare a new KafkaTemplate<String,NEWOBJECT> and autowire each kafka template declared in configuration for each object? there is another way to be able to just declare one kafkaTemplate in which I can send any type of object and automatically will be serialized in JSON?
I think, you can specify a generic KafkaTemplate<String, Object> and set the producer value serializer to JsonSerializer like this:
#Configuration
public class KafkaConfiguration {
#Value("${spring.kafka.bootstrap-servers}")
private String bootstrapServers;
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
#Bean
public ProducerFactory<String, Object> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, Object> kafkaTemplate() {
return new KafkaTemplate<>(producerFactory());
}
}
Referring your code:
Value Serializer is correctly defined as JsonSerializer, which will convert objects of any type to JSON.
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
Change <String, RequestDTO> to <String, Object> at every place in KafkaConfig & Controller.
Keep in mind that generics remain until compile time (type erasure)
only.
There are two scenario:
Scenario #1
If you want to use KafkaTemplate to send any type(as mentioned in your question) to kafka, so there is no need to declare your own KafkaTemplate bean because Spring boot did this for you in KafkaAutoConfiguration.
package org.springframework.boot.autoconfigure.kafka;
...
#Configuration(proxyBeanMethods = false)
#ConditionalOnClass(KafkaTemplate.class)
#EnableConfigurationProperties(KafkaProperties.class)
#Import({ KafkaAnnotationDrivenConfiguration.class, KafkaStreamsAnnotationDrivenConfiguration.class })
public class KafkaAutoConfiguration {
private final KafkaProperties properties;
public KafkaAutoConfiguration(KafkaProperties properties) {
this.properties = properties;
}
#Bean
#ConditionalOnMissingBean(KafkaTemplate.class)
public KafkaTemplate<?, ?> kafkaTemplate(ProducerFactory<Object, Object> kafkaProducerFactory,
ProducerListener<Object, Object> kafkaProducerListener,
ObjectProvider<RecordMessageConverter> messageConverter) {
KafkaTemplate<Object, Object> kafkaTemplate = new KafkaTemplate<>(kafkaProducerFactory);
messageConverter.ifUnique(kafkaTemplate::setMessageConverter);
kafkaTemplate.setProducerListener(kafkaProducerListener);
kafkaTemplate.setDefaultTopic(this.properties.getTemplate().getDefaultTopic());
return kafkaTemplate;
}
}
**Some Note**:
This config class has been annotated with #ConditionalOnClass(KafkaTemplate.class) that means: (from spring docs--->) #Conditional that only matches when the specified classes are on the classpath.
kafkaTemplate bean method is annotated with
#ConditionalOnMissingBean(KafkaTemplate.class) that means: (from spring docs ---->) #Conditional that only matches when no beans meeting the specified requirements are already contained in the BeanFactory.
Important! In pure java world, KafkaTemplate<?, ?> is not subtype of for example: KafkaTemplate<String, RequestDTO> so you can't to do this:
KafkaTemplate<?, ?> kf1 = ...;
KafkaTemplate<String, RequestDTO> kf2 = kf1; // Compile time error
because java parameterized types are invariant as mentioned in Effective Java third edition item 31. But is spring world that is ok and will be injected to your own service. You need only to specify your own generic type on your kafkaTemplate properties.
For example:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
#Service
public class KafkaService {
#Autowired
private KafkaTemplate<Integer, String> kafkaTemplate1;
#Autowired
private KafkaTemplate<Integer, RequestDTO> KafkaTemplate2;
}
Scenario #2
If you need to restrict value type of kafka record then you need to specify your own kafka bean something like this:
#Configuration(proxyBeanMethods = false)
#ConditionalOnClass(KafkaTemplate.class)
#EnableConfigurationProperties(CorridorTracingConfiguration.class)
public class CorridorKafkaAutoConfiguration {
#Bean
#ConditionalOnMissingBean(KafkaTemplate.class)
public KafkaTemplate<?, AbstractMessage> kafkaTemplate(ProducerFactory<Object, AbstractMessage> kafkaProducerFactory,
ProducerListener<Object, AbstractMessage> kafkaProducerListener,
ObjectProvider<RecordMessageConverter> messageConverter) {
KafkaTemplate<Object, AbstractMessage> kafkaTemplate = new KafkaTemplate<>(kafkaProducerFactory);
messageConverter.ifUnique(kafkaTemplate::setMessageConverter);
kafkaTemplate.setProducerListener(kafkaProducerListener);
kafkaTemplate.setDefaultTopic(this.properties.getTemplate().getDefaultTopic());
return kafkaTemplate;
}
Now this can be injected only to
KafkaTemplate<String, AbstractMessage> kafkaTemplate, the key type can be anything else instead of String. But you can send any sub type of AbstractMessage to kafka via it.
An example usage:
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
#Service
public class KafkaService {
#Autowired
private KafkaTemplate<String, AbstractMessage> kafkaTemplate;
public void makeTrx(TrxRequest trxRequest) {
kafkaTemplate.send("fraud-request", trxRequest.fromAccountNumber(), new FraudRequest(trxRequest));
}
}
#Accessors(chain = true)
#Getter
#Setter
#EqualsAndHashCode(callSuper = true)
#ToString(callSuper = true)
public class FraudRequest extends AbstractMessage {
private float amount;
private String fromAccountNumber;
private String toAccountNumber;
...
}
To restrict the key of kafka message follow the same (above) way
I encounted a problem when I wanted to work between two database, I wanted to use the table 1 in database 1 and the table 2 in database 2, I tried so many ways, but all seems not work.
spring.datasource.primary.url = jdbc:mysql://localhost:3306/mydb?useUnicode=true&characterEncoding=UTF-8
spring.datasource.primary.username = root
spring.datasource.primary.password = xxxx
spring.datasource.primary.driverClassName=com.mysql.jdbc.Driver
spring.datasource.secondary.url = jdbc:mysql://localhost:3306/testdb?useUnicode=true&characterEncoding=UTF-8
spring.datasource.secondary.username = root
spring.datasource.secondary.password = xxxx
spring.datasource.secondary.driverClassName=com.mysql.jdbc.Driver
above is my application.properties. Then I used #Primary setting spring.datasource.primary as the primary database in config file.
#Entity
#Table(name = "User")
public class User {
#Id
#NotNull
#Column(name = "phoneid")
private String phoneid;
}
public interface UserDAO extends CrudRepository<User, String> {
public User findByPhoneid(String phoneid);
}
I want to connect to database spring.datasource.primary and use the table User in it.
#Entity
#Table(name = "Favorite_Restaurant")
public class FavoriteRestaurant {
#Id
#NotNull
#Column(name = "favorite_restaurantid")
private int favoriteRestaurantId;
}
public interface FavoriteRestaurantDAO extends JpaRepository<FavoriteRestaurant, Integer> {
public List<FavoriteRestaurant> findAll(Sort sort);
}
I want to connect to database spring.datasource.secondary and use the table FavoriteRestaurant in it.
However when I Autowired UserDAo and FavoriteRestaurantDAO in My Service, It seems just like it autowired both userdao and favoriterestaurantdao from primary database. How can I inject FavoriteRestaurantDAO from secondary database!!!!! help!!!!!!
To be able to use several datasources you need to have several persistent unit configurations.
I will assume that you've got datasourceA and datasourceB to configure.
We have one configuration class for each of our persistent units. The listing contains the class for datasourceA (you will have to copy and adjust the configuration for datasourceB).
It's also a good idea not to mix the entities from different persistent units.
So we have separated them based on the package. We have created an empty class
SpringRootPackageMarker so that it tells spring which packages to scan.
Note! the SpringRootPackageMarker class is used in both #EnableJpaRepositories and in getDatasourceAEntityManagerFactoryBean method
So this is our way how to do it:
#DependsOn("transactionManager")
#EnableJpaRepositories(
basePackageClasses = SpringRootPackageMarker.class,
entityManagerFactoryRef = "datasourceAEntityManager",
transactionManagerRef = "transactionManager")
public class DatasourceAPersistenceUnitConfiguration {
private static final String DATASOURCE_A_PERSISTENT_UNIT_NAME = "datasourceAPU";
#DependsOn("transactionManager") // for unit tests
#Bean(name = "datasourceAEntityManager")
public LocalContainerEntityManagerFactoryBean getDatasourceAEntityManagerFactoryBean() {
final LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setPersistenceUnitName(DATASOURCE_A_PERSISTENT_UNIT_NAME);
factory.setDataSource(getDatasourceA());
factory.setJpaVendorAdapter(getDatasourceAJpaVendorAdapter());
factory.setPackagesToScan(SpringRootPackageMarker.class.getPackage().getName());
Properties jpaProperties = getDatasourceAJpaProperties();
factory.setJpaProperties(jpaProperties);
return factory;
}
#Bean
public DataSource getDatasourceA() {
DataSource datasource = null;
// prepare datasource A;
return datasource;
}
private JpaVendorAdapter getDatasourceAJpaVendorAdapter() {
final HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
//custom configuration for datasource A
return vendorAdapter;
}
private Properties getDatasourceAJpaProperties() {
Properties jpaProperties = new Properties();
//custom properties
return jpaProperties;
}
}
}
if you plan to inject the entityManager into your application you'll have to do it this way:
#PersistenceContext(unitName= DatasourceAPersistenceUnitConfiguration.DATASOURCE_A_PERSISTENT_UNIT_NAME)
private EntityManager manager;
Finally, I solved this problem by adding #EnableAutoConfiguration above my config class
#Configuration
#EnableJpaRepositories(basePackages = "datamodel.dao", entityManagerFactoryRef = "localEntityManagerFactory", transactionManagerRef = "localTransactionManager")
#EnableTransactionManagement
#EnableAutoConfiguration ///the key to make spring boot know your config!!!!!!!!!!!!!
public class MainDataConfig {
#Bean
#ConfigurationProperties(prefix = "datasource.main")
#Primary
public DataSource localDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#Primary
public LocalContainerEntityManagerFactoryBean localEntityManagerFactory(final EntityManagerFactoryBuilder builder) {
return builder.dataSource(localDataSource()).packages("datamodel.domain")
.persistenceUnit("mainPersistenceUnit").build();
}
#Bean
#Primary
public JpaTransactionManager localTransactionManager(#Qualifier("localEntityManagerFactory") final EntityManagerFactory factory) {
return new JpaTransactionManager(factory);
}
}
I'm trying to unit test a Spring-boot controller and one of my #Autowired fields is coming back null.
I have two autowired fields in this controller:
public class UserProfileController{
#Autowired
private UserProfileService profileService;
#Autowired
private IDataValidator dataValidatorImpl;
My test class is as follows:
#RunWith(SpringJUnit4ClassRunner.class)
#WebIntegrationTest
#SpringApplicationConfiguration(classes = UserProfileServiceApplication.class)
public class ControllerTest {
private MockMvc mockMvc;
#Mock
UserProfileService profileServiceMock;
#Autowired
ApplicationContext actx;
#InjectMocks
private UserProfileController profileController;
#Before
public void setup() {
// Process mock annotations
String[] asdf = actx.getBeanDefinitionNames();
for (int i = 0; i < asdf.length; i++){
System.out.println(asdf[i]);
}
MockitoAnnotations.initMocks(this);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(profileController).build();
}
/**
* All this does is verify that we return the correct datatype and HTTP status
* #throws Exception
*/
#Test
public void testGetProfileSuccess() throws Exception {
Mockito.when(profileServiceMock.getProfile(Mockito.any(HashMap.class))).thenReturn(new HashMap<String, Object>());
mockMvc.perform(get("http://localhost:8095/UserName?tenantId=tenant1"))
.andExpect(status().isOk())
.andExpect(content().contentType(TestUtil.APPLICATION_JSON_UTF8));
//verify profileService was only used once
Mockito.verify(profileServiceMock, Mockito.times(1)).getProfile(Mockito.any(HashMap.class));
//verify we're done interacting with profile service
Mockito.verifyNoMoreInteractions(profileServiceMock);
}
If I leave IDataValidator untouched in the test class, it comes up null and I get a NPE. If I #Spy the DataValidatorImpl, it cannot find properties from the Spring environment that it needs to work.
How can I just let the IDataValidator autowire itself and maintain its spring environment context as if I were just running the application normally?
When I print all beans in my #Before setup() method, I can see DataValidationImpl in the list.
When you mock your controller with
MockMvcBuilders.standaloneSetup(profileController).build();
the controller is replaced in the context. Since you did not inject any IDataValidator in it, it is null.
The simplest solution is to autowired the real IDataValidator into your test class and inject it into the controller.
In your controller:
public class UserProfileController{
private UserProfileService profileService;
private IDataValidator dataValidatorImpl;
#Autowired
public UserProfileController(UserProfileService profileService, IDataValidator dataValidatorImpl) {
this.profileService = profileService;
this.dataValidatorImpl = dataValidatorImpl;
}
And in your test :
#RunWith(SpringJUnit4ClassRunner.class)
#WebIntegrationTest
#SpringApplicationConfiguration(classes = UserProfileServiceApplication.class)
public class ControllerTest {
private MockMvc mockMvc;
private UserProfileService profileService;
#Autowired
private IDataValidator dataValidator;
#Before
public void setup() {
UserProfileService profileService = Mockito.mock(UserProfileService.class);
UserProfileController controller = new UserProfileController(profileService, dataValidator);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(controller).build();
}
}
If I understand correctly, you want to inject UserProfileController with real Validator and mock Service.
In this case I suggest to use #ContextConfiguration annotaion which allows to configure context in the test. You'll need to create a Configuration class:
#RunWith(SpringJUnit4ClassRunner.class)
#WebIntegrationTest
#SpringApplicationConfiguration(classes = UserProfileServiceApplication.class)
public class ControllerTest {
private MockMvc mockMvc;
#Mock
UserProfileService profileServiceMock;
#Autowired
ApplicationContext actx;
//comment this line out
//#InjectMocks
#Autowired
private UserProfileController profileController;
#Before
public void setup() {
// Process mock annotations
String[] asdf = actx.getBeanDefinitionNames();
for (int i = 0; i < asdf.length; i++){
System.out.println(asdf[i]);
}
//comment this line out
//MockitoAnnotations.initMocks(this);
#Configuration
public static class Config {
//wire validator - if it is not wired by other configurations
#Bean
Validator validator() {
return new Validaor();
}
//wire mock service
#Bean
public UserProfileService profileService() {
return mock(UserProfileService.class);
}
}
Okay, I swear I did this the first time but when trying to recreate the error thrown for jny it actually worked.
My solution is to inject via #Spy annotation and get the bean from the ApplicationContext in my #Before setup method.
public class ControllerTest {
private MockMvc mockMvc;
#Mock
UserProfileService profileServiceMock;
#Spy
IDataValidator dataValidator;
#Autowired
ApplicationContext actx;
#InjectMocks
private UserProfileController profileController;
#Before
public void setup() {
dataValidator = (IDataValidator) actx.getBean("dataValidatorImpl");
MockitoAnnotations.initMocks(this);
// Setup Spring test in standalone mode
this.mockMvc = MockMvcBuilders.standaloneSetup(profileController).build();
}