Junit testing for exceptions - exception

What is my mistake in the Code below especially in 'testSetEmailString()'?
package de.hhn.pp.addressBook.test;
import static org.junit.Assert.assertEquals;
import java.util.Arrays;
import java.util.Collection;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.Parameterized;
import org.junit.runners.Parameterized.Parameters;
import de.hhn.pp.addressBook.logic.ContactImplementation;
import de.hhn.pp.addressbook.api.exceptions.StringNotValidException;
#RunWith(Parameterized.class)
public class Asd {
private String email;
private boolean expectedValue;
private ContactImplementation contact;
public Asd(final String email, final boolean expectedValue) {
this.email = email;
this.expectedValue = expectedValue;
}
/**
* Data.
*
* #return the collection
*/
#Parameters
public static Collection<Object[]> data() {
Object[][] data = new Object[][] {
// INVALID EMAIL ADDRESSES
// it is not allowed to have digits in the Top Level Domain
{ "personalproductivity#hotmail.com2", false },
// it is not allowed to have the '#' twice
{ "personal#productivity#hotmail.de", false },
// it is not allowed to have special character such as '!'
{ "personal!productivity!#hotmail.de", false },
// Top Level Domain cannot start with a '.'
{ "personalproductivity#.gmail.com", false },
// email address must contain a '#'
{ "personalproductivity.de", false },
// email address must contain a Top Level Domain
{ "personalproductivity#", false },
// it is not allowed to have double '.'
{ "personal..productivity#web.de", false },
// it is not allowed to start with a '.'
{ ".personalproductivity#gmx.edu", false },
// VALID EMAIL ADDRESSES
{ "personalproductivity#hotmail.de", true },
{ "personal+productivity#hotmail.com", true },
{ "personalproductivity2015#web.edu", true },
{ "personal.productivity-SS15#gmail.com", true } };
return Arrays.asList(data);
}
#Test(expected = StringNotValidException.class)
public void testSetEmailString() {
contact.setEmail(email);
}
#Test
public void testIsEmailPatternValid() {
boolean result = ContactImplementation.isEmailPatternValid(email);
assertEquals("Result", this.expectedValue, result);
}
}
In my implemented class, the first Method 'setEmail(String)' sets the new email address by checking if the static method 'isEmailPatternValid' returns true. If the email is invalid, a new Exception is thrown, StringNotValidException.
The isEmailPatternValid test works pretty. But I want to check also the setEmail Method for better code coverage. I have problems in Exception testing.

Related

Spring WebFlux - Add a wrapping class before serialization

I'm developing APIs for an exam project, but I wanted their responses to be consistently using a wrapping class on all of them (Telegram Bot API style for those who know them).
So, for example, having these two classes:
public class User {
public int id;
public String name;
}
public class Item {
public int id;
public String itemName;
public User owner;
}
What Spring returns to me is this output:
{
"id": 1,
"itemName": "theItem",
"owner": {
"id": 2,
"name": "theUser"
}
}
What I want instead is for this output to be returned:
{
"ok": true,
"data": {
"id": 1,
"itemName": "theItem",
"owner": {
"id": 2,
"name": "theUser"
}
}
}
Maybe using a class wrapper like this:
public class ResponseWrapper<T> {
public boolean ok;
public T data;
}
Is it possible to do this?
I understand you need a global setting to convert all your responses into a standard one. For this you can implement ResponseBodyAdvice and have a common structure for all your api responses. Refer this link for a detailed example
Edit: For spring-webflux you can extend ResponseBodyResultHandler and override handleResult. An example is given in this answer
I thank #JustinMathew for the help, at the end, in my case (using Spring WebFlux with Kotlin), the ResponseBodyResultHandler class was more useful to me.
// File: /MicroserviceApplication.kt
#SpringBootApplication
class MicroserviceApplication {
#Autowired
lateinit var serverCodecConfigurer: ServerCodecConfigurer
#Autowired
lateinit var requestedContentTypeResolver: RequestedContentTypeResolver
#Bean
fun responseWrapper(): ResponseWrapper = ResponseWrapper(
serverCodecConfigurer.writers, requestedContentTypeResolver
)
}
// File: /wrapper/model/Response.kt
data class Response<T>(
val ok: Boolean,
val data: T?,
val error: Error? = null
) {
data class Error(
val value: HttpStatus,
val message: String?
)
}
// File: /wrapper/ResponseWrapper.kt
class ResponseWrapper(writers: List<HttpMessageWriter<*>>, resolver: RequestedContentTypeResolver) :
ResponseBodyResultHandler(writers, resolver) {
override fun supports(result: HandlerResult): Boolean =
(result.returnType.resolve() == Mono::class.java)
|| (result.returnType.resolve() == Flux::class.java)
#Throws(ClassCastException::class)
override fun handleResult(exchange: ServerWebExchange, result: HandlerResult): Mono<Void> {
val body = when (val value = result.returnValue) {
is Mono<*> -> value
is Flux<*> -> value.collectList()
else -> throw ClassCastException("The \"body\" should be Mono<*> or Flux<*>!")
}
.map { r -> Response(true, r, null) }
.onErrorMap { e ->
if (e !is Response.Error)
Response.Error(HttpStatus.INTERNAL_SERVER_ERROR, "Internal Server Error")
else e
}
.onErrorResume { e -> Mono.just(Response(false, null, e as Response.Error)) }
return writeBody(body, returnType, exchange)
}
companion object {
#JvmStatic
private fun methodForReturnType(): Mono<Response<Any>>? = null
private val returnType: MethodParameter = MethodParameter(
ResponseWrapper::class.java.getDeclaredMethod("methodForReturnType"), -1
)
}
Edit: I made of this answer a library for Spring WebFlux 2.7.3 here.
P.S. I also took a cue from this other question, which faces the same problem but with Java.

Reading Very Complex JSON using Spring Batch

My objective is to read a very complex JSON using Spring Batch. Below is the sample JSON.
{
"order-info" : {
"order-number" : "Test-Order-1"
"order-items" : [
{
"item-id" : "4144769310"
"categories" : [
"ABCD",
"DEF"
],
"item_imag" : "http:// "
"attributes: {
"color" : "red"
},
"dimensions" : {
},
"vendor" : "abcd",
},
{
"item-id" : "88888",
"categories" : [
"ABCD",
"DEF"
],
.......
I understand that I would need to create a Custom ItemReader to parse this JSON.
Kindly provide me some pointers. I am really clueless.
I am now not using CustomItemReader. I am using Java POJOs. My JsonItemReader is as per below:
#Bean
public JsonItemReader<Trade> jsonItemReader() {
ObjectMapper objectMapper = new ObjectMapper();
JacksonJsonObjectReader<Trade> jsonObjectReader =
new JacksonJsonObjectReader<>(Trade.class);
jsonObjectReader.setMapper(objectMapper);
return new JsonItemReaderBuilder<Trade>()
.jsonObjectReader(jsonObjectReader)
.resource(new ClassPathResource("search_data_1.json"))
.name("tradeJsonItemReader")
.build();
}
The exception which I now get is :
java.lang.IllegalStateException: The Json input stream must start with an array of Json objects
From similar posts in this forum I understand that I need to use JsonObjectReader. "You can implement it to read a single json object and use it with the JsonItemReader (either at construction time or using the setter)".
How can I do this either # construction time or using setter? Please share some code snippet for the same.
The delegate of MultiResourceItemReader should still be a JsonItemReader. You just need to use a custom JsonObjectReader with the JsonItemReader instead of JacksonJsonObjectReader. Visually, this would be: MultiResourceItemReader -- delegates to --> JsonItemReader -- uses --> your custom JsonObjectReader.
Could you please share a code snippet for the above?
JacksonJsonItemReader is meant to parse from a root node that is already and array node, so it expects your json to start with '['.
If you desire to parse a complex object - in this case, one that have many parent nodes/properties before it gets to the array - you should write a reader. It is really simple to do it and you can follow JacksonJsonObjectReader's structure. Here follows and example of a generic reader for complex object with respective unit tests.
The unit test
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import org.junit.runners.BlockJUnit4ClassRunner;
import org.springframework.core.io.ByteArrayResource;
import com.example.batch_experiment.dataset.Dataset;
import com.example.batch_experiment.dataset.GenericJsonObjectReader;
import com.example.batch_experiment.json.InvalidArrayNodeException;
import com.example.batch_experiment.json.UnreachableNodeException;
import com.fasterxml.jackson.databind.ObjectMapper;
#RunWith(BlockJUnit4ClassRunner.class)
public class GenericJsonObjectReaderTest {
GenericJsonObjectReader<Dataset> reader;
#Before
public void setUp() {
reader = new GenericJsonObjectReader<Dataset>(Dataset.class, "results");
}
#Test
public void shouldRead_ResultAsRootNode() throws Exception {
reader.open(new ByteArrayResource("{\"result\":{\"results\":[{\"id\":\"a\"}]}}".getBytes()) {});
Assert.assertTrue(reader.getDatasetNode().isArray());
Assert.assertFalse(reader.getDatasetNode().isEmpty());
}
#Test
public void shouldIgnoreUnknownProperty() throws Exception {
String jsonStr = "{\"result\":{\"results\":[{\"id\":\"a\", \"aDifferrentProperty\":0}]}}";
reader.open(new ByteArrayResource(jsonStr.getBytes()) {});
Assert.assertTrue(reader.getDatasetNode().isArray());
Assert.assertFalse(reader.getDatasetNode().isEmpty());
}
#Test
public void shouldIgnoreNullWithoutQuotes() throws Exception {
String jsonStr = "{\"result\":{\"results\":[{\"id\":\"a\",\"name\":null}]}}";
try {
reader.open(new ByteArrayResource(jsonStr.getBytes()) {});
Assert.assertTrue(reader.getDatasetNode().isArray());
Assert.assertFalse(reader.getDatasetNode().isEmpty());
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
#Test
public void shouldThrowException_OnNullNode() throws Exception {
boolean exceptionThrown = false;
try {
reader.open(new ByteArrayResource("{}".getBytes()) {});
} catch (UnreachableNodeException e) {
exceptionThrown = true;
}
Assert.assertTrue(exceptionThrown);
}
#Test
public void shouldThrowException_OnNotArrayNode() throws Exception {
boolean exceptionThrown = false;
try {
reader.open(new ByteArrayResource("{\"result\":{\"results\":{}}}".getBytes()) {});
} catch (InvalidArrayNodeException e) {
exceptionThrown = true;
}
Assert.assertTrue(exceptionThrown);
}
#Test
public void shouldReadObjectValue() {
try {
reader.setJsonParser(new ObjectMapper().createParser("{\"id\":\"a\"}"));
Dataset dataset = reader.read();
Assert.assertNotNull(dataset);
Assert.assertEquals("a", dataset.getId());
} catch (Exception e) {
Assert.fail(e.getMessage());
}
}
}
And the reader:
import java.io.IOException;
import java.io.InputStream;
import java.util.logging.Logger;
import org.springframework.batch.item.ParseException;
import org.springframework.batch.item.json.JsonObjectReader;
import org.springframework.core.io.Resource;
import com.example.batch_experiment.json.InvalidArrayNodeException;
import com.example.batch_experiment.json.UnreachableNodeException;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.JsonToken;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.node.ArrayNode;
/*
* This class follows the structure and functions similar to JacksonJsonObjectReader, with
* the difference that it expects a object as root node, instead of an array.
*/
public class GenericJsonObjectReader<T> implements JsonObjectReader<T>{
Logger logger = Logger.getLogger(GenericJsonObjectReader.class.getName());
ObjectMapper mapper = new ObjectMapper();
private JsonParser jsonParser;
private InputStream inputStream;
private ArrayNode targetNode;
private Class<T> targetType;
private String targetPath;
public GenericJsonObjectReader(Class<T> targetType, String targetPath) {
super();
this.targetType = targetType;
this.targetPath = targetPath;
}
public JsonParser getJsonParser() {
return jsonParser;
}
public void setJsonParser(JsonParser jsonParser) {
this.jsonParser = jsonParser;
}
public ArrayNode getDatasetNode() {
return targetNode;
}
/*
* JsonObjectReader interface has an empty default method and must be implemented in this case to set
* the mapper and the parser
*/
#Override
public void open(Resource resource) throws Exception {
logger.info("Opening json object reader");
this.inputStream = resource.getInputStream();
JsonNode jsonNode = this.mapper.readTree(this.inputStream).findPath(targetPath);
if (!jsonNode.isMissingNode()) {
this.jsonParser = startArrayParser(jsonNode);
logger.info("Reader open with parser reference: " + this.jsonParser);
this.targetNode = (ArrayNode) jsonNode; // for testing purposes
} else {
logger.severe("Couldn't read target node " + this.targetPath);
throw new UnreachableNodeException();
}
}
#Override
public T read() throws Exception {
try {
if (this.jsonParser.nextToken() == JsonToken.START_OBJECT) {
T result = this.mapper.readValue(this.jsonParser, this.targetType);
logger.info("Object read: " + result.hashCode());
return result;
}
} catch (IOException e) {
throw new ParseException("Unable to read next JSON object", e);
}
return null;
}
/**
* Creates a new parser from an array node
*/
private JsonParser startArrayParser(JsonNode jsonArrayNode) throws IOException {
JsonParser jsonParser = this.mapper.getFactory().createParser(jsonArrayNode.toString());
if (jsonParser.nextToken() == JsonToken.START_ARRAY) {
return jsonParser;
} else {
throw new InvalidArrayNodeException();
}
}
#Override
public void close() throws Exception {
this.inputStream.close();
this.jsonParser.close();
}
}

GraphQL java send custom error in json format

I am working in an graphql application where I have to send custom error object / message in json irrespective of whether it occurs in servlet or service.
Expected error response
{ errorCode: 400 //error goes here,
errorMessage: "my error mesage"}
It will be helpful if someone could guide me to achieve the above requirement.
GraphQL specification defines a clear format for the error entry in the response.
According to the spec, it should like this (assuming JSON format is used):
"errors": [
{
"message": "Name for character with ID 1002 could not be fetched.",
"locations": [ { "line": 6, "column": 7 } ],
"path": [ "hero", "heroFriends", 1, "name" ]
"extensions": {/* You can place data in any format here */}
}
]
So you won't find a GraphQL implementation that allows you to extend it and return some like this in the GraphQL execution result, for example:
"errors": [
{
"errorMessage": "Name for character with ID 1002 could not be fetched.",
"errorCode": 404
}
]
However, the spec lets you add data in whatever format in the extension entry. So you could create a custom Exception on the server side and end up with a response that looks like this in JSON:
"errors": [
{
"message": "Name for character with ID 1002 could not be fetched.",
"locations": [ { "line": 6, "column": 7 } ],
"path": [ "hero", "heroFriends", 1, "name" ]
"extensions": {
"errorMessage": "Name for character with ID 1002 could not be fetched.",
"errorCode": 404
}
}
]
It's quite easy to implement this on GraphQL Java, as described in the docs. You can create a custom exception that overrides the getExtensions method and create a map inside the implementation that will then be used to build the content of extensions:
public class CustomException extends RuntimeException implements GraphQLError {
private final int errorCode;
public CustomException(int errorCode, String errorMessage) {
super(errorMessage);
this.errorCode = errorCode;
}
#Override
public Map<String, Object> getExtensions() {
Map<String, Object> customAttributes = new LinkedHashMap<>();
customAttributes.put("errorCode", this.errorCode);
customAttributes.put("errorMessage", this.getMessage());
return customAttributes;
}
#Override
public List<SourceLocation> getLocations() {
return null;
}
#Override
public ErrorType getErrorType() {
return null;
}
}
then you can throw the exception passing in the code and message from inside your data fetchers:
throw new CustomException(400, "A custom error message");
Now, there is another way to tackle this.
Assuming you are working on a Web application, you can return errors (and data, for that matter) in whatever format that you want. Although that is a bit awkward in my opinion. GraphQL clients, like Apollo, adhere to the spec, so why would you want to return a response on any other format? But anyway, there are lots of different requirements out there.
Once you get a hold of an ExecutionResult, you can create a map or object in whatever format you want, serialise that as JSON and return this over HTTP.
Map<String, Object> result = new HashMap<>();
result.put("data", executionResult.getData());
List<Map<String, Object>> errors = executionResult.getErrors()
.stream()
.map(error -> {
Map<String, Object> errorMap = new HashMap<>();
errorMap.put("errorMessage", error.getMessage());
errorMap.put("errorCode", 404); // get the code somehow from the error object
return errorMap;
})
.collect(toList());
result.put("errors", errors);
// Serialize "result" and return that.
But again, having a response that doesn't comply with the spec doesn't make sense in most of the cases.
The other posted answer didn't work for me.
I found a solution by creating the following classes:
1) A throwable CustomException of GraphQLError type (just like mentioned in another answer).
2) Creating a GraphQLError Adaptor, which is not a Throwable.
3) A custom GraphQLErrorHandler to filter the custom exception.
Step 1:
The below throwable CustomGraphQLException implements GraphQLError because the GraphQLErrorHandler interface accepts errors only of type GraphQLError.
public class CustomGraphQLException extends RuntimeException implements GraphQLError {
private final int errorCode;
private final String errorMessage;
public CustomGraphQLException(int errorCode, String errorMessage) {
super(errorMessage);
this.errorCode = errorCode;
this.errorMessage = errorMessage;
}
#Override
public List<SourceLocation> getLocations() {
return null;
}
#Override
public ErrorType getErrorType() {
return null;
}
#Override
public String getMessage() {
return this.errorMessage;
}
#Override
public Map<String, Object> getExtensions() {
Map<String, Object> customAttributes = new HashMap<>();
customAttributes.put("errorCode", this.errorCode);
customAttributes.put("errorMessage", this.getMessage());
return customAttributes;
}
}
Step 2:
A non-throwable adaptor of GraphQLError is created to avoid the stack-trace of the above custom exception being passed in the final GraphQL Error Response.
public class GraphQLErrorAdaptor implements GraphQLError {
private final GraphQLError graphQLError;
public GraphQLErrorAdaptor(GraphQLError graphQLError) {
this.graphQLError = graphQLError;
}
#Override
public List<SourceLocation> getLocations() {
return graphQLError.getLocations();
}
#Override
public ErrorType getErrorType() {
return graphQLError.getErrorType();
}
#Override
public String getMessage() {
return graphQLError.getMessage();
}
#Override
public Map<String, Object> getExtensions() {
return graphQLError.getExtensions();
}
}
Step 3:
A custom GraphQLErrorHandler is implemented to filter the custom CustomGraphQLException and avoid its replacement with the default graphQL error response.
public class CustomGraphQLErrorHandler implements GraphQLErrorHandler {
public CustomGraphQLErrorHandler() { }
public List<GraphQLError> processErrors(List<GraphQLError> errors) {
List<GraphQLError> clientErrors = this.filterGraphQLErrors(errors);
List<GraphQLError> internalErrors = errors.stream()
.filter(e -> isInternalError(e))
.map(GraphQLErrorAdaptor::new)
.collect(Collectors.toList());
if (clientErrors.size() + internalErrors.size() < errors.size()) {
clientErrors.add(new GenericGraphQLError("Internal Server Error(s) while executing query"));
errors.stream().filter((error) -> !this.isClientError(error)
).forEach((error) -> {
if (error instanceof Throwable) {
LOG.error("Error executing query!", (Throwable) error);
} else {
LOG.error("Error executing query ({}): {}", error.getClass().getSimpleName(), error.getMessage());
}
});
}
List<GraphQLError> finalErrors = new ArrayList<>();
finalErrors.addAll(clientErrors);
finalErrors.addAll(internalErrors);
return finalErrors;
}
protected List<GraphQLError> filterGraphQLErrors(List<GraphQLError> errors) {
return errors.stream().filter(this::isClientError).collect(Collectors.toList());
}
protected boolean isClientError(GraphQLError error) {
return !(error instanceof ExceptionWhileDataFetching) && !(error instanceof Throwable);
}
protected boolean isInternalError(GraphQLError error) {
return (error instanceof ExceptionWhileDataFetching) &&
(((ExceptionWhileDataFetching) error).getException() instanceof CustomGraphQLException);
}
}
Step 4:
Configure the CustomGraphQLErrorHandler in GraphQLServlet. I am assuming you are using spring-boot for this step.
#Configuration
public class GraphQLConfig {
#Bean
public ServletRegistrationBean graphQLServletRegistrationBean(
QueryResolver queryResolver,
CustomGraphQLErrorHandler customGraphQLErrorHandler) throws Exception {
GraphQLSchema schema = SchemaParser.newParser()
.schemaString(IOUtils.resourceToString("/library.graphqls", Charset.forName("UTF-8")))
.resolvers(queryResolver)
.build()
.makeExecutableSchema();
return new ServletRegistrationBean(new SimpleGraphQLServlet(schema,
new DefaultExecutionStrategyProvider(), null, null, null,
customGraphQLErrorHandler, new DefaultGraphQLContextBuilder(), null,
null), "/graphql");
}
}
Reference

Upload pdf in HTML and Deserialize json file

I'm trying to upload a file in html and then send it to my database via restangular.
My frontend is a combination of angular with typescript but the upload is a form.
<form enctype="multipart/form-data">
<fieldset class="form-group" ng-repeat="field in $ctrl.metadata.fields">
<label ng-if="field.inputType !== 'hidden'" for="{{field.propertyKey}}"><strong>{{field.name}}</strong></label>
<input ng-if="field.inputType !== 'select' && field.inputType !== 'file'" class="form-control" type="{{field.inputType}}" name="{{field.propertyKey}}" id="{{field.propertyKey}}" ng-model="$ctrl.data[field.propertyKey]"/>
<input ng-if="field.inputType === 'file'" class="form-control" ngf-select type="{{field.inputType}}" name="{{field.propertyKey}}" id="{{field.propertyKey}}" ng-model="$ctrl.data[field.propertyKey]"/>
<sp-dropdown ng-if="field.inputType === 'select'" value="$ctrl.data[field.propertyKey]" api-domain="field.linkedObjectApiDomain" linked-object-name="field.linkedObjectName"></sp-dropdown>
</fieldset>
<button class="btn btn-primary" ng-click="$ctrl.save({item: $ctrl.data})">Save</button>
<button ng-if="$ctrl.metadata.buttons.hasOpen" class="btn btn-primary" ng-click="$ctrl.open()">Open</button>
</form>
I did the databinding of the file with ng-file-upload.
Upon saving we enter this typescript save method.
public save(item: any): any {
console.log("item to save is ", item);
console.log("rapport is ", item["rapport"]);
if (item.id === undefined) {
this.restService.save(this.metadata.apiDomain, item).then((addedItem: any) => {
toastr.success(`${addedItem.naam} successfully created.`, `Overzicht Dossiers Created`);
});
} else {
this.restService.update(this.metadata.apiDomain, item).then((updatedItem: any) => {
toastr.success(`${updatedItem.naam} successfully updated.`, `Overzicht Dossiers Updated`);
});
}
}
The second log with the file gives the json:
lastModified:1463402787393
lastModifiedDate:Mon May 16 2016 14:46:27 GMT+0200 (Romance (zomertijd))
name:"Rapport.pdf"
size:83605
type:"application/pdf"
upload:Promise
webkitRelativePath:""
__proto__:File
On server side I'm using a spring project which I didn't set up myself but the important files are my class which should store this data
Dossier
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package be.ugent.lca.data.entities;
import be.ugent.sherpa.entity.BaseEntity;
import java.sql.Date;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.JoinColumn;
import javax.persistence.Lob;
import javax.persistence.ManyToOne;
import javax.persistence.OneToOne;
/**
*
* #author Sam
*/
#Entity
//#JsonDeserialize(using = DossierDeserializer.class)
//#JsonSerialize(using = DossierSerializer.class)
public class Dossier extends BaseEntity{
private String externDossierNr;
private String internDossierNr;
private Date datum;
private Boolean doc;
private Date refKlantDatum;
private String refKlantVerwijzing;
private String verantw;
#OneToOne(fetch=FetchType.LAZY, mappedBy="dossier")
private Offerte offerte;
private String status;
#ManyToOne(fetch=FetchType.EAGER)
#JoinColumn(name = "persoon")
private Persoon persoon;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "OrganisatieFirma")
private OrganisatieFirma organisatieFirma;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "OrganisatieIntern")
private OrganisatieIntern organisatieIntern;
#Lob
#Column(length=100000)
private byte[] rapport;
public Offerte getOfferte() {
return offerte;
}
public void setOfferte(Offerte offerte) {
this.offerte = offerte;
}
public byte[] getRapport() {
return rapport;
}
public void setRapport(byte[] rapport) {
this.rapport = rapport;
}
public OrganisatieFirma getOrganisatieFirma() {
return organisatieFirma;
}
public String getExternDossierNr() {
return externDossierNr;
}
public void setExternDossierNr(String externDossierNr) {
this.externDossierNr = externDossierNr;
}
public String getInternDossierNr() {
return internDossierNr;
}
public void setInternDossierNr(String internDossierNr) {
this.internDossierNr = internDossierNr;
}
public void setOrganisatieFirma(OrganisatieFirma organisatieFirma) {
this.organisatieFirma = organisatieFirma;
}
public OrganisatieIntern getOrganisatieIntern() {
return organisatieIntern;
}
public void setOrganisatieIntern(OrganisatieIntern organisatieIntern) {
this.organisatieIntern = organisatieIntern;
}
public Persoon getPersoon() {
return persoon;
}
public void setPersoon(Persoon persoon) {
this.persoon = persoon;
}
public String getStatus() {
return status;
}
public void setStatus(String status) {
this.status = status;
}
public Date getDatum() {
return datum;
}
public void setDatum(Date datum) {
this.datum = datum;
}
public Date getRefKlantDatum() {
return refKlantDatum;
}
public void setRefKlantDatum(Date refKlantDatum) {
this.refKlantDatum = refKlantDatum;
}
public String getRefKlantVerwijzing() {
return refKlantVerwijzing;
}
public void setRefKlantVerwijzing(String refKlantVerwijzing) {
this.refKlantVerwijzing = refKlantVerwijzing;
}
public String getVerantw() {
return verantw;
}
public void setVerantw(String verantw) {
this.verantw = verantw;
}
public Boolean getDoc() {
return doc;
}
public void setDoc(Boolean doc) {
this.doc = doc;
}
}
and my repository for this class
/*
* To change this license header, choose License Headers in Project Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
package be.ugent.lca.data.repository;
import be.ugent.lca.data.entities.Dossier;
import be.ugent.lca.data.query.DossierQuery;
import be.ugent.sherpa.repository.RestRepository;
import org.springframework.data.rest.core.annotation.RepositoryRestResource;
/**
*
* #author Sam
*/
#RepositoryRestResource(collectionResourceRel = "dossiers", path = "dossiers")
public interface DossierRepository extends RestRepository<Dossier, DossierQuery<?>>{
}
When trying to save a file to my database the server gives this exception
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Can not deserialize instance of byte[] out of START_OBJECT token
This led me to believe that I have to write my own deserializer for Dossier
Thus:
package be.ugent.lca.data.entities.deserializers;
import be.ugent.lca.data.entities.Dossier;
import com.fasterxml.jackson.core.JsonParser;
import com.fasterxml.jackson.core.ObjectCodec;
import com.fasterxml.jackson.databind.DeserializationContext;
import com.fasterxml.jackson.databind.JsonDeserializer;
import com.fasterxml.jackson.databind.JsonNode;
import java.io.IOException;
public class DossierDeserializer extends JsonDeserializer {
#Override
public Dossier deserialize(JsonParser jsonParser,
DeserializationContext deserializationContext) throws IOException {
ObjectCodec oc = jsonParser.getCodec();
JsonNode root = oc.readTree(jsonParser);
Dossier dossier = new Dossier();
dossier.setExternDossierNr(root.get("externDossierNr").asText());
dossier.setInternDossierNr(root.get("internDossierNr").asText());
return dossier;
}
}
But my problem is that I don't know how exactly to deserialize the file json, since writing out root.get("rapport") gives back an empty string.
Any help would be much appreciated.
I've worked out the file upload.
First of all I split the file upload from the rest of my data so I won't have to rewrite the automatic deserialization for everything that does work.
this.restService.save(this.metadata.apiDomain, item).then((addedItem: any) => {
toastr.success(`${addedItem.naam} successfully created.`, `Overzicht Dossiers Created`);
console.log("created item ", addedItem);
var fd = new FormData();
fd.append("rapport", item["rapport"]);
this.restService.one('dossiers/' + addedItem.id + '/rapport').withHttpConfig({transformRequest: angular.identity}).customPOST(fd, '', undefined, {'Content-Type': undefined}).then(
(addedDossier: any) => {
console.log("posted dossier ", addedDossier);
}
);
});
In the callback of my normal save I do the custom post to dossiers/{id}/rapport for this I need a custom controller.
#BasePathAwareController
#RequestMapping("/dossiers/{id}")
#ExposesResourceFor(Dossier.class)
public class DossierController {
The BasePathAwawareController makes sure that all automatically generated paths that you don't override keep existing.
#Autowired
private DossierRepository dossierRepository;
With this I inject my repository to connect to my database.
#RequestMapping(path = "/rapport", method = RequestMethod.POST)//,headers = "content-type=multipart/form-data")
public #ResponseBody String postRapport(#PathVariable("id") Long id,#RequestParam("rapport") MultipartFile file) {
String name = "rapport";
System.out.println("Entered custom file upload with id " + id);
if (!file.isEmpty()) {
try {
byte[] bytes = file.getBytes();
Dossier dossier = dossierRepository.findOne(id);
dossier.setRapport(bytes);
dossierRepository.save(dossier);
return "You successfully uploaded " + name + " into " + name + "-uploaded !";
} catch (Exception e) {
return "You failed to upload " + name + " => " + e.getMessage();
}
} else {
return "You failed to upload " + name + " because the file was empty.";
}
}
Like this I'm able to successfully upload my file.

Keep tags order using SnakeYAML

I'm trying to translate yaml files to json, but the translation re-orders the tags...
Ex, YAML source:
zzzz:
b: 456
a: dfff
aa:
s10: "dddz"
s3: eeee
bbb:
- b1
- a2
snakeYAML produces:
{
"aa": {
"s3": "eeee",
"s10":"dddz"
},
"bbb":[
"b1",
"a2"
],
"zzzz": {
"a": "dfff",
"b":456
}
}
Create following class in your code, this is a tweaked version from the SnakeYAML source that uses LinkedHashMap and LinkedHashSet that keep the insertion order instead of TreeMap and TreeSet which auto-sort them.
import java.beans.FeatureDescriptor;
import java.beans.IntrospectionException;
import java.beans.Introspector;
import java.beans.PropertyDescriptor;
import java.lang.reflect.Field;
import java.lang.reflect.Method;
import java.lang.reflect.Modifier;
import java.util.*;
import org.yaml.snakeyaml.error.YAMLException;
import org.yaml.snakeyaml.introspector.*;
import org.yaml.snakeyaml.util.PlatformFeatureDetector;
public class CustomPropertyUtils extends PropertyUtils {
private final Map<Class<?>, Map<String, Property>> propertiesCache = new HashMap<Class<?>, Map<String, Property>>();
private final Map<Class<?>, Set<Property>> readableProperties = new HashMap<Class<?>, Set<Property>>();
private BeanAccess beanAccess = BeanAccess.DEFAULT;
private boolean allowReadOnlyProperties = false;
private boolean skipMissingProperties = false;
private PlatformFeatureDetector platformFeatureDetector;
public CustomPropertyUtils() {
this(new PlatformFeatureDetector());
}
CustomPropertyUtils(PlatformFeatureDetector platformFeatureDetector) {
this.platformFeatureDetector = platformFeatureDetector;
/*
* Android lacks much of java.beans (including the Introspector class, used here), because java.beans classes tend to rely on java.awt, which isn't
* supported in the Android SDK. That means we have to fall back on FIELD access only when SnakeYAML is running on the Android Runtime.
*/
if (platformFeatureDetector.isRunningOnAndroid()) {
beanAccess = BeanAccess.FIELD;
}
}
protected Map<String, Property> getPropertiesMap(Class<?> type, BeanAccess bAccess) {
if (propertiesCache.containsKey(type)) {
return propertiesCache.get(type);
}
Map<String, Property> properties = new LinkedHashMap<String, Property>();
boolean inaccessableFieldsExist = false;
switch (bAccess) {
case FIELD:
for (Class<?> c = type; c != null; c = c.getSuperclass()) {
for (Field field : c.getDeclaredFields()) {
int modifiers = field.getModifiers();
if (!Modifier.isStatic(modifiers) && !Modifier.isTransient(modifiers)
&& !properties.containsKey(field.getName())) {
properties.put(field.getName(), new FieldProperty(field));
}
}
}
break;
default:
// add JavaBean properties
try {
for (PropertyDescriptor property : Introspector.getBeanInfo(type)
.getPropertyDescriptors()) {
Method readMethod = property.getReadMethod();
if ((readMethod == null || !readMethod.getName().equals("getClass"))
&& !isTransient(property)) {
properties.put(property.getName(), new MethodProperty(property));
}
}
} catch (IntrospectionException e) {
throw new YAMLException(e);
}
// add public fields
for (Class<?> c = type; c != null; c = c.getSuperclass()) {
for (Field field : c.getDeclaredFields()) {
int modifiers = field.getModifiers();
if (!Modifier.isStatic(modifiers) && !Modifier.isTransient(modifiers)) {
if (Modifier.isPublic(modifiers)) {
properties.put(field.getName(), new FieldProperty(field));
} else {
inaccessableFieldsExist = true;
}
}
}
}
break;
}
if (properties.isEmpty() && inaccessableFieldsExist) {
throw new YAMLException("No JavaBean properties found in " + type.getName());
}
System.out.println(properties);
propertiesCache.put(type, properties);
return properties;
}
private static final String TRANSIENT = "transient";
private boolean isTransient(FeatureDescriptor fd) {
return Boolean.TRUE.equals(fd.getValue(TRANSIENT));
}
public Set<Property> getProperties(Class<? extends Object> type) {
return getProperties(type, beanAccess);
}
public Set<Property> getProperties(Class<? extends Object> type, BeanAccess bAccess) {
if (readableProperties.containsKey(type)) {
return readableProperties.get(type);
}
Set<Property> properties = createPropertySet(type, bAccess);
readableProperties.put(type, properties);
return properties;
}
protected Set<Property> createPropertySet(Class<? extends Object> type, BeanAccess bAccess) {
Set<Property> properties = new LinkedHashSet<>();
Collection<Property> props = getPropertiesMap(type, bAccess).values();
for (Property property : props) {
if (property.isReadable() && (allowReadOnlyProperties || property.isWritable())) {
properties.add(property);
}
}
return properties;
}
public Property getProperty(Class<? extends Object> type, String name) {
return getProperty(type, name, beanAccess);
}
public Property getProperty(Class<? extends Object> type, String name, BeanAccess bAccess) {
Map<String, Property> properties = getPropertiesMap(type, bAccess);
Property property = properties.get(name);
if (property == null && skipMissingProperties) {
property = new MissingProperty(name);
}
if (property == null) {
throw new YAMLException(
"Unable to find property '" + name + "' on class: " + type.getName());
}
return property;
}
public void setBeanAccess(BeanAccess beanAccess) {
if (platformFeatureDetector.isRunningOnAndroid() && beanAccess != BeanAccess.FIELD) {
throw new IllegalArgumentException(
"JVM is Android - only BeanAccess.FIELD is available");
}
if (this.beanAccess != beanAccess) {
this.beanAccess = beanAccess;
propertiesCache.clear();
readableProperties.clear();
}
}
public void setAllowReadOnlyProperties(boolean allowReadOnlyProperties) {
if (this.allowReadOnlyProperties != allowReadOnlyProperties) {
this.allowReadOnlyProperties = allowReadOnlyProperties;
readableProperties.clear();
}
}
public boolean isAllowReadOnlyProperties() {
return allowReadOnlyProperties;
}
/**
* Skip properties that are missing during deserialization of YAML to a Java
* object. The default is false.
*
* #param skipMissingProperties
* true if missing properties should be skipped, false otherwise.
*/
public void setSkipMissingProperties(boolean skipMissingProperties) {
if (this.skipMissingProperties != skipMissingProperties) {
this.skipMissingProperties = skipMissingProperties;
readableProperties.clear();
}
}
public boolean isSkipMissingProperties() {
return skipMissingProperties;
}
}
Then, create your Yaml instance like this:
DumperOptions options = new DumperOptions();
CustomPropertyUtils customPropertyUtils = new CustomPropertyUtils();
Representer customRepresenter = new Representer();
customRepresenter.setPropertyUtils(customPropertyUtils);
Yaml yaml = new Yaml(customRepresenter, options);
Profit!
The keeping of property order depends on java implementation and is not guaranteed.
In order to control the yaml generation you will need to implement your CustomRepresenter overriding the getProperties , see the example below:
package io.github.rockitconsulting.test.rockitizer.configuration;
import java.util.ArrayList;
import java.util.Collections;
import java.util.Comparator;
import java.util.LinkedHashSet;
import java.util.List;
import java.util.Set;
import org.yaml.snakeyaml.DumperOptions;
import org.yaml.snakeyaml.introspector.Property;
import org.yaml.snakeyaml.representer.Representer;
/**
* Custom implementation of {#link Representer} and {#link Comparator}
* to keep the needed order of javabean properties of model classes,
* thus generating the understandable yaml
*
*/
public class ConfigurationModelRepresenter extends Representer {
public ConfigurationModelRepresenter() {
super();
}
public ConfigurationModelRepresenter(DumperOptions options) {
super(options);
}
protected Set<Property> getProperties(Class<? extends Object> type) {
Set<Property> propertySet;
if (typeDefinitions.containsKey(type)) {
propertySet = typeDefinitions.get(type).getProperties();
}
propertySet = getPropertyUtils().getProperties(type);
List<Property> propsList = new ArrayList<>(propertySet);
Collections.sort(propsList, new BeanPropertyComparator());
return new LinkedHashSet<>(propsList);
}
class BeanPropertyComparator implements Comparator<Property> {
public int compare(Property p1, Property p2) {
if (p1.getType().getCanonicalName().contains("util") && !p2.getType().getCanonicalName().contains("util")) {
return 1;
} else if(p2.getName().endsWith("Name")|| p2.getName().equalsIgnoreCase("name")) {
return 1;
} else {
return -1;
} // returning 0 would merge keys
}
}
}
The below snippet shows the usage of the newly created class to generate the yaml structure:
DumperOptions options = new DumperOptions();
ConfigurationModelRepresenter customRepresenter = new ConfigurationModelRepresenter();
Yaml yaml = new Yaml(customRepresenter, options);
StringWriter writer = new StringWriter();
yaml.dump(suite, writer);
FileWriter fw = new FileWriter(rootPathTestSrc + "config_gen.yml", false);
fw.write(writer.toString());
fw.close();
This approach is much cleaner, comparing to the one suggested above.