postgresql - search on json using hibernate specification - json

I am facing issue while searching the json field. Any help would be appreciated.. here are the sample code snippet and error what I am facing
Entity class
#TypeDef(name = "cardjsonb", typeClass = JsonBinaryType.class)
public class TransactionEntity {
#Type(type = "cardjsonb")
#Column(name = "card_details", columnDefinition = "jsonb")
private TransactionCardDetails cardDetails;
}
public class TransactionCardDetails {
private Date expiryDate; //MM-YYYY
private String cardOnName;
private String cardNumber;
}
specification snippet
return (root, query, builder) -> builder.equal(
builder.function("json_extract_path_text", JsonBinaryType.class, root.get("cardDetails"), builder.literal("cardOnName")),
builder.literal("David")
);
Error details
ERROR 20628 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: function json_extract_path_text(jsonb, character varying) does not exist
Hint: No function matches the given name and argument types. You might need to add explicit type casts.
Position: 1732

If you are using jsonb as input, you need to use the jsonb_extract_path_text function instead.

Related

How to access nested Json Object Value into JPA Entity Class

I have a payload like this
{
"eventId":"ep9_0579af51",
"eventTime":"5/11/2022 5:50:58 PM",
"eventType":"UpdateTransaction",
"meta":{
"userId":"vkp",
"resourceType":"Transaction/DataDocs"
}
}
I need to map this json fields into a single entity class .
#PostMapping(path = "/id", consumes = "application/json")
public ResponseEntity<ImportTrans> import(#RequestBody ImportTrans importTrans) {
return ResponseEntity.of(Optional.ofNullable(repository.save(importTrans););
}
#Table(name = "IMPORT_TRANS")
#Entity
public class ImportTrans implements Serializable {
#Id
private Long processId;// AutoGenerator
private String eventId;
private Date eventTime;
private String eventType;
private String userId; // I dont want create new class for meta . Is there any way i
//can access meta.userId in ImportTrans class.
private String resourceType;
}
How can I access data from meta from ImportTrans without creating a separate class for it?
You should modify your request body before reaching the controller.
"You must consider the application performance factors on your own
before implementation"
Option 1. Using RequestBodyAdvice.
Option 2. Using Spring HandlerInterceptor.
Option 3. Use AOP
Option 4. Using HTTP Filter.
The below solution only works if you are using a separate DTO class.
private Map<String, String> meta = new HashMap<>();
String userID = importTrans.getMeta().get("userId");
I hope the above pieces of information answered your question.

springboot server returns error 400 after post request using postman

whenever I try to send this over postman:
{
"date": "2021-11-05 12:32:32",
"start": "start",
"destination": "destination",
"provider": "provider",
"driver":1,
"vehicule":1
}
i get an error 400, bad request, i'm using both the #restController and #requestBody annotations while also setting the content type to json.
i get this error on the debugger:
2021-11-09 16:57:52.086 WARN 11748 --- [nio-8080-exec-1] .w.s.m.s.DefaultHandlerExceptionResolver : Resolved [org.springframework.http.converter.HttpMessageNotReadableException: JSON parse error: Cannot deserialize value of type `java.util.Date` from String "2021-11-06 12:32:32.0": not a valid representation (error: Failed to parse Date value '2021-11-06 12:32:32.0': Cannot parse date "2021-11-06 12:32:32.0": while it seems to fit format 'yyyy-MM-dd'T'HH:mm:ss.SSSX', parsing fails (leniency? null)); nested exception is com.fasterxml.jackson.databind.exc.InvalidFormatException: Cannot deserialize value of type `java.util.Date` from String "2021-11-06 12:32:32.0": not a valid representation (error: Failed to parse Date value '2021-11-06 12:32:32.0': Cannot parse date "2021-11-06 12:32:32.0": while it seems to fit format 'yyyy-MM-dd'T'HH:mm:ss.SSSX', parsing fails (leniency? null))
at [Source: (PushbackInputStream); line: 3, column: 17] (through reference chain: com.siam.HRAssistTool.Entity.Schedule["date"])]
I don't understand how i should fix this what i assume date format related issue
when I remove the time from the json body and only leave the date, I get this error:
2021-11-09 17:34:55.418 WARN 11748 --- [nio-8080-exec-4] .w.s.m.s.DefaultHandlerExceptionResolver : Resolved [org.springframework.http.converter.HttpMessageNotReadableException: JSON parse error: Cannot construct instance of `com.siam.HRAssistTool.Entity.Vehicule` (although at least one Creator exists): no int/Int-argument constructor/factory method to deserialize from Number value (1); nested exception is com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot construct instance of `com.siam.HRAssistTool.Entity.Vehicule` (although at least one Creator exists): no int/Int-argument constructor/factory method to deserialize from Number value (1)
at [Source: (PushbackInputStream); line: 8, column: 20] (through reference chain: com.siam.HRAssistTool.Entity.Schedule["vehicule"])]
my schedule entity:
#Entity
public class Schedule implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id ;
private Date date ;
private String Start;
private String destination;
#OneToOne( fetch = FetchType.LAZY)
private Staff driver;
#OneToOne(fetch = FetchType.LAZY)
private Vehicule vehicule;
private String provider;
//constructors, getters and setters
}
my controller :
#RestController
public class ScheduleController {
#Autowired
ScheduleService scheduleService;
#PostMapping(value="/schedule/create")
public #ResponseBody String createSchedule( #RequestBody Schedule schedule) {
System.out.println(schedule.toString());
return scheduleService.addSchedule(schedule);
}
//other crud operation
}
First of all, replace Date with LocalDate, which is part of the new Java Time API. with this you can configure Jackson to handle serialization and deserialization of such a complex type easily. Add the following dependency:
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.11.0</version>
</dependency>
And then configure Jackson accordingly:
#Configuration
public class JacksonConfiguration {
#Bean
public ObjectMapper objectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
JavaTimeModule javaTimeModule = new JavaTimeModule();
objectMapper.registerModule(javaTimeModule);
objectMapper.configure(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS, false);
return objectMapper;
}
}
Then, please avoid using Entities in your Controller, either as a response or request type. Instead, use DTOs which are specific representations for your core model Entities.
public class ScheduleCreationDto {
private LocalDate date;
private String Start;
private String destination;
private Long driverId; // I am guessing the ID is a Long
private Long vehiculeId; // I am guessing the ID is a Long
private String provider;
//constructors, getters and setters
}
This should now be used as the request body:
#RestController
public class ScheduleController {
#Autowired
ScheduleService scheduleService;
#PostMapping(value="/schedule/create")
public #ResponseBody String createSchedule(#RequestBody ScheduleCreationDto scheduleCreationDto) {
return scheduleService.addSchedule(schedule);
}
//other crud operation
}
You also need to change ScheduleService so that it creates a Schedule based on ScheduleCreationDto. Most of the properties require a simple mapping, but others (driverId and vehiculeId) requires you to actually get those Entities from the Database using the provided ID. Something similar to the following should be done in your ScheduleService:
#Service
public class ScheduleService {
#Autowired
ScheduleRepository scheduleRepository;
#Autowired
DriverRepository driverRepository;
#Autowired
VehiculeRepository vehiculeRepository;
public String addSchedule(ScheduleCreationDto scheduleCreationDto) {
Optional<Driver> driver = driverRepository.findById(scheduleCreationDto.getDriverId());
Optional<Vehicule> vehicule = vehiculeRepository.findById(scheduleCreationDto.getVehiculeId());
if (driver.isPresent() && vehicule.isPresent()) {
Schedule schedule = new Schedule(scheduleCreationDto.getDate(), scheduleCreationDto.getStart(),
scheduleCreationDto.getDestination(), driver.get(), vehicule.get(), scheduleCreationDto.getProvider());
scheduleRepository.save(schedule);
}
return // whatever String you want to return, you should actually return the created Schedule, but that is a different topic
}
//other crud operation
}
400 error bad request with postman and Spring Boot API, in my case, happend for three reasons:
1.The first, is that the json format for the request is wrong, like sending:
{ key: value }
Or:
{ "key" : "value"
This is clarely not you case.
2.The second cause was sending the keys different from what the object was expecting.For example:
#PostMapping
public ResponseEntity<Object> save(#RequestResponse #Valid
ClassOfReciveObject reciveObject){
return ResponseEntity.status(HttpStatus.CREATED).body("OK");
}
If the ClassOfObjectRecived has properties :
{
public String age;
public String name;
}
And you are sending in postman others keys, you will get a bad Request
{
"country":"Brazil",
"Continent":"America"
}
3.The third case i got this error was because of the private access modifier for the atributes of this class, change it for public, or find ways to resolve it
public class ClassOfObjectRecived {
public String param1;
public String param2;
}

Switch from JsonStringType to JsonBinaryType when the project uses both MySQL and PostgreSQL

I have a problem with column json when it's necessary to switching from PostgreSQL to MariaDB/MySql.
I use Spring Boot + JPA + Hibernate + hibernate-types-52.
The table i want to map is like this:
CREATE TABLE atable(
...
acolumn JSON,
...
);
Ok it works for PostgreSQL and MariaDB/MySql.
The problem is when i want to deploy an application that switch easly from one to another because the correct hibernate-types-52 implementation for PostgreSQL and MySQL/MariaDB are different
This works on MySQL/MariaDB
#Entity
#Table(name = "atable")
#TypeDef(name = "json", typeClass = JsonStringType.class)
public class Atable {
...
#Type(type = "json")
#Column(name = "acolumn", columnDefinition = "json")
private JsonNode acolumn;
...
}
This works on PosgreSQL
#Entity
#Table(name = "atable")
#TypeDef(name = "json", typeClass = JsonBinaryType.class)
public class Atable {
...
#Type(type = "json")
#Column(name = "acolumn", columnDefinition = "json")
private JsonNode acolumn;
...
}
Any kind of solutions to switch from JsonBinaryType to JsonStringType (or any other solution to solve this) is appreciated.
The Hypersistence Utils project, you can just use the JsonType, which works with PostgreSQL, MySQL, Oracle, SQL Server, or H2.
So, use JsonType instead of JsonBinaryType or JsonStringType
#Entity
#Table(name = "atable")
#TypeDef(name = "json", typeClass = JsonType.class)
public class Atable {
#Type(type = "json")
#Column(name = "acolumn", columnDefinition = "json")
private JsonNode acolumn;
}
That's it!
There are some crazy things you can do - with the limitation that this only works for specific types and columns:
First, to replace the static #TypeDef with a dynamic mapping:
You can use a HibernatePropertiesCustomizer to add a TypeContributorList:
#Configuration
public class HibernateConfig implements HibernatePropertiesCustomizer {
#Value("${spring.jpa.database-platform:}")
private Class<? extends Driver> driverClass;
#Override
public void customize(Map<String, Object> hibernateProperties) {
AbstractHibernateType<Object> jsonType;
if (driverClass != null && PostgreSQL92Dialect.class.isAssignableFrom(driverClass)) {
jsonType = new JsonBinaryType(Atable.class);
} else {
jsonType = new JsonStringType(Atable.class);
}
hibernateProperties.put(EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS,
(TypeContributorList) () -> List.of(
(TypeContributor) (TypeContributions typeContributions, ServiceRegistry serviceRegistry) ->
typeContributions.contributeType(jsonType, "myType")));
}
}
So this is limited to the Atable.class now and I have named this custom Json-Type 'myType'. I.e., you annotate your property with #Type(type = 'myType').
I'm using the configured Dialect here, but in my application I'm checking the active profiles for DB-specific profiles.
Also note that TypeContributions .contributeType(BasicType, String...) is deprecated since Hibernate 5.3. I haven't looked into the new mechanism yet.
So that covers the #Type part, but if you want to use Hibernate Schema generation, you'll still need the #Column(columnDefinition = "... bit, so Hibernate knows which column type to use.
This is where it start's feeling a bit yucky. We can register an Integrator to manipulate the Mapping Metadata:
hibernateProperties.put(EntityManagerFactoryBuilderImpl.INTEGRATOR_PROVIDER,
(IntegratorProvider) () -> Collections.singletonList(JsonColumnMappingIntegrator.INSTANCE));
As a demo I'm only checking for PostgreSQL and I'm applying the dynamic columnDefinition only to a specific column in a specific entity:
public class JsonColumnMappingIntegrator implements Integrator {
public static final JsonColumnMappingIntegrator INSTANCE =
new JsonColumnMappingIntegrator();
#Override
public void integrate(
Metadata metadata,
SessionFactoryImplementor sessionFactory,
SessionFactoryServiceRegistry serviceRegistry) {
Database database = metadata.getDatabase();
if (PostgreSQL92Dialect.class.isAssignableFrom(database.getDialect().getClass())) {
Column acolumn=
((Column) metadata.getEntityBinding(Atable.class.getName()).getProperty("acolumn").getColumnIterator().next());
settingsCol.setSqlType("json");
}
}
#Override
public void disintegrate(SessionFactoryImplementor sessionFactory, SessionFactoryServiceRegistry serviceRegistry) {
}
}
metadata.getEntityBindings() would give you all Entity Bindings, over which you can iterate and then iterate over the properties. This seems quite inefficient though.
I'm also not sure whether you can set things like 'IS JSON' constraints etc., so a custom create script would be better.

Mapping BIGDECIMAL from Mongo, lack CONSTRUCTOR

I have a problem when I want get a object from mongo with a BigDecimal field.
I have the next structure in mongo:
{
"_id":ObjectId("546b07420c74bf96c7c3cd5f"),
"accountId":"1",
"modelVersion":"seasonal_optimized",
"yearMonth":"20143",
"income":{
"unscaled":{
"$numberLong":"68500"
},
"scale":2
},
"expense":{
"unscaled":{
"$numberLong":"125900"
},
"scale":2
}
}
And the entity is :
#Data
#NoArgsConstructor
#AllArgsConstructor
#Document(collection = "forecasts")
public class Forecast {
private String accountId;
private LocalDate monthYear;
private String modelVersion;
private BigDecimal income;
private BigDecimal expense;
}
and I'm trying retrieving a object from mongo, but I got the next error:
org.springframework.data.mapping.model.MappingInstantiationException: Failed to instantiate java.math.BigDecimal using constructor NO_CONSTRUCTOR with arguments.
Anybody can help me?
Thank you!!!!
It's an old question but I had the same issue using Spring data mongo and the aggregation framework.
Specifically when I want a BigDecimal as return type.
A possible workaround is to wrap your BigDecimal in a class and provide it as your return OutPutType, ie:
#Data
public class AverageTemperature {
private BigDecimal averageTemperature;
}
#Override
public AverageTemperature averageTemperatureByDeviceIdAndDateRange(String deviceId, Date from, Date to) {
final Aggregation aggregation = newAggregation(
match(
Criteria.where("deviceId")
.is(deviceId)
.andOperator(
Criteria.where("time").gte(from),
Criteria.where("time").lte(to)
)
),
group("deviceId")
.avg("temperature").as("averageTemperature"),
project("averageTemperature")
);
AggregationResults<AverageTemperature> result = mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(YourDocumentClass.class), AverageTemperature.class);
List<AverageTemperature> mappedResults = result.getMappedResults();
return (mappedResults != null ? mappedResults.get(0) : null);
}
In the example above the aggregation calculates the average temperature as BigDecimal.
Keep in mind that the default BigDecimal converter map it to a string in mongodb when saving. mapping-conversion
Retrieving the document as is should not be a problem anymore with Spring data mongo, versions I used:
org.springframework.boot:spring-boot-starter-data-mongodb:jar:2.0.1.RELEASE
spring-data-mongodb:jar:2.0.6.RELEASE
org.mongodb:mongodb-driver:jar:3.6.3

How to have a field member which is persisted in another schema?

Assume the following (I'm using MySQL)
#PersistenceCapable(identityType = IdentityType.APPLICATION, detachable = "true")
public class TclRequest2 {
#PrimaryKey
#Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY)
private long id;
#Persistent(column = "userid")
#Column(jdbcType = "INTEGER", length = 11, allowsNull = "false", defaultValue = "1")
private Member member; // This object table is in another schema
// Getters and setters
}
The field member is persisted in another schema. I could solve this by specifying the "catalog" attribute in the Member class's #PersitentCapable annotation but that would kill the flexibility of specifying the schema name in the properties file I'm using since I'm configuring jdo in a properties file.
Thank you.