I have a MyBatis mapper method that inserts data from a POJO into a table. Initially I was receiving a SQL exception (invalid column type 1111) which turned out to be because one of the values I was inserting into a column was null.
As described in this post the solution was to specify the jdbcType of the null variable (setting it to #{myVariable, jdbcType=VARCHAR} worked fine in my case.
My question is how to test this fix? I have a test method that calls the mapper insert method and I'm inserting a POJO with null values for fields that haven't had their jdbcType specified. I would expect the test to fail with a SQL exception but the test is passing successfully.
Related
I am using R2dbc jasync driver to generate and execute update query on my mysql db table containing a json column.
I am trying to do a JSON_SET operation on a column using R2dbcEntityTemplate and org.springframework.data.relational.core.query.Update.update and org.springframework.data.relational.core.query.Update.set methods. I am trying to generate a query equivalent to
UPDATE mytable SET json_col = JSON_SET(json_col, '$.path_to_update', CAST('{"a":"b"}' AS JSON)) WHERE id=:id
But I get the following error:
invalid json text: "the document is empty." at position 0 in value for column
From what I understand is the Update.set() operation is not delegating the JSON_SET operation to the DB but the DB is trying to interpret the generated string as json which is not valid json at position 0.
How can I achieve this using Update.set() function to delegate the MySQL function call to the DB before DB interprets the value as JSON ?
I tried using dbClient.sql().bind() and it worked but using dbTemplate.update() this is not working.
I am trying to build a DatabaseTable (my custom object) object by querying using jdbc DatabaseMetadata and ResultSet.
Below code works perfectly fine if I run it against MySQL database, but it fails with exception when tried against Exasol database.
private DatabaseTableColumnMetadata getTableAndColumnNames(DatabaseMetaData metaData)
throws SQLException {
ResultSet tables = metaData.getTables(null,null,"%",null);
while (tables.next()){
if (tables.getString("TABLE_TYPE").equalsIgnoreCase("TABLE")
&& ((ResultSetImpl) tables).getConnection().getCatalog()
.equals(tables.getString("TABLE_CAT"))) {
DatabaseTable table = DatabaseTable.builder().name(tables.getString("TABLE_NAME")).schema(tables.getString("TYPE_SCHEM")).build();
}
}
}
Exception thrown is as below
com.exasol.jdbc.EXAResultSet cannot be cast to com.mysql.cj.jdbc.result.ResultSetImpl","message":"com.exasol.jdbc.EXAResultSet cannot be cast to com.mysql.cj.jdbc.result.ResultSetImpl","name":"java.lang.ClassCastException
The exception thrown is at line where its trying to cast tables object to ResultSetImpl.
I have both jars in my project exajdbc.jar as well as mysql-connector.jar
Any help or clue to solve this problem pls.
I'm not sure why you've written this expression:
((ResultSetImpl) tables).getConnection().getCatalog()
In this expression you are casting tables to a MySQL-specific result-set implementation class, just so that you can get access to its getConnection() method. It's perhaps not surprising you're getting an error attempting to use this code to read data from an Exasol database, because in this situation tables will be an Exasol-specific result-set implementation class.
Surely this result set comes from the same database connection as the DatabaseMetaData object that gets passed into your method? I would expect the getConnection() method of the DatabaseMetaData object to return the same connection.
Try replacing the expression above with the following:
metaData.getConnection().getCatalog()
I'm using Spring Boot 1.3.3 and created a REST controller to add a JSON object into Mongo DB collections.
The data to be added from the JSON object will be a subset of information received from the request. So i have created a JSON request object ( DTO ) and an entity object ( model ) to be stored in Mongo collection.
I'm facing an issue now as the JSON request object is populated with default values for integer ( 0 ) and boolean data types ( false ) even if these fields are not populated as part of the request message. I don't want to store these values in the database.
I have added " spring.jackson.serialization-inclusion=non-null " and " spring.jackson.serialization-inclusion=non-default " properties in my application.properties file but still the fields are populated with default values.
Could anyone please help me out in resolving this issue and bypass the default values. NOTE: It works fine for String data type as they would be NULL values by default if not created.
Thanks in advance
String Attributes accept the null value while the primitive attributes have a default value for example 0 is default value for the int attributes.. to avoid having this values Use Integer instead.
Please use this annotation above your fields in bean class with which you facing the problem and tell me your problem is solved.
'#JsonInclude(Include.NON_NULL)'
Thanks
We're using neo4j 1.8.2 Advanced with Spring Data neo4j 2.2.0.RELEASE and Spring Framework 3.2.0.RELEASE. We're using a lot of custom queries. The queries are defined using the #Query annotation in our repository interfaces.
While writing tests we encountered a problem with one of our queries. As far as we know, when a query uses a start node which does not exists neo4j throws a NotFoundException which will be then translated to a DataRetrievalFailureException.
But for our query the exception is translated to an InvalidDataAccessResourceUsageException instead which normally indicates that the query is not correct. We think that our query looks good so we don't understand why the InvalidDataAccessResourceUsageException gets thrown. When we test it with an existing start node the query returns the expected results.
The query:
START person = node({0})
MATCH person -[attributeRel:ATTRIBUTE]-> attribute -[:ATTRIBUTE_CATEGORY]-> category
WHERE attributeRel.value! <> 'N' AND attributeRel.value! <> 'Unbekannt/nicht bewertet'
RETURN category, COLLECT(attribute), COLLECT(attributeRel)
ORDER BY category.name"
Is this a bug in Spring Data neo4j or is our query wrong or is the exception type correct and we just don't understand why it get's thrown.
Here is scenario.
I have a JPA entity class with the field
#Temporal(TemporalType.TIMESTAMP)
#Column(name = "CREATED_DATE", length = 19)
private Date createdDate = null;
And this field is mapped to a column CREATED_DATE with the type datetime (i am using mysql) in a Table.
When I select the rows of the table, CREATED_DATE column shows the Date and Time both. When I fetch the row data using query.getSingleResult and typecast the returned object to the Entity class, the createdDate field of the entity class also has both Date and Time :
request = (MyEntityClass) query.getSingleResult();<br>
System.out.println(request.getCreatedDate());
prints 2012-05-18 06:32:57.0 in the catalina.out log.
Now I am sending this Entity Class object to a particular client request as a Json object. The Json response the client receives though, when printed, does not show date time, it shows only date:
ClientResponse<String> response = clientRequest.get(String.class);
System.out.println("Response getResource ::"+response.getStatus()+"::"+response.getEntity());
console output:
Response getResource ::200::[{ ....... ,"createdDate":"18-05-2012", ......}]
I am using RestEasy Json api at the server side.
I have seen a similar question ( similar SO post) in SO, but the answer discussed there doesn't seem to work with me.
Please help.
The problem is related to the way the object is serialized to JSON. You probably configured it in one way or another (annotation, default implementation, config file) so that dates are serialized this way.