"Invalid JSON Number" Spring Data Mongo 2.0.2 - json

** FIXED **
All I had to do is add an apostrophe before and after each argument index,
i.e, change:
#Query(value = "{'type': 'Application','name': ?0,'organizationId': ?1}", fields = "{_id:1}")
To:
#Query(value = "{'type': 'Application','name': '?0','organizationId': '?1'}", fields = "{_id:1}")
===================
I recently upgraded my MongoDB and my Spring-Data-MongoDB Driver.
I used to access my MongoDB through mongoRepository using this code:
#Query(value = "{'type': 'Application','name': ?0,'organizationId': ?1}", fields = "{_id:1}")
Policies findPolicyByNameAndOrganizationId(String name, String organizationId);
Where Policies is the object I want to consume.
After performing an update to Spring, I get the following Error now when accessing the method above:
org.bson.json.JsonParseException: Invalid JSON number
I fear this is because I use Spring's MongoCoverter (in the case of this specific object only) to map documents to object.
Here's is my Reader Converter:
public class ApplicationPolicyReadConverotor implements Converter<Document, ApplicationPolicy > {
private MongoConverter mongoConverter;
public ApplicationPolicyReadConverotor(MongoConverter mongoConverter) {
this.mongoConverter = mongoConverter;
}
//#Override
public ApplicationPolicy convert(Document source) {
ApplicationPolicyEntity entity = mongoConverter.read(ApplicationPolicyEntity.class, source);
ApplicationPolicy policy = new ApplicationPolicy();
addFields(policy, entity);
addPackages(policy, entity);
return policy;
}
And here's is my Writer Converter:
public class ApplicationPolicyWriteConvertor implements Converter<ApplicationPolicy, Document>{
private MongoConverter mongoConverter;
public ApplicationPolicyWriteConvertor(MongoConverter mongoConverter) {
this.mongoConverter = mongoConverter;
}
#Override
public Document convert(ApplicationPolicy source) {
System.out.println("mashuWrite");
ApplicationPolicyEntity target = new ApplicationPolicyEntity();
copyFields(source, target);
copyPackages(source, target);
Document Doc = new Document();
mongoConverter.write(target, Doc);
return Doc;
}
I checked Spring reference (2.0.2) regarding MongoConverter and how it works and at this stage I think I'm doing it correctly.
Other object who do not use mapping/conversions suffer no problems.
Same did this Object (ApplicationPolicy) untill I upgraded my mongo and my spring driver.
My mongodb is 3.4.10 and Spring data mongo driver is 2.0.2.
Here's the code that initializes the MappingMongoCoverter Object:
(Adds my custom Converters).
SimpleMongoDbFactory simpleMongoDbFactory = new SimpleMongoDbFactory(client, dbName);
DefaultDbRefResolver defaultDbRefResolver = new DefaultDbRefResolver(simpleMongoDbFactory);
MongoMappingContext mongoMappingContext = new MongoMappingContext();
MappingMongoConverter mappingMongoConverter = new MappingMongoConverter(defaultDbRefResolver,
mongoMappingContext);
mappingMongoConverter.setMapKeyDotReplacement("_dot_");
// Adding custom read and write converters for permission policy.
mappingMongoConverter.setCustomConversions(new MongoCustomConversions(Arrays.asList(
new ApplicationPolicyWriteConvertor(mappingMongoConverter), new ApplicationPolicyReadConverotor(
mappingMongoConverter))));
mappingMongoConverter.afterPropertiesSet();
final MongoTemplate template = new MongoTemplate(simpleMongoDbFactory, mappingMongoConverter);
return template;
I know for sure that ReaderConverter WORKS legit (at least in some cases) since other aspects of the software use the custom ReaderConverter I've written and it works as expected.
Also when using debug mode (Intellij) I do not reach to the conversion code block when invoking the following:
#Query(value = "{'type': 'Application','name': ?0,'organizationId': ?1}", fields = "{_id:1}")
Policies findPolicyByNameAndOrganizationId(String name, String organizationId);
So basically I'm kinda clueless. I have a sense my converter Implementation is messy but couldn't fix it..

Related

Deep copy of properties one object to anothe from jackson?

I am performing an UPDATE operation such that all the non-null properties set in the incoming POJO shall be copied into another object (same type) and this shall happen for nested properties too.
Ex:
POJO:
public class Person {
private String homePhoneNumber;
private String officePhoneNumber;
private Address address;
public String getHomePhoneNumber() {
return homePhoneNumber;
}
// getter/setters
}
public class Address {
private String street;
private String houseNumber;
public String getStreet() {
return street;
}
// getter/setters
}
// Source
Person sourcePerson = new Person();
sourcePerson.setHomePhone("123");
Address address1 = new Address();
address1.setStreet("Street");
sourcePerson.setAddress(address1);
//Dest person
Person destPerson = new Person();
destPerson.setOfficePhone("456");
destPerson.setHomePhone("123");
Address address2 = new Address();
address2.setStreet("Street2");
address2.setHouseNumber("246");
destPerson.setAddress(address2);
ObjectMapper mapper = new ObjectMapper();
//skip setters for null values
mapper.setDefaultSetterInfo(JsonSetter.Value.forValueNulls(Nulls.SKIP));
Person result = mapper.updateValue(destPerson, sourcePerson);
So I want to copy all non-null properties set in sourcePerson to get copied in destPerson i.e overriding only those properties of destperson which sourcePerson has set, keeping other properties unchanged.
Using
Person result = mapper.updateValue(destPerson, sourcePerson);
is not working for nested properties. It's replacing the whole Address object from source to destination
I searched through jackson to find merge feature in jackson which:
mapper.setDefaultMergeable(true);
However, adding this configuration is making null values in sourcePerson nullify those in destPerson too, which seems strange.
mapper.configOverride(Address.class).setMergeable(true);
This above configuration works for what I wanted. But I have many POJO nested resources, so I don't want specific configurations for each POJO.
Is this can be achieved with jackson in a clean way ?
You can start by enabling error reporting with respect to merging
com.fasterxml.jackson.databind.MapperFeature#IGNORE_MERGE_FOR_UNMERGEABLE
This needs to be false.
It's indeed strange that mapper.configOverride() sort of works, but not mapper.setDefaultMergeable().
I don't see setters in your example. Aren't you using #JsonSetter annotations in Person class by any chance? Then they would override mapper configuration.
In jackson-databind Unit Tests I see they are using mapper.readerForUpdating() rather than mapper.updateValue():
private final ObjectMapper MAPPER = objectMapperBuilder()
// 26-Oct-2016, tatu: Make sure we'll report merge problems by default
.disable(MapperFeature.IGNORE_MERGE_FOR_UNMERGEABLE)
.build();
public void testBeanMergingWithNullDefault() throws Exception
{
// By default `null` should simply overwrite value
ConfigDefault config = MAPPER.readerForUpdating(new ConfigDefault(5, 7))
.readValue(aposToQuotes("{'loc':null}"));
assertNotNull(config);
assertNull(config.loc);
// but it should be possible to override setting to, say, skip
// First: via specific type override
// important! We'll specify for value type to be merged
ObjectMapper mapper = newObjectMapper();
mapper.configOverride(AB.class)
.setSetterInfo(JsonSetter.Value.forValueNulls(Nulls.SKIP));
config = mapper.readerForUpdating(new ConfigDefault(137, -3))
.readValue(aposToQuotes("{'loc':null}"));
assertNotNull(config.loc);
assertEquals(137, config.loc.a);
assertEquals(-3, config.loc.b);
// Second: by global defaults
mapper = newObjectMapper();
mapper.setDefaultSetterInfo(JsonSetter.Value.forValueNulls(Nulls.SKIP));
config = mapper.readerForUpdating(new ConfigDefault(12, 34))
.readValue(aposToQuotes("{'loc':null}"));
assertNotNull(config.loc);
assertEquals(12, config.loc.a);
assertEquals(34, config.loc.b);
}
Also worth trying using com.fasterxml.jackson.annotation.JsonMerge directly in Person class.

Castle windsor: how to pass arguments to deep dependencies?

I have the following dependency chain:
IUserAppService
IUserDomainService
IUserRepository
IUserDataContext - UserDataContextImpl(string conn)
All interfaces above and implementations are registered in a Windsor Castle container. When I use one connection string, everything works fine.
Now we want to support multiple databases, In UserAppServiceImpl.cs, we want to get different IUserRepository (different IUserDatabaseContext) according to userId as below:
// UserAppServiceImpl.cs
public UserInfo GetUserInfo(long userId)
{
var connStr = userId % 2 == 0 ? "conn1" : "conn2";
//var repo = container.Resolve<IUserRepository>(....)
}
How can I pass the argument connStr to UserDataContextImpl?
Since the connection string is runtime data in your case, it should not be injected directly into the constructor of your components, as explained here. Since however the connection string is contextual data, it would be awkward to pass it along all public methods in your object graph.
Instead, you should hide it behind an abstraction that allows you to retrieve the proper value for the current request. For instance:
public interface ISqlConnectionFactory
{
SqlConnection Open();
}
An implementation of the ISqlConnectionFactory itself could depend on a dependency that allows retrieving the current user id:
public interface IUserContext
{
int UserId { get; }
}
Such connection factory might therefore look like this:
public class SqlConnectionFactory : ISqlConnectionFactory
{
private readonly IUserContext userContext;
private readonly string con1;
private readonly string con2;
public SqlConnectionFactory(IUserContext userContext,
string con1, string con2) {
...
}
public SqlConnection Open() {
var connStr = userContext.UserId % 2 == 0 ? "conn1" : "conn2";
var con = new SqlConnection(connStr);
con.Open();
return con;
}
}
This leaves us with an IUserContext implementation. Such implementation will depend on the type of application we are building. For ASP.NET it might look like this:
public class AspNetUserContext : IUserContext
{
public string UserId => int.Parse(HttpContext.Current.Session["UserId"]);
}
You have to start from the beginning of your dependency resolver and resolve all of your derived dependencies to a "named" resolution.
Github code link:https://github.com/castleproject/Windsor/blob/master/docs/inline-dependencies.md
Example:
I have my IDataContext for MSSQL and another for MySQL.
This example is in Unity, but I am sure Windsor can do this.
container.RegisterType(Of IDataContextAsync, dbEntities)("db", New InjectionConstructor())
container.RegisterType(Of IUnitOfWorkAsync, UnitOfWork)("UnitOfWork", New InjectionConstructor(New ResolvedParameter(Of IDataContextAsync)("db")))
'Exceptions example
container.RegisterType(Of IRepositoryAsync(Of Exception), Repository(Of Exception))("iExceptionRepository",
New InjectionConstructor(New ResolvedParameter(Of IDataContextAsync)("db"),
New ResolvedParameter(Of IUnitOfWorkAsync)("UnitOfWork")))
sql container
container.RegisterType(Of IDataContextAsync, DataMart)(New HierarchicalLifetimeManager)
container.RegisterType(Of IUnitOfWorkAsync, UnitOfWork)(New HierarchicalLifetimeManager)
'brands
container.RegisterType(Of IRepositoryAsync(Of Brand), Repository(Of Brand))
controller code:
No changes required at the controller level.
results:
I can now have my MSSQL context do its work and MySQL do its work without any developer having to understand my container configuration. The developer simply consumes the correct service and everything is implemented.

Spring MVC Test, MockMVC: Conveniently convert objects to/from JSON

I am used to JAX-RS and would like to have similar comfort when sending requests using Spring MVC and working with the responses, i.e. on the client side inside my tests.
On the server (controller) side I'm quite happy with the automatic conversion, i.e. it suffices to just return an object instance and have JSON in the resulting HTTP response sent to the client.
Could you tell me how to work around the manual process of converting objectInstance to jsonString or vice versa in these snippets? If possible, I'd also like to skip configuring the content type manually.
String jsonStringRequest = objectMapper.writeValueAsString(objectInstance);
ResultActions resultActions = mockMvc.perform(post(PATH)
.contentType(MediaType.APPLICATION_JSON)
.content(jsonStringRequest)
)
String jsonStringResponse = resultActions.andReturn().getResponse().getContentAsString();
Some objectInstanceResponse = objectMapper.readValue(jsonStringResponse, Some.class);
For comparison, with JAX-RS client API I can easily send an object using request.post(Entity.entity(objectInstance, MediaType.APPLICATION_JSON_TYPE) and read the response using response.readEntity(Some.class);
if you have lot's of response objects, you could create some generic JsonToObject mapper-factory. It could be then used to detect the object type from a generic response (all response objects inherit from the same generic class) and respond/log properly from a bad mapping attempt.
I do not have a code example at hand, but as a pseudocode:
public abstract GenericResponse {
public String responseClassName = null;
// get/set
}
In the server code, add the name of the actual response object to this class.
The JsonToObject factory
public ConverterFactory<T> {
private T objectType;
public ConverterFactory(T type) {
objectType = type;
}
public T convert(String jsonString) {
// Type check
GenericResponse genResp = mapper.readValue(result.getResponse().getContentAsString(),
GenericResponse.class);
if (objectType.getClass().getSimpleName().equals(genResp.getResponseClassName())) {
// ObjectMapper code
return mapper.readValue(result.getResponse().getContentAsString(),
objectType.class);
} else {
// Error handling
}
}
}
I think this could be extended to be used with annotation to do more automation magic with the response. (start checking with BeanPostProcessor)
#Component
public class AnnotationWorker implements BeanPostProcessor {
#Override
public Object postProcessBeforeInitialization(final Object bean, String name) throws BeansException {
ReflectionUtils.doWithFields(bean.getClass(), field -> {
// make the field accessible if defined private
ReflectionUtils.makeAccessible(field);
if (field.getAnnotation(MyAnnotation.class) != null) {
field.set(bean, log);
}
});
return bean;
}
}
The above code snippet is copied from my current project and it injects to fields, you need to change it so, that it works for methods, eg ... where you may need it.
Having this all implemented may be tricky and can't say it necessarily works even, but it's something to try if you don't mind a bit of educative work.

WinRT XmlAnyElement and Serialization

We have a Windows Store application that communicates with our server using XML for requests / responses and are serialized with the XmlSerializer. The issue we are encountering is that one of our types can contain arbitrary XML as one of its properties. In non WinRT applications, the usage would have been.
public sealed ItemExtension {
[XmlAttribute("source")]
public string Source {get;set;}
[XmlAnyElement]
public XmlElement[] Data {get;set;}
}
This would allow us to have XML in our database like
<extension source="foo"><randomXml><data/></randomXml></extension>
In WinRT, XmlElement is not included, System.Xml.XmlElement does not exist and the Windows.Data.Xml.Dom.XmlElement is not compatible. Documentation mentions XElement, but XElement is not a supported WinRT type so the WinRT project won't compile if I try using it.
Is this a bug with Windows Store applications or is there a sufficient work around?
Thanks.
So far I've only found a hack to get this working. If we use
[XmlAnyElement]
public object Data {get;set;}
This will properly will properly deserialize existing data. In the debugger when inspecting it, it is of type System.Xml.XmlElement, which isn't exposed in WinRT. So there's no way to directly set this. Since we figured out that XmlSerializer can instantiate and access System.Xml.XmlElement, we use it to handle setting it by taking in an object/xml snippet, wrapping it in container xml for a wrapper type that contains [XmlAnyElement] and calling Deserialize on it to have the XmlSerializer instantiate an XmlElement, which can then be set on the target object you wish to serialize.
For getting data, since trying to read this property throws an exception in the UI layer, and trying to access InnerXml/OuterXml throw an exception as well, we are left with using the XmlSerializer to Serialize the XmlElement back into a string, and then can use that however you want.
public sealed class XmlAnyElementContainer
{
[XmlAnyElement]
public object Data { get; set; }
}
public void SetData(object extensionObject)
{
var objectSerializer = new XmlSerializer(extensionObject.GetType());
var settings = new XmlWriterSettings()
{
Indent = false,
OmitXmlDeclaration = true
};
var sb = new StringBuilder();
using (var xmlWriter = XmlWriter.Create(sb, settings))
{
objectSerializer.Serialize(xmlWriter, extensionObject);
}
string objectXml = sb.ToString();
string newXml = "<XmlAnyElementContainer>" + objectXml + "</XmlAnyElementContainer>";
var xmlAnySerializer = new XmlSerializer(typeof(XmlAnyElementContainer));
using (var sr = new StringReader(newXml))
{
[TargetPropertyToSerialize] = (xmlAnySerializer.Deserialize(sr) as XmlAnyElementContainer).Data;
}
}

Session scoped managed bean not available in servlet when using another browser than IE

I have been using the following bit of code in a servlet to locate a session backing bean (as suggested by BalusC) without problems until recently. Now it only works on Internet Explorer. Chrome and Firefox appear to be getting a totally new backing bean rather than the original backing bean. When calling functions in the backing bean, it falls over with null pointer errors for objects in the backing bean that were definitely initialized in the original.
FacesContext facesContext = FacesUtil.getFacesContext(req, res);
ProductSelection productSelection = (ProductSelection) facesContext.getApplication().evaluateExpressionGet(facesContext, "#{productSelection}", ProductSelection.class);
if(productSelection.getProductType() == null)
{
System.out.println("Sevlet: product type is NULL; did not get the original backing bean");
}
else
{
System.out.println("Sevlet: product type is: " + productSelection.getProductType().getProductTypeName());
}
It is a while since I tested this code and there have been several updates to Java but I'm not sure if these are the cause; I have changed something in my configuration or Chrome and Firefox have changed something in their code (unlikely). Is anyone else having similar problems? I am at a loss as to where to go from here, as there does not appear to be any errors associated with not finding the backing bean and my debugging skills for the java lib code are not that great (they don't comment their code very well and it is hard to follow); any suggestions would be greatly appreciated.
I am using Netbeans 7.01, JSF 2.0, Glassfish 3.1, and a Derby database. I tested it on my tower and laptop and it is doing it on both (Win XP and Win 7). The JRE is 7 update 40 build 1.7.0_40-b43. JDK is 1.6.0_04. Chrome version is 29.0.1547.76 m. Firefox is 23.0.1. Internet Explorer is 8.0.6001.18702.
The FacesUtil is slightly different to BalusC's code (but it was working fine):
package searchselection;
import javax.faces.FactoryFinder;
import javax.faces.component.UIViewRoot;
import javax.faces.context.FacesContext;
import javax.faces.context.FacesContextFactory;
import javax.faces.lifecycle.Lifecycle;
import javax.faces.lifecycle.LifecycleFactory;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
// By BalusC http://balusc.blogspot.com
// Utility to get the FacesContext.
// Used by the CriteriaServlet to get the backing bean when the user submits a customised
// search criteria object.
public class FacesUtil
{
// Getters -----------------------------------------------------------------
//
public static FacesContext getFacesContext(ServletRequest request, ServletResponse response)
{
// Get current FacesContext.
FacesContext facesContext = FacesContext.getCurrentInstance();
// Check current FacesContext.
if (facesContext == null)
{
// Create new Lifecycle.
LifecycleFactory lifecycleFactory = (LifecycleFactory) FactoryFinder.getFactory(FactoryFinder.LIFECYCLE_FACTORY);
Lifecycle lifecycle = lifecycleFactory.getLifecycle(LifecycleFactory.DEFAULT_LIFECYCLE);
// Create new FacesContext.
FacesContextFactory contextFactory = (FacesContextFactory) FactoryFinder.getFactory(FactoryFinder.FACES_CONTEXT_FACTORY);
facesContext = contextFactory.getFacesContext(
request.getServletContext(), request, response, lifecycle);
// Create new View.
UIViewRoot view = facesContext.getApplication().getViewHandler().createView(
facesContext, "");
facesContext.setViewRoot(view);
// Set current FacesContext.
FacesContextWrapper.setCurrentInstance(facesContext);
}
return facesContext;
}
// Helpers -----------------------------------------------------------------
// Wrap the protected FacesContext.setCurrentInstance() in a inner class.
private static abstract class FacesContextWrapper extends FacesContext
{
protected static void setCurrentInstance(FacesContext facesContext)
{
FacesContext.setCurrentInstance(facesContext);
}
}
}
Kind thanks in advance...
Work-a-round: The session Id is changing on Firefox and Chrome when the servlet is called from the applet, for some reason. I ended up storing the session ID and setting it on the HttpURLConnection connection to the servlet, which forces the servlet to get the correct backing bean.
In the productSelection backing bean:
private String sessionID = ""; // With getter
.
.
.
FacesContext facesContext = FacesContext.getCurrentInstance();
HttpSession session = (HttpSession) facesContext.getExternalContext().getSession(false);
sessionID = session.getId();
On the web page containing the applet I am using a javascript function to wait until the applet is fully loaded before telling it to load a criteria file, which will be modified by the user and then sent back to the backing bean for processing. I have simply passed the session ID along with the criteria file to the applet:
<SCRIPT language="javascript">
function waitUntilLoaded()
{
if (document.criteriaApplet.isActive())
{
var object = document.getElementById ("criteriaApplet");
criteriaApplet.loadCriteriaFile((object.codeBase + "#{productSelection.productUsage.searchCriteriaObjectUrl}"), "#{productSelection.sessionID}");
}
else
{
settimeout(waitUntilLoaded(),500)
}
}
</SCRIPT>
In the applet button code, to submit the modified criteria file back to the backing bean via the servelet, I added the session ID to the HttpURLConnection connection:
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setDoInput(true);
connection.setDoOutput(true);
connection.setUseCaches(false);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/x-java-serialized-object");
connection.setRequestProperty("Cookie","JSESSIONID=" + sessionID);
ObjectOutputStream out = new ObjectOutputStream(connection.getOutputStream());
out.writeObject(searchSubmitObject);
out.flush();
out.close();