Is there a way to create a dictionary of namespaces from a pyxb binding class?
A dictionary mapping what to what?
The set of all namespaces known to PyXB (which is defined by the set of binding modules that have been imported) can be obtained from pyxb.namespace.Namespace.AvailableNamespaces(). The elements are Namespace instances.
Related
I have a model_A which I am using in various parent models (Parent_1 and Parent_2).
Each parent model has a configuration reference.
The problem I am having is that model_A get the configuration reference from the last parent model I configure.
Is there a way that each refence of model_A inherits the configuration refence from the Parent_X in which it is referenced?
In other words is there any option to set the Source location of the dictionary as "inherit" ?
The solution I did was to define a third dictionary for the reusable components with the common parameters and also linking this dictionary agains the dictionaries from the parent objects. To summarise
Common Dictionary: with common parameters. Reusable models link against this dictionary
Parent Dictionary 1: Has a link to Common + Specific Params
Parent Dictionary 2
Hope it helps to other people facing similar problems
I am trying to write a SessionCustomizer that will translate camelCase field names to under_score. I have found solutions that basically follow the following strategy: loop over the ClassDescriptor objects in Session.getDescriptors().values(), and then loop over the DatabaseMapping objects in ClassDescriptor.getMappings().
The problem is that this only reaches direct attributes of a class, it does nothing to the attributes of an #embeddable class, a list of which is an attribute of the main class through #ElementCollection (and ends up in a separate table).
I believe that such an attribute is encoded as an AggregateCollectionMapping (a subsubsubclass of DatabaseMapping), but I cannot find the list of DatabaseMapping objects that this should in turn have, and which I would like to loop over again.
Eclipse uses AggregateCollectionMapping and AggregateObjectMapping for embedded attribute mappings.
Class descriptor of embedded attribute can be found at rootEntityClassDescriptor#mappings#embeddedMapping#referenceDescriptor. Reference descriptor contains attribute mappings of embedded entity.
I have two domain classes one is parent and other one is child and i have a hasMany relationship between them. Parent class has many childs and child class belongs to parent class.
And here is coding example.
class Parent{
String name
static hasMany = [childs:Child]
static constraints = {
}
}
class Child{
String name
static belongsTo = [parent:Parent]
static constraints={}
}
Problem is as soon as I get the parent object the child objects associated with parent class were also fetched. But when I convert the object to JSON I don't see the child object completely I can only able to see the ID's of child objects. I want to see all columns of child object instead of only Id.
Converted JSON response:
[{"class":"project.Parent","id":1,
"name":"name1","childs":[{"class":"Child","id":1},{"class":"Review","id":2}]}]
But I want the response which contains name of child object too, as follows
[{"class":"project.Parent","id":1,"name":"name1",
"childs":[{"class":"Child","id":1,"name":"childname1"},
{"class":"Review","id":2,"name":"childname2"}
]
}]
Any help greatly appreciated.
Thanks in advance.
The issue is with the use of default JSON converter. Here are your options:
1. Default - all fields, shallow associations
a. render blah as JSON
2. Global deep converter - change all JSON converters to use deep association traversal
a. grails.converters.json.default.deep = true
3. Named config marshaller using provided or custom converters
a. JSON.createNamedConfig('deep'){
it.registerObjectMarshaller( new DeepDomainClassMarshaller(...) )
}
b. JSON.use('deep'){
render blah as JSON
}
4. Custom Class specific closure marshaller
a. JSON.registerObjectMarshaller(MyClass){ return map of properties}
b. render myClassInstance as JSON
5. Custom controller based closure to generate a map of properties
a. convert(object){
return map of properties
}
b. render convert(blah) as JSON
You are currently using Option 1, which is default.
The simplest you can do is use Option 2 to set global deep converter, but be aware this effects ALL domain classes in your app. Which means that if you have a large tree of associations culminating in a top level object and you try to convert a list of those top level objects the deep converter will execute all of the queries to fetch all of the associated objects and their associated objects in turn. - You could load an entire database in one shot :) Be careful.
The latest grails automatically deep converts but you are probably a victim of lazy loading.
The children are not loaded at access and hence the JSON converter cannot convert them to JSON.
The workaround is to put this
static mapping = { childs lazy: false }
user dbrin is correct, but there's one more option. You could also use the Grails GSON Plugin:
https://github.com/robfletcher/grails-gson#readme
The Plugin adds some more features when dealing with json data.
The suggested solution is working, however I had some trouble referencing "grailsApplication". It turns out, that you can ingest it like any other service. I put the following code into the
BootStrap.groovy
file. Also, the class DeepDomainClassMarshaller handles quite well bidirectional circular references, but beware that the JSON Payload is not to big after all deep deferencation.
package aisnhwr
import grails.converters.JSON
import grails.core.GrailsApplication
import org.grails.web.converters.marshaller.json.DeepDomainClassMarshaller
class BootStrap {
GrailsApplication grailsApplication
def init = { servletContext ->
JSON.createNamedConfig('deep'){
it.registerObjectMarshaller( new DeepDomainClassMarshaller(false, grailsApplication) )
}
}
def destroy = {
}
}
I'm trying to change the (de)serialization of a list in one of my classes.
the objects in the list shall be serialised as int (their jpa id) and deserialised accordingly. serialization is simple.
for the deserialization i have a class that can translate the id into the object if id and class are known.
How do i get the necessary class from jackson? all default jackson serialisers have a constructor like this: protected StdDeserialiser(Class<?> vc) so the information is present somewhere.
is there a way to access it during deserialisation?
or before the deserialiser is constructed by jackson?
or inside the HandlerInstantiator?
I only want to overwrite the default deseriliser for certain references so i can't just write a provider or a custom module.
I made it work from inside the deserializer with the help of the ContextDeserializer interface as this supplies the deserializer with the target property.
public JsonDeserializer<?> createContextual(DeserializationContext ctxt, BeanProperty property) throws JsonMappingException {
Class<?> vc = null;
if (property.getType().isCollectionLikeType()) {
vc = property.getType().getContentType().getRawClass();
} else {
vc = property.getType().getRawClass();
}
return new ResourcePathDeserializer(vc, converter);
}
This solution is not perfect as I only get the raw class of the return type or the generic (which might be a parent class or an interface) but that is enough for my requirements.
It would be better if I could access the "real" class that was resolved by Jackson, but for me this works.
First of all, there is nothing fancy about writing a Module: it is just a way for plugging things in, like custom (de)serializers. So no need to avoid that. And you will most like need to write a module to do what you want.
In general it is not a good idea to try to create "universal" serializers or deserializers, and it will probably run into problem. But it depends on what exactly you are trying to do.
Type information will either be:
Implicit from context: you are writing a (de)serializer for type T, and register it for it, so that's your type
Passed by Jackson when (de)serializer is being constructed, via Module interface: modules are asked if they happen to have a (de)serializer for type T. SimpleModule will only use basic Class-to-impl mapping (that's where "simple" comes from); but full custom Module has access to incoming type.
But I don't know if above will work for your use case. Type information must be available from static type (declared content type for the list).
LINQ to SQL allows table mappings to automatically convert back and forth to Enums by specifying the type for the column - this works for strings or integers.
Is there a way to make the conversion case insensitive or add a custom mapping class or extenstion method into the mix so that I can specify what the string should look like in more detail.
Reasons for doing so might be in order to supply a nicer naming convention inside some new funky C# code in a system where the data schema is already set (and is being relied upon by some legacy apps) so the actual text in the database can't be changed.
You can always add a partial class with the same name as your LinqToSql class, and then define your own parameters and functions. These will then be accessible as object parameters and methods for this object, the same way as the auto-generated LinqToSql methods are accessible.
Example: You have a LinqToSql class named Car which maps to the Car table in the DB. You can then add a file to App_Code with the following code in it:
public partial class Car {
// Add properties and methods to extend the functionality of Car
}
I am not sure if this totally meets your requirement of changing the way that Enums are mapped into a column. However, you could add a parameter where the get/set properties will work to map the enums that you need while keeping things case-insensitive.