I have some C# classes generated by protoc, in project A. I'm consuming these classes in project B. Project B serializes/deserializes to JSON using Newtonsoft.Json (for CosmosDB). There's currently an issue in the microsoft.azure.documentdb.core 1.9.1 release which prevents controlling the (de)serialization of models with any method except decorating the model properties with attributes (e.g JsonProperty). In short, I need Newtonsoft.Json to use camel casing for property names and the only way I can do that (that I know of) is to decorate the generated protocol buffer models with [JsonProperty("myProp")].
The types are fairly large and I'd rather not do this manually if possible. My hope is there's a way to tell protoc to generate the classes with custom attributes on the properties?
Related
I'm trying to implement a logic that serializes and de-serializes a very complex legacy object. The object contains other complex objects inside of it that are spread through various other projects. There are even some JAXB1 objects that are generated from an xsd schemas and are part of the chain.
Right now there is already a working version that uses XStream and this worked for years. But since Java 17 there are issues because of the new restriction for using reflection on private fields from different modules. Exceptions like this started to appear ->
Module {A} does not 'opens {package}' to {B}". One of the things that bothers me is that all of these packages are from some third party libs and I even can't find any objects from them in the model chain.
So I started implementing a new serialization based on the Jackson databind API but right now I'm wondering is it going to solve the issues at all? Does Jackson also use reflection in case to serialize and de-serialize? What is the best way to configure the ObjectMapper and what should I change in the objects that I need to work with in case to make the reflection usage as low as possible?
Right now I've configure the ObjectMapper as:
objectMapper.disable(MapperFeature.AUTO_DETECT_CREATORS,
MapperFeature.AUTO_DETECT_FIELDS,
MapperFeature.AUTO_DETECT_GETTERS,
MapperFeature.AUTO_DETECT_IS_GETTERS);
objectMapper.disable(SerializationFeature.FAIL_ON_EMPTY_BEANS);
and I'm annotating all fields that should be serialized with #JsonProperty but I'm not sure that this is going to help. Okay, I assume I must have getters and setters, but should I also annotate them? Does the annotation has something to do with the reflection usage at all?
Looking for some help to serialize a deep nested java objects to Json. The constraint is that I cannot add any annotations or change the current Java code. Looking for a powerful Json library which has configuration options to convert Java to Json without altering the Java Object themselves. Following would be some of the options that might be required
Specify include/exclude fields/methods. Should be able to specify this at nested levels. A is composed of B and B is composed of C. Should have ability to specify include/exclude at C.
Include/Exclude Objects at nested levels.
Rename fields from java properties while converting to Json (at nested level objects too)
Manager circular dependencies.
Was looking at Jackson and Gson for this requirements. While there are tons of options using annotations to specify serialization configs while writing new Java Pojo's, I am looking at options where I need to specify serialization properties without changing the current Java code. Jackson and Gson do seem to have options for these, but not documented in depth.
Which library is easier to configure for the above requirements? Any other powerful library other than Jackson/Gson? Any pointers to this will be of great help.
Thanks much for your time.
As to Jackson, you might consider using so-called mix-in annotations (see f.ex http://www.studytrails.com/java/json/java-jackson-mix-in-annotation.jsp) which allow you to specify mix-ins to use, without adding them directly in the legacy classes.
This would let you use annotation-based configuration, but leave actual classes untouched.
But given all of your requirements, it may perhaps be better to just use Tree Model of Jackson or GSON (get JsonNode or such), and manually handle conversions to your liking.
You may then be able to convert tree value into POJO; Jackson, for example, has method(s) for doing this (ObjectMapper.treeToValue(), .valueToTree(), .convertValue()) which allow conversions of structurally compatible representations.
I am mapping some pre-existing Business Objects to our database using Entity Framework. These object were originally using a home-grown data access method, but we wanted to try out Entity Framework on it now that it is using Code-First. It was my expectation that this would be fairly simple, but now I am having some doubts.
I am trying to use only attributes to accomplish this so that I don't have some of the mapping here, some of it there, and still more of it over there....
When I query for entities, I am getting System.Data.Entity.DynamicProxies.MyClass_23A498C7987EFFF2345908623DC45345 and similar objects back. These objects have the data from the associated record there as well as related objects (although those are DynamicProxies also).
What is happening here? Is something going wrong with my mapping? Why is it not bringing back MyBusinessObject.MyClass instead?
That has nothing to do with mapping. Those types you see are called dynamic proxies. EF at runtime derives class from every type you map and use it instead of your type. These classes has some additional internal logic inside overriden property setters and getters. The logic is needed for lazy loading and dynamic change tracking of attached entities.
This behaviour can be turned off in context instance:
context.Configuration.ProxyCreationEnabled = false;
Your navigation properties will not be loaded automatically once you do this and you will have to use eager loading (Include method in queries) or explicit loading.
We're currently using GWT RPC for serialization on a GWT project but we're currently maintaining two sets of objects - the object that we need to convert for the database to retrieve/save and a version of the object that is safe for GWT RPC serialization (no enums/big decimal, etc.).
We're spending a lot of effort writing code that merely converts from one format to the other format. In addition it's pretty painful to make any changes to the data model because it has to be changed in two places.
I was thinking that we could use a combination of Spring 3.0 MVC and Jackson to replace the RPC calls with JSON calls. If we built JavaScript objects for GWT to hold this JSON data, then it would remove the need for any property conversion code. However we'd still have to maintain two sets of objects - one JavaScriptObject for the client side code and the server side representation.
To eliminate this layer, to take a Java object and have it produce a GWT JavaScriptObject with the JSNI getters/setters exposed. Is there a library out there that could do this automatically?
We eventually dropped GWT and went with a Spring MVC/jQuery solution, but I did find the protostuff library which looked like it could do most of what I was looking for.
What if you need to create POCO objects from a dbml file? Do you use a generator and who? You write the POCO's manually?
Say you like to make your objects Persistent Ignorant and share to clients, then create a DAO pattern for the communication between Client - DAO - L2S Objects, this is a question for disconnected design using Linq 2 SQL. Supposed that the POCO's using the client should be as much as primitive as they can be without dependencies (EntityRef<>, EntitySet<>, Attributes, etc.), and ofcourse you could cast the L2S object into the POCO with the appropriate DATA.
Any help and any corrections on the concept would be really helpful!
I would be tempted to say "wait until EF in .NET 4.0", which has much improved POCO support (compared to EF current) and hopefully a POCO T4 template in VS2010.
At the moment SqlMetal will emit rich objects; while LINQ-to-SQL can work on POCO types, you would have to write the POCOs yourself, or use xslt / T4 / whatever on the dbml.
SqlMetal can emit an XML mapping file from an input DBML file via the /map[:file] switch. This removes attributes from the generated class files, which is a step closer to POCO - you just have to remember to initialize your data context instances from the XML mapping file.
Removing EntitySet<T> and EntityRef<T> references is harder, and I'm not sure it's something I would recommend as you would lose a lot of useful functionality. However, it is possible - you need to manually manipulate the DBML file that you pass to SqlMetal by removing all <Association> elements. You could do this using LINQ to XML as a custom step in your build process, for example.
This would basically disable associations in the output mapping file and classes, as SqlMetal will only generate EntitySet / EntityRef code for <Association> mappings. You lose the ability to manage parent-child relationships automatically though.
That would give you a pretty close POCO pattern - the only other thing you would get is the INotifyPropertyChanging implementation, but I think you could justify hanging onto that as it is fairly generic.
If that doesn't meet your needs then you could look at doing your own code generation - check out T4 templates for LINQ to SQL which works in VS 2008 and is based on SqlMetal, but you have the option to totally customize the output to suit your needs as it uses T4 for template specification and output generation.
We also use Linq2Sql and need to write own model classes from L2S results. After lot of googling I've found T4 POCO Templates for Linq2Sql and EF which uses .dbml or .edmx files as a source and create own POCO entities.
Link to download at the bottom of the article or duplicated here.
We used it as a base and then customized it for our needs.