Entity Framework 4.1 and T4 class generation. Is this design overkill? - entity-framework-4.1

I am trying to get some design validation on modeling a domain using EF4.1 and T4.
At design time I run a customized a T4 poco generator template that reads edmx and creates 3 partial classes:
1) domain-level class (where any specific business methods will reside). this is only generated one time. Once Gen'd it's owned.
2) poco class just properties and virtual navigation properties to related objects, loaded lazily. this can be regen'ed if/when any underlying columns in the database change.
3) metadata class with an internal class whose properties are decorated with data annotations to provide additional column-level validation before inserting / updating data.
Is this overkill? I liked the separation, namely between the poco and domain object so that I can add methods to the partial domain object at any time without having to worry about method loss when needing to rerun the T4 template after underlying data specs may change. What about the metadata class? Is that unnecessary if my application will be performing field validation?

Related

WPF+REST+EF: what is the best way to organize DTO's?

I have a WPF MVVM app with 3 layers:
UI
Services
DAL
and some item, for example Order. I need 3 DTO:
Class for MVVM layer, with PropertyChanged notification;
Class for Json deserializer (get objects by REST API)
Class for Entity Framework (cache data in DB).
Well, I can use ONE class for all three cases, but this will be mix of different attributes (from EF, JSon, MVVM) and excess dependencies of layers.
Another way: make 3 classes, each layer has own class, and use AutoMapper for fast convert between. No bad, but 3 almost identical (90%) copy of each DTO class... not elegant solution.
What is the best approach? What do you use?
Thanks.
What is the best approach? What do you use?
The second approach, i.e. you define your business objects in a separate assembly that you can reference from all your applications. These classes should not implement any client-specific interfaces such as INotifyPropertyChanged but be pure POCO classes that contains business logic only.
In your WPF application, you then create a view model class that implements the INotifyPropertyChanged interface and wraps any properties of the business object that it makes sense to expose to and bind to from the view.
The view model then has a reference to the model and the view binds to the view model. This is typically how the MVVM design pattern is (or should be) implemented in a WPF application. The view model class contains your application logic, for example how to notify the view when a data bound property value is changed, and the model contains the business logic that is the same across all platforms and applications.
Of course this means that you will end up with a larger number of classes in total but this is not necessarily a bad thing as each class has its own responsibility.
The responsibility of a view model is to act as a model for the application specific XAML view whereas the responsibility of the model class is to implement the business logic and the responsibility of the DTO class is to simply transfer the data between the different tiers. This is a far better solution - at least in my opinion and probably in most enterprise architect's opinions as well - than defining a single class that implements all kind of UI specific logic just for the sake of reducing the number of classes.

Debugging Entity Framework DBContext API Mappings

I am mapping some pre-existing Business Objects to our database using Entity Framework. These object were originally using a home-grown data access method, but we wanted to try out Entity Framework on it now that it is using Code-First. It was my expectation that this would be fairly simple, but now I am having some doubts.
I am trying to use only attributes to accomplish this so that I don't have some of the mapping here, some of it there, and still more of it over there....
When I query for entities, I am getting System.Data.Entity.DynamicProxies.MyClass_23A498C7987EFFF2345908623DC45345 and similar objects back. These objects have the data from the associated record there as well as related objects (although those are DynamicProxies also).
What is happening here? Is something going wrong with my mapping? Why is it not bringing back MyBusinessObject.MyClass instead?
That has nothing to do with mapping. Those types you see are called dynamic proxies. EF at runtime derives class from every type you map and use it instead of your type. These classes has some additional internal logic inside overriden property setters and getters. The logic is needed for lazy loading and dynamic change tracking of attached entities.
This behaviour can be turned off in context instance:
context.Configuration.ProxyCreationEnabled = false;
Your navigation properties will not be loaded automatically once you do this and you will have to use eager loading (Include method in queries) or explicit loading.

Entity Framework Code First, pointing to the entities

I get the exception 'The entity type [TYPE] is not part of the model for the current context.' when trying to run my application.
My best guess so far is that it doesn't recognize my type as a type that it has mapped. This could very well be since it is a type loaded at runtime. This type comes from a different assembly.
How does EF: CF find all it's entities to map, and how can I make it find my types ?
EF is not designed to support this feature directly. EF is ORM and ORM is mostly created for purpose when you specify types you want to use and map at design time and simply use them at runtime. It doesn't mean that it is not possible to create types at runtime (with code mapping) but it is much more complex.
Context must know about all types it should map and about their mapping. If you create context with no reference to your new type it simply doesn't know about it. How to solve it? I can think about two options:
Emit context code as well and make sure that emitted context code contains public property of type DbSet<YourEmittedEntityType> (to use default mapping conventions) or emit OnModelCreating method as well to specify custom mapping.
Emit configuration (derived from EntityTypeConfiguration<YourEmittedEntityType>) class for your new entity as well. This class will specify mapping of new entity to your database table. Once you have your configuration you can manually create DbModelBuilder register all necessary entity type configuration including your new ones created at runtime, build DbModel, compile it and use DbCompiledModel to construct new instance of the DbContext. Just make user you cache DbCompiledModel for subsequent usages because its construction is very time consuming.
In both cases make sure that table used to persist and retrieve new entity is already created and turn off any database initializers - you must maintain your database manually.
Sure this is only the first step. Now you need to emit / generate code which will use your new entity and context - be aware that EF doesn't work with interfaces and inheritance is handled specially so in the most scenarios you need code working with your emitted type directly.

AutoMapper classes with a Transient lifestyle in IoC

I'm using AutoMapper to map domain entities to view models in an Asp.Net MVC app. I register these mapping classes in Castle Windsor so they are available to the controller thru ctor dependency injection. These mapping classes has a virtual CreateMap method where I can override AutoMapper's mapping, telling it how to map fields from the entity to the view model, which fields to ignore, pointing to methods that transforms the data, etc. All of this is working well; big kudos to the people behind AutoMapper!
So far I've been registering the mapping classes with a Singleton lifestyle in Windsor, but one of them needs to use the IAuthorizationRepository from Rhino.Security which needs to have its components registered as Transient. This forces me to register the mapping classes also as transient, because a singleton mapping class holding a reference to a transient IAuthorizationRepository causes problems the second time the mapper is used (i.e., ISession is already closed errors).
Is it a waste of resources to register all of these mapping classes with a Transient lifestyle, which will cause the mapping class to be instantiated and the CreateMap method to run each time the system wants to map a domain entity to a view model?
Or should I try to find a way to separate the IAuthorizationRepository from the mapping class so I can keep the mapping classes as Singletons?
Thanks
Dan
Another way around it is using the TypedFactoryFacility, then instead of injecting IAuthorizationRepository into your singletons you can inject Func<IAuthorizationRepository>

Does Model Driven Architecture play nice with LINQ-to-SQL or Entity Framework?

My newly created system was created using the Model Driven Architecture approach so all I have is the model (let's say comprehensive 'Order' and 'Product' classes). These are fully tested classes that support the business of my application. Now it's time to persist these classes as objects on the harddrive and at some later time retrieve them in the same state (thinking very abstractly here). Typically I'd create an IOrderRepository interface and eventually a ADO.NET-driven OrderRepository class with methods such as GetAll(), GetById(), Save(), etc... or at some point a BinaryFormatter-driven OrderRepostiroy class that serves a similar purpose through this same common interface.
Is this approach just not conducive to LINQ-To-Sql or the Entity Framework. Something that attempts to build my model from a pre-existing DB structure just seems wrong. Could I take advantage of these technologies but retain this 'MDA' approach to software engineering?
... notice I did not mention that this was a Web App. It may or may not be -- and shouldn't matter.
In general, I think that you should not make types implementing business methods and types used for O/R mapping the same type. I think this violates the single responsibility principle. The point of your entity types is to bridge the gap between relational space and object space. The point of your business types is to have collections of testable behavior. Instead, I would suggest that you project from your entity types onto your business types when materializing objects from the database. Separating these two allows your business methods and data mappings to evolve independently, which is very important, especially if you cannot always control the schema of the database. I explain this idea more fully in this presentation.