PLINQO / LINQ-To-SQL - Generated Entity Self Save Method? - linq-to-sql

Hi I'm trying to create a basic data model / layer
The idea is to have:
Task task = TaskRepository.GetTask(2);
task.Description = "The task has changed";
task.Save();
Is this possible? I've tried the code below
Note: The TaskRepository.GetTask() methods detaches the Task entity.
I'd expect this to work, any ideas why it doesnt?
Thanks
public partial class Task
{
// Place custom code here.
public void Save()
{
using (TinyTaskDataContext db = new TinyTaskDataContext { Log = Console.Out })
{
db.Task.Attach(this);
db.SubmitChanges();
}
}
#region Metadata
// For more information about how to use the metadata class visit:
// http://www.plinqo.com/metadata.ashx
[CodeSmith.Data.Audit.Audit]
internal class Metadata
{
// WARNING: Only attributes inside of this class will be preserved.
public int TaskId { get; set; }
[Required]
public string Name { get; set; }
[Now(EntityState.New)]
[CodeSmith.Data.Audit.NotAudited]
public System.DateTime DateCreated { get; set; }
}
#endregion
}

Having done some reading I've realised I was implmenting the Repository pattern incorrectly. I should have been adding the Save method to the repository for conventions sake.
However, the actually problem I was having with regard to commiting the disconnected dataset was due to optimistic concurrency. The datacontext's job is to keep track of the state of it's entities. When entities become disconnected you loose that state.
I've found you need to add a timestamp field to the database table or I can set the UpdateCheck field on each column in my dbml file.
Here is some info about the UpdateCheck
Some useful links about disconnected Linq and plinqo
Great info on implementing the Repository pattern with LINQ
Short tutorial for implementing for updating and reattaching entities
Previously answer question
Rick Strahl on LINQ to SQL and attaching Entities

There is no need for this line (Task task = new Task();). The above should work although I've never seen it implemented in this manner. Have you thought about using the managers? Are you running into any runtime errors?
Thanks
-Blake Niemyjski

Related

How to return complex objects with lazy loading to Web API

I am creating a Web API to expose Entity framework models.
Following a number of posts I have read, I have done a few bits in my webapi.config file
//Ignore circular references due to the VIRTUAL property on some objects.
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
//Remove XML formatter. We dont need XML, just JSON.
config.Formatters.Remove(config.Formatters.XmlFormatter);
DefaultContractResolver resolver = (DefaultContractResolver)config.Formatters.JsonFormatter.SerializerSettings.ContractResolver;
resolver.IgnoreSerializableAttribute = true;
In my Web API controllers, I am disabling ProxyCreation on the DB context.
Generally this is doing what I need to. However. I need to return a UserProfile object which has a virtual UserAdditionalInfos property as below.
[Serializable]
public class UserProfile
{
[Key]
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public int UserId { get; set; }
public virtual List<UserAdditionalInfos> AdditionalDetails { get; set; }
}
If I try and make an API call to get the UserProfile object, I get an error at the point it tries to lazy load the UserAdditionalInfos list. I expect this as I have switched off the proxy creation. But if I switch it back on, I get a proxy encoded string returned in the JSON, rather than the object I would like.
Short of manually creating a 'flat' object for my API, is there any solid workaround available? Im sure this is a common problem?
Cheers
Ok I managed to figure this out, but adding in an optional 'Includes' string in my interfaces which I then split and separate and apply to the query itself. Thanks for the insight all!

How do I populate a Data Access Layer Model Efficiently?

I'm working on developing my first Data Driven Domain using Dependency Injection in ASP.net.
In my Data Access Layer if have created some domain data models, for example:
public class Company {
public Guid CompanyId { get; set; }
public string Name { get; set; }
}
public class Employee {
public Guid EmployeeId { get; set; }
public Guid CompanyId { get; set; }
public string Name { get; set; }
}
I have then developed an interface such as:
public interface ICompanyService {
IEnumerable<Model.Company> GetCompanies();
IEnumerable<Model.Employee> GetEmployees();
IEnumerable<Model.Employee> GetEmployees(Guid companyId);
}
In a separate module I have implemented this interface using Linq to Sql:
public class CompanyService : ICompanyService {
public IEnumerable<Model.Employee> GetEmployees();
{
return EmployeeDb
.OrderBy(e => e.Name)
.Select(e => e.ToDomainEntity())
.AsEnumerable();
}
}
Where ToDomainEntity() is implemented in the employee repository class as an extension method to the base entity class:
public Model.EmployeeToDomainEntity()
{
return new Model.Employee {
EmployeeId = this.EmployeeId,
CompanyId = this.CompanyId,
Name = this.Name
};
}
To this point, I have more or less followed the patterns as described in Mark Seeman's excellent book 'Dependency Injection in .NET' - and all works nicely.
I would like however to extend my basic models to also include key reference models, so the domain Employee class would become:
public class Employee {
public Guid EmployeeId { get; set; }
public Guid CompanyId { get; set; }
public Company { get; set; }
public string Name { get; set; }
}
and the ToDomainEntity() function would be extended to:
public Model.Employee ToDomainEntity()
{
return new Model.Employee {
EmployeeId = this.EmployeeId,
CompanyId = this.CompanyId,
Company = (this.Company == null) ? null : this.Company.ToDomainEntity()
Name = this.Name
};
}
I suspect that this might be 'bad practice' from a domain modelling point of view, but the problem I have encountered would also, I think, hold true if I were to develop a specific View Model to achieve the same purpose.
In essence, the problem I have run into is the speed/efficiency of populating the data models. If I use the ToDomainEntity() approach described above, Linq to Sql creates a separate SQL call to retrieve the data for each Employee's Company record. This, as you would expect, increases the time taken to evaluate the SQL expression quite considerably (from around 100ms to 7 seconds on our test database), particularly if the data tree is complex (as separate SQL calls are made to populate each node/sub-node of the tree).
If I create the data model 'inline...
public IEnumerable<Model.Employee> GetEmployees();
{
return EmployeeDb
.OrderBy(e => e.Name)
.Select(e => new Model.Employee {
EmployeeId = e.EmployeeId,
/* Other field mappings */
Company = new Model.Company {
CompanyId = e.Company.CompanyId,
/* Other field mappings */
}
}).AsEnumerable();
}
Linq to SQL produces a nice, tight SQL statement that natively uses the 'inner join' method to associate the Company with the Employee.
I have two questions:
1) Is it considered 'bad practice' to reference associated data classes from within a domain class object?
2) If this is the case, and a specific View Model is created for the purpose, what is the right way of populating the model using without having to resort to creating inline assignment blocks to build the expression tree?
Any help/advice would be much appreciated.
The problem is caused by having both data layer entities and domain layer entities and needing a mapping between the two. Although you can get this to work, this makes everything very complex, as you are already experiencing. You are making mappings between data and domain, and will soon add many more mappings for these same entities, because of performance reasons and because other business logic and presentation logic will need different data.
The only real solution is to ditch your data entities and create POCO model objects that can directly be serialized to your backend store (SQL server).
POCO entities is something that is supported in LINQ to SQL from day one, but I think it would be better to migrate to Entity Framework Code First.
When doing this, you can expose IQueryable<T> interfaces from your repositories (you currently called your repository ICompanyService, but a better name would be ICompanyRepository). This allows you to do efficient LINQ queries. When querying directly over a query provider you can prevent loading complete entities. For instance:
from employee in this.repository.GetEmployees()
where employee.Company.Name.StartWith(searchString)
select new
{
employee.Name,
employee.Company.Location
};
When working with IQueryable<T>, LINQ to SQL and Entity Framework will translate this to a very efficient SQL query that only returns the employe name and company location from the database with filtering inside the database (compared to do filtering in your .NET application when GetEmployees() returns an IEnumerable<T>).
You can ask Linq2Sql to preload certain entities (as opposed to lazy load them) using DataLoadOptions.LoadWith method see: http://msdn.microsoft.com/en-us/library/bb534268.aspx.
If you do this with the Company entity then I think Linq2Sql won't have to reach to the database to fetch it again.

Entity Framework/Linq to sql model to business model

I'm coming from a stored procedure and creating the data access layer manually approach. I am trying to understand where I should fit Linq To SQL or entity frameworks into my normal planning. I normally seperate out the business layer from the DAL layer and use a repository inbetween.
It seems that people will either use the generated classes from linq to sql, extend them by using the partial class or do a full seperation and map the generated linq classes to seperate business entities. I am partial to the seperate Business entities. However, this seems to be counterintuitive.
One of my last projects used DDD and the entity framework. When needing to udpate an object it moved the business entity to the repistory layer which when going to the DAL layer would create a context and than requery the object. It would than update the values and resbumit.
I didn't see the large point as the data context wasn't saved and required an extra query to grab the object before updating. Normally I would just do the update(If concurrency wasn't an issue)
So my questions come down to:
Does it make sense to seperate linq to sql generated classes into Business entities?
Should the data context be saved or is that impractical?
Thanks for your time, trying to make sure I understand. I normally like to seperate out as it makes it cleaner to understand even in some smaller porjects.
I currently hand roll my own Dto classes and Datacontext instead of using auto-generated code files from Linq to Sql. To give some background of my solution architecture/modeling, I have a "Contract" project, and a "Dal" project. (Also a "Model" project, but I'll try to stay focused here on Dal only). Hand-rolling my own Dtos and Datacontext, makes everything a lot smaller and simpler, I'll give a few examples of how I do that here.
I never return out a Dto object outside of the Dal, in fact I make sure to declare them as internal. The way I return them out is I cast them as an interface (interfaces are located in my "Contract" layer). We'll make a simple "PersonRepository" that implements an "IPersonRetriever and IPersonSaver" interfaces.
Contracts:
public interface IPersonRetriever
{
IPerson GetPersonById(Guid personId);
}
public interface IPersonSaver
{
void SavePerson(IPerson person);
}
Dal:
public class PersonRepository : IPersonSaver, IPersonRetriever
{
private string _connectionString;
public PersonRepository(string connectionString)
{
_connectionString = connectionString;
}
IPerson IPersonRetriever.GetPersonById(Guid id)
{
using (var dc = new PersonDataContext(_connectionString))
{
return dc.PersonDtos.FirstOrDefault(p => p.PersonId == id);
}
}
void IPersonSaver.SavePerson(IPerson person)
{
using (var dc = new PersonDataContext(_connectionString))
{
var personDto = new PersonDto
{
Id = person.Id,
FirstName = person.FirstName,
Age = person.Age
};
dc.PersonDtos.InsertOnSubmit(personDto);
dc.SubmitChanges();
}
}
}
PersonDataContext:
internal class PersonDataContext : System.Data.Linq.DataContext
{
static MappingSource _mappingSource = new AttributeMappingSource(); // necessary for pre-compiled linq queries in .Net 4.0+
internal PersonDataContext(string connectionString) : base(connectionString, _mappingSource) { }
internal Table<PersonDto> PersonDtos { get { return GetTable<PersonDto>(); } }
}
[Table(Name = "dbo.Persons")]
internal class PersonDto : IPerson
{
[Column(Name = "PersonIdentityId", IsPrimaryKey = true, IsDbGenerated = false)]
internal Guid Id { get; set; }
[Column]
internal string FirstName { get; set; }
[Column]
internal int Age { get; set; }
#region IPerson implementation
Guid IPerson.Id { get { return this.Id; } }
string IPerson.FirstName { get { return this.FirstName; } }
int IPerson.Age { get { return this.Age; } }
#endregion
}
You will need to add the "Column" attribute to all of your Dto properties, but if you notice, if there is a one-to-one correlation between what you want the field to be exposed as on the interface, and the name of the actual table column, you won't need to add any of the Named Parameters. In this example my PersonId in the database is stored as "PersonIdentityId", yet I only want my interface to make the field say "Id".
That's how I do my Dal layer, I believe this layer should be dumb, real dumb. Dumb in the sense that it is only there for CRUD (Create, Retrieve, Update and Delete) operations. All of the business logic would go into my "Model" project, which would consume and utilize the IPersonSaver and IPersonRetriever interfaces.
Hope this helps!

How to prevent linq-to-sql designer undo my changing

Thanks for your attention in advance,
I’ve met an issue with LINQ-2-SQL designer in VS 2008 SP1 which has made me CRAZY. I use Linq2sql as my DAL. It seems Linq2sql speeds up coding in the first step but lots of issues arise in feature specifically with table or object inheritance.
In this case I have a class Entity that all other entity classes generated by Linq2sql designer inherit from.
public abstract class Entity
{
public virtual Guid ID { get; protected set; }
}
public partial class User : monius.Data.Entity
{
}
And the following generated by L2S designer (DataModel.designer.cs)
[Column(Storage = "_ID", AutoSync = AutoSync.OnInsert, DbType = "UniqueIdentifier NOT NULL",
IsPrimaryKey = true, IsDbGenerated = true, UpdateCheck = UpdateCheck.Never)]
[DataMember(Order = 1)]
public System.Guid ID
{
get
{
return this._ID;
}
set
{
if ((this._ID != value))
{
this.OnIDChanging(value);
this.SendPropertyChanging();
this._ID = value;
this.SendPropertyChanged("ID");
this.OnIDChanged();
}
}
}
When I compile the code VS warns me that
Warning 1 'User.ID' hides inherited member 'Entity.ID'. To make the current member override that mplementation, add the override keyword. Otherwise add the new keyword.
That warning is obvious and I have to change the code generated by L2S designer (DataModel.designer.cs) to
[…]
public override System.Guid ID
{
…
protected set
…
}
And the code compiled with no error or warning and everyone is happy. But that is not the end of story.
As soon as I made changes to entities of the diagram (dbml) or even I open dbml file to view it, any change manually I made to designer has been vanished and POOF! Redo AGAIN. That is a painful job.
Now I wonder if there is a way to force L2S designer not changing portions of auto-generated code.
I’ll be appreciated if someone kindly helps me with this issue.
I suppose the questioner by now has overcome the issue. But I'll add an answer , for the benefiit of others who, like myself, googled their way to this post.
If you need to an override modifier on your linqToSql generated property;
1) open the dbml
2) right click and select properties on the Property you wish to add the override (or virtual, new or new virtual) keyword
3) change the Inheritance Modifier property to what you desire.
If all your base class does is:
public abstract class Entity
{
public virtual Guid ID { get; protected set; }
}
...then why not make it an interface instead?
Alternatively, you may want to look into using Damien Guard's T4 templates to customize the output of the code generator: http://l2st4.codeplex.com/

ErrorProvider(from Windows Forms) for ASP.net & linq-to-sql?

I am trying to figure out how to notify the user which field failed to validate.
I setup LINQ to SQL class datacontext to access a database from ASP.net pages. Since user input will be from by web interface forms and import from Excel files, i would like the write the validation logic in one place. The idea behind this is when i import from excel. I will collect the error messages for each row, and display a summary, somehow. A logical place seems to extend the class generated by LINQ to SQL. According to most documentation and examples, i should do something like this:
public partial class Customer
{
partial void OnTitleChanging(string value)
{
if (!Char.IsUpper(value[0])) {
throw new ValidationException(
"Title must start with an uppercase letter.");}
}
}
The problem with this approach is that validation will stop on the first failed field.
In Windows Forms Link1, if I define an ErrorProvider component in the form and set the DataSource property of it to your BindingSource the exception will be indicated by a red circle right to the validated control. The tooltip of this red circle will show the exception message.
Is there something similar for ASP.net pages? I am using the listview control and inline editing in the listview.
Updates:
- I actually did something similar to what Nick Carver is suggesting. Link2 . Instead of throwing an exception, i record an error message.
public partial class PQSSClassesDataContext
{
public partial class ErrorFeilds
{
private static List<string> Messages = new List<string>();
public void AddErrorMessage(string message)
{
Messages.Add(message);
}
public List<string> GetErrorMessages()
{
return Messages;
}
}
}
I am actually stuck on how to map the error message to the field. That's why i was looking for something like ErrorProvider. I am already using events instead of exceptions to record errors. Any idea how to mark the corresponding failed field from the codebehind file?
Any help appreciated.
What we have done in the past is simply have an error collection on the DataContext, extend it just adding something like a List<ValidationError>. Then all you need to do is override SubmitChanges() and check if you have any validation errors, and decide to abort, throw them, handle however you wish really at that point...all before calling base.SubmitChanges()
We're in a ASP.Net per-request life-cycle, but if your Context is around longer make sure to clear the error list.
It's handy for your ValidationError class/objects to contain a reference to a common base or interface that all your classes implement so you can point to the object later from the error if needed. (e.g. get the ID for throwing the error labels or other info in the right place).
Example classes:
public class ValidationError {
public string Message { get; set; }
public IBase { get; set; }
}
public interface IBase {
public long ID { get; set; }
public DateTime DateModified { get; set; }
}
There is the ValidationSummary control, that works with the individual validation controls to show a list of errors. But the action of the WinForms ErrorProvider is performed in ASP.NET by the individual validation controls, which derive from the Label control