Hard to update an Entity created by another LINQ to SQL context - linq-to-sql

Why this keep bugging me all day.
I have an entity with several references where i get from a context which I then Dispose.
Do some Changes and try to SubmitChanges(). While calling SubmitChanges() without .Attach() seems to simply do nothing. When using .Attach() I get the Exception :
An attempt has been made to Attach or Add an entity that is not new, perhaps having been loaded from another DataContext. This is not supported.
Any ideas?

L2S is very picky about updating an entity that came from a different DB context. In fact, you cannot do it unless you 'detach' it first from the context it came from. There are a couple different ways of detaching an entity. One of them is shown below. This code would be in your entity class.
public virtual void Detach()
{
PropertyChanging = null;
PropertyChanged = null;
}
In addition to this, you can also serialize your entity using WCF based serialization. Something like this:
object ICloneable.Clone()
{
var serializer = new DataContractSerializer(GetType());
using (var ms = new System.IO.MemoryStream())
{
serializer.WriteObject(ms, this);
ms.Position = 0;
return serializer.ReadObject(ms);
}
}

Related

Exception while calling savechange method while adding, removing or modifying entity. IMP

I am working on Entity Framework 4.1 . Here Adding control into database using AddObject() and save that suing SaveChange() methods.
But Once I delete that added control and try to add again same I am getting this error again and again (Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. Refresh ObjectStateManager entries.) but not able to add it. Once i close my application then try to add then I am able to add that control.
I tried to search a lot here and there how it going wrong but could not find solution. As I am new born in field in Entity Framework.
As in this scenario i was calling SaveChange() method of Entity framework object context after every operation like add, delete and modification. But i was getting exception back to back.It got solved by
By calling method like this
public void Save(object entity)
{
using (var transaction = Connection.BeginTransaction())
{
try
{
SaveChanges();
transaction.Commit();
}
catch (OptimisticConcurrencyException)
{
if (ObjectStateManager.GetObjectStateEntry(entity).State == EntityState.Deleted || ObjectStateManager.GetObjectStateEntry(entity).State == EntityState.Modified)
this.Refresh(RefreshMode.StoreWins, entity);
else if (ObjectStateManager.GetObjectStateEntry(entity).State == EntityState.Added)
Detach(entity);
AcceptAllChanges();
transaction.Commit();
}
}
}

Enable ChangeTracking In Child Objects Using STE

I'm using STE and I want to enable change tracking for an object and its children. What I currently have to do now is something like this.
int id = 1;
using(CustomerEntities context = new CustomerEntities())
{
CustomerSection custSection = context.CustomerSections.Include("CustomerSections.Customers").SingleOrDefault(p => p.ID == id);
custSection.StartTracking();
foreach(Customer cust in custSection.Customers)
{
cust.StartTracking();
{
return custSection;
}
What I am looking for is a way to automatically enable change tracking for the child objects too, without having to loop through each one and explicitly tell it to start tracking changes.
Thanks in advance for any insight.
Most probably you are using Self Tracking entities in combination with WCF. Then it's not needed to enable the changetracking manually. this is already done for you. The T4 template that generates the STE's includes a method decorated with the [OnDeserialized] attribute which starts the tracking once entities are deserialized (which occurs normally after reaching the client and converted back into runtime class instances fromout the xml that WCF generated for the transport. See the exact code example:
[OnDeserialized]
public void OnDeserializedMethod(StreamingContext context)
{
IsDeserializing = false;
ChangeTracker.ChangeTrackingEnabled = true;
}
Search your entities or the T4 template and you will find this soon.

Is there a workaround for lack of bi-directional serialization

In my project we serialize disconnected Linq-to-SQL entities (mainly to preserve them between postbacks). Code in use for that is fairly straightforward:
public static string Serialize<P>(this P entity)
{
StringWriter writer = new StringWriter();
XmlTextWriter xmlWriter = new XmlTextWriter(writer);
DataContractSerializer serializer = new DataContractSerializer(typeof(P));
serializer.WriteObject(xmlWriter, entity);
return writer.ToString();
}
It works fine, but after deserialization all EntityRef's children for that object are gone and replaced with just a foreign key value. Looks like this problem is due to the lack of bi-directional serialization.
Is there existing work around for this problem?
Try changing the Serialization Mode property to "Unidirectional" on the Linq2Sql Dbml file. I ran into this issue when using L2S in a web service.

Linq to SQL and concurrency with Rob Conery repository pattern

I have implemented a DAL using Rob Conery's spin on the repository pattern (from the MVC Storefront project) where I map database objects to domain objects using Linq and use Linq to SQL to actually get the data.
This is all working wonderfully giving me the full control over the shape of my domain objects that I want, but I have hit a problem with concurrency that I thought I'd ask about here. I have concurrency working but the solution feels like it might be wrong (just one of those gitchy feelings).
The basic pattern is:
private MyDataContext _datacontext
private Table _tasks;
public Repository(MyDataContext datacontext)
{
_dataContext = datacontext;
}
public void GetTasks()
{
_tasks = from t in _dataContext.Tasks;
return from t in _tasks
select new Domain.Task
{
Name = t.Name,
Id = t.TaskId,
Description = t.Description
};
}
public void SaveTask(Domain.Task task)
{
Task dbTask = null;
// Logic for new tasks omitted...
dbTask = (from t in _tasks
where t.TaskId == task.Id
select t).SingleOrDefault();
dbTask.Description = task.Description,
dbTask.Name = task.Name,
_dataContext.SubmitChanges();
}
So with that implementation I've lost concurrency tracking because of the mapping to the domain task. I get it back by storing the private Table which is my datacontext list of tasks at the time of getting the original task.
I then update the tasks from this stored Table and save what I've updated
This is working - I get change conflict exceptions raised when there are concurrency violations, just as I want.
However, it just screams to me that I've missed a trick.
Is there a better way of doing this?
I've looked at the .Attach method on the datacontext but that appears to require storing the original version in a similar way to what I'm already doing.
I also know that I could avoid all this by doing away with the domain objects and letting the Linq to SQL generated objects all the way up my stack - but I dislike that just as much as I dislike the way I'm handling concurrency.
I worked through this and found the following solution. It works in all the test cases I (and more importantly, my testers!) can think of.
I am using the .Attach() method on the datacontext, and a TimeStamp column. This works fine for the first time that you save a particular primary key back to the database but I found that the datacontext throws a System.Data.Linq.DuplicateKeyException "Cannot add an entity with a key that is already in use."
The work around for this I created was to add a dictionary that stored the item I attach the first time around and then every subsequent time I save I reuse that item.
Example code is below, I do wonder if I've missed any tricks - concurrency is pretty fundamental so the hoops I'm jumping through seem a little excessive.
Hopefully the below proves useful, or someone can point me towards a better implementation!
private Dictionary<int, Payment> _attachedPayments;
public void SavePayments(IList<Domain.Payment> payments)
{
Dictionary<Payment, Domain.Payment> savedPayments =
new Dictionary<Payment, Domain.Payment>();
// Items with a zero id are new
foreach (Domain.Payment p in payments.Where(p => p.PaymentId != 0))
{
// The list of attached payments that works around the linq datacontext
// duplicatekey exception
if (_attachedPayments.ContainsKey(p.PaymentId)) // Already attached
{
Payment dbPayment = _attachedPayments[p.PaymentId];
// Just a method that maps domain to datacontext types
MapDomainPaymentToDBPayment(p, dbPayment, false);
savedPayments.Add(dbPayment, p);
}
else // Attach this payment to the datacontext
{
Payment dbPayment = new Payment();
MapDomainPaymentToDBPayment(p, dbPayment, true);
_dataContext.Payments.Attach(dbPayment, true);
savedPayments.Add(dbPayment, p);
}
}
// There is some code snipped but this is just brand new payments
foreach (var payment in newPayments)
{
Domain.Payment payment1 = payment;
Payment newPayment = new Payment();
MapDomainPaymentToDBPayment(payment1, newPayment, false);
_dataContext.Payments.InsertOnSubmit(newPayment);
savedPayments.Add(newPayment, payment);
}
try
{
_dataContext.SubmitChanges();
// Grab the Timestamp into the domain object
foreach (Payment p in savedPayments.Keys)
{
savedPayments[p].PaymentId = p.PaymentId;
savedPayments[p].Timestamp = p.Timestamp;
_attachedPayments[savedPayments[p].PaymentId] = p;
}
}
catch (ChangeConflictException ex)
{
foreach (ObjectChangeConflict occ in _dataContext.ChangeConflicts)
{
Payment entityInConflict = (Payment) occ.Object;
// Use the datacontext refresh so that I can display the new values
_dataContext.Refresh(RefreshMode.OverwriteCurrentValues, entityInConflict);
_attachedPayments[entityInConflict.PaymentId] = entityInConflict;
}
throw;
}
}
I would look at trying to utilise the .Attach method by passing the 'original' and 'updated' objects thus achieving true optimistic concurrency checking from LINQ2SQL. This IMO would be preferred to using version or datetime stamps either in the DBML objects or your Domain objects. I'm not sure how MVC allows for this idea of persisting the 'original' data however.. i've been trying to investigate the validation scaffolding in the hope that it's storing the 'original' data.. but i suspect that it is as only as good as the most recent post (and/or failed validation). So that idea may not work.
Another crazy idea i had was this: override the GetHashCode() for all of your domain objects where the hash represents the unique set of data for that object (minus the ID of course). Then, either manually or with a helper bury that hash in a hidden field in the HTML POST form and send it back to your service layer with your updated domain object - do the concurrency checking in your service layer or data layer (by comparing the original hash with a newly extracted domain object's hash) but be aware that you need to be checking for and raising concurrency exceptions yourself. It's nice to use the DMBL functions but the idea of abstracting away the data layer is so to not depend on the particular implementation's features etc. So having full control of the optimistic concurrency checking on your domain objects in your service layer (for example) seems like a good approach to me.

Maximum 'Units of Work' in one page request?

Its not One is it? I have a method that gets five Lists from different repositories. Each call opens and closes a new Datacontext. Is this ok to do or should I wrap everything in One datacontext. In this case it is not straightforward to use the same datacontext, but i am afraid that opening and closing numerous datacontext in one page request is not good.
Here is an article on just that subject...
Linq to SQL DataContext Lifetime Management
He recommends one per request and I have implemented that pattern in a few applications and it has worked well for me.
He talk a little about that in is article... His quick and dirty version makes a reference to System.Web and does something like this:
private TDataContext _DataContext;
public TDataContext DataContext
{
get
{
if (_DataContext == null)
{
if (HttpContext.Current != null)
{
if (HttpContext.Current.Items[DataContextKey] == null)
{
HttpContext.Current.Items[DataContextKey] = new TDataContext();
}
_DataContext = (TDataContext)HttpContext.Current.Items[DataContextKey];
}
else
{
_DataContext = new TDataContext();
}
}
return _DataContext;
}
}
But then he recommends you take the next step and get rid of the reference to System.Web and use dependency injection and create your own IContainer that could determine the life span of your datacontext based on whether your running in unit test, web application, etc.
Example:
public class YourRepository
{
public YourRepository(IContainer<DataContext> container)
{
}
}
then replace HttpContext.Current.Items[DataContextKey] with _Container[DataContextKey]
hope this helps...
I use on Unit of Work per request and built a IHttpModule that manages unit of work lifecycle, creating it on request and diposing it afterwards. The current unit of work is stored in HttpContext.Current.Items (hidden in Local.Data).