I ran into an interesting problem while using DLINQ. When I instantiate an entity, calling .SubmitChanges() on the DataContext will insert a new row into the database - without having ever called .Insert[All]OnSubmit(...).
//Code sample:
Data.NetServices _netServices = new Data.NetServices(_connString);
Data.ProductOption[] test = new Data.ProductOption[]
{
new Data.ProductOption
{
Name="TEST1",
//Notice the assignment here
ProductOptionCategory=_netServices.ProductOptionCategory.First(poc => poc.Name == "laminate")
}
};
_netServices.SubmitChanges();
Running the code above will insert a new row in the database. I noticed this effect while writing an app to parse an XML file and populate some tables. I noticed there were 1000+ inserts when I was only expecting around 50 or so - then I finally isolated this behavior.
How can I prevent these objects from being persisted implicitly?
Thanks,
-Charles
Think of the relationship as having two sides. When you set one side of the relationship the other side needs to be updated so in the case above as well as setting the ProductOptionCategory it is effectively adding the new object to the ProductOptions relationship on the laminate ProductOptionCategory side.
The work-around is as you have already discovered and to set the underlying foreign key instead so LINQ to SQL will not track the objects in the usual way and require implicit indication it should persist the object.
Of course the best solution for performance would be to determine from the source data which objects you don't want to add and never create the instance in the first place.
Related
I have a view ObjectDisplay that is composed of two relevant tables: Object and State. State represents the state of an Object, and the view pulls some of the details from the most recent State for each Object.
On the page that is displaying this information, a user can enter some comments, which creates a new State. After creating the new State, I immediately pull the Object from ObjectDisplay and send it back to be dropped into a partial view and replace the Object in the grid on the page.
// Add new State.
db.States.Add(new State()
{
ObjectId = objectId,
Comments = comments,
UserName = username
});
// Save the changes (executes all of the above).
db.SaveChanges();
// Return the new Object information.
return db.Objects.Single(c => c.ObjectId == objectId);
According to my db trace, the Single call occurs about 70 ms after the SaveChanges call, and it occurs on the same SPID.
Now for the issue: The database defaults the value of RecordDate in State to GETUTCDATE() - I don't provide the date myself. What I'm seeing is that the Object returned has the State's RecordDate of the old State and the Comments of the new State information of the old State. I am seeing that the Object returned has the old State's information. When I refresh the page, all the correct information is there, but the wrong information is returned in the initial call from the database/EF.
So.. what could be wrong? Could the view not be updating quickly enough? Could something be going on with EF? I don't really know where to start looking.
If you've previously loaded the same Object entity in the same DbContext, EF will return the cached instance with the stale values, and ignore the values returned from SQL.
The simplest solution is to reload the entity before returning it:
var result = db.Objects.Single(c => c.ObjectId == objectId);
db.Entry(result).Reload();
return result;
This is indeed odd. In SQL Server views are not persisted by default and therefore show changes in the underlying data right away. You can create a clustered index on a view with effectively persists the query, but in that case the data is updated synchronously, so you should see the change right away.
If you are working with snapshot isolation level your changes might not be visible to other SPIDs right away, but as you are on the same SPID and do not use snapshot isolation, this cant be the culprit either.
The only thing left at this point is the application layer. Are you actually using the result of the Single call higher up in the call stack or does that get lost somewhere. I assume that a refresh of the page uses a different code path, which would explain why it is working there.
We have the need to clone a complex data structure from one org to another. This contains a series of custom SObjects, including parents and children.
The flow would be the following. On origin org, we just JSON.serialize the list of SObjects we want to send. Then, on target org, we can JSON.deserialize that list of objects. So far so good.
The problem is that we cannot insert those SObjects directly, since they contain the origin org's IDs and Salesforce won't let us insert objects that already have Ids.
The solution we found is to manually insert the object hierarchy, maintaining a map of originId > targetId and fixing the relationships manually. However, we wonder if Salesforce provides an easier way to do such a thing, or someone knows a better way to do it.
Is there an embedded way in Salesforce to do this? Or are we stuck into a tedious manual process?
List.deepClone() call with preserveIds = false might deal with one problem, then:
Consider using upsert operation to build the relationships for you.
Upsert not only can prevent duplicates but also maintain hierarchies.
You'll need an external Id field on the parent, not on the children though.
/* Prerequisites to run this example succesfully:
- having a field Account_Number__c that will be marked as ext. id (you can't mark the standard one sadly)
- having an account in the DB with such value (but the point of the example is to NOT query for it's Id)
*/
Account parent = new Account(Account_Number__c = 'A364325');
Contact c = new Contact(LastName = 'Test', Account = parent);
upsert c;
System.debug(c);
System.debug([SELECT AccountId, Account.Account_Number__c FROM Contact WHERE Id = :c.Id]);
If you're not sure whether it will work for you - play with Data Loader's upsert function, might help to understand.
If you have more than 2 level hierarchy on the same sObject type I think you'd still have to upsert them in correct order though (or use Database.upsert version and keep on rerunning it for failed ones).
I've done some searches (over the web and SO) but so far have been unable to find something that directly answer this:
Is there anyway to force L2S to use a Stored Procedure when acessing a Database?
This is different from simply using SPROC's with L2S: The thing is, I'm relying on LINQ to lazy load elements by accessing then through the generated "Child Property". If I use a SPROC to retrieve the elements of one table, map then to an entity in LINQ, and then access a child property, I believe that LINQ will retrieve the register from the DB using dynamic sql, which goes against my purpose.
UPDATE:
Sorry if the text above isn't clear. What I really want is something that is like the "Default Methods" for Update, Insert and Delete, however, to Select. I want every access to be done through a SPROC, but I want to use Child Property.
Just so you don't think I'm crazy, the thing is that my DAL is build using child properties and I was accessing the database through L2S using dynamic SQL, but last week the client has told me that all database access must be done through SPROCS.
i don't believe that there is a switch or setting that out of the box and automagically would map to using t sprocs the way you are describing. But there is now reason why you couldn't alter the generated DBML file to do what you want. If I had two related tables, a Catalog table and CatalogItem tables, the Linq2SQL generator will naturally give me a property of CatalogItems on Catalog, code like:
private EntitySet<shelf_myndr_Previews_CatalogItem> _shelf_myndr_Previews_CatalogItems;
[global::System.Data.Linq.Mapping.AssociationAttribute(Name="CatalogItem", Storage="_CatalogItems", ThisKey="Id", OtherKey="CatalogId")]
public EntitySet<CatalogItem> CatalogItems
{
get
{
return this._CatalogItems;
//replace this line with a sproc call that ultimately
//returns the expected type
}
set
{
this._CatalogItems.Assign(value);
//replace this line with a sproc call that ultimately
//does a save operation
}
}
There is nothing stopping you from changing that code to be sproc calls there. It'd be some effort for larger applications and I'd be sure that you be getting the benefit from it that you think you would.
How about loading the child entities using the partial OnLoaded() method in the parent entity? That would allow you to avoid messing with generated code. Of course it would no longer be a lazy load, but it's a simple way to do it.
For example:
public partial class Supplier
{
public List<Product> Products { get; set; }
partial void OnLoaded()
{
// GetProductsBySupplierId is the SP dragged into your dbml designer
Products = dataContext.GetProductsBySupplierId(this.Id).ToList();
}
}
Call your stored procedure this way:
Where GetProductsByCategoryName is the name of your stored procedure.
http://weblogs.asp.net/scottgu/archive/2007/08/16/linq-to-sql-part-6-retrieving-data-using-stored-procedures.aspx
In LINQ-to-SQL if I update an object in the context but haven't called SubmitChanges, is there a way to "undo" or abandon that update so that the changes won't get submitted when I eventually call SubmitChanges?
For example, if I've updated several objects and then decide I want to abandon the changes to one of them before submitting.
Part 2: same question for Entity Framework, v3.5
Both LINQ to SQL and Entity Framework will use the same call (assuming you still have the active Context):
_dbContext.Refresh(RefreshMode.OverwriteCurrentValues, yourObj);
A more appropriate way would be to treat the Context as a Unit of Work, in which case you would no longer have an active context when refreshing the object. You would simply dispose of the object you're using currently and get a fresh copy from a new context.
I think you can use the .GetOriginalEntityState(yourEntity) to retrieve the original values. Then set your updated entity back to the original
dim db as new yourDataContext
//get entity
dim e1 as yourEntity = (from x in db.table1).take(1)
//update entity
e1.someProperty = 'New Value'
//get original entity
dim originalEntity = db.table1.getOrignalEntityState(e1)
e1 = originalEntity
db.submitChanges()
Very pseudo-code but I think it conveys the right idea. Using this method, you could also just undo one or more property changes without refreshing the entire entity.
If I make changes to an existing linq object by assigning a "new" object of the same type (with different values), SubmitChanges does not make the changes in the database. why not?
existing= new Data.Item{a=1, b=2...};
vs
existing.a= 1;
existing.b= 2;
Because you are not changing the object, you are assigning a new object to the variable.
You need to assign to fields one by one, (or InsertOnSubmit... but that will create a new object in the database and it does not sound like that is what you want to do).
This approach will sort of work if you we assigning the newly created object to a field of an object that LINQ to SQL knows about, but once again, that would be creating a new object rather than changing the one that field previously pointed to (which could result in a bunch of garbage rows in your database if you never get rid of them).