Fluent NHibernate, SQL Server and string length specification not working as expected - sql-server-2008

I am following the Summer of NHibernate tutorials but I am not using the xml mappings but instead, I am making use of Fluent NHibernate to do the mappings.
My Customer entity class is:
public class Customer
{
public virtual int CustomerId { get; set; }
public virtual string Firstname { get; set; }
public virtual string Lastname { get; set; }
}
The corresponding mapping class is:
public class CustomerMap: ClassMap<Customer>
{
public CustomerMap()
{
Id(x =>x.CustomerId);
Map(x => x.Firstname).Length(50).Nullable();
Map(x => x.Lastname).Length(50).Nullable();
ImportType<CustomerFirstnameCounter>();
}
}
My DAO class is:
public int AddCustomer( Customer customer )
{
using( ISession session = GetSession() )
{
using( ITransaction tx = session.BeginTransaction() )
{
try
{
int newId = ( int ) session.Save( customer );
session.Flush();
tx.Commit();
return newId;
}
catch( GenericADOException )
{
tx.Rollback();
throw;
}
}
}
}
And finally my test is:
[Test]
public void AddCustomerThrowsExceptionOnFail()
{
// Arrange
Customer customer = BuildInvalidCustomer();
// Act
_provider.AddCustomer( customer );
// Assert
}
When the test runs, no exception is thrown! So my first question is whether anyone can see what is wrong with my mapping.
Now, in the dB, the Firstname field is set as a varchar(50). When I debug the test, I see that the data is inserted but truncated (I do get warning messages). So this might indicate
that I haven't set the dB up properly. Can anyone point me in the direction of where to prevent this truncation of data in SQL Server?

This answer should help you.
I will also use Data Annotation StringLengthAttribute to ensure validation of your properties

.Length(50) does not check lengths at run-time. This is only used if you are generating the database schema from the mappings.
If you wish to validate the length of values you will have to either do this manually or use some validation framework like NHibernate Validator

Related

MySQL and EF Core 6 error The LINQ expression could not be translated

I recently updated our project from EF Core 2.2.6 to 6.x (along with and upgrade from .NET core 3.1 to .NET 6) and now I'm get errors like the one stated in the title whenever the query gets even a little complicated. One of those cases is when you add a GroupBy clause. Below is an example of a failing query.
_context.MyTable
.Where(a => a.Name.Contains("service"))
.GroupBy(ss => ss.IsServiceSpecific)
The entire error is:
The LINQ expression 'DbSet< MyTable >()
.Where(a => a.Name.Contains("service"))
.GroupBy(ss => ss.IsServiceSpecific)' could not be translated. Either rewrite the query in a form that can be translated, or switch
to client evaluation explicitly by inserting a call to 'AsEnumerable',
'AsAsyncEnumerable', 'ToList', or 'ToListAsync'
The setup at this MySQL::Entity Framework Core Support URL is exactly what I did (there are only two steps to set it up). My DI config looks like this:
builder.Services.AddEntityFrameworkMySQL()
.AddDbContext<MydbContext>(options =>
{
options.UseMySQL(builder.Configuration.GetConnectionString("DefaultConnection"));
});
It will execute simple queries but more complex ones always generate this error. It says to rewrite the query and force client side evaluation by using AsEnumerable or ToList but I don't want to drag all that data to the client and I expect that a simple group by can be translated and handled server side.
I did find one article that talks about this problem but I'm not getting if it's suggesting an actual solution.
This shouldn't be this hard and I feel like I'm missing something simple.
Model
internal class Post
{
public int PostId { get; set; }
public string? Title { get; set; }
public string? Content { get; set; }
public int BlogId { get; set; }
}
DBContext
internal class BloggingContext : DbContext
{
public DbSet<Post>? Posts { get; set; }
public string DbPath { get; }
public BloggingContext()
{
var path = Environment.GetFolderPath(Environment.SpecialFolder.Desktop);
DbPath = $"{path}{Path.DirectorySeparatorChar}blogging.db";
}
protected override void OnConfiguring(DbContextOptionsBuilder options)
=> options.UseSqlite($"Data Source={DbPath}");
}
Main
internal class Program
{
static void Main(string[] args)
{
using (var db = new BloggingContext())
{
var posts = db.Posts.Where(s => s.Title.Contains("Hello")).GroupBy(g => g.BlogId == 1994).Select(s => new { Key = s.Key, Counts = s.Count() }).ToList();
foreach (var p in posts)
{
Console.WriteLine(p);
}
}
}
}
Conclusion: You might add Select statement after GroupBy.

NHibernate Lazy Loaded Property with Composite Key Issue (Works in SQLite doesn't in SQL Server 2008)

So I have a class:
public class objWalkdown {
private objWalkdown_ID _walkdownId = new objWalkdown_ID();
public virtual objWalkdown_ID Id {
get { return _walkdownId; }
set { _walkdownId = value; }
}
public virtual String facility {
get { return _walkdownId.facility; }
set { _walkdownId.facility = value; }
}
public virtual String walkdown {
get { return _walkdownId.walkdown; }
set { _walkdownId.walkdown = value; }
}
public virtual String activity {
get { return _walkdownId.activity; }
set { _walkdownId.activity = value; }
}
public virtual Boolean complete { get; set; }
public virtual Byte[] filedata { get; set; }
}
That utilizes a composite id setup in another class:
[Serializable]
public class objWalkdown_ID {
public virtual String facility { get; set; }
public virtual String walkdown { get; set; }
public virtual String activity { get; set; }
}
From my research this serialized composite key is required in order to facilitate lazy loading on a property (i.e. a regular composite key setup will not work),shown here is my Fluent NNibernate mapping class:
public class objWalkdown_ORM : ClassMap<objWalkdown> {
public objWalkdown_ORM() {
Id(x => x.Id);
Map(x => x.facility);
Map(x => x.walkdown);
Map(x => x.activity);
Map(x => x.complete);
Map(x => x.filedata)
.CustomType("BinaryBlob")
.LazyLoad();
Table("tbPDFs");
}
}
And this 100% works when NHibernate was hooked up to my SQLite test database, but when moved to SQL Server 2008 things broke a bit. The only real DB changes being the SQLite BINARY type to the SQL Server 2008 VARBINARY(MAX) type. However with lazy loading disabled everything works on both sides, it's only when my property is tagged with the lazy loading option that things start to break. Here is a sample of how I'm accessing the data:
public Stream getWalkdown(String Facility, String Walkdown, String Activity) {
//Use NHibernate Session setup "Per-Request"
ISession session = (ISession)HttpContext.Current.Items["NHSession"];
//Build composite Id
objWalkdown_ID id = new objWalkdown_ID() { facility = Facility, walkdown = Walkdown, activity = Activity };
//Get the entity
objWalkdown walkdown = session.Get<objWalkdown>(id);
//Lazy Load File Data and return a memory stream
return new MemoryStream(walkdown.filedata);
}
Please focus on the lazy loading portion and don't ask me to not use it, the purpose of this question is to determine exactly what is wrong with the lazy loading. Like I stated, using SQLite this functions 100%. It will also work under both SQLite and SQL Server 2008 if the lazy loading is removed, so I don't think it's the actual data types on the back end.
The specific error I encounter, when inspecting the walkdown object (prior to the lazy load) is "System.InvalidOperationException - Invalid attempt to read when no data is present". However there definitely is data present. So I'm thinking this is an issue with the NHibernate proxy.
Here is a stack trace:
at System.Data.SqlClient.SqlDataReader.ReadColumnHeader(Int32 i)
at System.Data.SqlClient.SqlDataReader.IsDBNull(Int32 i)
at NHibernate.Driver.NHybridDataReader.IsDBNull(Int32 i)
at NHibernate.Type.NullableType.NullSafeGet(IDataReader rs, String name)
at NHibernate.Type.NullableType.NullSafeGet(IDataReader rs, String[] names, ISessionImplementor session, Object owner)
at NHibernate.Persister.Entity.AbstractEntityPersister.InitializeLazyPropertiesFromDatastore(String fieldName, Object entity, ISessionImplementor session, Object id, EntityEntry entry)
at NHibernate.Persister.Entity.AbstractEntityPersister.InitializeLazyProperty(String fieldName, Object entity, ISessionImplementor session)
at NHibernate.Intercept.AbstractFieldInterceptor.InitializeField(String fieldName, Object target)
at NHibernate.Intercept.AbstractFieldInterceptor.Intercept(Object target, String fieldName, Object value)
at NHibernate.Intercept.DefaultDynamicLazyFieldInterceptor.Intercept(InvocationInfo info)
at objWalkdownProxy.get_filedata()
I hope someone can give me some advice on a proper solution. I already have a work around, but it involves HQL and eagerly loading an entity when I require the file data, which is not what I want to be doing.
Unfortunately, the answer was... a glitch. Something went wrong or corrupted the table in MSSQL2008. During my testing/debugging I created a brand new table with the exact same specifications and everything works 100%.
One thing I would like to note though, is that during my research I found out that a Primary Key in MSSQL2008 can only be 900 bytes in size, and that if I ever wanted to have NHibernate generate my schema for me I should really put a .Length(900) on the Id field.

There is already an open DataReader associated with this Connection which must be closed first + asp.net mvc

I have a mysql database with a table entites with multiple fields in it like entity_title, entity_description, ... . In the table there are also 3 foreign keys user_id, region_id an category_id.
In my Index View I would like to show all the entities in a table (show the title, description, ... , the user name, the region name and the category name).
This is what I do in my Controller:
public ActionResult Index()
{
var model = this.UnitOfWork.EntityRepository.Get();
return View(model);
}
In my Repository I do this:
public virtual IEnumerable<TEntity> Get(
Expression<Func<TEntity, bool>> filter = null,
Func<IQueryable<TEntity>, IOrderedQueryable<TEntity>> orderBy = null,
string includeProperties = "")
{
IQueryable<TEntity> query = _dbSet;
if (filter != null)
{
query = query.Where(filter);
}
foreach (var includeProperty in includeProperties.Split
(new char[] { ',' }, StringSplitOptions.RemoveEmptyEntries))
{
query = query.Include(includeProperty);
}
if (orderBy != null)
{
return orderBy(query).ToList();
}
else
{
return query.ToList();
}
}
I always get the error Input string was not in a correct format on the last rule (return query.ToList()).
But when I check the _dbSet after the rule IQueryable<TEntity> query = _dbSet; it already gives the error: There is already an open DataReader associated with this Connection which must be closed first.
This probably comes because I want to select from more then one table. But how can I fix this? I tried adding MultipleActiveResultSets=True" to my ConnectionString like this:
<connectionStrings>
<add name="reuzzeCS" connectionString="server=localhost;uid=root;pwd=*****;Persist Security Info=True;database=reuzze;MultipleActiveResultSets=True"" providerName="MySql.Data.MySqlClient" />
But that gave me the error that the keyword doesn't exists, because I work with MySql.Data.MySqlClient ..
The Query executed is:
{SELECT
Extent1.entity_id,
Extent1.entity_title,
Extent1.entity_description,
Extent1.entity_starttime,
Extent1.entity_endtime,
Extent1.entity_instantsellingprice,
Extent1.entity_shippingprice,
Extent1.entity_condition,
Extent1.entity_views,
Extent1.entity_created,
Extent1.entity_modified,
Extent1.entity_deleted,
Extent1.user_id,
Extent1.region_id,
Extent1.category_id
FROM entities AS Extent1}
But when he wants to execute the query and I want to expand the results, I get the error There is already an open DataReader associated with this Connection which must be closed first
EDIT:
My full repository:
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Entity;
using System.Linq;
using System.Linq.Expressions;
using System.Text;
using System.Threading.Tasks;
namespace App.Data.orm.repositories
{
// REPO FROM TEACHER
public class GDMRepository<TEntity> where TEntity : class
{
internal GDMContext _context;
internal DbSet<TEntity> _dbSet;
public GDMRepository(GDMContext context)
{
this._context = context;
this._dbSet = _context.Set<TEntity>();
}
public virtual IEnumerable<TEntity> Get(
Expression<Func<TEntity, bool>> filter = null,
Func<IQueryable<TEntity>, IOrderedQueryable<TEntity>> orderBy = null,
string includeProperties = "")
{
IQueryable<TEntity> query = _dbSet;
if (filter != null)
{
query = query.Where(filter);
}
foreach (var includeProperty in includeProperties.Split
(new char[] { ',' }, StringSplitOptions.RemoveEmptyEntries))
{
query = query.Include(includeProperty);
}
if (orderBy != null)
{
return orderBy(query).ToList();
}
else
{
return query.ToList();
}
}
public virtual TEntity GetByID(object id)
{
return _dbSet.Find(id);
}
public virtual void Insert(TEntity entity)
{
_dbSet.Add(entity);
}
public virtual void Delete(object id)
{
TEntity entityToDelete = _dbSet.Find(id);
Delete(entityToDelete);
}
public virtual void Delete(TEntity entity)
{
if (_context.Entry(entity).State == EntityState.Detached)
{
_dbSet.Attach(entity);
}
_dbSet.Remove(entity);
}
public virtual void Update(TEntity entity)
{
_dbSet.Attach(entity);
_context.Entry(entity).State = EntityState.Modified;
}
}
}
GDMContext class:
using App.Data.orm.mappings;
using System;
using System.Collections.Generic;
using System.Data.Entity;
using System.Data.Entity.ModelConfiguration.Conventions;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace App.Data.orm
{
public class GDMContext:DbContext
{
public GDMContext() : base("reuzzeCS") { }
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
//REMOVE STANDARD MAPPING IN ENTITY FRAMEWORK
modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
//REGISTER MAPPERS
modelBuilder.Configurations.Add(new UserMapping());
modelBuilder.Configurations.Add(new PersonMapping());
modelBuilder.Configurations.Add(new RoleMapping());
modelBuilder.Configurations.Add(new EntityMapping());
modelBuilder.Configurations.Add(new MediaMapping());
modelBuilder.Configurations.Add(new BidMapping());
modelBuilder.Configurations.Add(new CategoryMapping());
modelBuilder.Configurations.Add(new AddressMapping());
modelBuilder.Configurations.Add(new RegionMapping());
modelBuilder.Configurations.Add(new MessageMapping());
}
}
}
My entity Model:
public class Entity
{
public Int64 Id { get; set; }
[Required(ErrorMessage = "Title is required")]
[StringLength(255)]
[DisplayName("Title")]
public string Title { get; set; }
[Required(ErrorMessage = "Description is required")]
[DisplayName("Description")]
public string Description { get; set; }
[Required]
public DateTime StartTime { get; set; }
[Required]
public DateTime EndTime { get; set; }
/*[Required(ErrorMessage = "Type is required")]
[StringLength(16)]
[DisplayName("Type")]
public string Type { get; set; }*/
[Required]
public decimal InstantSellingPrice { get; set; }
public Nullable<decimal> ShippingPrice { get; set; }
public Condition? Condition { get; set; }
public Nullable<Int64> Views { get; set; }
[Required]
public DateTime CreateDate { get; set; }
public Nullable<DateTime> ModifiedDate { get; set; }
public Nullable<DateTime> DeletedDate { get; set; }
public Int32 UserId { get; set; }
public Int32 RegionId { get; set; }
public Int16 CategoryId { get; set; }
public virtual User User { get; set; }
public virtual Region Region { get; set; }
public virtual Category Category { get; set; }
//public virtual ICollection<Category> Categories { get; set; }
public virtual ICollection<User> Favorites { get; set; }
public virtual ICollection<Bid> Bids { get; set; }
public virtual ICollection<Media> Media { get; set; }
}
public enum Condition
{
New = 1,
Used = 2
}
My Entity Mapping:
internal class EntityMapping : EntityTypeConfiguration<Entity>
{
public EntityMapping()
: base()
{
this.ToTable("entities", "reuzze");
this.HasKey(t => t.Id);
this.Property(t => t.Id).HasColumnName("entity_id").HasDatabaseGeneratedOption(DatabaseGeneratedOption.Identity);
this.Property(t => t.Title).HasColumnName("entity_title").IsRequired().HasMaxLength(255);
this.Property(t => t.Description).HasColumnName("entity_description").IsRequired();
this.Property(t => t.StartTime).HasColumnName("entity_starttime").IsRequired();
this.Property(t => t.EndTime).HasColumnName("entity_endtime").IsRequired();
//this.Property(t => t.Type).HasColumnName("entity_type").IsRequired();
this.Property(t => t.InstantSellingPrice).HasColumnName("entity_instantsellingprice").IsRequired();
this.Property(t => t.ShippingPrice).HasColumnName("entity_shippingprice").IsOptional();
this.Property(t => t.Condition).HasColumnName("entity_condition").IsRequired();
this.Property(t => t.Views).HasColumnName("entity_views").IsOptional();
this.Property(t => t.CreateDate).HasColumnName("entity_created").IsRequired().HasDatabaseGeneratedOption(DatabaseGeneratedOption.Computed);
this.Property(t => t.ModifiedDate).HasColumnName("entity_modified").IsOptional();
this.Property(t => t.DeletedDate).HasColumnName("entity_deleted").IsOptional();
this.Property(t => t.UserId).HasColumnName("user_id").IsRequired();
this.Property(t => t.RegionId).HasColumnName("region_id").IsRequired();
this.Property(t => t.CategoryId).HasColumnName("category_id").IsRequired();
//FOREIGN KEY MAPPINGS
this.HasRequired(t => t.User).WithMany(p => p.Entities).HasForeignKey(f => f.UserId).WillCascadeOnDelete(false);
this.HasRequired(t => t.Region).WithMany(p => p.Entities).HasForeignKey(f => f.RegionId);
this.HasRequired(t => t.Category).WithMany(p => p.Entities).HasForeignKey(f => f.CategoryId);
//MANY_TO_MANY MAPPINGS
this.HasMany(t => t.Favorites)
.WithMany(t => t.Favorites)
.Map(mc =>
{
mc.ToTable("favorites");
mc.MapLeftKey("entity_id");
mc.MapRightKey("user_id");
});
}
}
Link to stacktrace image!
UPDATE:
base {SELECT
Extent1.entity_id,
Extent1.entity_title,
Extent1.entity_description,
Extent1.entity_starttime,
Extent1.entity_endtime,
Extent1.entity_instantsellingprice,
Extent1.entity_shippingprice,
Extent1.entity_condition,
Extent1.entity_views,
Extent1.entity_created,
Extent1.entity_modified,
Extent1.entity_deleted,
Extent1.user_id,
Extent1.region_id,
Extent1.category_id
FROM entities AS Extent1} System.Data.Entity.Internal.Linq.InternalQuery {System.Data.Entity.Internal.Linq.InternalSet}
Your problem is
I think MySql connector probably doesn't support multiple active result sets and because of that the setting in connection string didn't help you.
So Please try this way instead of your code
Edit :
query.Include("User").Include("Region").Include("Category").ToList();
Let me know, if you get same error after this change.
Update:
I have change some thing for you Please use this code instead of your method
public virtual IEnumerable<TEntity> Get(
Expression<Func<TEntity, bool>> filter = null,
Func<IQueryable<TEntity>, IOrderedQueryable<TEntity>> orderBy = null,
string includeProperties = "")
{
IQueryable<TEntity> query = _dbSet;
if (filter != null)
{
query = query.Where(filter);
}
if (orderBy != null)
{
return orderBy(query.Include("User").Include("Region").Include("Category").ToList()).ToList();
}
else
{
return query.Include("User").Include("Region").Include("Category").ToList();
}
}
Update 2:
It is not about closing connection. EF manages connection correctly. My understanding of this problem is that there are multiple data retrieval commands executed on single connection (or single command with multiple selects) while next DataReader is executed before first one has completed the reading. The only way to avoid the exception is to allow multiple nested DataReaders = turn on MultipleActiveResultSets. Another scenario when this always happens is when you iterate through result of the query (IQueryable) and you will trigger lazy loading for loaded entity inside the iteration.
And stack overflow have lot of peoples got the solutions for your question
1: Entity Framework: There is already an open DataReader associated with this Command
2: How to avoid "There is already an open DataReader associated with this Connection which must be closed first." in MySql/net connector?
3: Error: There is already an open DataReader associated with this Command which must be closed first
and my personal advice for, I think you don't spent more time for this error, because waist of time and energy , and you can do it by using by manual query . So please try different ways.
You don't need split and formatting queries for avoiding input string was not correct format error
You can do this way instead of return query.ToList();
return _dbSet.Users
.Include(x => x.Region)
.Include(x => x.Category).ToList();
I think you can do it by using my above SO link's.
And My main question is :
Entity Framework can support ORM Concept, So why you don't try this way?. You can change the idea for using ORM Concept. It's may be solve this problem. This is a link for that and please see this tutorial
UPDATE
OK, so from your stack trace it looks like the "open DataReader associated ...blah" was a red-herring. Maybe that was visual studio and its intellisense visual debugger thingy trying to show you the values contained in your dbset but a connection was still open or something like that.
To me, it looks like EF's MySqlDatareader is doing its job of enumerating the results and mapping them to POCO's.
Maybe there is a column that is a varchar(..) or something of that sort on a table in your Database, and on your POCO's its mapped property is oftype(Int32). So if there is a an empty string or a value that isn't a number in the database I believe that an Input string was not in a correct format exception should be expected when you try convert a null or empty string value to an Int. Just tried this now to see:
I think the issue is that MySql doesn't support MARS and maybe it also doesn't suport Lazy Loading. While I couldn't find anything official to say this was the case I found a few posts with the same issue as you.
http://www.binaryforge-software.com/wpblog/?p=163
MySQL + Code First + Lazy Load problem !
http://forums.mysql.com/read.php?38,259559,267490
Now up until fairly recently I thought that calling ToList() on an IQueryable would Load the Results into memory and any Navigation properties would not be LazyLoaded, this is not strictly true. While the result will be persisted into Memory any virtual Navigation properties of that result will still be lazy loaded if you try to access them.
On a high level LazyLoading works because entity framework overrides your `virtual' navigation properties and uses its own implementation to load entities from the database.
My guess is that in your View or somewhere else in your code you must be accessing a property that you haven't explicitly loaded using an Include. My guess is that EF may be trying to do this on a single connection and that is why you see:
There is already an open DataReader associated with this Connection which must be closed first
I would turn off Lazyloading by doing the following:
public class GDMContext:DbContext
{
public GDMContext() : base("reuzzeCS")
{
base.Configuration.LazyLoadingEnabled = false;
}
}
Hope this helps.
According to your stack trace, the framework appears to have an issue converting a string to an integer. To quote another SO answer, "EF throws error each time you set a type in the model that is different from the table."
You have a few options.
If you are using a code first approach, I suggest you regenerate your database.
If you are using a "code second" approach (which maps your database tables to POCO classes), then I suggest you regenerate your database.
If you have not had luck with either of the above, you may at least narrow down which column is having the issue by testing each of your integer-based columns like this:
public ActionResult Index()
{
var model1 = this.UnitOfWork.EntityRepository.Get(
includeProperties: "category_id");
// Did that produce an error? If not, try another column:
var model2 = this.UnitOfWork.EntityRepository.Get(
includeProperties: "region_id");
// etc.
// If you get to your original code, then try testing other columns
var model = this.UnitOfWork.EntityRepository.Get();
return View(model);
}
What if the above does not work? There could be an issue with your connection string, as mentioned in this SO answer. This is probably a long-shot, given that your stack trace does not seem to stumble over creating a connection, but it is worth noting in your connection string for MARS there appears to be an extra double quote (though I am sure it is probably just a transcription error.) In any case, if you cannot get any query to work, ensure your connection string looks normal, like the following:
<connectionStrings>
<add name="reuzzeCS"
connectionString=
"server=localhost;database=Reuzze;User Id=root;password=P4ssw0rd"
providerName="MySql.Data.MySqlClient" />
</connectionStrings>
What if the above does not work? Check that your version of EntityFramework plays nice with your version of MySQL.
Let's simplify problem. No body can help you unless you want. First of all I must mention that MultipleActiveResultSets=True in your connectionString according to MSDN:
Is a feature that works with SQL Server to allow the execution of multiple batches on a single connection. When MARS is enabled for use with SQL Server, each command object used adds a session to the connection.
So that doesn't work with MySQL!
I think that you need to Specify Port Number in your connectionString like:
<connectionStrings>
<add name="reuzzeCS" connectionString="server=localhost;uid=root;pwd=*****;Persist Security Info=True;database=reuzze;" providerName="MySql.Data.MySqlClient" />
</connectionStrings>
Edit :
And, You need to use Connector/Net that is a fully-managed ADO.NET driver for MySQL in this page.It works for most basic scenarios of db interaction. It also has basic Visual Studio integration. According to this page you need connector too.
I hope to be useful. Regards.

Entity Framework 4.1 Code-First and Inserting New One-to-Many Relationships

I am having trouble peristing a new object graph to the context with a one-to-many relationship. I am using the Entity Framework 4.1 release, and implementing a Code-First approach. I am using an existing SQL 2008 database and implemented a context derived from DbContext. I have two classes, Person and Address. A person can contain 0 or more Addresses, defined as such.
public class Person
{
public Person()
{
Addresses = new List<Address>();
}
public int PersonId { get; set; }
***Additional Primitive Properties***
public virtual ICollection<Address> Addresses { get; set; }
}
public class Address
{
public int AddressId { get; set; }
public int AddressTypeId { get; set; }
***Additional Primitive Properties***
public int PersonId { get; set; }
public virtual Person Person { get; set; }
}
I am trying to create a new instance of Person with two addresses. However, when I add this structure to the context and save, only the first Address in the collection is persisted. The second has the Person navigation property set to null, and is not associated with the Person object, however, the first one in the list is associated.
var person = new Person();
var mailingAddress = new Address() { AddressTypeId = 1 };
person.Addresses.Add(mailingAddress);
var billingAddress = new Address() { AddressTypeId = 2 };
person.Addresses.Add(billingAddress);
context.People.Add(entity);
context.SaveChanges();
It does not throw an exception, but the second item in the Address collection is just not saved.
Does anybody have any good ideas on why only the first would be saved? Thank you.
After hours of troubleshooting/trial and error, I've solved my problem.
My POCO classes are also used in a disconnected environment, where
the objects are detached from the context, modified, and then re-attached.
In order to determine which navigation property collection items were affected, I overrode
the Equals and GetHashCode methods in the Address class to determine equality. Apparently this affects the ability for EF 4.1 to insert a complete collection of navigation property objects???
Here are the original equality methods which caused the issue:
public override bool Equals(object obj)
{
Address address = obj as Address;
if (address == null) return false;
return address.AddressId == this.AddressId;
}
public override int GetHashCode()
{
return this.AddressId.GetHashCode();
}
In order to correct the problem, I created a custom equality comparer
for the navigation object rather than including it directly in the address class.
public class AddressEqualityComparer : IEqualityComparer<Address>
{
public bool Equals(Address address1, Address address2)
{
if (address1.AddressId == address2.AddressId)
return true;
else
return false;
}
public int GetHashCode(Address address)
{
return address.AddressId.GetHashCode();
}
}
My context.People.Add method call worked as expected after I made this change.
If anyone knows why overriding the equality methods in the class causes
EF 4.1 to only insert the first item in the collection, that would be
great information.
As hinted at already, it's because the GetHashCode method is using the ID of all the siblings, which will be 0 at point of comparison by Entity Framework. Comment just that out and you will good to go.
I had the same exact issue and this piece let me to that. I didn't even bother to look at my EntityBase code...it's so old and hasn't changed in forever until now.
So a big thank you for your research!
Here is another way to attempt to add the code. Worth a shot. This code may not be exact, I typed freehand.
var person = new Person();
person.Addresses.Add(new Address()
{
AddressTypeId = 1
}),
new Address()
{
AddressTypeId = 2
});
context.People.Add(entity);
context.SaveChanges();

Using database default values with Linq to SQL codewise

I am using Dynamic Data with linq to SQL and SQL Server 2008.
I have a GUID column that gets his value from the default value with newguid(). When I set IsDbGenerated to true in the designer it works like a charm.
But when I renew table this property is set back to false again. So I added it to the metadata. For some reason it's not being pickup, "00000000-0000-0000-0000-000000000000" is being inserted in database. The displayname and readonly change are being pick up.
What am I missing?
[MetadataType(typeof(CMS_Data_HistoryMetadata))]
public partial class CMS_Data_History
{
}
[TableName("Content")]
public class CMS_Data_HistoryMetadata
{
[DisplayName("Pagina Title")]
public object pageTitleBar { get; set; }
[ReadOnly(true)]
[DisplayName("Versie")]
public object version_date { get; set; }
[ColumnAttribute(IsDbGenerated = true)]
public object entity_id;
}
I solved the problem by extending the partial insert en update class and check there if the guid is filled
partial void
InsertCMS_Data_History(CMS_Data_History
instance)
{
if(instance.entity_id == Guid.Empty)
{
instance.entity_id = Guid.NewGuid();
}
this.ExecuteDynamicInsert(instance);
}
partial void UpdateCMS_Data_History(CMS_Data_History
instance)
{
if (instance.version_date == DateTime.MinValue)
{
instance.version_date = DateTime.Now;
}
this.ExecuteDynamicUpdate(instance);
}