Alternative for filter include ef core v2.1.11 - ef-core-2.1

I'm trying to optimize a code and I've come up with filter include and only to discover that this is not possible with the current version.
The code/query that I'm trying to optimize is this:
var context = new EntitiesContext();
var sources = context.Schools.Include(s => s.Sections).AsNoTracking();
I also tried using entity framework core plus but it seems it doesn't cater for my needs. I'd like to eliminate the usage of looping just to filter the data.

Related

Splitting a feature collection by system index in Google Earth Engine?

I am trying to export a large feature collection from GEE. I realize that the Python API allows for this more easily than the Java does, but given a time constraint on my research, I'd like to see if I can extract the feature collection in pieces and then append the separate CSV files once exported.
I tried to use a filtering function to perform the task, one that I've seen used before with image collections. Here is a mini example of what I am trying to do
Given a feature collection of 10 spatial points called "points" I tried to create a new feature collection that includes only the first five points:
var points_chunk1 = points.filter(ee.Filter.rangeContains('system:index', 0, 5));
When I execute this function, I receive the following error: "An internal server error has occurred"
I am not sure why this code is not executing as expected. If you know more than I do about this issue, please advise on alternative approaches to splitting my sample, or on where the error in my code lurks.
Many thanks!
system:index is actually ID given by GEE for the feature and it's not supposed to be used like index in an array. I think JS should be enough to export a large featurecollection but there is a way to do what you want to do without relying on system:index as that might not be consistent.
First, it would be a good idea to know the number of features you are dealing with. This is because generally when you use size().getInfo() for large feature collections, the UI can freeze and sometimes the tab becomes unresponsive. Here I have defined chunks and collectionSize. It should be defined in client side as we want to do Export within the loop which is not possible in server size loops. Within the loop, you can simply creating a subset of feature starting from different points by converting the features to list and changing the subset back to feature collection.
var chunk = 1000;
var collectionSize = 10000
for (var i = 0; i<collectionSize;i=i+chunk){
var subset = ee.FeatureCollection(fc.toList(chunk, i));
Export.table.toAsset(subset, "description", "/asset/id")
}

Avoiding code duplication when loading parameters

I have a question regarding trying to avoid code duplication. I have a scenario class, this class contains a set of objects, and these objects are then used to run the algorithm. I use the scenario class to pass it to the necessary objects, and I find myself with several 'loading' of these parameters. Something like this
allEMCameraMotions = scenario.getEMCams();
gTObject = scenario.getGroundTruth();
visualCov = gTObject.getVisualCov();
eMCov = gTObject.getEMCov();
tempCov = formTemporalCovariance(gTObject, toEstimateWindowSize);
intrinsics = gTObject.getIntrinsics();
I have noticed that this is repeated in several functions inside, and I would like to know if there is out there any strategy to avoid this duplication.
Thank you!

Fetching strategy encapsulation for Entity Framework 4.1 and NHibernate

I created a project to test out NHibernate 3+ vs. Entity Framework 4.1, wrapping it in a repository, making it very testable using interfaces etc.
I do not want to expose either ORM outside of the repositories (I do not even expose IQueryables). Everything should be handled in that layer and until I tried to handle fetching in an abstract way, everything was good.
Microsoft's implementation of adding eager loading uses either magic strings (yuck) or Linq expressions (yay) on the Include function. Their syntax follows something like this:
IQueryableThing.Include(o => o.Person);
IQueryableThing.Include(o => o.Company.Contact);
IQueryableThing.Include(o => o.Orders.Select(p => p.LineItem.Cost);
The first will just load the associated person. (parent)
The second will load the associated company and each company's contact. (parent and grandparent).
The third will load all associated orders, line items and costs for each order.
It's a pretty slick implementation.
NHibernate uses a slightly different approach. They still use Linq expressions, but they make heavier use of extension methods (fluent approach).
IQueryableThing.Fetch(o => o.Person);
IQueryableThing.Fetch(o => o.Company).ThenFetch(o => o.Contact);
IQueryableThing.FetchMany(o => o.Orders).ThenFetch(p => p.LineItem).ThenFetch(q => q.Cost);
(I'm not sure I if the third line is the correct syntax)
I can encapsulate a list of expressions in a separate class and then apply those expression to the IQueryable within that class. So what I would need to do is standardize on the Microsoft expression syntax and then translate that into NHibernate's syntax by walking the expression tree and rebuilding each expression.
This is the part that's really tricky. I have to maintain a particular order of operations in order to call the correct function for the IQueryable (must start with either Fetch or FetchMany, with each subsequent being a "ThenFetch" or "ThenFetchMany"), which stops me from using the built-in ExpressionVisitor class.
Edit:
I finally created an expression parser that will take any level of nesting of properties, collections, and selects on collections and produce an array of expressions. Unfortunately, the built in Fetch extensions methods do not take LambdaExpression as a parameter.
The part I am stuck on currently is not being able to use the built in Fetch definitions from nHibernate. It looks like I may have to hit the Remotion library's functions directly or register my own extension methods that will satisfy their parser.
Funky.
Have you tried using NHiberanteUtil.Initialize()? I haven't attempted to do what you are doing, but I think Initialize will work akin to Include().

Which DAL libraries support stored procedure execution and results materialisation

I'm used to EF because it usually works just fine as long as you get to know it better, so you know how to optimize your queries. But.
What would you choose when you know you'll be working with large quantities of data? I know I wouldn't want to use EF in the first place and cripple my application. I would write highly optimised stored procedures and call those to get certain very narrow results (with many joins so they probably won't just return certain entities anyway).
So I'm a bit confused which DAL technology/library I should use? I don't want to use SqlConnection/SqlCommand way of doing it, since I would have to write much more code that's likely to hide some obscure bugs.
I would like to make bug surface as small as possible and use a technology that will accommodate my process not vice-a-versa...
Is there any library that gives me the possibility to:
provide the means of simple SP execution by name
provide automatic materialisation of returned data so I could just provide certain materialisers by means of lambda functions?
like:
List<Person> result = Context.Execute("StoredProcName", record => new Person{
Name = record.GetData<string>("PersonName"),
UserName = record.GetData<string>("UserName"),
Age = record.GetData<int>("Age"),
Gender = record.GetEnum<PersonGender>("Gender")
...
});
or even calling stored procedure that returns multiple result sets etc.
List<Question> result = Context.ExecuteMulti("SPMultipleResults", q => new Question {
Id = q.GetData<int>("QuestionID"),
Title = q.GetData<string>("Title"),
Content = q.GetData<string>("Content"),
Comments = new List<Comment>()
}, c => new Comment {
Id = c.GetData<int>("CommentID"),
Content = c.GetData<string>("Content")
});
Basically this last one wouldn't work, since this one doesn't have any knowledge how to bind both together... but you get the point.
So to put it all down to a single question: Is there a DAL library that's optimised for stored procedure execution and data materialisation?
Business Layer Toolkit might be exactly what's needed here. It's a lightweight ORM tool that supports lots of scenarios including multiple result sets although they seem very complicated to do.

How can I force Linq to SQL NOT to use the cache?

When I make the same query twice, the second time it does not return new rows form the database (I guess it just uses the cache).
This is a Windows Form application, where I create the dataContext when the application starts.
How can I force Linq to SQL not to use the cache?
Here is a sample function where I have the problem:
public IEnumerable<Orders> NewOrders()
{
return from order in dataContext.Orders
where order.Status == 1
select order;
}
The simplest way would be to use a new DataContext - given that most of what the context gives you is caching and identity management, it really sounds like you just want a new context. Why did you want to create just the one and then hold onto it?
By the way, for simple queries like yours it's more readable (IMO) to use "normal" C# with extension methods rather than query expressions:
public IEnumerable<Orders> NewOrders()
{
return dataContext.Orders.Where(order => order.Status == 1);
}
EDIT: If you never want it to track changes, then set ObjectTrackingEnabled to false before you do anything. However, this will severely limit it's usefulness. You can't just flip the switch back and forward (having made queries between). Changing your design to avoid the singleton context would be much better, IMO.
It can matter HOW you add an object to the DataContext as to whether or not it will be included in future queries.
Will NOT add the new InventoryTransaction to future in memory queries
In this example I'm adding an object with an ID and then adding it to the context.
var transaction = new InventoryTransaction()
{
AdjustmentDate = currentTime,
QtyAdjustment = 5,
InventoryProductId = inventoryProductId
};
dbContext.InventoryTransactions.Add(transaction);
dbContext.SubmitChanges();
Linq-to-SQL isn't clever enough to see this as needing to be added to the previously cached list of in memory items in InventoryTransactions.
WILL add the new InventoryTransaction to future in memory queries
var transaction = new InventoryTransaction()
{
AdjustmentDate = currentTime,
QtyAdjustment = 5
};
inventoryProduct.InventoryTransactions.Add(transaction);
dbContext.SubmitChanges();
Wherever possible use the collections in Linq-to-SQL when creating relationships and not the IDs.
In addition as Jon says, try to minimize the scope of a DataContext as much as possible.