LINQ variable to list of string without using column names? - linq-to-sql

In an C# ASP.Net MVC project, I'm trying to make a List<string> from a LINQ variable.
Now this might be a pretty basic thing, but I just cannot get that to work without using the actual column names for the data in that variable. The thing is that in the interests of trying to make the program as dynamic as possible, I'm leaving it up to a stored procedure to get the data out. There can be any amount of any which way named columns depending on where the data is fetched from. All I care about is taking all of their values into a List<string>, so that I can compare user-input values with them in program.
Pointing to the columns by their names in the code means I'd have to make dozens of overloaded methods that all just basically do the same thing. Below is false non-functioning code. But it should open up the idea of what I mean.
// call for stored procedure
var courses = db.spFetchCourseInformation().ToList();
// if the data fails a check on a single row, it will not pass the check
bool passed = true;
foreach (var i in courses)
{
// each row should be cast into a list of string, which can then be validated
// on a row-by-row basis
List courseRow = new List();
courseRow = courses[i]; // yes, obviously this is wrong syntax
int matches = 0;
foreach (string k in courseRow)
{
if (validator.checkMatch(courseRow[k].ToString()))
{
matches++;
}
}
if (matches == 0)
{
passed = false;
break;
}
}
Now below is an example of how I currently have to do it because I need to use the names for the columns
for (int i = 0; i < courses.Count; i++)
{
int matches = 0;
if (validator.checkMatch(courses[i].Name))
matches++;
if (validator.checkMatch(courses[i].RandomOtherColumn))
matches++;
if (validator.checkMatch(courses[i].RandomThirdColumn))
matches++;
if (validator.checkMatch(courses[i].RandomFourthColumn))
matches++;
/* etc...
* etc...
* you get the point
* and one of these for each and every possible variation from the stored procedure, NOT good practice
* */
Thanks for help!

I'm not 100% sure what problem you are trying to solve (matching user data to a particular record in the DB?), but I'm pretty sure you're going about this in slightly the wrong fashion by putting the data in a List. I
t should be possible to get your user input in an IDictionary with the key being used for the column name, and the object as the input data field.
Then when you get the data from the SP, you can get the data back in a DataReader (a la http://msmvps.com/blogs/deborahk/archive/2009/07/09/dal-access-a-datareader-using-a-stored-procedure.aspx).
DataReaders are indexed on column name, so if you run through the keys in the input data IDictionary, you can check the DataReader to see if it has matching data.
using (SqlDataReader reader = Dac.ExecuteDataReader("CustomerRetrieveAll", null))
{
while (reader.Read())
{
foreach(var key in userInputDictionary.AllKeys)
{
var data = reader[key];
if (data != userInputDictionary[key]) continue;
}
}
}
Still not sure about the problem you are solving but, I hope this helps!

A little creative reflection should do the trick.
var courses = db.spFetchCourseInformation()
var values = courses.SelectMany(c => c.GetType().GetProperties() // gets the properties for your object
.Select(property => property.GetValue(c, null))); // gets the value of each property
List<string> stringValues = new List<string>(
values.Select(v => v == null ? string.Empty : v.ToString()) // some of those values will likely be null
.Distinct()); // remove duplicates

Related

BIRT Report with JSON Column data to table

I have a PostgreSQL table that stores some information in a JSON column, I'd like to take the data from this JSON column and populate a secondary table in the BIRT report.
I've got the data into a JSON object just fine, I can use it, but table #2 won't populate any data even though there is data in the JSON object.
DataSource1 it attached to DataSet1 which is PostgreSQL. DataSource2 is a scripted DS, and it has Dataset2 attached to it with columns defined.
In dataset1 I have the OnFetch function making my JSON array:
vars["invoiceData"] = JSON.parse(row["i_invoice"]);
Then I have in dataset2 (json dataset) I have this for the fetch event setup:
// Get the length of the object
len = vars["invoiceData"].labor.length;
count = 0;// Counter used to step through each item in the JSON object.
// Loop through the JSON object adding it to the scripted data source
if(count < len && len != 0) {
row["hrs"] = vars["invoiceData"].labor[count].hrs;
row["desc"] = vars["invoiceData"].labor[count].desc;
row["rate"] = vars["invoiceData"].labor[count].rate;
row["amount"] = vars["invoiceData"].labor[count].amount;
count++;
return true;
}
return false;
If I echo out the length of my array it has one row in it, but that table never comes back with anything, always empty on my report.
You are mixing initialization code with loop code.
Just think about this fragment:
count = 0;
if (...) {
count++;
return true;
}
return false;
You need to split and modify this code. Basically, for a scripted data set you need three things:
A counter variable (or something similar). This is best declared as a report variable.
Initialization of the counter variable in the data set's open event.
Using the counter variable in the data set's fetch event, eg modifying it and comparing it to the data length.
Your code will return the first data row in an endless loop, because what BIRT does is:
open()
while fetch():
emit the current row
Thus, your code is definitely wrong.
OTOH the behavior that you describe doesn't match what I would expect: A never-ending report. So there are probably other errors in your report. What does the "problems" view in the BIRT designer show?

Unable to create a constant value of type 'T'

I have a table called Subjects,
I have an another Table called Allocations, which stores the Allocations of the Subjects
I have a Datagridview, which is populated with Subject Allocations from the Allocations Table
Now i need to get the Subjects that are not in the Datagridview
To do this
I Get All Subjects from the ObjectContext
Now i get all the Subjects that are alloted from the Datagridview (It Returns me an InMemory Collection)
Now i use the LINQ.EXCEPT method to filter the results, but it is throwing me the Following Exception,
"Unable To Create Constant Value of Type "ObjectContext.Subjects" Only primitive types ('such as Int32, String, and Guid') are supported in this context."
Below is my Code
public static IOrderedQueryable<Subject> GetSubjects()
{
return OBJECTCONTEXT.Subjects.OrderBy(s => s.Name);
}
private IQueryable<Subject> GetAllocatedSubjectsFromGrid()
{
return (from DataGridViewRow setRow in dgv.Rows
where !setRow.IsNewRow
select setRow.DataBoundItem).Cast<Allocation>() //I know the Problem lies somewhere in this Function
.Select(alloc =>alloc.Subject).AsQueryable();
}
private void RUN()
{
IQueryable<Subject> AllSubjects = GetSubjects(); //Gets
IQueryable<Subject> SubjectsToExclude = GetAllocatedSubjectsFromGrid();
IQueryable<Subject> ExcludedSubjects = AllSubjects.Except(SubjectsToExclude.AsEnumerable());
//Throwing Me "Unable to create a constant value of type 'OBJECTCONTEXT.Subject'. Only primitive types ('such as Int32, String, and Guid') are supported in this context."
}
As a result of googling i found that it happens because LINQ can't compare between InMemory collection(Records from DGV) and Objectcontext(FromDB)
A little short of time, have not tested it. But I guess you can try to get it all in memory. So instead of using
IQueryable<Subject> AllSubjects = GetSubjects(); //Gets
You do
List<Subject> AllSubjects = GetSubjects().ToList(); //
List<Subject> SubjectsToExclude = GetAllocatedSubjectsFromGrid().ToList();
List<Subject> ExcludedSubjects = AllSubjects.Except(SubjectsToExclude);
I got around this by comparing keys in a Where clause rather than using Except.
So instead of:
var SubjectsToExclude = GetAllocatedSubjectsFromGrid();
var ExcludedSubjects = AllSubjects.Except(SubjectsToExclude.AsEnumerable());
Something more like:
var subjectsToExcludeKeys =
GetAllocatedSubjectsFromGrid()
.Select(subject => subject.ID);
var excludedSubjects =
AllSubjects
.Where(subject => !subjectsToExcludeKeys.Contains(subject.ID));
(I'm guessing what your entity's key looks like though.)
This allows you to keep everything in Entity Framework, rather than pulling everything into memory.

JSON results are returned in a different order than expected

I am following Phil Haack's example on using jQuery Grid with ASP.NET MVC. I have it working and it works well...except for one minor problem. When I sort the columns by something other than the ID, the JSON data returned from the server is very...well...wrong. Here's is my Controller method.
[HttpPost]
public ActionResult PeopleData(string sidx, string sord, int page, int rows)
{
int pageIndex = Convert.ToInt32(page) - 1;
int pageSize = rows;
int totalRecords = repository.FindAllPeople().Count();
int totalPages = (int)Math.Ceiling((float)totalRecords / (float)pageSize);
var people = repository.FindAllPeople()
.OrderBy(sidx + " " + sord)
.Skip(pageIndex * pageSize)
.Take(pageSize);
var jsonData = new
{
total = totalPages,
page = page,
records = totalRecords,
rows = (
from person in people
select new
{
i = person.PersonID,
cell = new List<string> { SqlFunctions.StringConvert((double) person.PersonID), person.PersonName }
}
).ToArray()
};
return Json(jsonData);
}
When I sort by PersonID in the jsGrid table, I get this data back (I just used the name of the current ID as the name - e.g. 1, One; 2, Two, etc.)
{"total":1,"page":1,"records":6,"rows":[{"i":1,"cell":[" 1","One"]},{"i":2,"cell":[" 2","Two"]},{"i":3,"cell":[" 3","Three"]},{"i":4,"cell":[" 4","Four"]},{"i":5,"cell":[" 5","Five"]},{"i":6,"cell":[" 6","Six"]}]}
When I sort by PersonName, however, every other row has the order (the ID vs. the name) flipped around. So when I show it in the table, the PersonName is in the ID column and the ID is in the person column. Here is the JSON result.
{"total":1,"page":1,"records":6,"rows":[{"i":5,"cell":[" 5","Five"]},{"i":4,"cell":["Four"," 4"]},{"i":1,"cell":[" 1","One"]},{"i":6,"cell":["Six"," 6"]},{"i":3,"cell":[" 3","Three"]},{"i":2,"cell":["Two"," 2"]}]}
Anybody have any insight into what I've done wrong that causes this to happen?
Update
So, I have learned that, what is happening, is that my array values are flipping for every other item in the array. For example...if I populate my database with:
[A, B, C]
then for every even-numbered result (or odd, if you're counting from 0), my data is coming back:
[C, B, A]
So, ultimately, my JSON row data is something like:
[A, B, C]
[C, B, A]
[A, B, C]
[C, B, A]
...etc
This is always happening and always consistent. I am going a bit crazy trying to figure out what's going on because it seems like it should be something simple.
I have the same problem with my data which are INT type.
If elements in my queue (A,B,C) are NVARCHAR type I do not have this problem.
So problem is obviously in SqlFunction.StringConvert function.
Try to use the method described here. If you use fields instead of properties in the repository.FindAllPeople() you should look at the commented part of the code where are used FieldInfo and GetField instead of PropertyInfo and GetProperty.
I found the solution here: linq to entities orderby strange issue
The issue ultimately stems from the fact that Linq to Entities has trouble handling strings. When I was using the SqlFunctions.StringConvert method, this was incorrectly performing the conversion (although, I must admit that I don't fully understand why the order was then switched around).
In either case, per the above post, the solution for fixing the problem was to do the selection locally so that I could "force" Linq to Entities to work with strings properly. From this, my final code is:
var people = repository.FindAllPeople()
.OrderBy(sidx + " " + sord)
.Skip(pageIndex * pageSize)
.Take(pageSize);
// Due to a problem with Linq to Entities working with strings,
// all string work has to be done locally.
var local = people.AsEnumerable();
var rowData = local.Select(person => new
{
id = person.PersonID,
cell = new List<string> {
person.PersonID.ToString(),
person.PersonName
}
}
).ToArray();
var jsonData = new
{
total = totalPages,
page = page,
records = totalRecords,
rows = rowData
};
return Json(jsonData);

Using Linq to update a List<T> of objects to DB

I have a form that returns me a List of FlatSessie objects
in my edit view I edit a few FlatSessie's and get them returned to my Post method in that List object.
In my DB I have Sessies, which I map using Automapper to FlatSessie's and back
now I can not get linq to make the update to the DB for me.
the code:
[HttpPost]
public ActionResult Sessies(int id, int? modID, int? projID, string schooljaarparam, List<FlatSessie> sl) {
if (ModelState.IsValid) {
foreach (FlatSessie s in sl) { //i run over all FlatSessies which i get
Models.Sessies ses = Mapper.Map<FlatSessie, Sessies>(s); // i map them to the Sessies object
List<Sessies> origReeks = _db.Sessies.Where(p => p.Ses_ID == ses.Ses_ID).ToList(); //i get the original Session by ID. if there is a Session with that ID, if not (the ID will be 0) i do an Insert. if there is i want to do an Update.
if (origReeks.Count > 0) {
//it's an update
UpdateModel(origReeks.First(); //doesnt work
//_db.Sessies.Attach(ses, origReeks.First()); //doesnt work, gives me an error on used ID...
_db.SubmitChanges();
} else {
//no sessies yet, add them, this works.
_db.Sessies.InsertOnSubmit(ses);
_db.SubmitChanges();
}
}
TempData["okmsg"] = "De sessies zijn opgeslagen";
return RedirectToAction("Index");
}
//if not valid, i return the viewdata which i need.
Module m = _db.Modules.First(md => md.Mod_ID == modID.Value);
int antses = m.Mod_AantalSessies.Value;
List<List<SelectListItem>> lpllst = new List<List<SelectListItem>>(antses);
for (int i = 0; i < antses; i++) {
lpllst.Add(MvcApplication.lesplaatsList(schooljaarparam, -1));
}
ViewData["lesplist"] = lpllst;
ViewData["lglist"] = MvcApplication.lesgeverList();
return View(sl);
}
It might work to provide a prefix to UpdateModel (FlatSessie[n], where n is such that it matches the actual input name of the model element in question) so that it knows which properties to map onto the object, but because you are getting a list of these it might not. Have you tried updating the retrieved model using the data from the matching FlatSessie object directly?
Also, once you get this to work, you might want to do a single SubmitChanges for all inserts/updates (outside the loop) so that you get the entire submit wrapped in a single transaction. This will make it easier if there are errors to correct them and resubmit -- since you won't have some changes already committed causing further potential errors.

Linq to SQL: Queries don't look at pending changes

Follow up to this question. I have the following code:
string[] names = new[] { "Bob", "bob", "BoB" };
using (MyDataContext dataContext = new MyDataContext())
{
foreach (var name in names)
{
string s = name;
if (dataContext.Users.SingleOrDefault(u => u.Name.ToUpper() == s.ToUpper()) == null)
dataContext.Users.InsertOnSubmit(new User { Name = name });
}
dataContext.SubmitChanges();
}
...and it inserts all three names ("Bob", "bob" and "BoB"). If this was Linq-to-Objects, it wouldn't.
Can I make it look at the pending changes as well as what's already in the table?
I don't think that would be possible in general. Imagine you made a query like this:
dataContext.Users.InsertOnSubmit(new User { GroupId = 1 });
var groups = dataContext.Groups.Where(grp => grp.Users.Any());
The database knows nothing about the new user (yet) because the insert wasn't commited yet, so the generated SQL query might not return the Group with Id = 1. The only way the DataContext could take into account the not-yet-submitted insert in cases like this would be to get the whole Groups-Table (and possibly more tables, if they are affected by the query) and perform the query on the client, which is of course undesirable. I guess the L2S designers decided that it would be counterintuitive if some queries took not-yet-committed inserts into account while others wouldn't, so they chose to never take them into account.
Why don't you use something like
foreach (var name in names.Distinct(StringComparer.InvariantCultureIgnoreCase))
to filter out duplicate names before hitting the database?
Why dont you try something like this
foreach (var name in names)
{
string s = name;
if (dataContext.Users.SingleOrDefault(u => u.Name.ToUpper() == s.ToUpper()) == null)
{
dataContext.Users.InsertOnSubmit(new User { Name = name });
break;
}
}
I am sorry, I don't understand LINQ to SQL as much.
But, when I look at the code, it seems you are telling it to insert all the records at once (similar to a transaction) using SubmitChanges and you are trying to check the existence of it from the DB, when the records are not inserted at all.
EDIT: Try putting the SubmitChanges inside the loop and see that the code will run as per your expectation.
You can query the appropriate ChangeSet collection, such as
if(
dataContext.Users.
Union(dataContext.GetChangeSet().Inserts).
Except(dataContext.GetChangeSet().Deletes).
SingleOrDefault(u => u.Name.ToUpper() == s.ToUpper()) == null)
This will create a union of the values in the Users table and the pending Inserts, and will exclude pending deletes.
Of course, you might want to create a changeSet variable to prevent multiple calls to the GetChangeSet function, and you may need to appropriately cast the object in the collection to the appropriate type. In the Inserts and Deletes collections, you may want to filter it with something like
...GetChangeSet().Inserts.Where(o => o.GetType() == typeof(User)).OfType<User>()...