Razor WebPage: How to make data from the database alvailable to several pages? - razor

My question is quite similar to this one.
I get the same data (attributes of objects, each row represents an object) from the database for two of my pages, one page to edit the data and one page to view it. To reduce redundant code and to improve maintaining I wanted to write the piece of code that loads the data only once.
My idea was to use a _PageStart.cshtml. But with that I only can store strings in the PageData array but not Objects.
So what is the best way to make the rows from the database available on several pages?
Here is how I get the data from the database:
var db = Database.Open("mydb");
String query = "select * from motors";
var rows = db.Query(query);
System.Data.DataTable motors = new System.Data.DataTable();
motors.Columns.Add("posX", typeof(int));
motors.Columns.Add("posY", typeof(int));
IEnumerator<dynamic> en = rows.GetEnumerator();
while (en.MoveNext()) {
motors.Rows.Add(en.Current.posX, en.Current.posY);
}
I would like to access the DataTable motors on different pages.

All you have to do is create a page that only has your code in it, Then on whatever page you want it displayed just use:
#RenderPage("~/yourpage.cshtml");

Related

What is the least resource intensive to perform "findManyOrCreate" in Laravel eloquent?

I have an array of post codes coming from an input:
$postCodes = collect(["BK13TV", "BK14TV", "BK15TV", "BK16TV"]);
In my database I already have two of the post codes - "BK13TV", "BK16TV".
So I would like to run something like this:
$postCodeModels = PostCode::findManyOrCreate($postCodes->map($postCode) {
return ['code' => $postCode]
})
My initial approach was to load all the post codes, then diff them against the postCodes from the input like so:
PostCode::createMany($postCodes->diff(PostCode::all()->pluck('code')))
However, here it means that I am loading the entire content of post_codes table, which just seems wrong.
In the ideal case, this would return all post code models matching the passed post codes as well as would create entries for post codes that did not exist in the database.
First I need to retrieve existing postcodes:
$existingPostCodes = PostCode::whereIn('code', $postCodes)->get();
The find all the post codes in the input, that are not stored yet in database:
$newPostCodes = $postCodes->diff($existingPostCode->pluck('code'));
Finally retrieve all the new post codes as models:
$postCodeModels = PostCode::whereIn('code', $postCodes)->get();
Admittedly, this still takes three queries, but does eliminate the crazy stuff of loading an entire table worth of data.

Copy a record and it's descendants in Laravel 5.4

I have the current structured database and Eloquent models.
Document
id
Pages
id, document_id
Assets
id, page_id
So I have documents that has many pages that has many assets.
I have a really ugly method in my document model that was working up until now that iterates over all of the pages then assets and replicates them storing the new foreign keys against the new records.
foreach ($pages as $page) {
$replicatedPage = $page->replicate();
$replicatedPage->document_id = $newDocument->id;
$replicatedPage->save();
foreach ($page->assets as $asset) {
$replicatedAsset = $asset->replicate();
$replicatedAsset->page_id = $replicatedPage->id;
$replicatedAsset->save();
}
}
Unfortunately, this has now gotten extremely slow as people are uploading almost 200 pages with 200 assets per page (40,000 assets in total).
I looked into the saveMany() function too but under the hood it just does exactly what I am doing above by iterating over said ID's.
What I really need is a nice query builder to copy all of these at the SQL level but I cannot figure out how to do this. Or even a raw SQL statement I can embed into a raw query builder function.
Any ideas?
Use the insert() method to make it faster. Eager load document with its all pages and assets, iterate over this data and build an array for each table. An example for pages table (do the same for the assets table):
$replicatedPages = [];
foreach ($pages as $page) {
$replicatedPages[] = $page->toArray(); // Probably you'll need to remove ID column.
}
Page::insert($replicatedPages);
With this approach, you will execute just two queries for inserting all document's pages and assets instead of executing 40000 queries.

Kentico Smart Search Index - any way to read it with the API?

We need to get a search box autocomplete working based on Kentico search indexes, but half the site is in the CMS app pages, and half is in MVC. So the autocomplete webpart works on the CMS app pages, but not the MVC ones.
An option we're exploring is to use the Twitter Typeahead js library in both sides of the site, which requires the search terms to be in a json file.
So we'd like to be able to load the search index terms via the Kentico API and then write that out to a json file.
The SearchIndexInfo object doesn't seem to have a way to get the index's terms that it writes out to the index files.
Update
For clarification: We can do the search via the API, but the searchresultitems only return with the title and content fields, and they do not contain all the search terms that are stored in the index files.
For instance, a search index for a custom page type might build the index based on the DocumentName, Description, Location, City, Company Name, DesignCategory fields. All of those will be stored in the index somewhere, so how do we read the terms that are stored in the index?
Not just the results, which would only have DocumentName(title) and Description (content).
We're basically trying to convert the search index files into a json representation, not the search results.
Of course, if the SmartSearchDialog webpart just does its predictive search on only the title and content fields, then we would just go with that, but I believe the SmartSearchDialog does an actual search does it not?
thanks
There is API for it:
// Gets the search index
SearchIndexInfo index = SearchIndexInfoProvider.GetSearchIndexInfo("NewIndex");
if (index != null)
{
// Prepares the search parameters
SearchParameters parameters = new SearchParameters()
{
SearchFor = "home",
SearchSort = "##SCORE##",
Path = "/%",
ClassNames = "",
CurrentCulture = "EN-US",
DefaultCulture = CultureHelper.EnglishCulture.IetfLanguageTag,
CombineWithDefaultCulture = false,
CheckPermissions = false,
SearchInAttachments = false,
User = (UserInfo)MembershipContext.AuthenticatedUser,
SearchIndexes = index.IndexName,
StartingPosition = 0,
DisplayResults = 100,
NumberOfProcessedResults = 100,
NumberOfResults = 0,
AttachmentWhere = String.Empty,
AttachmentOrderBy = String.Empty,
};
// Performs the search and saves the results into a DataSet
System.Data.DataSet results = SearchHelper.Search(parameters);
if (parameters.NumberOfResults > 0)
{
// The search found at least one matching result, and you can handle the results
}
}
More details here.
Roman's answer in the comments doesn't look like it would work, and in thinking about it some more we were perhaps trying to do something too complicated and weren't asking the right question maybe.
Instead of trying to replicate the search index in json for use by the twitter typeahead autocomplete, perhaps a better way to do it is to keep it simpler and just use the search results' title and content fields.
Then, in order to get additional fields into the content field of the search results ( such as project location ) we can then customize the search building code (CMSLoaderAttribute) to add the extra fields into the SearchDocument's Content field.

Umbraco Get properties of a Blog Post

I have Blog Repository and it contains blog posts list. I am accessing all Blog Posts by using this code
var contentType = ApplicationContext.Services.ContentTypeService.GetContentType("BlogPost");
var blogPostList = ApplicationContext.Services.ContentService.GetContentOfContentType(contentType.Id);
Now I am accessing custom data type properties by using
foreach (var blog in blogPostList)
{
foreach ( var property in blog.Properties)
{
}
}
Now I can access properties and get its value but for few properties I get json String and that is no use for me as I would need to create models to properly parse json string into proper json.
Is there any way to use GetPropertyValue in this situation or some other way to get properly formatted Json.
Stop right there!!! If this is for the front end DO NOT use the ContentService. This is specifically for the back end of the site, and is VERY database intensive.
For all front end code, you should be using IPublishedContent. This queries the content cache, and is MUCH faster, as there is no database access.
If you have an UmbracoHelper (which you will in Umbraco controllers), you have access to it out of the box, otherwise you need to spin up a helper. Here's an example:
var umbracoHelper = new Umbraco.Web.UmbracoHelper(Umbraco.Web.UmbracoContext.Current);
var blogPosts = umbracoHelper. TypedContentAtXPath("//BlogPost [#isDoc]");
You will then have a list of pages, and can use the standard .GetPropertyValue methods to get the values out of the fields. Note, the XPath query here isn't super efficient, you could make it a lot more specific if you wanted to.

How IQueryables are dealt with in ASP.NET MVC Views?

I have some tables in a MySQL database to represent records from a sensor. One of the features of the system I'm developing is to display this records from the database to the web user, so I used ADO.NET Entity Data Model to create an ORM, used Linq to SQL to get the data from the database, and stored them in a ViewModel I designed, so I can display it using MVCContrib Grid Helper:
public IQueryable<TrendSignalRecord> GetTrends()
{
var dataContext = new SmgerEntities();
var trendSignalRecords = from e in dataContext.TrendSignalRecords
select e;
return trendSignalRecords;
}
public IQueryable<TrendRecordViewModel> GetTrendsProjected()
{
var projectedTrendRecords = from t in GetTrends()
select new TrendRecordViewModel
{
TrendID = t.ID,
TrendName = t.TrendSignalSetting.Name,
GeneratingUnitID = t.TrendSignalSetting.TrendSetting.GeneratingUnit_ID,
//{...}
Unit = t.TrendSignalSetting.Unit
};
return projectedTrendRecords;
}
I call the GetTrendsProjectedMethod and then I use Linq to SQL to select only the records I want. It is working fine in my developing scenario, but when I test it in a real scenario, where the number of records is way greater (something around a million records), it stops working.
I put some debug messages to test it, and everything works fine, but when it reaches the return View() statement, it simply stops, throwing me a MySQLException: Timeout expired. That let me wondering if the data I sent to the page is retrieved by the page itself (it only search for the displayed items in the database when the page itself needs it, or something like that).
All of my other pages use the same set of tools: MVCContrib Grid Helper, ADO.NET, Linq to SQL, MySQL, and everything else works alright.
You absolutely should paginate your data set before executing your query if you have millions of records. This could be done using the .Skip and .Take extension methods. And those should be called before running any query against your database.
Trying to fetch millions of records from a database without pagination would very likely cause a timeout at best.
Well, assuming information in this blog is correct, .AsPagination method requires you to sort your data by a particular column. It's possible that trying to do an OrderBy on a table with millions of records in it is just a time consuming operation and times out.