I have to print (say 100000 rows) in the HTML page. These rows are coming from a socket connection providing a transfer speed of around 500-700 rows per second, and all of them needs to be appended to the HTML body. ng-repeat is appending all rows only when the socket connection is paused, that too it takes a minimum of 4-5 seconds to append even 2000 records.
I cant use pagination, lazyloading nothing, it needs to be on the same page, all loaded at once.
Need help with it.
ng-repeat is always slow if you have more data to display, I can suggest you few options.
1- Go for client side infinite scroll option with ng-repeat, you can check this for more details.
2- You can use $interval to append the collection with a number of records which gets binded to the ng-repeat.
3- Try replacing the ng-repeat with a grid control which supports virtualization.
Related
Have a question about best practise of fetching large json data from reactjs application.
Let's say we have fetch url foo.com and we also have from,to variables like 0 to 6 or whatever we put there. And we have about 10 000 elements in response json.
I need all these rows because I need to make sorting about all of them. But it takes too much time rendering them all together.
Is there any good practise.
Fetch them all and client waits (using at the moment)
Fetch n items until all fetched (better speed?)
Fetch n items, show already fetched data and fetch remaining values in background (In this case only sorting part waiting until last fetched, but client can look around already on page)
Fetching 10K items at once is not a good idea in most cases. Neither
is rendering 10K items on the screen. The trick is to find a sweet spot in UX that allows you to NOT render/fetch this many items.
You need to ask yourself what's the most important thing for user on this page.
Do they want to see the top most blah record?
Do they mostly care about seeing and eyeballing the items?
Are they searching for a particular item?
If you can't change the api do the sorting for you on the server, then here is what I would do:
load first 100 items, sort them and then render them on the scree
kick of the fetch for the next 100 item and show a loading message to user
once the second page is fully fetched, sort all 200 and before rendering indicate on the UI that "there is new data, click here to see" but still only render the top 100 ones
repeat until you have all the items but give the user the ability to cancel at any time
The benefits are:
√ user sees data as soon as first page is there
√ you always render 100 items with a "show more" button so your DOM will be lean
√ they can cancel at any time if they have found what they were looking for
I'm hoping this will be a rather simple question to answer, as I'm not looking for any specific code. I have a table on a classic asp page populated from an sql server. I've just set the table up so that each row is clickable and takes you to a page to edit the data in the row. My question is this: Would I be better off trying to use the recordset that populated the table or should I reconnect to the db and pull just the record I want edited.
As always; It Depends. It depends on what you need to edit about the record. It depends on How far apart your DB and site are from each other. It depends on which machine, if the DB and site are on separate machines, is more powerful.
That being said, you should make a new call for that specific record. The reason mainly being because of a specification you made in your question:
...and takes you to a page to edit the data in the row
You should not try to pass a record set between pages. There are a few reasons for this
Only collect what you need
Make sure data is fresh
Consider how your program will scale
On point 1 there are two ways to look at this. One is that you are trying to pass the entire record set across a page when you only need 1 record. There are few situations where another DB call would cost more than this. The other is you are only passing one record which would make me question your design. Why does this record set have every item related to a record. You are selecting way too much for just a result list. Or if the record is that small then Why do you need the new page. Why can you not just reveal an edit template for the item if it is that minimal.
On point 2 consider the following scenario. You are discussing with a coworker how you need to change a customer's record. You pull up this result set in an application but then nature calls and you step away from you desk. The coworker gets called by the customer and asked why the record is not updated yet. To placate the customer your coworker makes the changes. Now you are using an old record set and may overwrite additional changes your coworker made while you were away. This all happens because you never update the record set, you always just pass the old one from page to page.
On point 3 we can look back a point 1 a bit. let us say that you are passing 5 fields now. You decide though that you need a comments field to attach to one of your existing fields. do you intend to pass 2000 characters of that comment field to the next page? How about if each of the 5 need a comment field? Do you intend to pass 10,000 characters for a properly paged record set of 10? do you not do record set paging and need to pass 10,000 characters for a full 126 records.
There are more reasons too. Will you be able to keep your records secure passing them this way? Will this effect your user's experience because they have a crummy computer and cannot build that quick of a post request quickly? Generally it is better to only keep what you need and in most situations your result set should not have everything you need to edit.
Background
I have an ASP.net MVC 4 web application written in VB and Razor, and using MySQL as its data source.
I need a view to display a table containing an ever expanding amount of data. (Potentially up to 10's of thousands of rows and maybe more.)
In order for me to continue further development, I have temporarily implemented a basic data-table where all rows are written to the page and then handled by the data-table thereafter. This works fine with up to a few hundred rows, but the more rows there are, the slower it gets and page loading times plummet!
Question
How do I implement the data table in such a way that data is retrieved and displayed only when needed so as to keep consistent page loading times, but also keep the searching and sorting functionality?
Additional notes
My guess is that the data-table must call something server side to pass only the required data, but I have no idea where to begin with this.
Paging
Only display a certain amount of rows per page.
You can use the .Take(100) to only retrieve the first 100 rows. Use .Skip(100).Take(100) to get the second 100 rows etc.
Filtering, sorting and searching should be done serverside. Keep in mind that you should FIRST sort/filter/search, and than use .Take(100)
The solution I found to this was to use an Ajax source for the data table. I added a get method to my controller and return the required datatable JSON array to populate the datatable.
Here is the website I found which provided the solution:
http://www.codeproject.com/Articles/177335/Refreshing-content-of-the-table-using-AJAX-in-ASP
Lets suppose i have some records in the mysql, and i want to make the records output gradual, i mean for example:
in the begining outputs 2-4 records to the page, then when i scroll down new records appends on the page, but the previus records that stays at the top are dissapearing, this way i want to replace the classic pagination, is this possible to do??
and please can you tell me some JQuery methods that can be usefull for this task?? and some tricks to combine??
i think that here i need to use ajax yes?
i am doing something like this the first time.
thanks!
You will certainly need ajax for this. I would recommend listening for when the user has scrolled past a certain point using $(window).scrollTop() and then issuing your ajax request from there.
You will want to make sure to keep track of how many records you have already loaded (for your LIMIT statements in future queries) and the Y position of when you want the scroll to trigger the data loading.
I have a website and want to display search results dynamically meaning that as the user interacts with controls and selects options, the search results are populated in realtime - i.e. the user doesnt need to click the search button.
The data is stored in a MySQL relational data base.
Now I know this is likely to lead to a large server load for a user-set above a certain size - are there anyways to mitigate this?
Max.
One way to mitigate the server load would be to introduce a slight timer delay before posting back to the server after each control is populated. If you give the user 3 seconds or so to input an additional field, the user may have time to add a search parameter. That could eliminate an extraneous query or two.
Also I always like to set a max numbers of results returned.