How to split large json fetch in react - json

Have a question about best practise of fetching large json data from reactjs application.
Let's say we have fetch url foo.com and we also have from,to variables like 0 to 6 or whatever we put there. And we have about 10 000 elements in response json.
I need all these rows because I need to make sorting about all of them. But it takes too much time rendering them all together.
Is there any good practise.
Fetch them all and client waits (using at the moment)
Fetch n items until all fetched (better speed?)
Fetch n items, show already fetched data and fetch remaining values in background (In this case only sorting part waiting until last fetched, but client can look around already on page)

Fetching 10K items at once is not a good idea in most cases. Neither
is rendering 10K items on the screen. The trick is to find a sweet spot in UX that allows you to NOT render/fetch this many items.
You need to ask yourself what's the most important thing for user on this page.
Do they want to see the top most blah record?
Do they mostly care about seeing and eyeballing the items?
Are they searching for a particular item?
If you can't change the api do the sorting for you on the server, then here is what I would do:
load first 100 items, sort them and then render them on the scree
kick of the fetch for the next 100 item and show a loading message to user
once the second page is fully fetched, sort all 200 and before rendering indicate on the UI that "there is new data, click here to see" but still only render the top 100 ones
repeat until you have all the items but give the user the ability to cancel at any time
The benefits are:
√ user sees data as soon as first page is there
√ you always render 100 items with a "show more" button so your DOM will be lean
√ they can cancel at any time if they have found what they were looking for

Related

Table Data with sort, search query, pagination pass to different component in Angular

I will try to explain as simple as possible.
There is a table which in start gets first 30 rows of n rows and has sorting, search using mat-table. So even if it is sorted only the first 30 rows will be sent and when the user goes to next page it gets the next 30 rows from backend each time making a new request.
Now each row has a button that will take it to another component which will get some detailed data about the specific row.
This component has a previous and next feature that will get the detailed view of the next or previous row data in the same order that is displayed in the table(sorted, search Result, page number).
Currently the table rows are made again in backend(Django) with all the sort, search and other parameters then find the current row and send the next and previous row (will take minimum 5 hits on DB).
Hence it very slow.
In Frontend I can only pass the data of only that page, which will have problem in next or previous page.
How to properly tackle this...
Normal search UIs don't focus on 30 rows at a time. Instead, they first search the entire dataset, then 'paginate' the results. (Or is that what you intended to say?)
There are details that can let the processing work fast, or there may be details that prevent speed. Please go into details about the table structure and the possible search criteria.

ng-repeat not working with large set of data

I have to print (say 100000 rows) in the HTML page. These rows are coming from a socket connection providing a transfer speed of around 500-700 rows per second, and all of them needs to be appended to the HTML body. ng-repeat is appending all rows only when the socket connection is paused, that too it takes a minimum of 4-5 seconds to append even 2000 records.
I cant use pagination, lazyloading nothing, it needs to be on the same page, all loaded at once.
Need help with it.
ng-repeat is always slow if you have more data to display, I can suggest you few options.
1- Go for client side infinite scroll option with ng-repeat, you can check this for more details.
2- You can use $interval to append the collection with a number of records which gets binded to the ng-repeat.
3- Try replacing the ng-repeat with a grid control which supports virtualization.

Complex HTML issue with huge data set to show in one page

Person -> Item - > Work -> Component.
This are the main tables in the database.
I have to search for Item by a criteria. It will give a list. I will join the Person to get his "parent". After this maybe is a record in Work table maybe not, but if is, I will join also for Work I will join the list of Components if can be found.
The original code it was with nested tables. It will crash the browser, because taking to much memory with that design and is extremely slow around 150 records.
I rewrote it with divs the nested table. The performance got a huge boost, but start to be slow again because of buttons and design. ( It wasn't able to show 200 record before even after 10 min waiting!, now display 5k rows in 23 seconds)
Some of my benchmark logs:
SQL execution time: 0.18448090553284 seconds. Found 5624 rows.
For each result processing took: 0.29220700263977 seconds.
Writing the HTML code took:0.4107129573822 seconds.
Rows in HTML: 26551 headers + data.
Total Cells in HTML: **302491** headers and data.
Time until DOMready: 23691 milliseconds (in JavaScript)
0.18 + 0.29 + 0.41 = 0.88 So is around 1 second!
But when the browser actually want to show it to you ( paint ) it will take like 20 seconds!!!
Please don't suggest the paging! - the customer (final user) want to see all data in 1 web page for whatever his reason. No comment here.
Running on an i7 processor and 8/16 GB ram as requirement is accepted.
Most of the data rows has a collapse/expand button.
Most of the data row has the CRUD buttons: Add, Edit, Delete, View in details
All 4 kind of data table has headers and they don't match the length of the other kind of table header size, neither in column numbers.
When I want to just list the data in a blank page ( without design) and use 1 single table it is like 2 seconds or 3, not 20-30.
The original nested table solution has buttons with functionality in data row.
I would like to use it, and not implement it again.
My idea is to go back to the original nested table design ( to not re implement a lot of functionality of buttons) Then display only the top level table collapsed, with expand buttons. Then call an AJAX to get the second level data, when ready call the 3rd level then the 4th level.
The user is using intranet or the same PC as the server, so this maybe can be acceptable? -and doesn't have a blocking user interface for long time.
How would you handle this case, when there is not an option to show a next page button with 20 records / page.

Asp.NET MVC How to quickly display large amount of data in datatables

Background
I have an ASP.net MVC 4 web application written in VB and Razor, and using MySQL as its data source.
I need a view to display a table containing an ever expanding amount of data. (Potentially up to 10's of thousands of rows and maybe more.)
In order for me to continue further development, I have temporarily implemented a basic data-table where all rows are written to the page and then handled by the data-table thereafter. This works fine with up to a few hundred rows, but the more rows there are, the slower it gets and page loading times plummet!
Question
How do I implement the data table in such a way that data is retrieved and displayed only when needed so as to keep consistent page loading times, but also keep the searching and sorting functionality?
Additional notes
My guess is that the data-table must call something server side to pass only the required data, but I have no idea where to begin with this.
Paging
Only display a certain amount of rows per page.
You can use the .Take(100) to only retrieve the first 100 rows. Use .Skip(100).Take(100) to get the second 100 rows etc.
Filtering, sorting and searching should be done serverside. Keep in mind that you should FIRST sort/filter/search, and than use .Take(100)
The solution I found to this was to use an Ajax source for the data table. I added a get method to my controller and return the required datatable JSON array to populate the datatable.
Here is the website I found which provided the solution:
http://www.codeproject.com/Articles/177335/Refreshing-content-of-the-table-using-AJAX-in-ASP

Displaying search results dynamically as use interacts with controls

I have a website and want to display search results dynamically meaning that as the user interacts with controls and selects options, the search results are populated in realtime - i.e. the user doesnt need to click the search button.
The data is stored in a MySQL relational data base.
Now I know this is likely to lead to a large server load for a user-set above a certain size - are there anyways to mitigate this?
Max.
One way to mitigate the server load would be to introduce a slight timer delay before posting back to the server after each control is populated. If you give the user 3 seconds or so to input an additional field, the user may have time to add a search parameter. That could eliminate an extraneous query or two.
Also I always like to set a max numbers of results returned.