I have created a wholesale order form in the form of a large HTML table with lots of number inputs. I've noticed that switching between number inputs and typing values in the inputs are very slow in this table (ie. when I click in an input, it takes time for the cursor to show up; when I type in an input, it takes time for the character to show up). Is there any way for me to remove the lag without paginating the table rows?
You can view and play around with the table here (use guest password "braese").
A screenshot of a Chrome Dev Tools performance recording for clicking inside a single input (I'm not really sure what to make of this):
I did some more digging and testing, and it turns out the lag (update layer tree) when clicking/typing in inputs is only an issue in Chrome. This answer on another question points to issues with Chrome since version 46. It's a shame that such a popular browser has allowed an issue like this to go on for 2+ years. We will have to paginate our order form since Chrome is so popular with our client's customers.
Wow my chrome almost crashed :)
It is your styles
You are causing layer creation every time you apply translate transform
Check out the Layers menu in Chrome Dev tools;
Test if it runs better - only table :)
Also order your styles and scripts! - first styles after that scripts
Cheers!
Related
I have an html page which shows many rows of data which the user needs to be able to edit. So, a lot of input fields.
In every browser except for MacOS Safari there is no problem at all. In Safari however there are problems: clicking (focussing) on an input takes sometimes a few seconds. Typing....has some serious lag.
Having read a lot of topics i still do not have a working solution.
The one that somewhat works is wrapping EVERY individual input inside tags. But, while typing then works without any lag, focussing still takes several seconds.
This solution, mentioned by Roman:
Why does Safari Mobile have trouble handling many input fields on iOS 8
I do not use any extensions or whatever, just plain MacOS Safari.
Does someone have any other solution that finally solves this strange behavior? I'm tearing my hair out here.
We have an application which displays a (complex) table of (~500) rows which can be edited via a modal dialog (PrimeFaces). Displaying the table (or doing any ajaxy things like changing some status on save buttons) takes much more time in IE9+ (5-6s) than in Chrome or Firefox (< 1s).
In the Internet Explorer's profiler, I see that most of the time is taken by the JQuery.attr() method which is called by PrimeFaces' updateFormStateInput().
I really don't know how to go deeper in identifying the cause of this problem or if this bad performance is considered normal for IE.
Switching to Chrome/Firefox is not an option as our users have other applications that only work with IE (sharepoint).
So, is there anything I can do to solve/identify the problem? (Except removing each component one by one and see if this improves performances.)
The question may be a bit theoretical (did I mean rhetorical?) or fuzzy, but here it is anyway.
I'm planning to build a web page where content will be added through Ajax requests and displayed using tabs and panes. The initial view has just one tab and shows a list of links. Clicking on a link opens a new tab/pane. The user can navigate through the tabs and close them, just like in a web browser.
I could build a UI able to display an unlimited number of tabs, but that implies adding DOM content to as many -- and possibly a lot of -- panes. More likely, I'll set a limit to how many tabs can be open simultaneously. But what should that limit be? What rule would you follow? What's your experience in how much content a DOM document can hold without impact on performance?
There is no way to answer your question. I, for example, have Chrome and Firefox running side by side right now. All browsers together need 9.85GB of shared memory (so some of that RAM is shared between all browsers; I have only 8GB of RAM and about 2GB is actually in use).
There is no JavaScript API to tell when memory gets tight; the only indication is that the machine is starting to swap or that the browser is crashing.
So you have the following solutions:
Limit the user to something that is too small. Good for you, bad for power users like me.
Set the limit too big. Bad for you, since users will start complaining how your stupid site makes their browser crash but good for me.
Instead of setting a limit, clear the panes which are not visible. Save the state of the pane somewhere (server, local storage) and restore it when the user opens it. Quite some effort but it would solve your problem.
I am developing a jQtouch app and each request done via ajax creates a new div in the document for the loaded content. Only a single div is shown at any one time.
How many div's can I have before the app starts getting unresponsive and slow?
Anyone have any ideas on this?
EDIT: Its an iPad app running on Safari, and it would be less than 1000 div's with very basic content
I've had tens of thousands, maybe even a hundred thousand divs, on screen at once.
Performance is either fine, or bad, depending on:
Parsed from HTML or generated Dynamically in JavaScript?
Parsed from HTML means you have a LARGE html source, and this can make browsers hang. Generated in JS is surprisingly fast, even on Internet Explorer, which is the slowest of all browsers for JS.
To be honest, if you really need an absolute answer to this question, then you might want to reconsider your design.
No answer given here will be right, as it depends upon many factors that are specific to your application. E.g. heavy vs. little CSS use, size of the divs, amount of actual graphics rendering required per div, target browser/platform, number of DOM event listeners etc..
Just because you can doesn't mean that you should! :-)
As others have said, there's really no answer.
However, in this talk about the Google Maps API version 3, the speaker brings up the number ten thousand several times, as a basic threshold for browser unhappiness.
http://code.google.com/apis/maps/documentation/javascript/
Without defining a particular environment, it's not possible to answer your question.
And even then, anything anyone tells you is just a guess. You need to do your own testing on real-world configurations with different browsers and hardware. You'll also need to establish some performance benchmarks to decide what "too slow" even means.
I've been able to add several thousand divs without a problem. Depends on what you'll be doing afterwards, of course, and the memory on the client machine. Everyone else is right about that.
As Harpo said, 10K is probably a good ceiling. At one time, I noticed speed problems starting at about 4K divs, but hardware has improved since then.
And, as Neil N said, adding the divs via scripting is better than having a huge HTML source.
And, to answer Harpo's comment, one way to "break it up" so that JS doesn't lock the page and produce a "page is running slowly" error is to call a timer at the end of each "add a div" routine, and the timer in turn calls your "add a div" function again.
Now, MY question is: is it possible to "paint" so that you don't need to add thousands of divs? This can be done with the canvas tag with some browsers, but I don't think it's possible with VML (the excanvas project) on IE. Or is it? I think VML "paints" by adding new elements to the DOM, at which point you may as well use DIVs, unless it's a simple shape.
Is it possible to alter the source of an image via scripting? (the image in the DOM, of course -- not the original image on the server.)
I've recently been developing an application using tables with a large number (hundreds) of rows. The rows contain tabular data and each row contains two drop down lists as well as link which displays hidden table rows directly below it.
I am currently experiencing lengthy delays when attempting to select a value from the drop down, and when displaying the hidden table rows.
Is the table the source of my problem here?
From what I gather, HTML tables can be appear to be slow to render if widths are not explicitly stated. If they aren't, the browser has to finish loading the contents of the cells before it can calculate the correct widths.
MSDN has some information here on their "Building High Performance HTML Pages" article that may help; they suggest the following (regarding tables specifically):
Set the table-layout CSS attribute to fixed on the table.
Explicitly define col objects for each column.
Set the WIDTH attribute on each col.
The problem is more than likely the rendering of the controls (100 select boxes) rather than the table layout. How many items are there in the drop down list? Does it perform the same way in all browsers on different operating systems?
Sorry to be so vague but generally tables aren't slow to render, but are looked down upon because of their lack of accessibility (although of course much of the idealism is the fact screen readers don't like them).
I can state from experience that it's almost certainly the dropdown lists that are causing the slowness. Tables with hundreds of rows can render almost instantaneously if they only contain text, but add a dropdown list to each row with just a few dozen options, and now even 50 rows will take a noticeable amount of time.
Try to design around the need for dropdowns in the table rows - if this is a form (and why else would you have dropdowns?), have the user select the row they wish to edit, and then put the editable controls only in that row, either via AJAX if you're into that sort of thing, or a more traditional "detail view" approach.
I've not noticed any slowdowns when displaying static tables with a few thousand entries, but adding effects like sorting or dynamic zebra-striping can quickly bog down even a modern browser on a fast computer. Make sure you're not running any JavaScript iterations across the entire table.
Whatever you come up with, test the approach in a couple of browsers on a couple of platforms, with some slower machines. I once had an app that was speedy everywhere except in Safari for Macintosh. It turned out to be something about the way it rendered the dropdowns. There's just no substitute for experimentation. Uhm, I meant testing.
If you know how wide each column should be, try specifying table-layout: fixed in the CSS for the table and see if that makes any difference (it should stop the browser trying to re-render the whole table just because you've toggled visibility on a few rows.)
IE does not handle very large DOMs manipulations well at all.
If you have two selects in each of the rows and a link, and let's assume each select has 5 options that's (((5options + 1select) * 2) + 1 link + 3tds + 1tr) at least 17 DOM items per row. Plus, if I'm not mistaken, ie treats each text item as it's own DOM node. So add another 10 DOM nodes for the drop down text and 1 for the link, that's 28 DOM items per row.
I agree with the others that it's not only your table that is causing the slow loading but the population of the dropdown list as well.
If it helps you can try paginating your table. You can use JavaScript frameworks such as jQuery or extjs for pagination and AJAX.