Adding content to the DOM: any limits? - html

The question may be a bit theoretical (did I mean rhetorical?) or fuzzy, but here it is anyway.
I'm planning to build a web page where content will be added through Ajax requests and displayed using tabs and panes. The initial view has just one tab and shows a list of links. Clicking on a link opens a new tab/pane. The user can navigate through the tabs and close them, just like in a web browser.
I could build a UI able to display an unlimited number of tabs, but that implies adding DOM content to as many -- and possibly a lot of -- panes. More likely, I'll set a limit to how many tabs can be open simultaneously. But what should that limit be? What rule would you follow? What's your experience in how much content a DOM document can hold without impact on performance?

There is no way to answer your question. I, for example, have Chrome and Firefox running side by side right now. All browsers together need 9.85GB of shared memory (so some of that RAM is shared between all browsers; I have only 8GB of RAM and about 2GB is actually in use).
There is no JavaScript API to tell when memory gets tight; the only indication is that the machine is starting to swap or that the browser is crashing.
So you have the following solutions:
Limit the user to something that is too small. Good for you, bad for power users like me.
Set the limit too big. Bad for you, since users will start complaining how your stupid site makes their browser crash but good for me.
Instead of setting a limit, clear the panes which are not visible. Save the state of the pane somewhere (server, local storage) and restore it when the user opens it. Quite some effort but it would solve your problem.

Related

thumbnails keep reloading on page scroll

The thumbnails on my photos page take a long time to load, then if you scroll down the page and back up, they have somehow vanished and need to reload. I'm using an old version of chrome on Windows XP, so I'm sure that's half my problem, BUT, it still does this on my cell (galaxy S5). I'm a novice coder so please go easy on me lol. Here is the link to my website and the photos page:
http://www.mikemicalizzicontracting.com/photos.html
I can't really help you about your precise questions, as we don't have your code, but I am in a good mood so I'll just give you some tips to make your website quicker.
Some tips for optimize your images :
1) Resize your images
For now your images are displayed in 207 * 154, which is small, but you download them in their original size. You should resize them so the size of the image (in bits) would be much lower.
You can do that with a lot of online tools, or write a simple script for that in bash for instance. I think you will find a pre-made solution for the script very easily, or just have to download imagick (I am not sure of the name). This solution by scrip also depends on your distribution (wether you are usinc OSX, Windows, Linux..)
2) Cache your images first
It is always painful for the user to wait for images to display. So what you need to do is use browser cache (if you are novice you wil have some trouble but it will be worth it) to store the images data and then display them from this cache.
So the strategy is :
If there is the image in cache, just display it from there
If there isn't, download it from your server
once it is loaded, put it in cache; then display it from there.
3) (More advanced) Preload the images
If your home page is very easy to display (not a lot if images, not that much script to run etc), you can start download some images in the client and just store them in the cache.
This task should always be a second-hand task, it means it should not slow your user navigation. So depending on what the user is doing, you may will have to pause the request and start it again.
If the user go on the photo page when an image is currently downloading for cache, you don't need to start it all over again, just continue.
This last tip can be long to develop, but this is a good thing to do when you want to display very good quality images or just plenty of them.

Should the number of html elements being loaded on a page be monitored?

I have an application that displays a page with 5-10000 rows in a table element and has a drop down menu that switches to other views with a similar structure. Currently I do an async request when switching between views (I also clear/remove the current view from the document) and load the appropriate list each time; however, I was thinking of
1) loading all views in the background before they are requested for viewing so they load instantly on click.
and
2) Just hiding a particular set of rows rather than removing it so if the client navigates back it will be instant as well.
This would mean that potentially 10s of thousands of html elements would be loaded into the current document; is this a problem? What if it was more than 10's of thousands?
Loading 10000+ HTML elements onto your page is not a very smart idea. If the user's computer is of normal to fast speed, the user may experience slight lag, and if the user's computer is slow, it may even cause a system crash (depending on the amount of RAM on the computer).
A technique you could explore is to lazyload the HTML elements - this means that when the user scrolls down to a particular portion of the page, then the elements are loaded via AJAX. (also known as "infinite scrolling").
This means that the browser's rendering engine does not have to render so many elements in one go, and that saves memory, and increases speed.
It depends on the HTML elements, but I would say as a rule of thumb, don't load upfront.
When I say it depends on the elements, I mean take a look at facebook for example. They load maybe 10 or 20 items into the feed, and then add more as you scroll, because each item is so rich in content (photos, videos, etc).
However on the flipside of that, think about how much info is in each row, if it's say, less than 500 bytes, 500 x 10000 = 5MB, which isn't an awful load for a web request, and if you can cache intelligently, maybe it will be a lot less than that.
The bottom line is, don't assume that the number of HTML elements matters so much, think about the amount of data that it amounts to. My advice, cross that bridge when you get there. Monitor your request times, if they are too long, come up with an AJAX page loader, which shouldn't be too hard.
The size of your html document would be conciderable...
I once got a page with 5000~10000 items (table rows) and the browser (IE) was rendering way too long (downloading the page, parsing and rendering it).
The best solution seems to me to set up a WebService with a LazyLoading system.
So IMHO yes, the number of element in a HTML document should be monitored.
Yes of course! The number of element in a HTML document can be monitored.! Use FireBug in Firefox!

how many div's can you have before the dom slows and becomes unstable?

I am developing a jQtouch app and each request done via ajax creates a new div in the document for the loaded content. Only a single div is shown at any one time.
How many div's can I have before the app starts getting unresponsive and slow?
Anyone have any ideas on this?
EDIT: Its an iPad app running on Safari, and it would be less than 1000 div's with very basic content
I've had tens of thousands, maybe even a hundred thousand divs, on screen at once.
Performance is either fine, or bad, depending on:
Parsed from HTML or generated Dynamically in JavaScript?
Parsed from HTML means you have a LARGE html source, and this can make browsers hang. Generated in JS is surprisingly fast, even on Internet Explorer, which is the slowest of all browsers for JS.
To be honest, if you really need an absolute answer to this question, then you might want to reconsider your design.
No answer given here will be right, as it depends upon many factors that are specific to your application. E.g. heavy vs. little CSS use, size of the divs, amount of actual graphics rendering required per div, target browser/platform, number of DOM event listeners etc..
Just because you can doesn't mean that you should! :-)
As others have said, there's really no answer.
However, in this talk about the Google Maps API version 3, the speaker brings up the number ten thousand several times, as a basic threshold for browser unhappiness.
http://code.google.com/apis/maps/documentation/javascript/
Without defining a particular environment, it's not possible to answer your question.
And even then, anything anyone tells you is just a guess. You need to do your own testing on real-world configurations with different browsers and hardware. You'll also need to establish some performance benchmarks to decide what "too slow" even means.
I've been able to add several thousand divs without a problem. Depends on what you'll be doing afterwards, of course, and the memory on the client machine. Everyone else is right about that.
As Harpo said, 10K is probably a good ceiling. At one time, I noticed speed problems starting at about 4K divs, but hardware has improved since then.
And, as Neil N said, adding the divs via scripting is better than having a huge HTML source.
And, to answer Harpo's comment, one way to "break it up" so that JS doesn't lock the page and produce a "page is running slowly" error is to call a timer at the end of each "add a div" routine, and the timer in turn calls your "add a div" function again.
Now, MY question is: is it possible to "paint" so that you don't need to add thousands of divs? This can be done with the canvas tag with some browsers, but I don't think it's possible with VML (the excanvas project) on IE. Or is it? I think VML "paints" by adding new elements to the DOM, at which point you may as well use DIVs, unless it's a simple shape.
Is it possible to alter the source of an image via scripting? (the image in the DOM, of course -- not the original image on the server.)

Reducing load time, or making the user think the load time is less

I've been working on a website, and we've managed to reduce the total content for a page load from 13.7MiB's to 2.4, but the page still takes forever to load.
It's a joomla site (ick), and it has a lot of redundant DOM elements (2000+ for the home page), and make 60+ HttpRequest's per page load, counting all the css, js, and image requests. Unlike drupal, joomla won't merge them all on the fly, and they have to be kept separate or else the joomla components will go nuts.
What can I do to improve load time?
Things I've done:
Added colors to dom elements that have large images as their background so the color is loaded, then the image
Reduced excessively large images to much smaller file sizes
Reduced DOM elements to ~2000, from ~5000
Loading CSS at the start of the page, and javascript at the end
Not totally possible, joomla injects it's own javascript and css and it does it at the header, always.
Minified most javascript
Setup caching and gziping on server
Uncached size 2.4MB, cached is ~300KB, but even with so many dom elements, the page takes a good bit of time to render.
What more can I do to improve the load time?
Check out this article.
http://www.smashingmagazine.com/2010/01/06/page-performance-what-to-know-and-what-you-can-do/
If the link gets removed or lost the tools mentioned are:
YSlow (by Yahoo)
Google's Page speed
AOLs web page test
Smush.it (Image compression tool)
It sounds like you've done a great job of working around the real problem: those giant graphics. You can probably squeeze some more efficiency out of caching, minifying, etc., but there has to be a way to reduce the size of the images. I worked with a team of some of the pickiest designers on Earth and they never required uncompressed JPEGs. Do you mean images cut out of Photoshop and saved on full quality (10)? If so, the real solution (and I appreciate that you may not be able to accomplish this) is to have a hard conversation where you explain to the design company, "You are not your users." If the purpose of the site is to only impress other visual designers with the fidelity of your imagery, maybe it's ok. If the purpose of the site is to be a portfolio that gains your company work, they need to re-asses who their audience is and what the audience wants. Which, I'm guessing, is not 2 minute load times.
Have you enabled HTTP Compression (gzip) on your web servers? That will reduce the transfer size of all text-based files by 60-90%. And your page will load 6-10x faster.
Search StackOverflow or Google for how to enable it. (It varies per server software: Apache, IIS, etc).
Are all the DOM elements necessary? If they are, is it possible to hide them as the page loads? Essentially, you would have your important need-to-be-there dom elements render with the page, and then when the document is loaded, you could unhide the rest of the elements as necessary
$('.hidden').removeClass('hidden')
Evidently, there are plugins for compressing and combining JS/CSS files:
http://extensions.joomla.org/extensions/site-management/site-performance
http://extensions.joomla.org/extensions/site-management/site-performance/7350
I would say you can't do anything. You're close to the absolute limit.
Once you get to a critical point, you'll have to compare the amount of effort you'd need to input into further compressing the site against the effort of throwing the most bandwidth expensive components out the window. For example, you might spend 20 hours further compressing your site by 1%. You might also spend 40 hours on throwing the Joomla theme away and starting again saving 50%. (though probably not).
I hope you find an easy answer. I feel your pain as I am struggling to finish a Drupal project than has been massacred by a designer who hasn't implemented his own code in years and has become institutionalized in a cubicle somewhere near a tuna salad sandwich.

iFrame Best Practices

I have a large, hi-def JavaScript-intensive image banner for a site I'm designing. What is everyone's opinion of using iframes so that you incur the load time only once? Is there a CSS alternative to the iframe?
Feel free to preview the site.
It is very much a work in progress.
­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­­
I should also have mentioned that I would like the banner rotation to keep moving. When the visitor clicks on a link, the banner rotation starts over. It would be nice if the "animation" kept rotating, regardless of the page the user visits.Blockquote
Well, in that case I would strongly recommend not doing that. The only real way of achieving that is to have the actual website content in the iframe, which means that you suddenly have lots of negative sides to the site: not being able to bookmark urls easily due to the address bar not changing; accessibility concerns; etc
I think you'll find that most people won't care that it reloads again. Once a visitor lands on your website, they'll marvel at the wonderful banner immediately, and then will continue to ignore it while they browse your site - until an image they haven't seen appears and distracts them away from your content.
Keep the rotation random enough, and with enough images, and people will stop to look at it from whatever page they're on.
I find the main challenge with iFrame headers is resizing. Since the font in your header is of static size, I don't see a problem with using an iFrame. Although I'm not sure if it's really intensive enough to be worth it.
Well, the browser appears to cache all seven banner images upon the first load, and runs them out from the cache (for each subsequent page) thereafter. I don't think you have a problem :D
Try it out with Firebug's Net monitoring tool in Firefox.
This may work without CSS also, but if you use CSS to load the background and your server is configured correctly, the image should already only be downloaded once.
Usually the browser will request a resource by asking for it only if it has not been modified since the last time it was downloaded. In this case, the only things sent back and forth are the HTTP headers, no content.
If you want to ensure the image is only downloaded once, add an .htacces or an apache2.conf rule to make the image expire a few days into the future so that users will only request it again if their cache is cleared or the content expiration date passes. An .htaccess file is probably too excessive to use in your case, though results may vary.
You could have it load the main page once, then asynchronously load the other elements when needed (ajax). If you did that, an iFrame would not be necessary. Here is an example of loading only the new material.
While using IFrames as a sort of master page/template for your pages might be a good thing, IFrames have a known negative impact to searchability/SEO.
It might also be unnecessary in the first place because once your images are loaded the first time (and with the large high-def images you have on your site, that would be slow no matter what you do) the images are cached by browsers and will not be reloaded until the user clears their cache or does a Ctrl+F5.