HTML max number of <img> elements - html

Anybody know what's the upper limit for how many img elements can be loaded in a given web page? I'm doing some image processing analysis that I'm presenting in the browser but the page keeps crashing until I trim it down to more limited figures. This may be a function of the image size as well, which in my case is a bunch of 500x500 images. I'll wind up doing a pagination solution but it would be nice to max out the page lengths at the same time. This is a local page stored on my desktop referencing a bunch of local bmp files.
To throw some upper limit figures, both IE 10 and Chrome crashed with a page with 12,000 images.

A limit to the number of img tags is not part of the HTML standard. Browser implementations like Chrome and IE may have some arbitrary limits, but it would not be part of the HTML standard. 12,000 images is quite a few to store in RAM, and that could be your issue.
If each image were 1 MB, and your opened 12,000, that would be about 12 GB of RAM right there. I'm not sure what you're trying to do exactly, but HTML is not normally used for browser rendering of that size, that's supercomputing territory.

Related

Is it good to display an image as src="data: image/jpg;base64,...?

I have two ways to approach displaying an image on my webpage.
One way is a simple pass to the path of the image in src attribute.
Second way is to pass data in the src attribute
Will any problem occur in the future? Or a longer page loading time with second approach?
Suggestions please.
Loading the image in the page will be slower up front (larger html file), but faster overall (fewer requests to the server). Note that IE7 and lower have no support for this, and IE8 doesn't support images over 32k. (Source) Encoding in base64 also increases the image size by 1/3. (Source)
In my opinion, it makes sense for icons in CSS files, occasional small thumbnails, etc. But several large images should be loaded from normally.
If you have path: Use it.
First of all, data is not standard, so you can´t guarantee it works.
Second, you´ll get things like cache management free, without writing any code.
Yes the second approach is something like 8 times slower than the first one. That’s base64 encoding.
I use base64 to do small images, and the benefit of this is that they can be stored in a database. Use them on things like thumbnails, profile pictures etc... anything that is around 100 or 200 KB
EDIT: Sorry my bad... 37% bigger file size is base64

Should the number of html elements being loaded on a page be monitored?

I have an application that displays a page with 5-10000 rows in a table element and has a drop down menu that switches to other views with a similar structure. Currently I do an async request when switching between views (I also clear/remove the current view from the document) and load the appropriate list each time; however, I was thinking of
1) loading all views in the background before they are requested for viewing so they load instantly on click.
and
2) Just hiding a particular set of rows rather than removing it so if the client navigates back it will be instant as well.
This would mean that potentially 10s of thousands of html elements would be loaded into the current document; is this a problem? What if it was more than 10's of thousands?
Loading 10000+ HTML elements onto your page is not a very smart idea. If the user's computer is of normal to fast speed, the user may experience slight lag, and if the user's computer is slow, it may even cause a system crash (depending on the amount of RAM on the computer).
A technique you could explore is to lazyload the HTML elements - this means that when the user scrolls down to a particular portion of the page, then the elements are loaded via AJAX. (also known as "infinite scrolling").
This means that the browser's rendering engine does not have to render so many elements in one go, and that saves memory, and increases speed.
It depends on the HTML elements, but I would say as a rule of thumb, don't load upfront.
When I say it depends on the elements, I mean take a look at facebook for example. They load maybe 10 or 20 items into the feed, and then add more as you scroll, because each item is so rich in content (photos, videos, etc).
However on the flipside of that, think about how much info is in each row, if it's say, less than 500 bytes, 500 x 10000 = 5MB, which isn't an awful load for a web request, and if you can cache intelligently, maybe it will be a lot less than that.
The bottom line is, don't assume that the number of HTML elements matters so much, think about the amount of data that it amounts to. My advice, cross that bridge when you get there. Monitor your request times, if they are too long, come up with an AJAX page loader, which shouldn't be too hard.
The size of your html document would be conciderable...
I once got a page with 5000~10000 items (table rows) and the browser (IE) was rendering way too long (downloading the page, parsing and rendering it).
The best solution seems to me to set up a WebService with a LazyLoading system.
So IMHO yes, the number of element in a HTML document should be monitored.
Yes of course! The number of element in a HTML document can be monitored.! Use FireBug in Firefox!

Do PNG images slow down the render of an html?

I'm using several PNG images (via CSS) into a site's template, xhtml and CSS,.
I've kept the pngs as small as possible, and optimised as possible, but when testing it in any browser (Safari, Firefox, IEs) it takes at least 2 seconds to render.
Unfortunately I can't share the code here, but I can say that I've removed all javascript and my html code is fairly small (about 250 lines and no tables) and validates correctly.
I would like to know if the PNGs are the "guilty" part as this is my first site done almost exclusively with pngs (instead of gifs + jpegs) (I won't support IE6 so no need for hacks).
No, they do not take time to render (unless you have a really slow computer). What does take time, is the retrieving of a lot of small files. When you query a web-server for a small file, the time retrieving the file itself does not take long. But to setup the connection etc. etc. adds up.
So, what you should do, is to make what is called a "sprite". Combine all the small images to one large and "cut" them with CSS. How it is done and what it is exactly is explained here:
http://css-tricks.com/css-sprites/
and here
http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
If you have many images (doesn't have to be PNGs), then download times will be impacted. By default browsers have a limited number of threads to download content with (IIRC FF has 4), so the more images you have, the longer things will take.
Additionally, if you don't specify dimensions on your images, the browser can only correctly layout the page when an image arrives. It will need to reflow the layout for every such image, which is both expensive and time consuming.
In short, ensure you don't have too many images to download and that the HTML markup has the right dimensions for them.
One workaround for having many images is to use CSS sprites.
Check this link. Read under the "Optimize Images" tab.
Best Practices for Speeding Up Your Web Site
Speed up Images Load Time
I hope this was the thing you needed.

What is maximium resolution of a html page?

I have few question in this regard
When you create an internet page, does the program automatically create 75pdi?
Could we create 300DPI page could this be able communicate on internet ?
What is maximum DPI resolution you can get on a Web page?
Unless the entire web page is just an image file, web pages don't specify a resolution like that. HTML defines the layout and contents of the page, the video and printer drivers determine the resolution it is displayed or printed in.
Meaningless question, see #1.
See #2.
To answer your questions (I'm presuming you're talking about images on a web page, rather than the web page itself, which is created in HTML, etc.)
You should create the image at 72dpi. Most programs with 'Save for web' functionality should convert the image to 72dpi, but you may need to do this yourself.
You could put a 300dpi image on the web and it should display correctly in pretty much all browsers (and should print at the correspondingly higher resolution), but this is a bad idea as it'll be much slower to load/render, will consume bandwidth, etc. As such, I'd really recommend sticking to 72dpi. If you want a high resolution version of an image, link to the raw image file or create a (resolution independent PDF or SVG, etc.)
As above, there's no maximum (although the web site's visitors machines will eventually grind to a halt attempting to decode an 'n' DPI image).
Web displays graphics at 72dpi. If you make an image that is 300dpi, it's going to look much larger on the screen than was intended.
DPI, be it 72 or 300, is only relevant when going to an output device like a printer, talking about DPI for web graphics is meaningless. On the web, all images are shown 1::1. A pixel of data in the image is a pixel of data on the screen.
You can use any DPI you want for a web image. It makes no difference in how it will be displayed.
BUT - if you are working towards the web you can no longer measure things in inches, centimeters of picas. You need to start working with all dimensions in pixels. If you are viewing your graphics in Photoshop, make sure your view is set to 100%. Then you'll be seeing the same thing that will be displayed in the browser.
Everyones browser is different, so a conservative estimate for a static page design is that your page content should be about 900 pixels or so wide. (People are used to scrolling down, so your page height can be whatever you want).

How much faster is it to use inline/base64 images for a web site than just linking to the hard file?

How much faster is it to use a base64/line to display images than opposed to simply linking to the hard file on the server?
url(data:image/png;base64,.......)
I haven't been able to find any type of performance metrics on this.
I have a few concerns:
You no longer gain the benefit of caching
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
Let's define "faster" as in: the time it takes for a user to see a full rendered HTML web page
'Faster' is a hard thing to answer because there are many possible interpretations and situations:
Base64 encoding will expand the image by a third, which will increase bandwidth utilization. On the other hand, including it in the file will remove another GET round trip to the server. So, a pipe with great throughput but poor latency (such as a satellite internet connection) will likely load a page with inlined images faster than if you were using distinct image files. Even on my (rural, slow) DSL line, sites that require many round trips take a lot longer to load than those that are just relatively large but require only a few GETs.
If you do the base64 encoding from the source files with each request, you'll be using up more CPU, thrashing your data caches, etc, which might hurt your servers response time. (Of course you can always use memcached or such to resolve that problem).
Doing this will of course prevent most forms of caching, which could hurt a lot if the image is viewed often - say, a logo that is displayed on every page, which could normally be cached by the browser (or a proxy cache like squid or whatever) and requested once a month. It will also prevent the many many optimizations web servers have for serving static files using kernel APIs like sendfile(2).
Basically, doing this will help in certain situations, and hurt in others. You need to identify which situations are important to you before you can really figure out if this is a worthwhile trick for you.
I have done a comparison between two HTML pages containing 1800 one-pixel images.
The first page declares the images inline:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC">
In the second one, images reference an external file:
<img src="img/one-gray-px.png">
I found that when loading multiple times the same image, if it is declared inline, the browser performs a request for each image (I suppose it base64-decodes it one time per image), whereas in the other scenario, the image is requested once per document (see the comparison image below).
The document with inline images loads in about 250ms and the document with linked images does it in 30ms.
(Tested with Chromium 34)
The scenario of an HTML document with multiple instances of the same inline image doesn't make much sense a priori. However, I found that the plugin jquery lazyload defines an inline placeholder by default for all the "lazy" images, whose src attribute will be set to it. Then, if the document contains lots of lazy images, a situation like the one described above can happen.
You no longer gain the benefit of caching
Whether that matters would vary according to how much you depend on caching.
The other (perhaps more important) thing is that if there are many images, the browser won't get them simultaneously (i.e. in parallel), but only a few at a time -- so the protocol ends up being chatty. If there's some network end-to-end delay, then many images divided by a few images at a time multiplied by the end-to-end delay per image results in a noticeable time before the last image is loaded.
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
The file format / image compression algorithm is the same, I take it, i.e. it's PNG.
Using Base-64, each 8-bit character represents 6-bits: therefore binary data is being decompressed by a ratio of 8-to-6, i.e. only about 35%.
How much faster is it
Define 'faster'. Do you mean HTTP performance (see below) or rendering performance?
You no longer gain the benefit of caching
Actually, if you're doing this in a CSS file it will still be cached. Of course, any changes to the CSS will invalidate the cache.
In some situations this could be used as a huge performance boost over many HTTP connections. I say some situations because you can likely take advantage of techniques like image sprites for most stuff, but it's always good to have another tool in your arsenal!