Is there a "law of diminishing returns" for converting images to Base64 as opposed to simply using the images themselves? - html

Say I have some icons on my site (32x32, 64x64, 128x128, etc.), then converting them to Base64 makes sense, right?
Now say I have some high resolution images that are 2MB, 3MB, 4MB, and greater that I am using on my site. Does it make sense to convert these larger images to Base64 or is it "smarter" to simply keep using them as .jpg/.png/.gif/etc.?
If there is such a "law", "rule of thumb", etc. what is the line in the sand?
EDIT: While this post was marked as a duplicate, the linked "original" is from over 10 years ago; browser technology, computers, and the web itself, has changed significantly since then. What would be the answer for today's technology?

The answer to the questions is yes, and it depends.
If we rephrase the question to: Does the law of diminishing returns apply to using base64 for embedding images in a page?
a) Yes, the law applies
b) It depends on: Image count and size, and your setup (ie HTTP (HTTP/2?) connection type, etc)
The reason being that more images require more connections imply more handshakes, unless you are using keep alive connections or HTTP/2 streaming. If the images are bigger and require more computing to convert from base64 back to binary (plus decompression), then the bandwidth saves come with CPU expense.
In general, if you have lots of images (icons, for example), you could embed as base64. But in that case you also have the following options:
a) Image Atlas: Converting all small images to a single image (one load) and showing only the portion that you need through the page.
b) Converting to alternative formats, such as fonts or SVG, and again rendering what you need. Example: Open Iconic.

Related

Is it good to display an image as src="data: image/jpg;base64,...?

I have two ways to approach displaying an image on my webpage.
One way is a simple pass to the path of the image in src attribute.
Second way is to pass data in the src attribute
Will any problem occur in the future? Or a longer page loading time with second approach?
Suggestions please.
Loading the image in the page will be slower up front (larger html file), but faster overall (fewer requests to the server). Note that IE7 and lower have no support for this, and IE8 doesn't support images over 32k. (Source) Encoding in base64 also increases the image size by 1/3. (Source)
In my opinion, it makes sense for icons in CSS files, occasional small thumbnails, etc. But several large images should be loaded from normally.
If you have path: Use it.
First of all, data is not standard, so you can´t guarantee it works.
Second, you´ll get things like cache management free, without writing any code.
Yes the second approach is something like 8 times slower than the first one. That’s base64 encoding.
I use base64 to do small images, and the benefit of this is that they can be stored in a database. Use them on things like thumbnails, profile pictures etc... anything that is around 100 or 200 KB
EDIT: Sorry my bad... 37% bigger file size is base64

How to calculate overhead of a web request?

Suppose there is a fancy button to be put on a website. And the design of the button is such that parts of it can be sliced and applied as a repeating background.
I often slice the images and apply them as a repeating backgrounds this way. So one button in an image is split into several different images. I do this to reduce the size of the images used.
My team leader told me not to slice the images. If you slice a button into three parts, there would be three web requests. And this will slow down the site.
I find it hard to believe that the overhead of three requests would be more than using the entire image. So I just want to know how to calculate the amount of bytes transferred per web request.
You could use the net tab on firebug to see the time taken for each web request, its also broken down into the time it takes to download each component in the response.
If you reuse the three images a lot in the site than you will save requests and bandwidth (your gut reaction). However the three images need to be initially retrieved -- so you must reuse them.
The crux is this: Due to HTTP pipelining the overhead is pretty much negligible (especially since you consider that this is HTTP -- a string based protocol). Three images being retrieved probably have the same latency as retrieving a single image.
-- edit: after comment from Ericlaw --
Pipelining is indeed not widely supported. However this does not mean you don't stand to gain from three images. Just make sure you reuse them A LOT throughout your website.
-- edit --
Browsers also open multiple connections to a server, the norm is 2 connections last I checked -- however, I believe recent browsers open more.

How much faster is it to use inline/base64 images for a web site than just linking to the hard file?

How much faster is it to use a base64/line to display images than opposed to simply linking to the hard file on the server?
url(data:image/png;base64,.......)
I haven't been able to find any type of performance metrics on this.
I have a few concerns:
You no longer gain the benefit of caching
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
Let's define "faster" as in: the time it takes for a user to see a full rendered HTML web page
'Faster' is a hard thing to answer because there are many possible interpretations and situations:
Base64 encoding will expand the image by a third, which will increase bandwidth utilization. On the other hand, including it in the file will remove another GET round trip to the server. So, a pipe with great throughput but poor latency (such as a satellite internet connection) will likely load a page with inlined images faster than if you were using distinct image files. Even on my (rural, slow) DSL line, sites that require many round trips take a lot longer to load than those that are just relatively large but require only a few GETs.
If you do the base64 encoding from the source files with each request, you'll be using up more CPU, thrashing your data caches, etc, which might hurt your servers response time. (Of course you can always use memcached or such to resolve that problem).
Doing this will of course prevent most forms of caching, which could hurt a lot if the image is viewed often - say, a logo that is displayed on every page, which could normally be cached by the browser (or a proxy cache like squid or whatever) and requested once a month. It will also prevent the many many optimizations web servers have for serving static files using kernel APIs like sendfile(2).
Basically, doing this will help in certain situations, and hurt in others. You need to identify which situations are important to you before you can really figure out if this is a worthwhile trick for you.
I have done a comparison between two HTML pages containing 1800 one-pixel images.
The first page declares the images inline:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC">
In the second one, images reference an external file:
<img src="img/one-gray-px.png">
I found that when loading multiple times the same image, if it is declared inline, the browser performs a request for each image (I suppose it base64-decodes it one time per image), whereas in the other scenario, the image is requested once per document (see the comparison image below).
The document with inline images loads in about 250ms and the document with linked images does it in 30ms.
(Tested with Chromium 34)
The scenario of an HTML document with multiple instances of the same inline image doesn't make much sense a priori. However, I found that the plugin jquery lazyload defines an inline placeholder by default for all the "lazy" images, whose src attribute will be set to it. Then, if the document contains lots of lazy images, a situation like the one described above can happen.
You no longer gain the benefit of caching
Whether that matters would vary according to how much you depend on caching.
The other (perhaps more important) thing is that if there are many images, the browser won't get them simultaneously (i.e. in parallel), but only a few at a time -- so the protocol ends up being chatty. If there's some network end-to-end delay, then many images divided by a few images at a time multiplied by the end-to-end delay per image results in a noticeable time before the last image is loaded.
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
The file format / image compression algorithm is the same, I take it, i.e. it's PNG.
Using Base-64, each 8-bit character represents 6-bits: therefore binary data is being decompressed by a ratio of 8-to-6, i.e. only about 35%.
How much faster is it
Define 'faster'. Do you mean HTTP performance (see below) or rendering performance?
You no longer gain the benefit of caching
Actually, if you're doing this in a CSS file it will still be cached. Of course, any changes to the CSS will invalidate the cache.
In some situations this could be used as a huge performance boost over many HTTP connections. I say some situations because you can likely take advantage of techniques like image sprites for most stuff, but it's always good to have another tool in your arsenal!

sprites vs image slicing

I don't have much experience with the sprite approach to images (http://www.alistapart.com/articles/sprites). Anyone care to share some pros/cons of sprites vs. old-school slices?
The main advantage of sprites is that the browser has to request less pictures from the webserver. That reduces the number of HTTP requests and makes it possible to compress the parts of the design more effectively. These two points also represent the disadvantages of sliced images.
Here you can see some good examples how sprites improve the loading speed of web pages:
http://css-tricks.com/css-sprites/
Pros:
It's far easier on the server to serve a single large image than many small ones.
It's (slightly) faster for a web browser to load such an image.
Browsers only load images as they needs them - if you are using multiple images in a rollover, the browser would "pause" the first time you roll over the element. This can be solved using sprites, because there is only one image to load.
Cons:
It's kind of a pain to code (more so than using multiple images at least)
One often overlooked downside of using CSS sprites is memory footprint:
https://web.archive.org/web/20130605000516/http://blog.vlad1.com/2009/06/22/to-sprite-or-not-to-sprite/
Unless the sprite image is carefully constructed, you end up with
incredible amounts of wasted space. My favourite example is from WHIT
TV’s web site, where this image is used as a sprite. Note that
this is a 1299×15,000 PNG. It compresses quite well — the actual
download size is around 26K — but browsers don’t render compressed
image data. When this image is downloaded and decompressed, it will
use almost 75MB in memory (1299 * 15000 * 4).
When sprites get loaded into the browser, they are stored uncompressed. A 26 KB file can uncompress to take up a whopping 75 MB of RAM. You should be mindful of using sprites with very large dimensions.
There's also the issue of what happens in browsers with poor CSS support (legacy browsers). The sprites may end up totally broken.
Sprites
Pros:
Less HTTP connections to the server
Faster loading on broadband
Cons:
No encapsulation: If you want to change one image, you have to change the sprite
It is difficult to setup individual images in CSS without a tool
Don't degrade: If the browser don't support CSS, you are in trouble
CSS Sprites:
Pros:
Graceful degrade in unsupported browsers (text can be shown when background images for links are not allowed)
Fewer HTTP requests
Each image has a separate overhead like color table so image slicing will be having more overhead than CSS sprites
Cons:
Poses a problem if images are turned off in the browsers (rare case though)
Image slicing:
Pros:
User perceives a faster load since loaded piece by piece.
Load on demand like when the user places his mouse on the image
Cons:
The webpages might have a large size on the client side even thought it might not be the case on the server side.
The main drawback of sprites is it makes it hard to read/maintain/modify your CSS. It can be difficult to remember the exact pixel offsets within the sprite.
pros using sprites :
since it is using 1 images for all, it require less load on http server.
cons:
- hard to code. you must know the coordinate each images inside sprites so you can display it correctly. once you change the size of the image, you need to adjust all ...
- big images could creates long waited page to display. while using images, user with slow internet connection can see one by one.
best practices.
use it for example roll over images.
Look into using a CSS sprite generator (we use SmartSprites). That way you can do slices locally, and have your build process generate a spritemap. It's the best of both worlds.
Also is SmartSprites isn't for you, there's definitely others, however I like it because it reduces the amount of work up front AND during changes.
Cons
- slower on older browsers/ maybe not working on them with hover effect (opera6)
- if not used correctly can get very/too huge (group them adequately!)
- tedious work to set them up
Pros
- less bytes transfered, because one big image is smaller then all individual images combined (one header/ color table)
- less http requests
I prefer going the middle ground of grouping similar images (normal, hover, selected page, the parent page of selected page) than having all the images in one file. To make these, you image slice like normal in Photoshop or Illustrator, open the files up and combine them with a shortcut key. I wrote the Photoshop script that combines images into CSS sprites. You will have multiple HTTP connections, but won't have the load delay on hover.

How can I improve loading times on a static HTML site?

We are working on a website and noticed that the GIF images(100kb - 200kb) are loading very slowly.
The site is a static site with CSS/HTML.
Does any one have any pointers to why the images might be loading slowly?
Would using JPGs improve the performance?
Here is the HTML code for that image:
<div><img src="images/mainImg_3.gif">
They're loading slow because they're huge. 200KB is a very big image file. I don't know exactly what the going recommendation is for web images, but it's a very good idea to keep them under 50K.
GIF images are not very efficient for photographic images. You should experiment with other formats like JPG and PNG to see if you can get the same quality with smaller file size. You should be able to shrink the file size quite a bit while retaining the quality.
Another trick: use thumbnails. Save two version of each image, one 25% the size (by resolution) of the other. Your site visitors can click the thumbnail if they want to see more. This will speed loading times and reduce your bandwidth bill.
Are your images properly sized? If you displaying them on your webpage as 300x300 pixels make sure the original image is the same size.
This helps two-fold, one its less data to download, and 2 it doesn't require extra processing power for the browser to resize. Additionally, the image will look crisper if its the exact size.
As far as the difference between GIF and JPG, (with the exception of transparency, which JPG doesn't support), it boils down to what your image contains. GIF uses a color table and a map to that color table to store the image, while JPG uses a compression algorith. So, if you image contains few color variation you will get a smaller filesize from a GIF. Conversly, if you have a photograph with lots of color variation you'll want to use a JPG.
Also take a look at YSlow
It will analyze your site for you and tell you where the bottlenecks might be.
Enable caching for image files (the example below also adds css and js caching) which will ensure users don't download files twice. If you are using apache 1.3 or 2:
ExpiresActive On
<FilesMatch "\.(ico|gif|jpe?g|png|js|css)$">
ExpiresDefault "access plus 1 year"
</FilesMatch>
It is probably a reasonable idea to enable gzip compression for html and css. In apache 2:
SetOutputFilter DEFLATE
and in apache 1.3:
mod_gzip_on Yes
It depends on how large the image is, if you're displaying wallpaper-sized images at 100kb, that's not too bad. If you're displaying thumbnails of this size, then you have a problem.
JPEG images are lossy but can be compressed with ease. Depending on how much compression you select, you can really decrease image sizes with JPEG.
You should specify the height and width attributes for the img tag, see below for the W3C Schools explanation of why you should do this. And for further info see http://www.codinghorror.com/blog/archives/000807.html for further more radical techniques.
Tip: It is a good practice to specify
both the height and width attributes
for an image. If these attributes are
set, the space required for the image
is reserved when the page is loaded.
However, without these attributes, the
browser does not know the size of the
image, and cannot reserve the
appropriate space to it. The effect
will be that the page layout will
change during loading (while the
images load).
Tip: Do not rescale images with the
height and width attributes!
Downsizing a large image with the
height and width attributes forces a
user to download the large image (even
if it looks small on the page). The
correct method is to rescale the image
with a program, before using it on a
page.
This is copied from http://www.w3schools.com/tags/att_img_height.asp
The fastest loaded resource is always the resource that doesn't need to be loaded at all. I.e. apart from shrinking your images to reasonable sizes you should read about HTTP caching.
You should instruct your web server to deliver the responses with proper caching informations so that user agents may reuse local cached versions.
Mark Nottingham wrote a tutorial about HTTP caching. It's a good starting point.
And this is a tutorial about apache configuration on HTTP chaching.
Look into the free tool: Smush It!
It's co-developed by Stoyan Stephanov, Nicole Sullivan (of Yahoo!) and incorporates every tidbit for images from YSlow (from Yahoo!) and the Yahoo Developer Network findings.
It will analyze your image(s) and determine from a set of server side tools what the optimal image type is (e.g. PNG8, PNG24, GIF, JPG, etc.) and also create the optimized image... e.g. even if you feed it a PNG image, it will find the best method to compress it, and "Smush It" to its smallest possible file size.
Then take the output image, and then serve it up from a (cookie-less domain) if you can, preferably on a CDN, with far-future expires headers, with gzip compression.