To reduce the number requests on the server I have embedded some images (PNG & SVG) as BASE64 directly into the css. (Its automated in the build process)
like this:
background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAFWHRTb2Z0d2FyZQBBZG etc...);
Is this a good practice? Are there some reasons to avoid this? Are there some major browser that don't have data url support?
Bonus question:
Does it make sense to do this for the CSS & JS also?
Is this a good practice? Are there some reasons to avoid this?
It's a good practice usually only for very small CSS images that are going to be used together (like CSS sprites) when IE compatibility doesn't matter, and saving the request is more important than cacheability.
It has a number of notable downsides:
Doesn't work at all in IE6 and 7.
Works for resources only up to 32k in size in IE8. This is the limit that applies after base64 encoding. In other words, no longer than 32768 characters.
It saves a request, but bloats the HTML page instead! And makes images uncacheable. They get loaded every time the containing page or style sheet get loaded.
Base64 encoding bloats image sizes by 33%.
If served in a gzipped resource, data: images are almost certainly going to be a terrible strain on the server's resources! Images are traditionally very CPU intensive to compress, with very little reduction in size.
Common answers here seems to suggest this is not needed, for a set of legit reasons.
However, all of these seems to neglect modern apps behavior and build process.
It's not impossible (and actually quite easy) to design a simple process that will walk through a folder images and will generate a single CSS with all the images of this folder.
This css will be fully cached and will dramatically reduce round trips to the server, which is as correctly suggest by #MemeDeveloper one of the biggest performance hits.
Sure, It's hack. no doubt. same as sprites are a hack. In perfect world this will not be needed, until then, it's a possible practice if what you need to fix is:
Page with multiple images that are not easily "spritable".
Round trip to servers are an actual bottleneck (think mobile).
speed (to the milliseconds level) is really that important for your use case.
You don't care (as you should, if you want the web to go forward) about IE5 and IE6.
my view.
It's not a good practice. Some browsers are not supporting data URIs (e.g. IE 6 and 7) or support is limited (e.g. 32KB for IE8).
See also this Wikipedia article for complete details on the Data URI disadvantages:
Data URI scheme
Disadvantages
Data URIs are not separately cached from their containing documents (e.g. CSS or HTML files) so data is downloaded every time the containing documents are redownloaded.
Content must be re-encoded and re-embedded every time a change is made.
Internet Explorer through version 7 (approximately 15% of the market as of January 2011), lacks support.
Internet Explorer 8 limits data URIs to a maximum length of 32 KB.
Data is included as a simple stream, and many processing environments (such as web browsers) may not support using containers (such as multipart/alternative or message/rfc822) to provide greater complexity such as metadata, data compression, or content negotiation.
Base64-encoded data URIs are 1/3 larger in size than their binary equivalent. (However, this overhead is reduced to 2-3% if the HTTP server compresses the response using gzip)
Data URIs make it more difficult for security software to filter content.
I was using data-uri's for about a month, and Ive just stopped using them because they made my stylesheets absolutely enormous.
Data-uri's do work in IE6/7 (you just need to serve an mhtml file to those browsers).
The one benefit I got from using data-uri's was that my background images rendered as soon as the stylesheet was downloaded, as opposed to the gradual loading we see otherwise
It's nice that we have this technique available, but I won't be using it too much in the future. I do recommend trying it out though, just so you know for yourself
I'd more inclined to use CSS Sprites to combine the images and save on requests. I've never tried the base64 technique but it apparently doesn't work in IE6 and IE7. Also means that if any images changes then you have to redeliver the whole lost, unless you have multiple CSS files, of course.
I have no idea about general best practices but I for one would not like to see that kind of thing if I could help it. :)
Web browsers and servers have a whole load of caching stuff built in so I would have thought your best bet was to just get your server to tell the client to cache image files. Unless you are having loads of really small images on a page then I wouldn't have thought the overhead of multiple requests was that big a deal. Browsers generally will use the same connection to request lots of files so there are no new network connections being established so unless the volume of traffic through HTTP headers is significant compared to the size of the image files I wouldn't worry about multiple requests too much.
Are there reasons why you think there are too many requests going to the server at the moment?
I would suggest it for tiny images that are used very often, for example common icons of a web application.
Tiny, because the Base64 encoding increases the size
Often used, because this justifies the longer initial load time
Of course support problems with older browsers have to be kept in mind. Also it might be a good idea to use the capability of a framework to automatically inline the images a data urls such as GWT's ClientBundle or at least use CSS classes instead of adding it to the element's style directly.
More information are gathered here: http://davidbcalhoun.com/2011/when-to-base64-encode-images-and-when-not-to/
Related
Google Pagespeed complains when you have blocking CSS in an external file. In HTTP/1 this probably makes sense, but what about now with HTTP/2 ?
If you inline critical CSS (above the fold), that bytes still need to download, parse and everything else, all before the document renders.
With HTTP/2, there is no need to make another connection since the same can be reused, so that is not an overhead. Plus, with server push you can even push the CSS file before it's being requested.
So... is inlining critical CSS still a recommended thing?
I agree that in heavy sites, you probably don't want to download all CSS. For example if you are visiting the gallery, you would only need gallery.css, not profile.css, not forum.css, etc. But that is manageable with chunks and other techniques (and still using external css files, no need to inline them)
Inlining also makes CSS not cacheable.
I am missing something?
This has nothing to do with the possible duplicate question. Whoever marked this as duplicated has no idea about what critical CSS is or probably didn't even have read this question.
Yes it can still help. A lot.
When you download the HTML you still need to wait for that to download, then process it, and then make another request for the CSS file(s) and wait for those to download.
While downloading additional resources is quicker under HTTP/2, and there is not as much of a bottleneck when you have a lot of additional resources to download when using it, the CSS file still can't be requested until the HTML file has downloaded and been processed. Additionally the CSS file is usually prioritised by the browser, since it's render blocking, so usually will be one of the first resources requested meaning the avoidance of head of line blocking for HTTP/2 is not as beneficial for CSS resources.
When HTTP/2 Push becomes more common place it may not have as much impact as requests for the HTML page can also push the CSS file needed for that, but that's added complexity and there's still some questions as to how that will work (e.g. if the browser already has the CSS file then server should somehow know not to push it).
I wrote a blog post on this topic if you want more detail on this (and this is on a HTTP/2 site): https://www.tunetheweb.com/blog/inlining-css-is-not-for-me/ . I'm still not a big fan of this process as I explain in that post...
Don't inline it if it doesn't make sense. I guess inlining ten lines of css won't kill you, but inlining the equivalent of 18 kb of gzip-compressed CSS is just madness.
Just use HTTP/2 Push to be sure the browser gets the CSS as early as possible.
Worst case HTTP/2 Push will push the resource multiple times, but browsers reset pushed streams which they consider are fresh in cache (use etags). So worst case HTTP/2 Push is still slightly better than inlining.
The cache issue with HTTP/2 Push is largely attenuated using a cache digests polyfill (until the real thing be available in browsers):
https://www.shimmercat.com/en/blog/articles/cache-digests/
For a quick video introduction to cache digests: https://www.youtube.com/watch?v=Zq1YF3ri98k
We use it in production and we are very satisfied.
As a general note, take automatic optimization recommendations with a grain of salt until they complete their transitions to HTTP/2.
I am missing something?
You're right in general, for the modern browsers and http 2. For the mobile and old browsers, for gprs or other slow high latency connections - not exactly. In some cases, you can get an advantage of fewer documents downloaded and improving browser parsing speed by inlining .css. Additionally, if you're dynamically adding the .css and something goes wrong, the inlined .css still works, the same is right for any resources could be inlined in html.
Its not really about to download all files, its about how long it takes to download this files.
If you are using critical CSS the CSS(Design) is directly shown, so its really faster then first downloading the CSS Files because the CSS is directly in the <head> of your Page so the Page is shown directly without blocking Resources ( e.g download the big CSS File ).
If you have the CSS File which is for Example 5MB big, the Browser first has to download this file until the Design is shown.
In order to find a good internship, I am engaged in answering these questions, There is one question which I am not clear what it says, I have never heard relevant information before, so I want to know What is progressive rendering?
Progressive rendering is the name given to techniques used to render content for display as quickly as possible.
It used to be much more prevalent in the days before broadband internet but it's still useful in modern development as mobile data connections are becoming increasingly popular (and unreliable!)
Examples of such techniques :
Lazy loading of images where (typically) some javascript will load an image when it comes into the browsers viewport instead of loading all images at page load.
Prioritizing visible content (or above the fold rendering) where you include only the minimum css/content/scripts necessary for the amount of page that would be rendered in the users browser first to display as quickly as possible, you can then use deferred javascript (domready/load) to load in other resources and content.
This pertains to the idea of sending data in chunks when doing Server side rendering rather than all at once. The advantage is reduced TTFB (Time to first Byte) i.e. the time period between the browser making the HTTP request and it receiving the first byte.
See below 'JSConf' talk for a detailed explanation:
https://www.youtube.com/watch?v=aRaQe9n1lPk
Progressive rendering is a concept to render content to the browser optimally by rendering the critical content first and the non-critical parts progressively as needed by the user. This article explains it very nicely.
I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!
Sometimes on completely valid browsers, but a hindered Internet connection, the webpage loads without some of the external css files, resulting in a ugly webpage.
Is there a way to prevent this without resorting to embedding all of the css in the html?
I guess you might be hitting the timeout when hitting the CSS file. You might try caching the CSS file on the client side by using far future headers. And minify the CSS so it has a small file size and can be quickly grabbed.
Try to use less css files as much as possible because ever single css files send different http request so, when there are less css files that means less http request .Which automatically increase the speed & minify the css also .
A general rule of performance is to reduce the number of HTTP transactions. This is particularly important in these days of add-ins. Each HTTP transaction adds an overhead of about 1kB up and down by adding the headers. It adds load on the server and delays rendering. It also opens up the risk of network timeouts -- especially a problem on 3G phone networks.
Regarding CSS, it's better to have a single larger file than lots of smaller ones to avoid exactly the problems you're experiencing. If you minify the file -- but don't optimise it -- it will also get rid of the comments and white space.
Similarly it's worth combining jQuery addins into a single file for the same reasons.
Anyone know how I can reduce my page size while it's loading? Like anyway I can reduce my CSS file's size and Javascript file's size?
Gzip compression
Minification
Combining multiple requests into one
GZIP compression is effective on text-based responses such as stylesheets and HTML, but not images.
Minification is effective on CSS and Javascript (more so on Javascript). There are various solutions available depending on your language and framework. This obfuscates the code that is sent by removing whitespace and shortening identifier names where possible.
Combining multiple requests into one cuts down a lot on latency, even if it doesn't save much bandwidth. It results in a faster perceived load time for users. You can combine multiple CSS files into one, which is easy. With images, you can use a technique known as sprites.
YUI's Compressor is one option.
YSlow is decent for getting an idea as to why it's slow.
It does tend to make recommendations for large sites rather than small sites, so apply common sense.
Do you mean bytes transferred? If so, you would use GZIP compression of the HTML. In Apache, use mod_deflate. There are also servlet filters out there.
w3compiler is a tool that will do exactly what you want.
There is a great book on this called High Performance Web Sites. It's an O'Reilly book that takes you through the various steps of client side performance upgrades. It covers the topics you suggest such as CSS and JS file sizes amongst other things.