Related
MY CONDITION
I have multiple pages index.html, menu.html and I have single stylesheet style.css.
More than half of styling code needed for index.html is not needed in menu.html.
MY CONFUSION
Shall I create 3 CSS files index.css, global.css, menu.css? Each stylesheets focused for corresponding HTML document and global.css for global stylings
MY DOUBT
Will doing so affect my webpage load performance or not? Here, we are just loading multiple files but the no. of lines altogether has been decreased significantly.
As Heretic Monkey pointed out the performance of a website is a very complicated topic and there are no easy answers. Sometimes single large bundle is better, sometimes few smaller files.
If you would like to split up your CSS into multiple files during development but still publish it as a single file on the website, then you could use a css preprosessor like SASS. Check it out: https://sass-lang.com/guide
Anyways it sounds like your project is pretty small so I would not worry about this kind of optimizations yet. It is pretty crazy how much css (and other resources) are used on modern websites.
For example this is one of 4 css bundles that this stackoverflow page is using: https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=c05ce93d5306
More important on the performance side is that the resources (css and others) are cached properly. That means that user only needs to download the css file once and after that the browser will use locally cached copy of the file. Caching can be configured on your web server or hosting service you are using.
Although keep in mind that caching might be bit risky and if configured wrong your visitor might end up running old styles or javascript. For example here is a good read on the topic: https://simonhearne.com/2022/caching-header-best-practices/
I like the idea of encapsulating my CSS into separate files. This also brings the added advantage of being able to easily minify the CSS. But I know performance is negatively impacted by the overhead needed to pull these separate files from the server.
To address the latter point, people often suggest inlining the style or at least putting the CSS in the HEAD of the html document. I'm not going to inline because then editing the style becomes a nightmare. I can consider putting it in the head to increase performance, but I do not want to put it in there minified. I won't be able to read it, and it will be a pain to have to adjust the CSS once minified.
So my question is, What is the better option -- in terms of performance -- between these two?
Minified external CSS file
CSS placed in the HEAD but not minified
You are not considering browser-side caching in your evaluation. It is almost ALWAYS better to serve up CSS in an external file for cases where you will be using the same CSS file throughout a multi-page website. The reason for this is that once the CSS is downloaded on first page visit, assuming you have expiry headers set properly, the browser will not need to download the CSS on subsequent page loads until the expiry TTL is passed. This even holds true across multiple user sessions on a website, such that if a user visits the sites some days/weeks later, they may not need to download the CSS at all. If you served up in-page CSS, it would need to be downloaded on every page load.
Also minifying is typically not that big of a performance boost, as most server to browser connections will perform text compression on transmitted content anyway.
Of course it is also usually much easier to maintain CSS in an external file as you have pointed out.
The best option would be to:
Minify them all and bundle them in the server side with something like bundles for Asp.Net or brewer for nodejs, that way you remove the overhead you mentioned above.
To expand on my comment:
Generally, when optimising web page loading, you want to minimise the number of HTTP requests that the browser makes as these are expensive, time-wise; even requests for small files require the browser to send its request to a server, wait for the response, and then act accordingly. From that perspective, the best thing would be to put all the code for your page into a single file. However, this would be a page maintenance nightmare, and it also fails to take into account caching of resources by browsers, as covered by #MikeBrant.
A single css file (potentially composed of several concatenated minified files) is a good compromise between separation of style (css) and content (html), and performance. The same applies to javascript. You can also consider using a content delivery network (CDN) for Javascript if you're using a common library like JQuery as the user's browser may already have the library cached from visiting another site. Google's CDN serves a number of useful libraries.
Generally, you'll get far bigger performance gains from optimising images, enabling server compression, and removing extraneous javascript than you will from minification or inlining CSS. Images are almost always the "heaviest" elements of a page, and it is often very easy to reduce image size by 20-50% and maintain decent quality.
I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!
After reading up on critical path css, I was wondering how I could embed this into my builds. Are there any finished tools out there that does this already? The process needs to be automatable to avoid the inline CSS getting out of sync with other CSS.
If there is no such tool today, I can see how I could make one (say a grunt plugin), using this experimental script together with PhantomJS, but there is no point in re-inventing the wheel (if there is one already).
I had exactly the same idea - if you're still looking, I built exactly what we both wanted:
Critical Path CSS Generator. (I didn't end up using the tool you linked too since it misses psuedo selectors, media queries, non -webkit prefixed css rules etc).
More documentation is on the way, but basically just install PhantomJS first and then call the script like this:
phantomjs penthouse.js http://youSite.com/page1 yourSite.css > yourSite-criticalcss-page1.css
phantomjs penthouse.js http://youSite.com/page2 yourSite.css > yourSite-criticalcss-page2.css
You can pass in minified CSS as well as unminified - I don't modify the CSS except for removing unmatched selectors, rules (and I remove comments).
Use IISpeed or the Apache/Nginx PageSpeed modules
Google maintains some wonderful modules called PageSpeed that works for Apache and Nginx front servers. For those on .NET, just use IISpeed, the IIS equivalent of the PageSpeed modules. It is commercial and costs 100$, but is quite marvelous from a front-end perspective in what it does, and (among lots of other stuff) handles the main problem when using Penthouse: dealing with changing/dynamic content generation.
It works by injecting some javascript into the head of some of the first visitors to any page, analysing which css rules are actually being used. Then, after some rounds, it then collects these css rules and injects them as inline css in the head of that page for all subsequent visitors.
This is totally automatic and works on any ASP.NET page. You then avoid having to manually run Penthouse (mentioned above) on every page you like to speed up, and remembering to keep that css up to date (otherwise it will be out of date at some time, messing up your styles).
Penthouse is still great for pages where the content is mostly static.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
This question's answers are a community effort. Edit existing answers to improve this post. It is not currently accepting new answers or interactions.
Is there a online tool that we can input the HTML source of a page into and will minify the code?
I would do that for aspx files as it is not a good idea to make the webserver gzip them...
Perhaps try HTML Compressor, here's a before and after table showing what it can do (including for Stack Overflow itself):
It features many selections for optimizing your pages up to and including script minimizing (ompressor, Google Closure Compiler, your own compressor) where it would be safe. The default option set is quite conservative, so you can start with that and experiment with enabling more aggressive options.
The project is extremely well documented and supported.
Don't do this. Or rather, if you insist on it, do it after any more significant site optimizations are complete. Chances are very high that the cost/benefit for this effort is negligible, especially if you were planning to manually use online tools to deal with each page.
Use YSlow or Page Speed to determine what you really need to do to optimize your pages. My guess is that reducing bytes of HTML will not be your site's biggest problem. It's much more likely that compression, cache management, image optimization, etc will make a bigger difference to the performance of your site overall. Those tools will show you what the biggest problems are -- if you've dealt with them all and still find that HTML minification makes a significant difference, go for it.
(If you're sure you want to go for it, and you use Apache httpd, you might consider using mod_pagespeed and turning on some of the options to reduce whitespace, etc., but be aware of the risks.)
Here is a short answer to your question: you should minify your HTML, CSS, JS. There is an easy to use tool which is called grunt. It allows you to automate a lot of tasks. Among them JS, CSS, HTML minification, file concatenation and many others.
The answers written here are extremely outdated or even sometimes does not make sense. A lot of things changed from old 2009, so I will try to answer this properly.
Short answer - you should definitely minify HTML. It is trivial today and gives approximately 5% speedup. For longer answer read the whole answer
Back in old days people were manually minifying css/js (by running it through some specific tool to minify it). It was kind of hard to automate the process and definitely required some skills. Knowing that a lot of high level sites even right now are not using gzip (which is trivial), it is understandable that people were reluctant in minifying html.
So why were people minifying js, but not html? When you minify JS, you do the following things:
remove comments
remove blanks (tabs, spaces, newlines)
change long names to short (var isUserLoggedIn to var a)
Which gave a lot of improvement even at old days. But in html you were not able to change long names for short, also there was almost nothing to comment during that time. So the only thing that was left is to remove spaces and newlines. Which gives only small amount of improvement.
One wrong argument written here is that because content is served with gzip, minification does not make sense. This is totally wrong. Yes, it makes sense that gzip decrease the improvement of minification, but why should you gzip comments, whitespaces if you can properly trim them and gzip only important part. It is the same as if you have a folder to archive which has some crap that you will never use and you decide to just zip it instead of cleaning up and zip it.
Another argument why it pointless to do minification is that it is tedious. Maybe this was true in 2009, but new tools appeared after this time. Right now you do not need to manually minify your markup. With things like Grunt it is trivial to install grunt-contrib-htmlmin (relies on HTMLMinifier by #kangax) and to configure it to minify your html. All you need is like 2 hours to learn grunt and to configure everything and then everything is done automatically in less than a second. Sounds that 1 second (which you can even automate to do nothing with grunt-contrib-watch) is not really so bad for approximately 5% of improvement (even with gzip).
One more argument is that CSS and JS are static, and HTML is generated by the server so you can not pre-minify it. This was also true in 2009, but currently more and more sites are looking like a single page app, where the server is thin and the client is doing all the routing, templating and other logic. So the server is only giving you JSON and client renders it. Here you have a lot of html for the page and different templates.
So to finish my thoughts:
google is minifying html.
pageSpeed is asking your to minify html
it is trivial to do
it gives ~5% of improvement
it is not the same as gzip
I wrote a web tool to minify HTML. http://prettydiff.com/?m=minify&html
This tool operates using these rules:
All HTML comments are removed
Runs of white space characters are converted to single space characters
Unnecessary white space characters inside tags are removed
White space characters between two tags where one of these two tags is not a singleton is removed
All content inside a style tag is presumed to be CSS and is minified as such
All content inside a script tag is presumed to be JavaScript, unless provided a different media type, and then minified as such
The CSS and JavaScript minification uses a heavily forked form of JSMin. This fork is extended to support CSS natively and also support SCSS syntax. Automatic semicolon insertion is supported for JavaScript minification, however automatic curly brace insertion is not yet supported.
This worked for me:
http://minify.googlecode.com/git/min/lib/Minify/HTML.php
It's not an already available online tool, but being a simple PHP include it's easy enough you can just run it yourself.
I would not save compressed files though, do this dynamically if you really have to, and it's always a better idea to enable Gzip server compression.
I don't know how involved that is in IIS/.Net, but in PHP it's as trivial as adding one line to the global include file
CodeProject has a published sample project (http://www.codeproject.com/KB/aspnet/AspNetOptimizer.aspx?fid=1528916&df=90&mpp=25&noise=3&sort=Position&view=Quick&select=2794900) to handle some of the following situations...
Combining ScriptResource.axd calls into a single call
Compress all client side scripts based on the browser capability including gzip/deflate
A ScriptMinifier to remove comments, indentations, and line breaks.
An HTML compressor to compress all html markup based on the browser capability including gzip/deflate.
And - most importantly - an HTML Minifier to write complete html into single line and minify it at possible level (under construction).
For Microsoft .NET platform there is a library called the WebMarkupMin, which produces the minification of HTML code.
In addition, there is a module for integration this library into ASP.NET MVC - WebMarkupMin.Mvc.
try http://code.mini-tips.com/html-minifier.html, this is .NET Libary for Html Minifier
HtmlCompressor is a small, fast and very easy to use .NET library that minifies given HTML or XML source by removing extra whitespaces, comments and other unneeded characters without breaking the content structure. As a result pages become smaller in size and load faster. A command-line version of the compressor is also available.