reduce page size - html

Anyone know how I can reduce my page size while it's loading? Like anyway I can reduce my CSS file's size and Javascript file's size?

Gzip compression
Minification
Combining multiple requests into one
GZIP compression is effective on text-based responses such as stylesheets and HTML, but not images.
Minification is effective on CSS and Javascript (more so on Javascript). There are various solutions available depending on your language and framework. This obfuscates the code that is sent by removing whitespace and shortening identifier names where possible.
Combining multiple requests into one cuts down a lot on latency, even if it doesn't save much bandwidth. It results in a faster perceived load time for users. You can combine multiple CSS files into one, which is easy. With images, you can use a technique known as sprites.

YUI's Compressor is one option.
YSlow is decent for getting an idea as to why it's slow.
It does tend to make recommendations for large sites rather than small sites, so apply common sense.

Do you mean bytes transferred? If so, you would use GZIP compression of the HTML. In Apache, use mod_deflate. There are also servlet filters out there.

w3compiler is a tool that will do exactly what you want.

There is a great book on this called High Performance Web Sites. It's an O'Reilly book that takes you through the various steps of client side performance upgrades. It covers the topics you suggest such as CSS and JS file sizes amongst other things.

Related

BEM HTML Byte Size... does it matter?

I’m working within a CMS which has a limit on a include field of 32,000 characters (I can include multiple includes on each page/template).
Using BEM our header/navigation element has blown the 32000 limit which has led me to question BEM and html performance.
My research so far points to it doesn’t matter too much, css performance does but I’m uneasy asking for the limit to be increased. (45000 would be sufficient) but this is loaded on every page before any content.
The number of links are hidden away with a mega drop down menu and has evolved mainly to support seo requests but in time some may be dropped into the footer but for the time being I need to have indexable links within the header.
Any advice would be greatly appreciated. I am running minify / gzip compression at a server level.
You should not pay attention to plain text size. Only gziped size matters. And as research shows BEM naming after gzip is almost identical to manually minified classnames.
So you may use BEM without any worries.
General suggestions regarding performance in term of bandwidth and parsing:
Minify all your CSS and JS
Compress your HTML and send files using gzip
Leverage browser caching (set your headers) and consider using CDNs for JS/CSS common libraries
Optimize any images
Use shorten name convention for your BEM classes
Analyse your site using tools as Google Page Speed or similar software and incrementally improve your site

HTML - reduce byte size

I'm testing a website speed using PageSpeed Insights tool.
In the result page, one of the warnings suggested me to reduce byte size of css, html and js files.
At the first I tried to remove comments, but nothing changed.
How can I do that?
Should I remove spaces and tabs?
It seems to be a very long operation, worth it?
The action of removing spaces, tabs and useless chars is called minify.
You don't need to do that, there are a lot of services that can minimize files for you.
for example:
http://www.willpeavy.com/minifier/
Be care if you have jquery code: sometimes it removes spaces in wrong place.
You have two things to do to reduce page size:
Minify CSS & JS files
In server side, if you are running your website via Apache, you can install APC, for page cahing. You'll have better parformances
APC
In addition to CSS minifier/prettifier tools above, I recommend using proCSSor for optimizing CSS files. It offers variety of advanced options.
Never found those tools to be much use beyond giving some tips for what might be slowing it down. Minifying is unlikely to achieve much. If you want to speed up your site, save the page and see what the largest files are. Generally they will be the image files rather than the code, and see if you can reduce these.
Also, try and test it on two servers - is your host slow?
If your html file is massive, that suggests a problem with the site's structure - it is rare that a page needs to be large.
Finally, large javascript files are most likely to be things like jquery. If Google hosts these, then use the hosted version. That way, it will probably be already in a user's cache and not impact on your loading time.
EDIT, after further testing and incorporating the issues discussed in the comments below:
PageSpeed Insights is an utterly amateurish tool, and there are much more effective ways to speed up the rendering time than minifying the codes.
PageSpeed Insights is an utterly amateurish tool, that as a matter of standard advises to reduce HTML, CSS and JS file sizes, if not minified. A much, much better tool is Pingdom Website Speed Test. That compares rendering speed to the average of the sites it is asked to test, and gives the download times of the site's components.
Just test www.gezondezorg.org on both, and see the enormous difference in test results. At which the Google tool is dead wrong. It advises to reduce the CSS and JS files, while its own figures (click the respective headers) show that doing so will reduce their sizes with 3.8 and 7.9 kB, respectively. That comes down to less than 1 millisecond download time difference! (1 millisecond = 1/1000 of a second; presumed broadband internet).
Also, it says that I did do a good thing: enable caching. That is BS as well, because my .htaccess file tells browsers to check for newly updated files at every visit, and refresh cached files whenever updated. Tests confirm that all browsers heed that command.
Furthermore, that site is not intended to be viewed on mobile phones. There is just way too much text on it for that. Nevertheless, PageSpeed Insights opens default with the results of testing against mobile-phone criteria.
More effective ways to speed up the rendering
So, minifying hardly does anything to speed up the rendering time. What does do that is the following:
Put your CSS codes and Javascripts as much as possible in one file each. That saves browser-to-server (BTS) requests. (Do keep in mind that quite a number of Javascripts need the DOM to be fully loaded first, so in practice it comes down to putting the scripts as much as possible in 2 files: a pre- and a post-body file.)
Optimize large images for the web. Photoshop and the likes even have a special function for that, reducing the file size while keeping the quality good enough for use on the web.
In case of images that serve as full-size background for containers: use image sprites. That saves BTS requests as well.
Code the HTML and JS files so that there is no rendering dependency on files from external domains, such as from Twitter, Facebook, Google Analytics, advertisement agencies, etc.
Make sure to get a web-host that will respond swiftly, has a sufficient processing capacity, and has a(n almost) 100% up-time.
Use vanilla/native JS as much as possible. Use jQuery or other libraries only for tasks that would otherwise be too difficult or too time-consuming. jQuery not only is an extra file to download, it is also processed slower than native JS.
Lastly, you should realize that:
having the server minify the codes on the fly generally results in a much slower response from the server;
minifying a code makes it unreadable;
de-minifying tools are notorious for their poor performance.
Minifying resources refers to eliminating unnecessary bytes, such as extra spaces, line breaks, and indentation. Compacting HTML, CSS, and JavaScript can speed up downloading, parsing, and execution time. In addition, for CSS and JavaScript, it is possible to further reduce the file size by renaming variable names as long as the HTML is updated appropriately to ensure the selectors continue working.
You can find plenty of online tools for this purpose, a few of them are below.
HTML Minify
CSS Minify
JS Minify
good luck!

Mobiles, HTML, CSS (& laziness)

With regards to mobile websites on smartphones;
Assuming that:
HTML code is rarely a huge amount of data
Compressed JS files are not so heavy
Images are often loaded via CSS (at least could always)
It's the same sequence (PHP + SQL = HTML) on server-side.
It seems way faster to do this way and quite easy to maintain.
And even if:
It's not graceful at all (hide Useless elements instead of generating a sharp and beautiful HTML code)
Useless code is loaded and treated.
Best practices for mobiles websites don't recommend to do this way.
Is it a good idea to rely only on different CSS to create a mobile version of a website? (Actually on different header templates, in order not to load useless JS)
It's probably a bad idea to serve HTML with elements that you know will be useless to your users.
Small amounts of kb make a difference on mobile download speed.
It means your CSS and Javscript need to be more complicated.
You users might see the content if the CSS or JS are slow loading.
It will take more processing power (I think CSS styles will still be applied to the hidden elements).
It's likely to be easier to manage on the server
But to answer the question "Is it a good idea to rely only on different CSS to create a mobile version of a website?";
Yes if you want your mobile users to have the same content as your large screen users. Which you probably should as this is normally what the users want.
No if you want to serve them different content.
Speaking for Belgium, I know a lot of people are still on Edge instead of 3G and loading a webpage takes some time. If we would have to load pages made your way, we would indeed be loading a lot of useless code, giving us quite a bad experience.
I'd suggest you stop being lazy and write your mobile websites the way they should be written. Think of your visitors and user experience; it honestly isn't that much of an effort.
I think you basically answered your own question already. Like BoltClock said, do what you want, but I sure wouldn't recommend doing things your way.

Should I embed images as data/base64 in CSS or HTML

To reduce the number requests on the server I have embedded some images (PNG & SVG) as BASE64 directly into the css. (Its automated in the build process)
like this:
background: url(data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAFWHRTb2Z0d2FyZQBBZG etc...);
Is this a good practice? Are there some reasons to avoid this? Are there some major browser that don't have data url support?
Bonus question:
Does it make sense to do this for the CSS & JS also?
Is this a good practice? Are there some reasons to avoid this?
It's a good practice usually only for very small CSS images that are going to be used together (like CSS sprites) when IE compatibility doesn't matter, and saving the request is more important than cacheability.
It has a number of notable downsides:
Doesn't work at all in IE6 and 7.
Works for resources only up to 32k in size in IE8. This is the limit that applies after base64 encoding. In other words, no longer than 32768 characters.
It saves a request, but bloats the HTML page instead! And makes images uncacheable. They get loaded every time the containing page or style sheet get loaded.
Base64 encoding bloats image sizes by 33%.
If served in a gzipped resource, data: images are almost certainly going to be a terrible strain on the server's resources! Images are traditionally very CPU intensive to compress, with very little reduction in size.
Common answers here seems to suggest this is not needed, for a set of legit reasons.
However, all of these seems to neglect modern apps behavior and build process.
It's not impossible (and actually quite easy) to design a simple process that will walk through a folder images and will generate a single CSS with all the images of this folder.
This css will be fully cached and will dramatically reduce round trips to the server, which is as correctly suggest by #MemeDeveloper one of the biggest performance hits.
Sure, It's hack. no doubt. same as sprites are a hack. In perfect world this will not be needed, until then, it's a possible practice if what you need to fix is:
Page with multiple images that are not easily "spritable".
Round trip to servers are an actual bottleneck (think mobile).
speed (to the milliseconds level) is really that important for your use case.
You don't care (as you should, if you want the web to go forward) about IE5 and IE6.
my view.
It's not a good practice. Some browsers are not supporting data URIs (e.g. IE 6 and 7) or support is limited (e.g. 32KB for IE8).
See also this Wikipedia article for complete details on the Data URI disadvantages:
Data URI scheme
Disadvantages
Data URIs are not separately cached from their containing documents (e.g. CSS or HTML files) so data is downloaded every time the containing documents are redownloaded.
Content must be re-encoded and re-embedded every time a change is made.
Internet Explorer through version 7 (approximately 15% of the market as of January 2011), lacks support.
Internet Explorer 8 limits data URIs to a maximum length of 32 KB.
Data is included as a simple stream, and many processing environments (such as web browsers) may not support using containers (such as multipart/alternative or message/rfc822) to provide greater complexity such as metadata, data compression, or content negotiation.
Base64-encoded data URIs are 1/3 larger in size than their binary equivalent. (However, this overhead is reduced to 2-3% if the HTTP server compresses the response using gzip)
Data URIs make it more difficult for security software to filter content.
I was using data-uri's for about a month, and Ive just stopped using them because they made my stylesheets absolutely enormous.
Data-uri's do work in IE6/7 (you just need to serve an mhtml file to those browsers).
The one benefit I got from using data-uri's was that my background images rendered as soon as the stylesheet was downloaded, as opposed to the gradual loading we see otherwise
It's nice that we have this technique available, but I won't be using it too much in the future. I do recommend trying it out though, just so you know for yourself
I'd more inclined to use CSS Sprites to combine the images and save on requests. I've never tried the base64 technique but it apparently doesn't work in IE6 and IE7. Also means that if any images changes then you have to redeliver the whole lost, unless you have multiple CSS files, of course.
I have no idea about general best practices but I for one would not like to see that kind of thing if I could help it. :)
Web browsers and servers have a whole load of caching stuff built in so I would have thought your best bet was to just get your server to tell the client to cache image files. Unless you are having loads of really small images on a page then I wouldn't have thought the overhead of multiple requests was that big a deal. Browsers generally will use the same connection to request lots of files so there are no new network connections being established so unless the volume of traffic through HTTP headers is significant compared to the size of the image files I wouldn't worry about multiple requests too much.
Are there reasons why you think there are too many requests going to the server at the moment?
I would suggest it for tiny images that are used very often, for example common icons of a web application.
Tiny, because the Base64 encoding increases the size
Often used, because this justifies the longer initial load time
Of course support problems with older browsers have to be kept in mind. Also it might be a good idea to use the capability of a framework to automatically inline the images a data urls such as GWT's ClientBundle or at least use CSS classes instead of adding it to the element's style directly.
More information are gathered here: http://davidbcalhoun.com/2011/when-to-base64-encode-images-and-when-not-to/

Multiple CSS files and performance

Would it be wise to combine all CSS into a single file? Would there be any performance increase. I have to assume that an HTTP request is made to get each file on initial page load, and reducing those requests would seem to make sense.
Are there any reasons NOT to combine all css into a single file?(such as maintainability or other performance issue)
Merging all of your css files into one will absolutely gain performance. Whether that performance is noticeable depends on your load, number of requests etc. For the average blog, this will have close to zero impact.
Read Best practices for speeding up your web site at Yahoo! Developer. It'll explain things way better than i can.
As you say, a reason not to merge css files is for maintainability. However, there are many tools out there which automatically merge and minify your css files into one.
You should check out YUI Compressor, this will help you with merging and minifing your css files.
Would it be wise to combine all CSS
into a single file? Would there be any
performance increase. I have to assume
that an HTTP request is made to get
each file on initial page load, and
reducing those requests would seem to
make sense.
Yes, but make the combination at build or runtime and don't try to maintain a single file if you started with multiple ones.
In addition to the number of HTTP requests it is also important to set the right expiration headers in the response.
Are there any reasons NOT to combine
all css into a single file?(such as
maintainability or other performance
issue)
It is not necessary to maintain a single file, but good to serve a single file, because CSS data is anyway merged.
The YUI Compressor is a good tool for JavaScript and CSS minification.
Would it be wise to combine all CSS into a single file? Would there be any performance increase. I have to assume that an HTTP request is made to get each file on initial page load, and reducing those requests would seem to make sense.
Yes, yes, yes.
Are there any reasons NOT to combine all css into a single file?
No
(such as maintainability
Combine them at build time and not at development time an there won't be a maintainability problem.
When not all page need all the css then splitting it in multiple files might be faster.
Maintainability becomes an issue if the CSS file gets really huge, when different teams need to coordinate their work on it.
There might be another reason, to NOT combine files:
If the combined stylesheet file is getting too large, Internet Explorer 9 will ignore some of the styles..
See: IE 9 ignoring CSS rules