What are the best practices for caching HTML? Should it be avoided? - html

I have been working on a few factors to optimize my webpages. I have optimized the delivery of everything else and now I am thinking about making some optimization related to the HTML.
Since HTML can change frequently I was thinking caching it would be counter productive or an overkill but I know exactly when my HTML is going to change ( it changes every six hours). Is caching my webpage for lets say 1 or 2 hours worth it? What are some best practice that I should follow in case I decide to cache HTML?

For most smaller to midsize web sites, I would think this would be overkill.
In order to determine how much time is being saved, the question is: how large is a given page on a site, and how often will people be clicking between pages? (Or, for that matter, are your users on particularly slow connections?) Unless the pages are very large, or people are spending a lot of time clicking between pages, the time spent loading the pure HTML should not be especially significant.
The CSS and JavaScript, sure, that makes sense to cache, but I believe most browsers handle that caching automatically. But loading HTML content is a relatively quick process; unless you are using a vast amount of HTML-generated graphics or other items (e.g. HTML5's SVG) I can't see much need to store that for later.
Think about how your users browse your sites - will they be visiting each page only a few times, or is your site built in such a way that they have to keep clicking back and forth between pages?
Any site that would have people clicking back and forth sufficiently often to merit this might benefit from being built as a single-page application (SPA). If people are clicking back and forth enough that caching HTML becomes a serious concern, it might be better to think of those as different parts of one page.

Related

Should I preload a large image that's above the fold?

I'm excited about the rel="preload" property, because it looks like it can help speed up page render times.
The use case is a web page with a large image above the fold. Right now, Chrome doesn't start downloading the image until after fetching jQuery (a fairly heavy file). With preloading enabled, they download in parallel.
But I'm reading conflicting reports about whether I should use preload for things that are in visible HTML elements elsewhere (as opposed to things made visible by user interaction, like a dropdown menu).
This post seems to recommend not preloading:
When not to use preloading:
When the asset is referred to somewhere else on the same page.
When you're not sure the user will actually require that asset. Like on a page visitors only go to 3% of the time.
While this one seems to indicate it was really helpful for a similar situation on the Financial Times website:
When the Financial Times introduced a Link preload header to their site, they shaved 1 second off the time it took to display the masthead image...
So which is it? Should I provide an early "hint" to display the always-shown, above-the-fold image? Or should I just let the browser get to it in the usual order?
I think that in cases involving performance optimization, you really want to create an A/B test to determine which one you should do. There really is no cookie cutter answer for image preloading that applies to everybody as a best practice.
One of the biggest tenets of a favorite book of mine, Lean Enterprise, is use to A/B tests to prove or disprove a HIPPO (highly paid person's opinion). Certain opinions carry a lot of weight, both in your organization and on the internet. Because of their importance and reputation, their opinions veer towards the realm of fact, even though it may not be.
The practice of measuring performance empirically is also touted by another book I love which deals with performance tuning - Code Complete 2nd Ed. In that book, McConnell gives several code examples where you would expect one piece of code to be optimal, but in fact, it performed poorly (see chapters 25 and 26). One of his key points is that you should always test a performance optimization. If it isn't worth testing, it isn't worth writing the "highly performant" code in the first place. McConnell's premise doesn't just apply to his low level coding examples, but to high level decisions such as preloading images above the fold as well.
I can also attest to the importance of A/B testing at a professional level. I used to work on Amazon's SEO team and we A/B tested everything. The fact of the matter is that you never really know how customers will respond to something. Nobody can predict customer behavior - not even Jeff Bezos - and you really need to back up your hypothesis with real data to prove or disprove the validity of what you're doing.
Even though you can find multiple blog articles and online sources discussing whether preloading is better or not, you don't really know whether that will work for you until you've done it. Different people have different servers with different performance characteristics and different network topologies, etc. You just don't know which way is better for you until you have the data. If you launch your A/B test and find that find that your repel rate goes up when preloading, then you know that you have to dial back your treatment and return to your control. If however, you find that customers don't get bored waiting and click through at a higher rate than before, then you have a winner and you dial up the treatment - deleting the control code entirely after a period.
I hope that helps.
The article is wrong about those statements. Preloading is used by the developer because he already knows those assets are needed on a page. Preloading is not a "hint". Prefetching is more in line with what he states, that you are telling a browser that it's possible the asset may be needed.
If an image is above the fold, then you know it will be needed, and that is exactly what prefetching is for.
Addy Osmani of Google
preload is a declarative fetch, allowing you to force the browser to
make a request for a resource without blocking the document’s onload
event.
I would use rel="preload" for images only if they are to be found in css, javascript that will led to a bottleneck in the render of the page at a later time, if user driven events will request the image and it will have detrimental impact in user experience or if you are experiencing any flows due to this large image in the rendering of the page.
I understand your image is above the fold, and if it's to be found in the source code from the beginning I would let browser prioritize what to download first.
On the other hand you could always A/B this change and see if it works for you.

maximum file size website

I have a quick question and It might not be the right place the ask it but I hope someone can help me becuase im curious.
I'm creating a one-page website whit a lot of style (isnt designed webfriendly and uses a lot of images) this isnt a big problem because we believe we're aiming at a more design branche who most likely have kind of good pcs with a pretty ok connection to the internet and the site is sized down a lot of mobile.
But still I wonder what people recommend. my site is currently 2.2mb on loading (is there a website where you can check this btw? I just made a guess by calculating the file sizes of images etc.) I can still optimize a lot but I think going under 1mb is being a hard task. is this good enough or is there anyway who has experienced it isnt?
thanx in advance
You shouldn't worry about the size of your website (especially since you know your target audience) if the user experience doesn't suffer. However, you should optimize everything you can without sacrificing the design. Google's Page Speed might help you a lot. They even have the total size calculator you wanted. There are tons of similar tools available online as well.
Also, read this about last year's website size trends: http://www.theglobeandmail.com/technology/tech-news/bloated-web-pages-costly-for-smartphone-users/article9355125/
According to the website HTTP Archive, which regularly studies the top 10,000 most-visited sites online, the average web page now weighs in at about 1.3 megabytes, up about 35 per cent in the last year.
UPDATE
Inspired by #pwdst's comments, I'd like to add that if you want support for mobiles, tablets, etc... there's no need to sacrifice the look of the main site - you can use media queries and practically serve a different presenation to those users. Of course, you can go even further and make a different website for them (usually a subdomain).
If you use dev tools, for example in Chrome, you can see the number of requests and total transferred data under the "Network" tab. To accurately replicate the experience of your users, you should do this with an empty cache - "Incognito" or "Private" tabs can be a great way to do this in most browsers. It's also worth remembering that the experience when testing locally will be drastically different from that when the effects of latency and upstream bandwidth come into play.
Be extremely careful about assumptions regarding users if you want to maintain usability. "Good enough" is extremely contextual, it may be "good enough" for a user on a desktop, laptop or premium tablet on a solid DSL connection - but anything but on a Edge or 3G connection on a phone. Wherever possible assumptions should be backed up by analysis of user agent strings from server logs, or from analytics software such as Google Analytics. It's also worth remembering that you may have relatively few mobile or tablet requests, not because mobile or tablet users don't want to visit the website - or choose to visit using another device, but because it is difficult, slow or even impossible to use on their device of choice. This case study from an engineer at YouTube illustrates how customers were being completely excluded from the site.
In general the page should be as small as possible, and you may want to look at lazy loading in addition to optimisations, which will help initial rendering time - in fairness though your estimated size is not much larger than the internet archive average page weight of 1681kB as at January 15th this year. Don't forget that other resources can also use a lot of space - JavaScript makes up 274kB of that average - and it's not clear if you have factored that into your estimate. Aggressive use of caching will help return visitors and minimise the data transfer required in these circumstances.
I always try to consider if you can justify the page weight cost against functionality. What is the objective of visitors to your site? Does the added functionality or imagery help this sufficiently to justify the performance cost. Many performance orientated developers are now actively setting a page weight budget but this is harder for single page apps where you cannot be as granular with resources.
I would recommend using Image Pre-loading which can help your page loading times, even when its large.

Why do most websites have their HTML in just a few lines?

I searched for this question-- both in Google and Stackoverflow-- but couldn't find what I wanted.
Now, whenever I see the source code of a website, like Facebook or Google, their HTML code is spanned in just one line? Why do they do it? What is the significance of doing it? Do I need to do it for small websites as well, say maybe a school website?
The main Google page server millions of page views every hour. If it's one byte longer, that means gigabytes of additional data are transferred over the Internet every day.
That's why big sites with lots of traffic really squeeze every bit out of their HTML code. For small sites, this isn't really an issue.
Here'a a StackOverflow answer with a lot of reasons why web sites use "bad practices" in general: Why do big sites use 'bad practices'?
As others have said, though, the "one big line" issue is usually motivated by both performance considerations and obfuscation.
Frankly, unless you're serving hundreds (thousands?) of page load per minute, I wouldn't worry about that.
As said by Aaron Digulla, its used to keep byte size down.
Its also used to remove whitespaces (which can cause errors in php&js), comments and also to keep pesky people from trying to steal code. In minified form, its harder to read and therefore harder to copy.
We called that "minification" (remove unnecessary information from the code (White spaces, comments etc.) ). not only for html, you can do that it for javascript, css etc as well. Minification improves overall performance of the application.
Wikipedia explanation:
http://en.wikipedia.org/wiki/Minification_%28programming%29
You can apply that for small web sites as well. There are tools available for that and it is a best practice as well.

What's a good design strategy to make an existing website responsive without risking losing good search engine ranking

I've been working on re-designing a website for a client and I'm trying to figure out the best approach for making my clients new website as responsive as possible, without jepordizing his good search engine rankings. The client wants to add a bunch of new content to the website as well as a few new pages. The current website uses transitional html and the new website is going to need some html5. I'm tempted to use a responsive framework like bootstrap for the new design but my concern is completely screwing up his good rankings when I launch the new website. Another thing that I'm trying to take into consideration is that the existing website is an old dreamweaver template. A lot of the markup and code kind of sucks for lack of a better term. It would be a lot easier for me to maintain completely fresh files. But again, I don't want to mess up his rankings. So would it be better to start fresh and hope for the best when I launch, Or would it be better to remove the content that he doesn't want, add the new content, and maybe add a couple of media queries? Or is there a better approach?
It's the content that got him where they are in the rankings so as long as it remains you won't have much to worry about
If the old HTML was poorly written, using more semantic HTML should help their rankings as it will give the search engines a better understanding of their content's weight and meaning. Using microformats would be a good thing as well.
If you change URLs make sure to do 301 redirects so all of the links they've earned will be "passed" to their new URLs. Also submit an XML sitemap so the search engines are aware of the new URLs.
Regardless of what you do when you make the change there will be a fluctuation in their rankings. This is normal. Be patient and wait at least two weeks before panicking if their rankings don't return.
Make sure to monitor Google Webmaster Tools after making the change to see if any errors are reported after you make your switch (404 errors will occur if you change URLs and that's ok)
Read this article: How to Turn Any Site Into a Responsive Site. It may make your life easier.
If the core markup of the content is semantic and structured properly, you could do pretty much anything you want and your ranking should not degrade. If anything it will most likely improve. Also if the content itself is mostly unchanged then you're safe.

Mobiles, HTML, CSS (& laziness)

With regards to mobile websites on smartphones;
Assuming that:
HTML code is rarely a huge amount of data
Compressed JS files are not so heavy
Images are often loaded via CSS (at least could always)
It's the same sequence (PHP + SQL = HTML) on server-side.
It seems way faster to do this way and quite easy to maintain.
And even if:
It's not graceful at all (hide Useless elements instead of generating a sharp and beautiful HTML code)
Useless code is loaded and treated.
Best practices for mobiles websites don't recommend to do this way.
Is it a good idea to rely only on different CSS to create a mobile version of a website? (Actually on different header templates, in order not to load useless JS)
It's probably a bad idea to serve HTML with elements that you know will be useless to your users.
Small amounts of kb make a difference on mobile download speed.
It means your CSS and Javscript need to be more complicated.
You users might see the content if the CSS or JS are slow loading.
It will take more processing power (I think CSS styles will still be applied to the hidden elements).
It's likely to be easier to manage on the server
But to answer the question "Is it a good idea to rely only on different CSS to create a mobile version of a website?";
Yes if you want your mobile users to have the same content as your large screen users. Which you probably should as this is normally what the users want.
No if you want to serve them different content.
Speaking for Belgium, I know a lot of people are still on Edge instead of 3G and loading a webpage takes some time. If we would have to load pages made your way, we would indeed be loading a lot of useless code, giving us quite a bad experience.
I'd suggest you stop being lazy and write your mobile websites the way they should be written. Think of your visitors and user experience; it honestly isn't that much of an effort.
I think you basically answered your own question already. Like BoltClock said, do what you want, but I sure wouldn't recommend doing things your way.