Does the amount of html on page affect speed? - html

Of course the amount of HTML loaded all at once will affect speed. But will the overall amount of HTML on a website page affect performance speed?
If so, by how much? Will roughly 1,000 list items make a noticeable difference?
e.g.
I have a website that loads HTML per request and removes the previously loaded HTML. The user will sometimes go back and forth, which requires loading each time. Instead of reloading it each time, I'd like to just hide the loaded HTML and reshow it when they request it again.
Here's a link to an example on my website
Edit: For those that want to close this, this question falls under this topic.
a practical, answerable problem that is unique to software development

Yes, the amount of HTML effects page speed.
That's why many developers today are minifying their HTML as a part of the build process.
BUT, you better ask yourself not HOW MUCH HTML do you load, but WHAT KIND of HTML are you going to load.
for example: if you have 200 paragraph elements with lots of text it will be just fine. but what if you have 100 big IMG elements? in that case, maybe hiding them (but still loading them) is not such a good idea and its better to load them on demand.

Related

Will removing a HTML section block from my all website pages, affects SEO and my website rank?

So I have a hidden HTML section block that is heavy and requires multiple HTTP requests, which slowing my website page speed load.
So I am trying to implement a lazyloading methodology for this specific section to only render after someone clicking on the [SHOW] button.
I want to know if removing this entire HTML section from my web pages will cause a problem for Google or any other bots and will damage my rank? or not?
Thanks.
Improving the overall performance of your site is always a good idea. In fact, page speed directly affects your ranking, so best improve it by all means:
https://moz.com/learn/seo/page-speed
https://developers.google.com/web/updates/2018/07/search-ads-speed
Hidden text, on the other hand, might hurt your ranking, at least if done the wrong way:
https://www.searchenginejournal.com/hidden-text-seo/208866/
Since in your case, the hidden parts seem to hugely impair your page loading, lazy-loading them would be the way to go.

What are the best practices for caching HTML? Should it be avoided?

I have been working on a few factors to optimize my webpages. I have optimized the delivery of everything else and now I am thinking about making some optimization related to the HTML.
Since HTML can change frequently I was thinking caching it would be counter productive or an overkill but I know exactly when my HTML is going to change ( it changes every six hours). Is caching my webpage for lets say 1 or 2 hours worth it? What are some best practice that I should follow in case I decide to cache HTML?
For most smaller to midsize web sites, I would think this would be overkill.
In order to determine how much time is being saved, the question is: how large is a given page on a site, and how often will people be clicking between pages? (Or, for that matter, are your users on particularly slow connections?) Unless the pages are very large, or people are spending a lot of time clicking between pages, the time spent loading the pure HTML should not be especially significant.
The CSS and JavaScript, sure, that makes sense to cache, but I believe most browsers handle that caching automatically. But loading HTML content is a relatively quick process; unless you are using a vast amount of HTML-generated graphics or other items (e.g. HTML5's SVG) I can't see much need to store that for later.
Think about how your users browse your sites - will they be visiting each page only a few times, or is your site built in such a way that they have to keep clicking back and forth between pages?
Any site that would have people clicking back and forth sufficiently often to merit this might benefit from being built as a single-page application (SPA). If people are clicking back and forth enough that caching HTML becomes a serious concern, it might be better to think of those as different parts of one page.

Prevent loading of entire CSS stylesheet - only load needed content?

I'm trying to improve my site speed as I integrate certain features of a theme I purchased into my site and came across this interesting feature on the website of said theme.
The problem I have now is my CSS style sheet is absolutely massive (220kb) and takes ages (3-4 seconds) to load. I'm working to reduce that, but that's a different story...
When I try to load that exact same page on the theme dev's website, it loads much faster. Upon inspection, on my website, I find my 'Size' and 'Content' amounts of the stylesheet in the dev tools is about identical.
On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient.
I also thought it was because the dev's page was caching the content on my browser but I've disabled cache and it still only loads a portion of the content.
My website's cache works by the way, I'm just interested in speeding up that initial load time.
Any thoughts?
Relevant imagery:
Cache Disabled:
Dev's website with small size/content ratio - http://i.imgur.com/B3uhOWF.png
My website with ~1:1 size/content ratio - http://i.imgur.com/Ymu2YRz.png
Off the top of my head, you could minify your CSS. There's a few free tools around including the Page Speed Insights for Chrome.
See https://developers.google.com/speed/docs/insights/MinifyResources
It looks like your CSS is already in a minified file, but perhaps the content isn't!
Just so people aren't confused. The solution was to enable gzip on the server, which you can read more about at https://developers.google.com/speed/articles/gzip (but still minify your css as well).
Thanks to user worldofjr, I found the answer to be ENABLE GZIP on the server.
Unbelievable increase in speed.
That should be required reading whenever someone signs up for webhosting. I can't believe my site has gone 1 year without this when it took 2 seconds to enable!!!
"On the developer's page, the size is much smaller than the content. Does this mean that ONLY the elements called upon in the HTML are downloaded from the stylesheet? If yes, how does one accomplish this, because it sounds tremendously efficient."
That's not generally how css works. And what you describe might not be very efficient even if you did somehow implement it. The server would have to know which styles to send, which would require the browser to load the page and send in it's list and the server would have to generate the custom css and send it back...
You certainly can limit which css loads where--and you should. But that has to be done manually, at least in my experience.
Anyway, a few things to consider before you proceed:
I think 200k is really, really big for a stylesheet. Most sites I work on the total CSS weight might be 20k. Or less. I'd investigate why yours is so huge. Maybe something has gone wrong. Minifying might help, but it isn't going to knock you down a whole order of magnitude.
However, even if it is the world's largest CSS file, 200k isn't THAT big. Not 4-5 seconds big. I'm working on a site right now where an entire page weighs ~350kb and I have it loading in under a second. Again, I suspect something else has gone wrong. The size of that CSS is probably indicative of something but I suspect it's not purely the time it takes the file to download.

What prevents HTML pages from taking advantage of progressive rendering?

I've notice that some pages begin to render almost immediately, while others sometime have to wait until many or all of the resources (javascript, image, css) have been downloaded. The worst case seems to be for a large page, on a slow connection or server. One specific page I'm looking at comes out to almost 2 MB, with 30 different .js files, a dozen .css files, and 80 image.
I'm aware of the suggestions at http://developer.yahoo.com/performance/rules.html, but what would be preventing the browser from attempting to render the page until the last element has been downloaded?
There are a handful of reasons why this may happen. The most common that I see are large tables.
For example, Internet Explorer doesn't like to render a table until it is finished loading.
Each browser is a bit different though, in how they render things that are still downloading.
The general rule is to not use tags about structure to affect layout. With the styles at the start of a page a rendering engine knows what it has to do to render a certain part of the page, but it always has to wait for a part to download to know how to render it properly.
With that in mind:
Tables should rearely (never?) be used for layout purposes.
Parts that should be reasonably rendered first (sidebars, toolbars, and anything that framews the page) should be featured at the top of the HTML document.
The huge JavaScript libraries in use today are different in that they only need to be loaded (and cached) once.

Organizing css for large sites

I have a more general design question relating to css and websites. I know that it is good computer science to normalize code as much as possible and avoid duplication. So it would stand to reason to me, at least in theory that one would do the same when organizing stylesheets for a website.
So when I started on my most recent website I started out with this same philosophy. It worked ok for my first few pages and while I was only testing in firefox...
However as my site grew and as I added pages, multiple layouts (and browsers) I found this philosophy broke down really quickly. Ultimately over time I have moved to the following approach:
I have a very limited top level css file for each master page layout in my site, it contains classes for well known styles across that layout as well as css for the master page.
I keep specific css styles for each page.
I keep specific css styles for embeddable page elements / controls
I ended up taking this route so that I could trust that changes on one page wouldn't accidentally break other pages in the site resulting in a lot of regression bugs.
What do other people do when approaching this? Is this a good / bad approach... I do see cons to this approach, some pages are very similar so making a significant change means changing more css code, I also feel that the pro's outweigh this on a daily basis.
What do other developers think about this philosophy? Good? Bad? Just curious really...
To me its one of those situations where I weighed the difference between my ideals (I try to keep very tight code), and the frustration of changing requirements on one page breaking 20 other pages because I changed a div width by a few pixels (upsetting a float on another page for instance).
Thanks for your input
Just like any other type of code, if you are duplicating your CSS code all over the place, you are asking for trouble. Maintenance is going to get harder and harder for you as time goes on.
The best way to not have issues with a change on one page affecting other pages detrimentally is to have a style guide that drives your UI layout and design. If your style guide defintes the HTML and CSS classes to use for a given design element, then they will all always be displayed the same across all pages. If a specific page's element needs to be changed, you change the HTML to use a different class and then build new CSS for that class (and add it to your style guide for reuse). The style guide also allows you to make sure that your HTML is uniform across all developers working on the site, which means even less of a chance of CSS changes causing problems as you do more development.
Another point you need to remember with CSS is that every one of those .css files you create and reference on a page is an HTTP request. If every page and control has its own CSS file, you are really hurting your users' experience on the site by bogging down the total request download time for every single page request. It also makes it less likely for their browser to cache the .css files because the cache has a limited amount of space, so if you keep filling it with more and more .css files, they are going to get dumped from the cache more quickly. Yes, you can combine .css files programmatically in a handler so your page only makes one request per page, but then you have additional server overhead and the caching issue still remains (unless you have a single request for all .css files on your site, which defeats the purpose of what you're trying to do here anyways).