Webpage elements in code limit [closed] - html

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Is there a hard (or somewhat-hard) limit on the number of HTML elements (no css, no scripts) a webpage can load on a browser?

Kind of a dumb set of questions. Good Code influences speed a lot. Better hardware lowers the layman's ability to distinguish Good from Bad.
A case could be made that we've hit the point in hardware where only extreme cases of Bad code are distinguishable, but that's a discussion for some other time.
Write Good Code.
Don't feed the Dissonance.
As for your specific questions...
1) Based on the memory available, yes. As to how many, "yes" is also a good answer, because it's dependent on a lot of things but usually is higher than you'd reasonably hit.
2) Pulling this out of my ass, but generally, the reason your page is slow is (in order of likelyhood):
Total File Size is Too Damn Big for Available Bandwidth
JS hooks on spammy events (such as scroll; never hook scroll directly)
JS that causes repaint and reflow
Rendering in general
General use of JS
JS fetch/parse delay (async/defer and bottom-of-page-load are your friends)
CSS parsing & application
HTML parsing (assuming not well-formed)
HTML parsing (assuming well-formed)
3) Inline styles really don't add much in the grand scheme. Their scope is absurdly small. They honestly contribute more to file size than to parse/apply.
4) Not by much. CSS is declarative and static. Microsoft tried dynamic for a while but very quickly realized how bad of an idea that was and killed it. JS is the real bottleneck, not CSS.
5) Depends on hardware and usage. 10 mil blank <meta /> tags in the head is probably going to do very little. 10 mil <div> tags with a ton of supporting CSS and JS, on the other hand, will chug real bad if the machine isn't beefy (and 64bit).
One of the pages at work that is in desperate need of a refactor has around 16800 elements currently, as well as a lot of JS. The page chugs fairly bad at the start, but is reasonably responsive once it's finished loading. Meanwhile, another is sitting around 89.7k. In spite of being orders of magnitude larger, it doesn't have a ton of JS backing it up and many of the elements are (usually) display:hidden, so it's actually a lot faster and more responsive of a page (until you hit ctrl+f).
6) Not significantly. Some stuff can chug extra hard on one rather than another, but that's usually due to either an impl bug in the browser being proded by some really odd combination or usage of features a web page hit.
7) Better hardware can handle more complex stuff faster, assuming it can be taken advantage of. A 128-core computer can't magically multithread your serial O((n^7)!) script- though the JIT optimizer sure as hell might try.
8) Server Hardware does very little to effect client-side speed unless it's swamped and thus running into the server-side bandwidth limit. Server Hardware does do a lot to effect the speed at which it can produce pages with heavy server-side backend to them. Again, having the server do something that's O((n^7)!) will be slower the worse its hardware is, and that gets compounded significantly by the number of users requesting that operation (assuming you can't/don't lock and return a cached copy of the first request's result).
A bit of further reading on Repaints and Reflows... The title of the article itself is absurdly bad, but the content is good. In-depth explanations (and even further readings) on repaints and reflows, which, while not quite the same today (8 years later), are still a key concept in page efficiency. Most of these are also still applicable due to how the browser has to render some elements (such as tables without table-layout: fixed;).

Related

Empty Lines in HTML source bad for seo? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I prefer to write my HTML clear, so I may use empty lines here and there - example:
<div>
<!-- Seasons -->
<table class="giantTable">
...
</table>
<!-- Prices -->
<table class="giantTable">
...
</table>
</div>
Today my new workmate told me that this is bad for SEO,
because Google would need more time for parsing the site and may abort with a timeout.
I never heard about this,
shall I really write Spaghetti-Code again?
Google do use page-load and rendering time as one metric (of over 200!) for determining your page-rank, so to an extent your colleague is right (although timeout's are not the issue - he is wrong on that).
However, you can have the best of both worlds :) Write your HTML as you normally do, and then minify it before deployment.
Note that there are a number of tools for analysing your site performance (both online, and as browser plugins - e.g. YSlow), and it's a very sensible thing to do. You can have numerous bottlenecks in your web-site, and can often get some quick wins that significantly improve the responsiveness of your site.
As always with optimisation though - measure first! Don't just randomly implement supposed improvements until you have measured the bottlenecks, and then confirmed the improvement is an improvement.
The sentiment isn't entirely off. Google does now consider the speed of your pages as a factor, and excessive white-space in code can increase payload size. Google themselves recommend minifying your code ( https://developers.google.com/speed/docs/best-practices/payload#MinifyHTML ), and this can be done without too much overhead on the web server.
I find the biggest culprit in dynamic websites comes from using loads of space in the middle of for/while loops, so cutting down on that can make a big difference. Also, try using tabs instead of spaces and you'll cut your white-space big-time.
Even if this were true (which I've never heard before however RBs point above makes a good point), there are many other things that contribute to your page ranking way more than what that would.
Google made an awesome SEO guide which I always check out, its really pretty and easy to read as well, what with all the pictures of Robots. Its definitely worth checking out - Google SEO Guide
It isn't bad at all, they ignore white space. Otherwise everyone would be trying to write code all on one line
http://jamesmartell.com/matt-cutts/is-excessive-whitespace-in-the-html-source-bad/
This document describes how to do SEO for Google (it is quite extensive). A first glance over all the pages doesn't say anything about compressing your HTML.

Deprecated tags and attributes? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
Upon visiting an older site it can be common to find deprecated tags and attributes. Yet as the are deprecated and not obsolete, they are still valid and supported. A 'good' programmer tends to stay away from these if at all possible, yet just how bad is it to use these?
Upon looking up information about deprecated elements, I find the most common information supplied is:
Browsers should continue to support deprecated tags and attributes,
but eventually these tags are likely to become obsolete and so future
support cannot be guaranteed.
Obviously if something breaks you should change it. Yet should we actively change instances of deprecated elements in code we had worked on previously? Either a personal site, or a huge site with a huge view rate, just how important is it to stay as up-to-date on accepted elements?
How much warning after a tag or attribute becomes deprecated is given when it is decided it will become obsolete? Any? What about other web formats?
On any site, be it an over-haul, or just noticing a line of deprecated code, should it all be actively removed? Or is it worth the saved time and energy to just wait until is is truly dropped and change it all at once?
Should you lazy (like I wish I could be), or is it worth the effort to be that up-to-date?
Should you just be lazy (like I wish I could be), or is it worth the effort to be that up-to-date?
It's always important to be lazy, that's why we have tags marked as deprecated instead of just removed.
I always tell developers to be lazy, but only lazy enough to not stop yourself from preventing yourself from doing good things. There's such a thing as too lazy, and such a thing as not lazy enough. DRY is an example of good lazy. Relying on deprecated tags is an example of bad lazy. Using divs for table layout is an example of inefficient overwork.
On any site, be it an over-haul, or just noticing a line of deprecated code, should it all be actively removed? Or is it worth the saved time and energy to just wait until is is truly dropped and change it all at once?
If you're doing maintenance, and it's a bug report, it's always important to fix it, obviously, but in the case of being proactive, the question becomes, what's the most important thing for you to do right now. If you have some 20% time and you want to fix it, then fix it. If you have tasks that need to be done, then you need to focus on those first.
Either a personal site, or a huge site with a huge view rate, just how important is it to stay as up-to-date on accepted values?
This is really what you're asking about. On your own personal site, that's definitely 20% time, so do that as you want.
On a massive site with lots of throughput, you need to try and keep it up to date so it continues to offer good benefit to the users. If the site suddenly stops working for lots of users, they're going to stop coming. Then a massive site with lots of throughput turns into a dead site taking up space on the internet.

why do people still use tables, inline css, et al? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
When you learn HTML and so forth nowadays, the mantra is always "Clean code = better code".
So why do sites like Mobile Me and Google and Facebook all use huge amounts of tables and other not-semantically correct code?
Thanks!
Because people still use IE6, unfortunately, and it's so incredibly bad at CSS as to make it almost worthless for CSS selectors of any sophistication. Until IE6 is gone and dead dead dead in the cold ground, you're still going to see a lot of this.
If you could see what SharePoint generates, you would probably go into seizures.
Clean code is better, yes.
But working code is much much better )
Because sometimes that's the path of least resistance. It's not always about being ideologically pure, it's about being pragmatic and getting the job done in this crazy, multi-browser, multi-platform world.
Because it's easier.
While the purist in me will also strive for semantic tags and external CSS for layout, the pragmatist in me need to get this site up by 6pm (so I can go home to my wife and a nice warm dinner!) and there's just this little problem with [insert browser here*] that could easily be solved with a bit of conditional CSS, or a table or something.
There are other reasons for high-traffic sites like Google and Facebook to use inline CSS and js: bandwidth. Every external file you reference is one extra round-trip to the server to fetch. Of course, that doesn't really explain the style="xxx" tags as opposed to just inline <style> tags, but it still reduces the size of the page. Also, while caching does tend to reduce the number of trips, there are still a significant number of requests that are made with a "clean" cache that you still want to optimise for that case.
Not always IE (but mostly is)
I had an affiliate marketing client the other day who wanted me to make him a web template where he could go in and edit it with Adobe Dreamweaver (some app I don't use because I'm a Linux user). So, being the web-savvy guy I am, I did it in XHTML with cross-platform CSS that I have learned over the years, using DIVs primarily, and only using TABLES for form field alignment simply because of the 80/20 rule. It was lean code in as few lines as possible, loaded super fast, and worked on all browsers from IE6 on up.
But then I handed it off to him, and he was visibly frustrated and asked me to make changes. He didn't like the CSS because he couldn't cut and paste sections to another page and have the styling carry over. Instead, he wanted me to switch everything to inline styles. Next, he couldn't edit the floating DIVs very well, and would undo my cross-platform work I had done, so he wanted it reverted back to tables. The end result was XHTML + CSS for the shell of the page that centers everything into the middle and adds the fancy graphics around the page. Then, I used PHP "include" separation for headers and footers. The final part was the middle of the page, and that was his domain. I had to compose it in TABLEs with inline styles. At that point, he was happy and I had a compromise.
So, keep this in mind that there are some cases where you have to use TABLE formatting and inline styles because that's all the client really knows how to manipulate.

Why should I not use HTML frames? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I haven't used frames since 1998. They seem like a bad idea and in all my development I've never had a situation where frames were the right solution, or even a decent solution.
However, I'm now working with an internal web application written by another group and the entire site is built in a - header, left side menu, right side content - frameset.
For one, when VPN'd to my network I constantly get a "website.com/frames.html" cannot be found." error message. This doesn't happen when I'm on the internal network.
Second, the app has a built in email/messaging system. The number of unread messages is shown in the left side menu frame as "Messages (3)" but the count doesn't update as I read the messages. The developer told me since it was in a frame I needed to right click on the menu and 'Refresh'. Seriously????
So, my programming related question is, what reasons do you have for not using frames in a website?
Although they solved a problem at the time they were created (updating part of a "page" while keeping in place a non-updating part), framesets were criticised in terms of usability pretty much from the start, as they break generic functions of the browser, such as:
bookmarking, and copy-and-pasting URLs to share
printing the page as displayed on the screen
reloading the page: since the URL has generally not changed, you will often be taken back to the site's homepage or default frameset; manually reloading some frames is possible, but not obvious to the user
back and forward buttons are ambiguous: undo/redo the last frame change, or take you to the last time the URL bar changed?
The heaviest burden of avoiding framesets - including the same content on every page - is trivial to solve if you are using any server-side language to generate your HTML, even if all it provides is a "server side include". Unlike framesets, a server-side include could occur anywhere on the page; building a site with a server-side scripting language or templating system has other obvious advantages too.
There is still an advantage to being able to update small areas of the page without reloading the entire content, which can be achieved via AJAX. This sometimes leads people to create interfaces with all the problems of framesets outlined above, but that is hardly an argument in favour of framesets. Again, a site built with well-designed AJAX functionality can achieve things which framesets don't even begin to address.
One good reason to avoid frames today is they have been deprecated in HTML 5: Chapter 11 Obsolete features
11.2 Non-conforming features
Elements in the following list are entirely obsolete, and must not be
used by authors:
[...]
frame
frameset
noframes
Either use iframe and CSS instead, or use server-side includes to
generate complete pages with the various invariant parts merged in.
The #1 reason? Users hate them.
Even if they offered advantages in other areas (separation of code, application design, speed etc) they are part of the user interface. If users don't approve, don't use them.
Frames were vaguely useful when you had a static web site, to avoid repeating navigation menu in all pages, for example. It also reduced the overall size of a page.
Both these arguments are obsolete now: sites don't hesitate to serve fat pages, and most of them are dynamically built so including such navigational parts (or status, etc.) has no problem.
The "why" part is well answered above, partly by your own question (you hit a limitation, although it can be overridden with a bit of JS).
My number 1 reason not to use frames is because they break the bookmark (aka favorite) feature of browsers.
With the technology that exists today, frames have become obsolete. But if your legacy project still uses them, you can make the messages update with some ajax.
Just because of the cell phone iPad craze doesn't mean that highly functional full featured sites are suddenly "obsolete", and those who decided to make framesets obsolete seem to be the same complainers who never figured out their full potential in the first place, or maybe they're the lobbyists of the mega-corporate cell-phone and tablet makers who couldn't be bothered to make a decent frames capable browser for their itty-bitty screens.
Admittedly, iFrames can handle simple jobs like scrolling and/or displaying independent segments within a single page pretty well, and I use them for that inside my own frames based website, but to get them to work as well as the foundation for a site itself is a nightmare. Trust me, I know because my website is one of the most sophisticated frameset based sites on the Internet and I've been looking at the pros and cons of transposing it all to iFrames. Nightmare is an understatement.
I can already hear the whiners saying, "Well why did you build it that way in the first place then?" ... and the answer is A: Because I'm not lazy. and B: Because a frames based site is the most functional, visually appealing, and user friendly format for an information based site with hundreds of pages of content that doesn't have to rely on a server. By that I mean all but the external advertising can be viewed straight off a flash drive. No MySQL or PHP needed.
Here's some of the issues I've encountered:
The objection to orphaned pages can be easily handled with JavaScript.
The objection regarding bookmarking is irrelevant unless you use no frames all.
Content specific bookmarking can be handled with an "Add Bookmark" JavaScript function
The objection regarding SEO is easily handled by an XML sitemap and JavaScript.
Laying out dynamically sized frames is far easier and more dependable with standard framesets.
Targeting and replacing nested framesets from an external frame is easier with standard framesets.
In-house scripts like JavaScript searches and non-server dependent shopping carts that are too complex for cookies don't seem possible with iFrames, or if they are, it's way more hassle to get them working than using standard frames.
All that being said, I like the single page appeal of iFrames, and when they can actually do all the same stuff for my site as easily as standard frames does now, then I'll migrate. In the meantime, this nonsense about them being "obsolete" is as irksome as the other so-called "upgrades" they've foisted on us over the years without thinking it all the way through.
So what does all this boil down to for the question of whether or not to use framesets? The answer is that it all depends on what you want your site to do and on what platform it will mostly be viewed on. At some point it becomes impractical to make a multi-page site work well without some frames or iFrame integration. However if you're just creating a basic profile page that displays well on a cell phone or tablet, don't bother with framesets.
They almost always make people angry. What more do you need?
Frames are really useful in some occasions. If you are creating a local webpage that serves only for reading, no interactivity involved and the website will not be public on the internet, all the reasons on not to use frames are removed. For example a user manual for an application that is developed solely in html, frames are really useful in keeping a table of contents on the left in a simple and easy to code way. Also if you have proper navigation within the website then the back button ambiguity is removed completely

Is true HTML debugging possible?

I've been a web developer for quite some time and what has helped me in learning is to visually see what is going on.
That's the reason for Tools like Aardvark, Web developer, Firebug and many others.
But when i saw the Gecko Reflow Videos they just blew my mind.
Then my question is, is it possible to truly debug html (step through each element)? Or come close to it?
What i've been doing a lot is to use Aardvark and remove elements but Aardvark has its issues with "background" and same size elements and not being able to target those.
UPDATE: I've been trying to write a good update for this question since it has left me thinking about it more. But since English isn't my primary language its been tough.
In the past years it's been the browsers who have had the task of being compatible with the standards. As they get closer to that goal, it is us who should be thinking about what we can truly create when browser compatibility is minimal, and if there are techniques we can utilize that makes rendering a page faster.
We can think of the past decades as the early years of HTML/CSS, where the main goal was just to get the thing to work. Now we should be looking for techniques that speed up the current process. An example of this is in the video above where the Gecko engine is running through the code twice. Why is that? And are there other instances where its doing unnecessary things (even though they work and are compatible)
This is something that clearly needs to be tested to be confirmed, hence my original question of a true debugger.
My $0.02:
"True" HTML debugging, in the sense you're talking about, is not technically possible, because there is no requirement of HTML user agents (web browsers) to render HTML elements in a particular order, nor is there anything like an atomic unit of execution like a "statement".
For instance, when rendering a table, should a user agent reserve space for each <tr> before rendering their child <td>s (breadth-first)? Or should it render each child <td> and each <td>s child and so forth (depth-first)? In practice, user agents make all kinds of guesses to try to render pages as quickly as possible. In other words, there would be no guarantee that debug-order will match actual render-order, nor should there be.
HTML can be thought of as an declarative language in this sense, in that it specifies what should be done (the page rendered to spec) but not exactly how to do it (exactly which order to render elements to the screen). In general, it's best to assume that everything happens at once, although the W3C does give some tips on speeding up <table> rendering based on how user agents should render <table> elements.
IMO, the webdev toolbar and Firebug are the best we've got, where we can edit/disable specific HTML elements and CSS rules.
ok - serious answer.
Judging by the comments on the sites that I've followed from that link, I think that you and I know that there probably isn't. There are a lot of smart blokes and blokettes on those threads, and they all seam to point towards the "no, this is all clever $4!# that wont help us in understanding rendering.
However, I think that what your question might want to emphasis is that rendering at a browser level is very interesting.
Let me just throw this one out there. Do you think that putting body { overflow: scroll; } as a default might speed us up just a little???
In my professional opinion, there's really only one effective tool for time-factoring / assessing / debugging within the html milieu: The WebDev Iterator
Personally, I feel as long as your HTML validates to W3C spec, isn't that all that matters? One should develop their HTML to spec and let browser companies worry about their bugs (which are pretty rare these days) than to focus on old browser mistakes of the past.
HTML Validator plugin for Firefox (aka Tidy) is all any web developer needs to see if their markup is correct, what's wrong, and where it's wrong.
Even if you could do true debugging, each browser parses HTML it's own way, so even if you could step through Firefox to see how a rendering bug occurs, that won't help you with IE or Safari/Chrome at all because they execute parsing in their own manner. This isn't like PHP, .NET or Java where the parsing of the code is the same for everybody, debugging makes sense there.
Then my question is, is it possible to truly debug html (step through each element)? Or come close to it?
You could probably step through the page rendering process by running Firefox under gdb, or modify an open-source browser to have a "step" button, but I really doubt this will achieve anything useful.
CSS isn't that complicated, everything is basically a box, with a width/height/padding/margin.. The problem with web-development (CSS particularly) is every browser implements rendering slightly differently (some more differently than others)..
If you want to know the render-order to speed your page load up, I'd say you're going about this the wrong way.. The browser rendering the page probably accounts for maybe 5% of the load time, the rest is page-generation time and network latency.
You could possibly shave 2ms of your page load by reordering some tags and using a different CSS positioning method.. or you could reduce the page-generation time by 200ms by caching, and half the network latency by setting up a second web-server nearer your users.. Compressing your logo better, or minifying your javascript would most likely improve load-time (universally, across all browsers!)
Basically, if you're concerned about load time, there are much better places to start. If you're concerned about how the page is being rendered, Firebug(-Lite) and http://browsershots.org (or a virtual machine or two) are all you need!