I would like to know if there are any web standards regarding web page file-size , how much should the total assets on the page weight considering the users have an average internet connection.
Thanks
Depends entirely on your users platform.
Mobile
These are estimates, surprisingly difficult to find good comparison charts.
Normal phone networks, anywhere from 5-20kbps
Edge is pretty slow at around 40kbps
as far as I can tell
3G will download max at around 80kbps
4g will be incredibly fast and is the
future, but not out yet
It is important though to realise that mobile users have a higher tolerance for slow loads. So a 3 second wait for a mobile page shouldn't put many people off.
Broadband
Depending on your visitors location, can have varying speeds.
http://www.speedtest.net/global.php#0
Africa has average of 1.93mbps, South America 3mbps and the rest of the world is > 6mbps. These (as far as I can tell are averages).
So if your visitor base is biased towards the Africas or South America, you should try to aim for a lighter page design. The rest of the world it becomes less of an issue.
As broadband speeds improve world wide, page sizes will become largely irrelevant. But that day is not here yet!
The Importance of Speed
I studied a bit on online responsiveness, and can't find resources at the moment to back this up, but Google as far as I know did some research and found users were sensitive to page delays of just 30ms, meaning they would have a favourable bias towards two equal content pages where one loaded >=30ms faster. So speed can help get a leg up on competition! And the human brain is a lot more sensitive to speed than we might assume. Remember though, different platform users have different tolerance levels.
Recommendation
The faster the better! A well designed, CSS driven layout will naturally be small and have a lot of other benefits down the line (SEO). If your page is loading slowly, find the bottleneck (server or filesizes) and aim to reduce so that the page load is natural and speedy on your audiences target platforms.
There are no real standards, seeing as connection speeds vary massively across the planet and even from user to user.
For general pointers on keeping sizes down, see Google's Pagespeed rules: Minimize payload size
Don't forget about latency too. If you have to have a lot of data on a page, it's much better to have it be in 1 or 2 large items (i.e. 1 css sprite image instead of 20 small items). For every item, you are going to incur at least 1 round trip latency delay.
Even if you page is "small", by whatever standard you use, if it has a large number of items on it, it's going to load slowly for remote users. This is one of the main benefits of CDNs - moving content closer to users so latency is lower.
There is no standard that limits the size of a web page to a fixed size, so I can't quote you a source that says "a web page must be no greater than 100kb" as no such source exists.
The size of the page is entirely related to the audience visiting the web page. If you have an "average" audience, you could confidently send 200kb (inclusive of all images, includes and HTML). If you are targeting mobile devices, you will want to make you pages much smaller than this. If you are publishing data for consumption by online services, it may be acceptable to send mega-byte pages.
So this is where as a web-designer you need to use your judgement to decide how big to make the page and to test the speeds in conditions similar to those of your audience.
Asides from page size, there are various things you can do to make your request faster, including HTTP compression and minimising the number of requests (i.e. combine your scripts into a single include, placing script includes at the bottom of your page and so on...
Here is a helpful tool you can use with Firebug inside of Firefox to view the details of each request...
https://developer.yahoo.com/yslow/
Related
I maintain a Django forum where users can leave messages and photos for one another (or the entire public at large). Think of it like a counterpart of 9gag.
I benchmarked my website's performance via GTmetrix. One of things it rated me lowly on is specificity of image dimensions. Specifically, it advises me to add image width and height within all <img> tags, so that browsers have an easier time rendering the page, minimizing repainting.
All uploaded images are dynamically re-sized and saved in the DB, maintaining their aspect ratio. The only way for me to include their heights and widths in their <img> tags on the template is if I saved their width or height in the DB as well. OR, if I read their width and height when serving them for viewing.
Either way, it's going to take some computing power on my part. It seems the trade-off here is between the server's computing time, versus the users' browsers. I.e. server-end or client-end. I do want to optimize the experience for my users, and my server resources are decent.
A new image gets uploaded on my website's public area every few seconds. You can say it's pretty much a staple of the website. Does anyone have experience of a big performance boost from including image width and heights in their template? I'd love to hear some advice on it, so I can go ahead with it (or dump the idea).
Thanks in advance.
If you know the precise layout of your site, then you could resize the image on user-upload and specify your dimensions using css.
If your site is responsive and the images scale according to device dimensions, then I wouldn't worry too much with this optimisation suggestion. Twitter scores an F (0) for this on GTmetrix, and has a D and B grade respectively but still has a faster load time than most sites.
An afterthought as suggested by Hassan:
My suggestion is that you focus on the internals of Django (eg. caching templates, using memcache, etc.) and then optimise delivery of static content (eg. compression in Nginx, say) and configure your web server's caching policies. This is where you'll notice a large performance gain. Lastly, if you have tons of page views, you may want to consider deploying Varnish or Squid in front of your web server.
I'm developing a very graphical website where images are a very important part of the site. In my firsts attempts the average page size is between 1.5 MB - 2.5 MB which I think is too much, but I would like to hear some advice on a reasonable top limit for a website.
Some considerations:
Our target public is from all Spain, this means not everybody will have a fiber connection, many of them will have 10 MB ADSL connection, and some of them will have 3MB ADSL connection
It's not a primary target, but owners want website to be available also on smartphones (although images are more important than smartphone accesibility)
What's your opinion? Thank you
Believe it or not, as average page size has exploded over the last few years, the average page size sits pretty squarely in that range.
http://httparchive.org/trends.php
I wouldn't worry too much about it, but definitely do everything you can to minify and optimize whatever code you have on the site. If your page needs a lot of images, it needs a lot of images.
I've found lots of information about measuring load time for pages and quite a bit about profiling FPS performance of interactive applications, but this is something slightly different than that.
Say I have a chart rendered in SVG and every click I make causes the chart to render slightly differently. I want to get a sense of the complete time elapsed between the click and the point in time that the pixels on the screen actually change. Is there a way to do this?
Measuring the Javascript time is straight forward but that doesn't take into consideration any of the time the browser spends doing any layout, flow, paint, etc.
I know that Chrome timeline view shows a ton of good information about this, which is great for digging into issues but not so great for taking measurements because the tool itself affects performance and, more importantly, it's Chrome only. I was hoping there was a browser independent technique that might work. Something akin to how the Navigation Performance API works for page load times.
you may consider using capturing hdmi capturing hardware (just google for it) or a high speed camera to create a video, which could be analyzed offline.
http://www.webpagetest.org/ supports capturing using software only, but I guess it would be too slow for what you want to measure.
I've been working on a website, and we've managed to reduce the total content for a page load from 13.7MiB's to 2.4, but the page still takes forever to load.
It's a joomla site (ick), and it has a lot of redundant DOM elements (2000+ for the home page), and make 60+ HttpRequest's per page load, counting all the css, js, and image requests. Unlike drupal, joomla won't merge them all on the fly, and they have to be kept separate or else the joomla components will go nuts.
What can I do to improve load time?
Things I've done:
Added colors to dom elements that have large images as their background so the color is loaded, then the image
Reduced excessively large images to much smaller file sizes
Reduced DOM elements to ~2000, from ~5000
Loading CSS at the start of the page, and javascript at the end
Not totally possible, joomla injects it's own javascript and css and it does it at the header, always.
Minified most javascript
Setup caching and gziping on server
Uncached size 2.4MB, cached is ~300KB, but even with so many dom elements, the page takes a good bit of time to render.
What more can I do to improve the load time?
Check out this article.
http://www.smashingmagazine.com/2010/01/06/page-performance-what-to-know-and-what-you-can-do/
If the link gets removed or lost the tools mentioned are:
YSlow (by Yahoo)
Google's Page speed
AOLs web page test
Smush.it (Image compression tool)
It sounds like you've done a great job of working around the real problem: those giant graphics. You can probably squeeze some more efficiency out of caching, minifying, etc., but there has to be a way to reduce the size of the images. I worked with a team of some of the pickiest designers on Earth and they never required uncompressed JPEGs. Do you mean images cut out of Photoshop and saved on full quality (10)? If so, the real solution (and I appreciate that you may not be able to accomplish this) is to have a hard conversation where you explain to the design company, "You are not your users." If the purpose of the site is to only impress other visual designers with the fidelity of your imagery, maybe it's ok. If the purpose of the site is to be a portfolio that gains your company work, they need to re-asses who their audience is and what the audience wants. Which, I'm guessing, is not 2 minute load times.
Have you enabled HTTP Compression (gzip) on your web servers? That will reduce the transfer size of all text-based files by 60-90%. And your page will load 6-10x faster.
Search StackOverflow or Google for how to enable it. (It varies per server software: Apache, IIS, etc).
Are all the DOM elements necessary? If they are, is it possible to hide them as the page loads? Essentially, you would have your important need-to-be-there dom elements render with the page, and then when the document is loaded, you could unhide the rest of the elements as necessary
$('.hidden').removeClass('hidden')
Evidently, there are plugins for compressing and combining JS/CSS files:
http://extensions.joomla.org/extensions/site-management/site-performance
http://extensions.joomla.org/extensions/site-management/site-performance/7350
I would say you can't do anything. You're close to the absolute limit.
Once you get to a critical point, you'll have to compare the amount of effort you'd need to input into further compressing the site against the effort of throwing the most bandwidth expensive components out the window. For example, you might spend 20 hours further compressing your site by 1%. You might also spend 40 hours on throwing the Joomla theme away and starting again saving 50%. (though probably not).
I hope you find an easy answer. I feel your pain as I am struggling to finish a Drupal project than has been massacred by a designer who hasn't implemented his own code in years and has become institutionalized in a cubicle somewhere near a tuna salad sandwich.
So I was looking at the facebook HTML with firebug, and I chanced upon this image
and came to the conclusion that facebook uses this large image (with tricky image positioning code) rather than many small ones for its graphical elements. Is this more efficient than storing many small images?
Can anybody give any clues as to why facebook would do this.
These are called CSS sprites, and yes, they're more efficient - the user only has to download one file, which reduces the number of HTTP requests to load the page. See this page for more info.
The problem with the pro-performance viewpoint is that it always seems to present the "Why" (performance), often without the "How", and never "Why Not".
CSS Sprites do have a positive impact on performance, for reasons that other posters here have gone into in detail. However, they do have a downside: maintainability; removing, changing, and particularly resizing images becomes more difficult - mostly because of the alterations that need to be made to the background-position-riddled stylesheet along with every change to the size of a sprite, or to the shape of the map.
I think it's a minority view, but I firmly believe that maintainability concerns should outweigh performance concerns in the vast majority of cases. If the performance is necessary, then go ahead, but be aware of the cost.
That said, the performance impact is massive - particularly when you're using rollovers and want to avoid that effect you get when you mouseover an image then the browser goes away to request the rollover. It's appropriate to refactor your images into a sprite map once your requirements have settled down - particularly if your site is going to be under heavy traffic (and certainly the big examples people have been pulling out - facebook, amazon, yahoo - all fit that profile).
There are a few cases where you can use them with basically no cost. Any time you're slicing an image, using a single image and background-position tags is probably cheaper. Any time you've got a standard set of icons - especially if they're of uniform size and unlikely to change. Plus, of course, any time when the performance really matters, and you've got the budget to cover the maintenance.
If at all possible, use a tool and document your use of it so that whoever has to maintain your sprites knows about it. http://csssprites.org/ is the only tool I've looked into in any detail, but http://spriteme.org/ looks seriously awesome.
The technique is dubbed "css sprites".
See:
What are the advantages of using CSS
Sprites in web applications?
Performance of css sprites
How do CSS sprites speed up a web
site?
Since other users have answered this already, here's how to do it, and another way is here.
Opening connections is more costly than simply continuing a transfer. Similarly, the browser only needs to cache one file instead of hundreds.
yet another resource: http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-tools-and-tutorials/
One of the major benefits of CSS sprites is that it add virtually 0 server overhead and is all calculated client side. A huge gain for no server side performance hit.
Simple answer, you only have to 'fetch' one image file and it is 'cut' for different views, if you used multiple images that would be multiple files you would need to download, which simply would equate into additional time to download everything.
Cutting up the large image into 'sprites' makes one HTTP request and provides a no flicker approach as well to 'onmouseover' elements (if you reuse the same large image for a mouse over effect).
Css Sprites tecnique is a method for reducing the number of image requests using background position.
Best Practices for Speeding Up Your Web Site
CSS Sprites: Image Slicing’s Kiss of Death
Google also does it - I've written a blog post on it here: http://www.stevefenton.co.uk/Content/Blog/Date/200905/Blog/Google-Uses-Image-Sprites/
But the essence of it is that you make a single http request for one big image, rather than 20 small http requests.
If you watch a http request, they spend more time waiting to start downloading than actually downloading, so it's much faster to do it in one hit - chunky, not chatty!