Advice on total page size - html

I'm developing a very graphical website where images are a very important part of the site. In my firsts attempts the average page size is between 1.5 MB - 2.5 MB which I think is too much, but I would like to hear some advice on a reasonable top limit for a website.
Some considerations:
Our target public is from all Spain, this means not everybody will have a fiber connection, many of them will have 10 MB ADSL connection, and some of them will have 3MB ADSL connection
It's not a primary target, but owners want website to be available also on smartphones (although images are more important than smartphone accesibility)
What's your opinion? Thank you

Believe it or not, as average page size has exploded over the last few years, the average page size sits pretty squarely in that range.
http://httparchive.org/trends.php
I wouldn't worry too much about it, but definitely do everything you can to minify and optimize whatever code you have on the site. If your page needs a lot of images, it needs a lot of images.

Related

How do I get our website to load under 1 second?

We have a responsive business website running in html5. Our goal always has been to load the site up fast so the users can browse and get the required info even if they are in a slow bandwidth area. Since our users are from every part of the world and the internet connection is not up to speed in every country, the best option we have is to deliver the content as fast as possible. The main purpose is to have the homepage optimized for speed and the rest of the pages can be served quickly from cache. The steps so far taken:
main document - 10 kb
fonts are preloaded in head
reduced and minified css - final size 6kb
use cdn for jquery.slim.min.js - size 25kb
main image compressed and reduced from 600kb to webp- final size 20kb
favicon - final size 940 bytes
use cdn for bootstrap js async loaded- size 23kb
font reduced - final size 20kb
font awesome locally served - final size 1kb
analytics (preconnected as well) including googletag async loaded - 50kb
.htaccess edited for browser caching (images,css,js,ico,webp,jpg etc) for repeat visits.
Having done all of that, the site loading really gotten faster and the consistent results for lighthouse are provided below for desktop and mobile respectively:
gtmetrix, pingdom has the site loading in 1-1.2 seconds consistently. Domcontentloaded - 910ms. The results are pretty similar from different regions in the world (tested from 8 different countries). Our site is hosted in hostgator and they are gzipping content. Above sizes are for the files that are compressed and sent to browsers before deflating. I have been studying all over the web including stackoverflow to see what can be optimized (for first visit) to get to that sweet spot. I even minified the html which negatively affected the performance, so it is put back to original. homepage html file doesnt have any comments so valuable bytes are not wasted and also we dont have any videos. What else can be done to shave off 200-300 milliseconds ? :) Appreciate any valuable info/advice. Thank you.
WOW!! Great job you are doing, I think you might only add one thing to make it faster which is using some lazyloading for content and photos if any, and fetch them whenever they are in the view port.

How do I optimize a very loooong page with TONS of images on it?

I have been tasked with optimizing a marketing landing page for speed. It already does really well, but the problem is its very graphic heavy.
The entire thing I would guestimate is 30+ pages long all on one page (It must be like this, and everything must be images due to conversion reasons).
Here's what I've done so far but I'm not sure if there's something else I'm missing
I re-compressed over a 140 jpg's on the page to slightly smaller sizes
I played around with sprites but most of the images are all large (like entire testimonial boxes that are 600px wide). The sprite images ended up being like 2mb. That page actually took longer to load for some reason (by almost 2s) so I took them off
The MOST important thing is that everything immediately at the top loads as fast as humanly possible and before anything else so that the sale isn't lost by someone starting at a bunch of images loading out of order. I used some preaching images with this method: http://perishablepress.com/press/2009/01/18/css-image-caching/ - It did seem to work quite well on the smaller images, but when I add the background (which is very graphic intensive) everything seems to slow down at once, like its trying to do a certain number (5?) of images at a time. Ideally I'd love to group the first 4 smaller images being pre-cached, and then follow it up with focusing all bandwidth on the background, then the rest of the entire page in standard order..
Google Page Speed score is 90/100. the only thing its concerned about is unused styles but that I'm not really concerned about because its about 2kb total... the overhead from the 7mb of images is way more important.
I have the option to use a CDN for the images but I'm not sure how I'd go about doing this or if it would actually help?
I feel I've done all I can but then again I could totally be missing something...
A CDN would definitely help. And with 140 pictures, I hope it
contains more than just server. Just link directly to the IP of
the servers, to avoid unnecessary DNS lookup.
Minimize HTTP requests. You mention that you've been experimenting
with sprites. I still believe sprites to be the way to go, but you
might not want to create just one, huge image. If you have 140
images, you should probably have about 10 sprites.
Make sure that you have set headers to make the client cache all
content. Remember ETags.
Gzip/compress content for browsers that allow it.
If the source code is lengthy, make sure to flush the buffer early.
Consider lazily loading images below the fold? There are a number of javascript plugins to accomplish this
scrolling pagination + combining images as sprites on a per-page basis.

What is maximum size of image that can be used on website?

I.e in pixels, and in mb?
Wondering this for awhile thanks!
The maximum size of a JPEG image is 65535 x 65535 pixels.
The maximum size of a PNG image is 2^31-1 x 2^31-1 pixels. You would have great difficulty constructing an image this large due to memory constraints on typical computers.
Some older platforms cannot operate on files that are over two gigabytes in size. Of course, two gigabyte image files would be awkward to work with in most situations, so unless you're doing astronomy with amazing telescopes, I really wouldn't worry about it.
Since most displays are under 1920 x 1200 pixels, it would probably make sense to resize your images down to this size, unless your intention is to allow your clients to make photographic reproductions of your images -- in which case, give your clients as many pixels as you can.
The answer is, there is no such limit, but from a practical point of view, use of images larger than 2 MP (1600x1200) on your website wont make any sense,it wont be it useful/easier for a wide audience. and w.r.t size in MB, if you're seeking practical solution, an image with <2MB would likely serve you in any case.
Eventually the host computer will run out of memory and can't load the image - but I think it is safe to say that you'll never make that happen.
Images for webpages can be as big as you like. You need to think about the convenience (or lack thereof) of the users loading very big images on connections that are now mobile all the time and unstable to say the least.

Web standards - Page size

I would like to know if there are any web standards regarding web page file-size , how much should the total assets on the page weight considering the users have an average internet connection.
Thanks
Depends entirely on your users platform.
Mobile
These are estimates, surprisingly difficult to find good comparison charts.
Normal phone networks, anywhere from 5-20kbps
Edge is pretty slow at around 40kbps
as far as I can tell
3G will download max at around 80kbps
4g will be incredibly fast and is the
future, but not out yet
It is important though to realise that mobile users have a higher tolerance for slow loads. So a 3 second wait for a mobile page shouldn't put many people off.
Broadband
Depending on your visitors location, can have varying speeds.
http://www.speedtest.net/global.php#0
Africa has average of 1.93mbps, South America 3mbps and the rest of the world is > 6mbps. These (as far as I can tell are averages).
So if your visitor base is biased towards the Africas or South America, you should try to aim for a lighter page design. The rest of the world it becomes less of an issue.
As broadband speeds improve world wide, page sizes will become largely irrelevant. But that day is not here yet!
The Importance of Speed
I studied a bit on online responsiveness, and can't find resources at the moment to back this up, but Google as far as I know did some research and found users were sensitive to page delays of just 30ms, meaning they would have a favourable bias towards two equal content pages where one loaded >=30ms faster. So speed can help get a leg up on competition! And the human brain is a lot more sensitive to speed than we might assume. Remember though, different platform users have different tolerance levels.
Recommendation
The faster the better! A well designed, CSS driven layout will naturally be small and have a lot of other benefits down the line (SEO). If your page is loading slowly, find the bottleneck (server or filesizes) and aim to reduce so that the page load is natural and speedy on your audiences target platforms.
There are no real standards, seeing as connection speeds vary massively across the planet and even from user to user.
For general pointers on keeping sizes down, see Google's Pagespeed rules: Minimize payload size
Don't forget about latency too. If you have to have a lot of data on a page, it's much better to have it be in 1 or 2 large items (i.e. 1 css sprite image instead of 20 small items). For every item, you are going to incur at least 1 round trip latency delay.
Even if you page is "small", by whatever standard you use, if it has a large number of items on it, it's going to load slowly for remote users. This is one of the main benefits of CDNs - moving content closer to users so latency is lower.
There is no standard that limits the size of a web page to a fixed size, so I can't quote you a source that says "a web page must be no greater than 100kb" as no such source exists.
The size of the page is entirely related to the audience visiting the web page. If you have an "average" audience, you could confidently send 200kb (inclusive of all images, includes and HTML). If you are targeting mobile devices, you will want to make you pages much smaller than this. If you are publishing data for consumption by online services, it may be acceptable to send mega-byte pages.
So this is where as a web-designer you need to use your judgement to decide how big to make the page and to test the speeds in conditions similar to those of your audience.
Asides from page size, there are various things you can do to make your request faster, including HTTP compression and minimising the number of requests (i.e. combine your scripts into a single include, placing script includes at the bottom of your page and so on...
Here is a helpful tool you can use with Firebug inside of Firefox to view the details of each request...
https://developer.yahoo.com/yslow/

Reducing load time, or making the user think the load time is less

I've been working on a website, and we've managed to reduce the total content for a page load from 13.7MiB's to 2.4, but the page still takes forever to load.
It's a joomla site (ick), and it has a lot of redundant DOM elements (2000+ for the home page), and make 60+ HttpRequest's per page load, counting all the css, js, and image requests. Unlike drupal, joomla won't merge them all on the fly, and they have to be kept separate or else the joomla components will go nuts.
What can I do to improve load time?
Things I've done:
Added colors to dom elements that have large images as their background so the color is loaded, then the image
Reduced excessively large images to much smaller file sizes
Reduced DOM elements to ~2000, from ~5000
Loading CSS at the start of the page, and javascript at the end
Not totally possible, joomla injects it's own javascript and css and it does it at the header, always.
Minified most javascript
Setup caching and gziping on server
Uncached size 2.4MB, cached is ~300KB, but even with so many dom elements, the page takes a good bit of time to render.
What more can I do to improve the load time?
Check out this article.
http://www.smashingmagazine.com/2010/01/06/page-performance-what-to-know-and-what-you-can-do/
If the link gets removed or lost the tools mentioned are:
YSlow (by Yahoo)
Google's Page speed
AOLs web page test
Smush.it (Image compression tool)
It sounds like you've done a great job of working around the real problem: those giant graphics. You can probably squeeze some more efficiency out of caching, minifying, etc., but there has to be a way to reduce the size of the images. I worked with a team of some of the pickiest designers on Earth and they never required uncompressed JPEGs. Do you mean images cut out of Photoshop and saved on full quality (10)? If so, the real solution (and I appreciate that you may not be able to accomplish this) is to have a hard conversation where you explain to the design company, "You are not your users." If the purpose of the site is to only impress other visual designers with the fidelity of your imagery, maybe it's ok. If the purpose of the site is to be a portfolio that gains your company work, they need to re-asses who their audience is and what the audience wants. Which, I'm guessing, is not 2 minute load times.
Have you enabled HTTP Compression (gzip) on your web servers? That will reduce the transfer size of all text-based files by 60-90%. And your page will load 6-10x faster.
Search StackOverflow or Google for how to enable it. (It varies per server software: Apache, IIS, etc).
Are all the DOM elements necessary? If they are, is it possible to hide them as the page loads? Essentially, you would have your important need-to-be-there dom elements render with the page, and then when the document is loaded, you could unhide the rest of the elements as necessary
$('.hidden').removeClass('hidden')
Evidently, there are plugins for compressing and combining JS/CSS files:
http://extensions.joomla.org/extensions/site-management/site-performance
http://extensions.joomla.org/extensions/site-management/site-performance/7350
I would say you can't do anything. You're close to the absolute limit.
Once you get to a critical point, you'll have to compare the amount of effort you'd need to input into further compressing the site against the effort of throwing the most bandwidth expensive components out the window. For example, you might spend 20 hours further compressing your site by 1%. You might also spend 40 hours on throwing the Joomla theme away and starting again saving 50%. (though probably not).
I hope you find an easy answer. I feel your pain as I am struggling to finish a Drupal project than has been massacred by a designer who hasn't implemented his own code in years and has become institutionalized in a cubicle somewhere near a tuna salad sandwich.