We have a responsive business website running in html5. Our goal always has been to load the site up fast so the users can browse and get the required info even if they are in a slow bandwidth area. Since our users are from every part of the world and the internet connection is not up to speed in every country, the best option we have is to deliver the content as fast as possible. The main purpose is to have the homepage optimized for speed and the rest of the pages can be served quickly from cache. The steps so far taken:
main document - 10 kb
fonts are preloaded in head
reduced and minified css - final size 6kb
use cdn for jquery.slim.min.js - size 25kb
main image compressed and reduced from 600kb to webp- final size 20kb
favicon - final size 940 bytes
use cdn for bootstrap js async loaded- size 23kb
font reduced - final size 20kb
font awesome locally served - final size 1kb
analytics (preconnected as well) including googletag async loaded - 50kb
.htaccess edited for browser caching (images,css,js,ico,webp,jpg etc) for repeat visits.
Having done all of that, the site loading really gotten faster and the consistent results for lighthouse are provided below for desktop and mobile respectively:
gtmetrix, pingdom has the site loading in 1-1.2 seconds consistently. Domcontentloaded - 910ms. The results are pretty similar from different regions in the world (tested from 8 different countries). Our site is hosted in hostgator and they are gzipping content. Above sizes are for the files that are compressed and sent to browsers before deflating. I have been studying all over the web including stackoverflow to see what can be optimized (for first visit) to get to that sweet spot. I even minified the html which negatively affected the performance, so it is put back to original. homepage html file doesnt have any comments so valuable bytes are not wasted and also we dont have any videos. What else can be done to shave off 200-300 milliseconds ? :) Appreciate any valuable info/advice. Thank you.
WOW!! Great job you are doing, I think you might only add one thing to make it faster which is using some lazyloading for content and photos if any, and fetch them whenever they are in the view port.
Related
I built a website that is currently live at this link Arema Insurance Agency
The issue is it takes too long to load the png images I have on there. Most of the body of the website was designed in adobe illustrator and imported as a png. The size of the homepage png is 4.4 MB and ( width="1892" height="3082" ) in my html.
Is there a way I can optimize the speed?
There are lots of options here.
Optimise the images. Run them through an online compressor like Squoosh.
Cache the website. How to do this is highly dependent on how the site is set up. Considering the site is made up of just three images (incredibly bad-practice by the way), I'd guess your setup probably isn't equipped for this.
Get better hosting. It shouldn't take 20 seconds to load a 4.4mb webpage, I suspect your hosting service is terrible. Sticking to the big players such as CrazyDomains or HostGator is your best bet.
There's plenty of other ways detailed on Google, but start with these as they are the easiest and most likely to have an effect on a site like this.
You can use developer tools from your browser and check the network section of that(similar to where you check console.log statement)
In Chrome to open network tab ctrl+shift+i and then network. It will open this up
As far as I inspected your HomePageOfficial_AIA.png is taking more time to load. So you can optimize the image accordingly or check your hosting support for the same
https://i.stack.imgur.com/1cxR7.png
I'm working on optimizing a site and I've been able to get all of the pages to load in around 2 seconds just by optimizing the images with photoshop but, I have one page that doesn't seem to react like the rest. When I open up the developer tools in Chrome and watch the waterfall timeline of the loading elements it shows that there is a lag of over 3 seconds just to load the document. The fastest time I've ever been able to get the page to load is around 7 seconds, which is way longer than it should be since the page is only 895 kB.
Is this a server issue or is there some faulty code in the specific page? It makes me wonder because other pages on the same website load much faster even though their size is greater. Any help from you would be awesome!
Link to webpage: http://egi.utah.edu/corporate-associate-program/corporate-associate-list/
A cache plugin probably will solve your problem since no query will be called to db unless something in the page changed.
I used 2 of them and both speed up sites. (But only use one, you cant use both at the same time.)
Wp-rocket http://wp-rocket.me/
Wordpress super cache https://wordpress.org/plugins/wp-super-cache/
I was able to solve the problem with your input. It turns out that it was a image slider plugin (FlexSlider) that was re-generating the images to fit the size of the element which was slowing it down so much. There were 7 large images that it re-generated on the server side as the page was requested. All I had to do was stop the plugin from resizing the images by leaving it to the default setting and I wrote some javascript to do the same thing but locally and it sped up the loading time drastically. It's now loading at under 3 seconds when it was loading at around 10.
Thanks #drew010
The first background image I have on my website - loads slowly.
Now, it's a background image but it should fit a large screen so it's a 900K png.
What are the tricks I can make in order for it to load faster? (it's the first thing you see so even the 3-4 seconds it takes is felt...
There are lots of tricks. The lads above in their comments are right, there are things you can do to compress it, and analysis what is taking the time.
Load times and improving them are around 3 components:
- Network Latency (the time it takes to pass data from the browser to your host)
- Image Size (physical size of the image that goes down the pipe)
- Connection Cost (Time taken to establish the connection with the host)
Now, if you we take your scenario, you have an image that is large, but not huge. So how can we make it faster, well we can remove the time it takes to download by:
a) compressing it - many options here, along with the google one above, here is a simple one: https://tinypng.com/
b) remove the network latency by ensuring the host sends the cache-control with a max-age in it. This will mean when you refresh it the host returns a 304 and that browser shows the image from the cache
c) use a CDN such as cloudflare to do the hard stuff for you. A CDN will handle the compression, setting of the cache flag, reduce your network latency and ensures your connections are established quickly. Some like cloudflare have free starting tariffs, so worth a look.
d) lazy load the image using javascript after the page has rendered. It's still good to do all the above, but if you create some javascript to fire on document.ready and set the background using css/img tag, background or whatever mechanism you are currently using then it will start the download of the image at that point, enabling your users to see the page and interact with it ahead of your background being finalised.
Besides improving website performance, I consider it an important courtesy to site visitors to make a point of compressing all images.
(Not to insult anyone's intelligence here but, just to clarify, compressing an image is not the same as "shrinking" it's width/height — although both will reduce file size.)
If you have 1000 visitors and your site has 100 images and you save a half second of load time per image, then you just saved mankind 14 hours of page load time! (and it only takes a few seconds to compress an image.)
I compress the occasional "one-off" manually with:
TinyPNG.com for .pngs and .jpg's (up to 20 images at a time)
ezGIF.com for animated .GIFs.
Batches of images can be compressed programmatically using a library like:
pngquant, or
PHP's built-in ImageMagick.
As far as I know these all use the same compression algorithm, known as "Lossy Compression", which can reduce filesize significantly without any noticeable degradation of image quality.
I've been working on a website, and we've managed to reduce the total content for a page load from 13.7MiB's to 2.4, but the page still takes forever to load.
It's a joomla site (ick), and it has a lot of redundant DOM elements (2000+ for the home page), and make 60+ HttpRequest's per page load, counting all the css, js, and image requests. Unlike drupal, joomla won't merge them all on the fly, and they have to be kept separate or else the joomla components will go nuts.
What can I do to improve load time?
Things I've done:
Added colors to dom elements that have large images as their background so the color is loaded, then the image
Reduced excessively large images to much smaller file sizes
Reduced DOM elements to ~2000, from ~5000
Loading CSS at the start of the page, and javascript at the end
Not totally possible, joomla injects it's own javascript and css and it does it at the header, always.
Minified most javascript
Setup caching and gziping on server
Uncached size 2.4MB, cached is ~300KB, but even with so many dom elements, the page takes a good bit of time to render.
What more can I do to improve the load time?
Check out this article.
http://www.smashingmagazine.com/2010/01/06/page-performance-what-to-know-and-what-you-can-do/
If the link gets removed or lost the tools mentioned are:
YSlow (by Yahoo)
Google's Page speed
AOLs web page test
Smush.it (Image compression tool)
It sounds like you've done a great job of working around the real problem: those giant graphics. You can probably squeeze some more efficiency out of caching, minifying, etc., but there has to be a way to reduce the size of the images. I worked with a team of some of the pickiest designers on Earth and they never required uncompressed JPEGs. Do you mean images cut out of Photoshop and saved on full quality (10)? If so, the real solution (and I appreciate that you may not be able to accomplish this) is to have a hard conversation where you explain to the design company, "You are not your users." If the purpose of the site is to only impress other visual designers with the fidelity of your imagery, maybe it's ok. If the purpose of the site is to be a portfolio that gains your company work, they need to re-asses who their audience is and what the audience wants. Which, I'm guessing, is not 2 minute load times.
Have you enabled HTTP Compression (gzip) on your web servers? That will reduce the transfer size of all text-based files by 60-90%. And your page will load 6-10x faster.
Search StackOverflow or Google for how to enable it. (It varies per server software: Apache, IIS, etc).
Are all the DOM elements necessary? If they are, is it possible to hide them as the page loads? Essentially, you would have your important need-to-be-there dom elements render with the page, and then when the document is loaded, you could unhide the rest of the elements as necessary
$('.hidden').removeClass('hidden')
Evidently, there are plugins for compressing and combining JS/CSS files:
http://extensions.joomla.org/extensions/site-management/site-performance
http://extensions.joomla.org/extensions/site-management/site-performance/7350
I would say you can't do anything. You're close to the absolute limit.
Once you get to a critical point, you'll have to compare the amount of effort you'd need to input into further compressing the site against the effort of throwing the most bandwidth expensive components out the window. For example, you might spend 20 hours further compressing your site by 1%. You might also spend 40 hours on throwing the Joomla theme away and starting again saving 50%. (though probably not).
I hope you find an easy answer. I feel your pain as I am struggling to finish a Drupal project than has been massacred by a designer who hasn't implemented his own code in years and has become institutionalized in a cubicle somewhere near a tuna salad sandwich.
How much faster is it to use a base64/line to display images than opposed to simply linking to the hard file on the server?
url(data:image/png;base64,.......)
I haven't been able to find any type of performance metrics on this.
I have a few concerns:
You no longer gain the benefit of caching
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
Let's define "faster" as in: the time it takes for a user to see a full rendered HTML web page
'Faster' is a hard thing to answer because there are many possible interpretations and situations:
Base64 encoding will expand the image by a third, which will increase bandwidth utilization. On the other hand, including it in the file will remove another GET round trip to the server. So, a pipe with great throughput but poor latency (such as a satellite internet connection) will likely load a page with inlined images faster than if you were using distinct image files. Even on my (rural, slow) DSL line, sites that require many round trips take a lot longer to load than those that are just relatively large but require only a few GETs.
If you do the base64 encoding from the source files with each request, you'll be using up more CPU, thrashing your data caches, etc, which might hurt your servers response time. (Of course you can always use memcached or such to resolve that problem).
Doing this will of course prevent most forms of caching, which could hurt a lot if the image is viewed often - say, a logo that is displayed on every page, which could normally be cached by the browser (or a proxy cache like squid or whatever) and requested once a month. It will also prevent the many many optimizations web servers have for serving static files using kernel APIs like sendfile(2).
Basically, doing this will help in certain situations, and hurt in others. You need to identify which situations are important to you before you can really figure out if this is a worthwhile trick for you.
I have done a comparison between two HTML pages containing 1800 one-pixel images.
The first page declares the images inline:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC">
In the second one, images reference an external file:
<img src="img/one-gray-px.png">
I found that when loading multiple times the same image, if it is declared inline, the browser performs a request for each image (I suppose it base64-decodes it one time per image), whereas in the other scenario, the image is requested once per document (see the comparison image below).
The document with inline images loads in about 250ms and the document with linked images does it in 30ms.
(Tested with Chromium 34)
The scenario of an HTML document with multiple instances of the same inline image doesn't make much sense a priori. However, I found that the plugin jquery lazyload defines an inline placeholder by default for all the "lazy" images, whose src attribute will be set to it. Then, if the document contains lots of lazy images, a situation like the one described above can happen.
You no longer gain the benefit of caching
Whether that matters would vary according to how much you depend on caching.
The other (perhaps more important) thing is that if there are many images, the browser won't get them simultaneously (i.e. in parallel), but only a few at a time -- so the protocol ends up being chatty. If there's some network end-to-end delay, then many images divided by a few images at a time multiplied by the end-to-end delay per image results in a noticeable time before the last image is loaded.
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
The file format / image compression algorithm is the same, I take it, i.e. it's PNG.
Using Base-64, each 8-bit character represents 6-bits: therefore binary data is being decompressed by a ratio of 8-to-6, i.e. only about 35%.
How much faster is it
Define 'faster'. Do you mean HTTP performance (see below) or rendering performance?
You no longer gain the benefit of caching
Actually, if you're doing this in a CSS file it will still be cached. Of course, any changes to the CSS will invalidate the cache.
In some situations this could be used as a huge performance boost over many HTTP connections. I say some situations because you can likely take advantage of techniques like image sprites for most stuff, but it's always good to have another tool in your arsenal!