I recently enabled Brotli compression at one of the CDN platform that we use. With this I was expecting the performance to improve since the resources sizes gets reduced by 15-30% but to my surprise I see that the performance is still the same.
I did check various metrics and all looks still the same except ttfb where I see an increase of 10-15 miliseconds per resource.
Has anyone seen this before and if yes, what are the best ways to go about this issue? I am also suspecting that chrome might be taking longer time to decompress the resources when its brotli than Gzip but unfortunately I do not have any way to measure that time.
There is not enough detail to answer question. Performance is relative and so the gains due to Brotli may be getting drowned out by bigger performance issues with your site.
Some questions for you to answer for yourself:
Is Brotli setup correctly and working? Can you see br as the content-encoding in developer tools network tab? Note you may need to add the content encoding column.
Are you using HTTPS on your site (required by all browsers to use Brotli)? Did you move to HTTPS as part of this move? Is your HTTPS optimised.
Has the overall size of your site gone down once you enabled Brotli? If so by how much? If you have lots of 10Mb print-quality images on your site and you have changed your HTML from being 50kb to 45kb then you may not see much of an overall difference.
How long does your page take to generate? If your page takes 30 seconds to generate because the HTML is dynamic and your backend (app server, data server, whatever) is slow, then going to 29.5 seconds won’t seem that much.
Do you have lots of render-blocking CSS and JavaScript. These are text so should hopefully be delivered faster now but if they are massively complex and processing time on the client is large then the download time may be an insignificant part of this.
Are you testing from your company office while sitting 50metres from your data centre with a high speed 1000Mbps Ethernet connection which is basically talking straight into the web server? If so download speeds are going to be negligible no matter what the size of the downloads.
Brotli should compress text smaller. It can take longer/more processing power to do that compression than gzip but the network gains versus the CPU costs are usually worth it.
It is not magic however, and cannot make up for other performance issues on a site.
Related
NB: since the answer to this could involve JavaScript or PHP programming, or general networking, or IT systems, I put it here, but if some mod thinks it's better suited for SuperUser or ServerFault, I won't object to it being moved.
I have a landing page to which I'm driving traffic through PPC. I've set up AWS CloudWatch to get RUM data, and the page is performing terribly — an average load time of 9.9s and max of 21.5!
I've done all of the "standard" optimization I can think of or research. The site is built with WordPress, running on Apache on an EC2 server. I've
Upgraded the EC2 instance to ensure I have enough memory
Written a custom plugin to filter out any other plugins that aren't explicitly used on the landing page in question
Customised my theme code so that it sets proper srcset and creates the correct image sizes on upload
Minified all the JS and CSS that I include through plugins or themes I've written
Put the site itself as a distribution to CloudFront
Installed the WP Super Cache plugin, and created a separate CDN distribution on CloudFront for it
Set appropriate cache control headers on CloudFront and told it to gzip everything
Put a facade in place of any videos
The site is blazing fast for me — "load" is less than 1 second. But my RUM says that's not the case for my users. So, I dug a little deeper. 70.2% of my visitors use Chrome, and 27.7% use other, of which almost 1/3rd are Android Browser — which as I understand it is just some sort of "Chrome Lite" — so nearly 80% of my visitors are using some Chrome variant.
Sure enough — if I load the page on Safari (to ensure nothing has expired on CloudFront), clear my browser cache, and reload the page, the first request shows a waiting time of 21.2ms, TTFB of 22.6ms, and download time of 4.8ms. The whole page shows that it's finished loading in 973ms.
Firefox is slightly slower, with the first request taking 100ms total, and the whole load about 1.75s — not blazing fast, but still within the understood "2 second" limit for good user experience.
On the other hand, for Chrome that same first request takes almost 570ms waiting and 208ms download. So, just the first request (which is 36k in size) takes almost as long to load in Chrome as the whole site takes in Safari. And that repeats for every single request, where both the waiting time and the download time are an order of magnitude slower on Chrome than on Safari (on the same device on the same network):
Whereas on Safari:
I would think "waiting" and "download" times would be primarily network driven, but I can repeat this all day long and the results are the same.
I might just assume that Chrome is not optimized for the Mac on which I'm running it, but, as I said, this all started with RUM data, so it's clearly not that. As much as I might like to, I obviously can't force all of my visitors to swap their Android devices for iPhones. Equally obviously, I can't have an average load time of 10 seconds.
So, why is my site so slow on Chrome? What else can I do to optimize this?
The landing page in question is here: https://www.chrisrichardson.info/lp/prague-b/
Note, a lot of the optimization I've done is for that page in particular, so other pages on the site might perform even worse, but I don't care about that, at least at the moment.
Hahahaha.
OK, just leaving this here for posterity's sake. The 10x latency was because Chrome had preserved in devtools from a previous session throttling to 3G. So, if you stumble upon this problem, check your throttling.
That still doesn't address my RUM issue, however, but I'll open that up as a different question.
I have a website that runs just fine on my local server. It's quick, responsive, and overall runs great.
But when I put it on the server of my domain host, sometimes it takes excessively long to load assets. For example, one 1MB png file took 2.31 seconds to load:
Chrome's Network Developer Tool reveals to me the following:
So is this likely due to poor implementation of my code or is it possibly a crappy server? (The company is subscribing at the lowest tier possible to host their content) My internet connection is quick so I doubt it's that.
I think it is probably a problem with your host. An image is an image :) there aren't a 100 ways to implement one!
Oversized images always take longer to load, so you should keep your images as small as possible.To lower down the content download time, you can optimize/compress the image without degrading it's visual quality.If you are using any graphics software to optimize the images, you should use “Save for Web” option. This will reduce the size of images and hence image load time.
Furthermore, you can use CDN to serve static assets of your website like, images, CSS,JS, videos, etc. A CDN populates your website files to geographically distributed network of servers called POPs. CDN serves the website resources from the nearest geographical location of a visitor, that means your website assets will load more faster.
Use SSD based host. SSD has excellent read/write rates compared to that of traditional HDDs. Hence, solid state drives perform better than hard disk drives and they are almost 100 times faster than traditional HDDs.
Here's a question for you
Is the image you're trying to load in background, or its an image tag?
I assume that if you put some Javascript code in an external source (and use the src="") that it's a little slower b/c the page has to then download another portion, but I'm wondering whether that's inconsequential.
From testing I've done online (with webpagetest.org) seems quite small (< 5% of the total page time loading).
But just wondering about what's happening "under the hood" and whether the browser (I assume) is spinning up another process to download that bit separately rather than coming across from the server with the rest of the page is actually just as fast (b/c it's happening in parallel).
Not slow enough to matter.
I think the speed difference question is a red herring. Generally, you should keep your script separate from your html:
Separation of concerns: the html is the structure of the site, whereas the script is its behavior. It mixes concerns to mingle them together, and it's best practice to keep your script in a separate file.
It might seem counter-intuitive that script served separately from html could be just as fast or even faster, but things like caching proxy servers, content delivery networks, and even new web protocols like SPDY can make the speed question completely moot.
If you test in Firebug in Firefox you'll see that Firefox is downloading multiple files at the same time (the number of concurrent files is different for each browser). But the main reason why you should put js code in external files is that it can be minified and compressed on the server side, and also cached by the browsers. Loading it from an external file has also the benefits of being able to load it from a static domain (cookie less) and use a CDN to speed up the delivery. So to reply to your question it'll be slower to put it in the page as the browser will need to download it every time it loads the page.
I'm the primary developer at a web firm, but often end up doing some sysadmin stuff, and was wondering what resources are available for learning how to troubleshoot slow page load times.
My experience with sysadmin tools is almost none. I'm relatively proficient at the Linux/Unix command line, but have never used any type of packet tracking software and only know the basics of using dig for ip resolves. My experience with apache and mysql is mostly limited to configuring initial setup and then using them.
Are there any good books or web sites that cover the topics needed for accurately diagnosing website performance/bottlenecks and if so what are they, or is the gamut of technologies used to large and experience/time with using the technologies typically how people get good at this stuff?
Overall, there's no substitute for experience. A concept as broad as "slow web page load time" could be hitting a bottleneck in any number of different places:
Client is slow to resolve IP from domain.
Network between client and domain is slow or congested.
Server is slow to respond to request.
Requested page is large.
Requested resources embedded in page are large.
Page contains server-side code that requires significant processing.
Database is slow to respond.
Page manipulates a lot of data before responding.
Rendered page on client contains a lot of code and runs slowly.
etc.
For any given page, it's a matter of know where the bottlenecks could be and determining what that bottleneck is in order to address it. Having a full and complete understanding of everything that goes on from end to end in "loading a page" is essential. Identifying patterns of slow load times across multiple disparate requests will help narrow down the potential bottlenecks. etc.
It's very much a case-by-case scenario.
Have you diagnosed exactly where the problems are? Slow load times could be caused by anything, such as:
too much going on in the dom/js
a) not caching js or other resources on the client
b) not minifying/compressing resources
c) making too many requests with ajax/doing silly things in the browser like redrawing dom that doesn't need it.
too much going on in the server
a) no cache on db tables, no indexes
b) not handling long running tasks asynchronously
c) improperly configured proxies/apache servers
network issues -- wish I knew more about this.
Step 1 is always to figure out where the worst slowdown is. Do some metrics on the server to make sure it is doing easy stuff fast. And hard stuff reasonably fast. Look in the browser to see how long loading resources is taking. Look in the chrome/firebug profilers to see how much time the javascript is taking to run.
You will probably find a bunch of things that could be improved. Prioritize and address the issues...
I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.
You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd
This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.
Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.
It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.
I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.