Can somebody point me to decent tutorial explaining the various http-header settings influencing caching (for proxies, the browser, their play-together and possible quirks in browsers and caching engines)
I think it is a somewhat neglected feature --- at least I do not make use of it. Caching CSS, JS and pictures (pretty much everything coming from the cookieless domain) should speed up things a bit.
Caching tutorial -- This is pretty good article describing HTTP caching process and related headers.
There is good article about Optimizing cache at Let's make the web faster website. All articles there are worth checking out.
To speed up a things, you can also:
reduce number of requests needed to load your page
compress for your HTTP responses
combine your CSS/JS files together, and minimize them, for example by using YUI Compressor or Google Closure Compiler
put small images into one big bundle so that they are loaded using single request (online tool for putting images together)
Implement SPDY support on your server -- that's probably not something you'll do, but it's interesting idea worth mentioning
here you can find spec, you can refer it.
Related
I'm trying to reduce the size of my website, but to do that I need a reliable tool to measure the size of my pages.
I used to use Google Lighthouse, in the performance audits it reports the size, but it's not precise, and it's inconsistent with the network tab
I tried several combinations of curl, but I can't make it crawl website correctly
I tried several combinations of wget, but it couldn't handle correctly the gzip or brotli encoding
I came to the conclusion that wget or curl are not the right tools, because they don't evaluate JS, so they can't do conditional loading of assets
I'm trying now with puppetter.js and phanotm.js, but I still haven't managed to do it
Does anyone have a good solution for this?
How to Measure Size
Web browsers make a lot of decisions about what to download based on their particular context (for example, what compression algorithms it supports). It's difficult to replicate those conditions in an external tool, such as curl. So you'll want to use a tool that thinks like a browser (or is a browser).
The server can also choose to send different content based on visitor information (user agent, whether they're logged in, geolocation, etc.) or even completely arbitrary conditions (like a randomized image). So you'll want to look at more than one sample, preferably from many user agents and locations.
Most tools don't provide that kind of power.
The closest thing I can suggest is WebPageTest. It uses an actual web browser to visit your site and reports an analysis of that visit, including total page weight (even broken down by different page events). WebPageTest can be used as an API and even run locally. Output is available as JSON, so you can parse and do custom reporting with CLI apps.
How to Speed Up a Website
The technical question of "weighing" a website aside, there's a broader problem you're trying to tackle: how to speed up your website. There is a lot of information available for performance optimization.
Specifically, there's a lot of discussion about what metrics should be considered when evaluating a page's performance, how much weight should be given each metric, and how to use that information to prioritize optimizations.
When considering page weight, I would highly recommend breaking it down by how many bytes are necessary to accomplish certain tasks. Google recommends thinking about resources in terms of the critical rendering path - the HTML, blocking JS, and non-deferred CSS necessary to construct a web page.
You may have a 1MB page where render-critical assets only make up 10KB of the page - that's a very fast site. Or you may have a 1MB page where 500KB are required for an initial render - not so fast. WebPageTest helps break down those weights by event for you.
I wish I could give more technical detail about using WebPageTest with CLI tools. It's something I plan to explore soon. But for now, hopefully this will give you a good start.
Have you tried PageSpeed Insights ?
Analyze you website and read on optimization guidelines.
I would know what are the queries that make slow my Joomla website. I know that there is the possibility to log all the slow query.
Unfortunately, I have only the access to the ftp where the website is hosted. Can I see this log also in ftp? Or I have to access to the server?
There are other way to see this log?
Thanks
As Rinos already said, there can be different reasons for a slow Joomla-Site.
If you cannot find a db-query being responsible for that check the network-tab in your developer-tools for resources that slow down your site.
One possible reason can be, that you are loading http-resources via https (so if you have hardcoded integrations of images, script-files etc. that will load via http while your site running in https, the developer tools will bark at you something about 'mixed content' ;) )
Depending on your Joomla there also might be some Modules/Components/Plugins that are not well designed... maybe deactivating them by chance and refreshing (yep, might be lot of work) will give you a hint. BUT: Please be careful, since there are some Plugins like the authentication-plugins that are needed and if you deactivate them you might "lock" yourself out. Normally core components and plugins shouldn't be responsible for that at all.
If you have a look at the queries from the debug-console, there are some queries that perform a full-table-scan. Maybe you'll find one among these that are performance-hungry.
If not so, please check your global configuration in the Joomla Backend under System -> Global Configuration and try check the following things:
Is caching enabled?
What kind of caching do you use?
Under the Server-Tab check for gzip-compression of your page
if you force https, do you have any http-resources on your site (like mentioned above)
Some of the possibilities here might help you to gain some performance, but if you still have performance problems, my next look would be the server-config.
There are still other things you might give a "heads up":
What PHP-Version are you running? PHP 7.x brings a remarkable performance-boost
Have a look at your php.ini file. What about your memory-limit and other options (have a look here for the technical requirements that are recommended: https://downloads.joomla.org/technical-requirements
Back on your site, are there any javascript-errors (the developer-tools of your browser will tell you)
Well, these are several possibilities you might pay some attention... Performance-Issues can have many reasons but hopefully some of the above said things might lead you on the right track ;)
regards
You can active debug in joomla from Configuration system, then you can see at the end of pages in "live site" all query performed with time and memory used, and much more you can help for understand what is slowing down your site
I assume that if you put some Javascript code in an external source (and use the src="") that it's a little slower b/c the page has to then download another portion, but I'm wondering whether that's inconsequential.
From testing I've done online (with webpagetest.org) seems quite small (< 5% of the total page time loading).
But just wondering about what's happening "under the hood" and whether the browser (I assume) is spinning up another process to download that bit separately rather than coming across from the server with the rest of the page is actually just as fast (b/c it's happening in parallel).
Not slow enough to matter.
I think the speed difference question is a red herring. Generally, you should keep your script separate from your html:
Separation of concerns: the html is the structure of the site, whereas the script is its behavior. It mixes concerns to mingle them together, and it's best practice to keep your script in a separate file.
It might seem counter-intuitive that script served separately from html could be just as fast or even faster, but things like caching proxy servers, content delivery networks, and even new web protocols like SPDY can make the speed question completely moot.
If you test in Firebug in Firefox you'll see that Firefox is downloading multiple files at the same time (the number of concurrent files is different for each browser). But the main reason why you should put js code in external files is that it can be minified and compressed on the server side, and also cached by the browsers. Loading it from an external file has also the benefits of being able to load it from a static domain (cookie less) and use a CDN to speed up the delivery. So to reply to your question it'll be slower to put it in the page as the browser will need to download it every time it loads the page.
Do web hosts usually cache your content? You know like caching proxies.
I was just wondering is it good practise to always control the cache properly? I have seen alot of sites without cache control in the head section.
No, Web Hosts will generally not cache anything.
One because it's a lot of work and uses their processing power.
Secondly because it could reduce bandwidth use, and they would prefer you go over your bandwidth so they can charge you more. (Even most unlimited bandwidth packages have a limit, read the Terms)
Thirdly if they control the caching process and you don't want your website cached it would be a lot of work to go through the effort of turning it off for certain websites.
So yes, it would be a good practice to control your cache properly.
Most of hosts do not cache anything. Though it would be really wise to set proper cache-control headers anyway< because there still may be some proxies between your server and browsers.
regarding the 2nd part of your question, you really need cache-headers if you:
don't want your pages to be cached at all
want to make sure your pages (and/or images, js, css, ...) will be cached
if you don't specify caching-directives in your response headers, proxies and browsers can -and will- decide whether or not to cache on their own.
it is good practice to try to avoid being at the mercy of some unknown piece of code ;-)
I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.
You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd
This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.
Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.
It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.
I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.