reduce server load by loading image files / javascript files from another server? - html

I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.

You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd

This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.

Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.

It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.

I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.

Related

Brotli for static resources

I recently enabled Brotli compression at one of the CDN platform that we use. With this I was expecting the performance to improve since the resources sizes gets reduced by 15-30% but to my surprise I see that the performance is still the same.
I did check various metrics and all looks still the same except ttfb where I see an increase of 10-15 miliseconds per resource.
Has anyone seen this before and if yes, what are the best ways to go about this issue? I am also suspecting that chrome might be taking longer time to decompress the resources when its brotli than Gzip but unfortunately I do not have any way to measure that time.
There is not enough detail to answer question. Performance is relative and so the gains due to Brotli may be getting drowned out by bigger performance issues with your site.
Some questions for you to answer for yourself:
Is Brotli setup correctly and working? Can you see br as the content-encoding in developer tools network tab? Note you may need to add the content encoding column.
Are you using HTTPS on your site (required by all browsers to use Brotli)? Did you move to HTTPS as part of this move? Is your HTTPS optimised.
Has the overall size of your site gone down once you enabled Brotli? If so by how much? If you have lots of 10Mb print-quality images on your site and you have changed your HTML from being 50kb to 45kb then you may not see much of an overall difference.
How long does your page take to generate? If your page takes 30 seconds to generate because the HTML is dynamic and your backend (app server, data server, whatever) is slow, then going to 29.5 seconds won’t seem that much.
Do you have lots of render-blocking CSS and JavaScript. These are text so should hopefully be delivered faster now but if they are massively complex and processing time on the client is large then the download time may be an insignificant part of this.
Are you testing from your company office while sitting 50metres from your data centre with a high speed 1000Mbps Ethernet connection which is basically talking straight into the web server? If so download speeds are going to be negligible no matter what the size of the downloads.
Brotli should compress text smaller. It can take longer/more processing power to do that compression than gzip but the network gains versus the CPU costs are usually worth it.
It is not magic however, and cannot make up for other performance issues on a site.

How much slower is it to put .js in an external file rather than directly on the page?

I assume that if you put some Javascript code in an external source (and use the src="") that it's a little slower b/c the page has to then download another portion, but I'm wondering whether that's inconsequential.
From testing I've done online (with webpagetest.org) seems quite small (< 5% of the total page time loading).
But just wondering about what's happening "under the hood" and whether the browser (I assume) is spinning up another process to download that bit separately rather than coming across from the server with the rest of the page is actually just as fast (b/c it's happening in parallel).
Not slow enough to matter.
I think the speed difference question is a red herring. Generally, you should keep your script separate from your html:
Separation of concerns: the html is the structure of the site, whereas the script is its behavior. It mixes concerns to mingle them together, and it's best practice to keep your script in a separate file.
It might seem counter-intuitive that script served separately from html could be just as fast or even faster, but things like caching proxy servers, content delivery networks, and even new web protocols like SPDY can make the speed question completely moot.
If you test in Firebug in Firefox you'll see that Firefox is downloading multiple files at the same time (the number of concurrent files is different for each browser). But the main reason why you should put js code in external files is that it can be minified and compressed on the server side, and also cached by the browsers. Loading it from an external file has also the benefits of being able to load it from a static domain (cookie less) and use a CDN to speed up the delivery. So to reply to your question it'll be slower to put it in the page as the browser will need to download it every time it loads the page.

Create many images on the server or do resizing in the browser?

at the moment I'm designing a new gallery and I have quite a lot of different image sizes for different ways pf presentation in the frontend. And now I'm wondering if I should create every size on the server or just create a few and then use browser resizing and HTLM/CSS "croping".
So it's many images on the server, a lot of space on the disk and also a lot of requests from the client against fewer requests, fewer space and resizing/cropping with the browser.
I tend to think the second solution is more advanced and modern, but also more complicated?
Thanks for any advice!
It's really a trade-off as you've said yourself. Performance is increasingly important though, and even affects page ranking, which is a good reason to keep various sizes on the server, i.e. only send thumbnails when that's all the user needs to see.
If you wanted to go to the programming effort, you could use a library like ImageMagick to generate smaller images from large ones. This way, you only need to store the large images on the server. The downside being programming effort and extra server load; since space is cheap, probably not worth it unless you need to serve many different variants.
You could create a handler to serve up your images. When a request is made for an image in a size you don't yet have stored in the file system on your server, you could do the resize, save the file and send it to the client.
That way you only have the files that are needed on the server.

Do web hosting companies cache pages?

Do web hosts usually cache your content? You know like caching proxies.
I was just wondering is it good practise to always control the cache properly? I have seen alot of sites without cache control in the head section.
No, Web Hosts will generally not cache anything.
One because it's a lot of work and uses their processing power.
Secondly because it could reduce bandwidth use, and they would prefer you go over your bandwidth so they can charge you more. (Even most unlimited bandwidth packages have a limit, read the Terms)
Thirdly if they control the caching process and you don't want your website cached it would be a lot of work to go through the effort of turning it off for certain websites.
So yes, it would be a good practice to control your cache properly.
Most of hosts do not cache anything. Though it would be really wise to set proper cache-control headers anyway< because there still may be some proxies between your server and browsers.
regarding the 2nd part of your question, you really need cache-headers if you:
don't want your pages to be cached at all
want to make sure your pages (and/or images, js, css, ...) will be cached
if you don't specify caching-directives in your response headers, proxies and browsers can -and will- decide whether or not to cache on their own.
it is good practice to try to avoid being at the mercy of some unknown piece of code ;-)

Troubleshooting slow web page load times, how to do it/get better?

I'm the primary developer at a web firm, but often end up doing some sysadmin stuff, and was wondering what resources are available for learning how to troubleshoot slow page load times.
My experience with sysadmin tools is almost none. I'm relatively proficient at the Linux/Unix command line, but have never used any type of packet tracking software and only know the basics of using dig for ip resolves. My experience with apache and mysql is mostly limited to configuring initial setup and then using them.
Are there any good books or web sites that cover the topics needed for accurately diagnosing website performance/bottlenecks and if so what are they, or is the gamut of technologies used to large and experience/time with using the technologies typically how people get good at this stuff?
Overall, there's no substitute for experience. A concept as broad as "slow web page load time" could be hitting a bottleneck in any number of different places:
Client is slow to resolve IP from domain.
Network between client and domain is slow or congested.
Server is slow to respond to request.
Requested page is large.
Requested resources embedded in page are large.
Page contains server-side code that requires significant processing.
Database is slow to respond.
Page manipulates a lot of data before responding.
Rendered page on client contains a lot of code and runs slowly.
etc.
For any given page, it's a matter of know where the bottlenecks could be and determining what that bottleneck is in order to address it. Having a full and complete understanding of everything that goes on from end to end in "loading a page" is essential. Identifying patterns of slow load times across multiple disparate requests will help narrow down the potential bottlenecks. etc.
It's very much a case-by-case scenario.
Have you diagnosed exactly where the problems are? Slow load times could be caused by anything, such as:
too much going on in the dom/js
a) not caching js or other resources on the client
b) not minifying/compressing resources
c) making too many requests with ajax/doing silly things in the browser like redrawing dom that doesn't need it.
too much going on in the server
a) no cache on db tables, no indexes
b) not handling long running tasks asynchronously
c) improperly configured proxies/apache servers
network issues -- wish I knew more about this.
Step 1 is always to figure out where the worst slowdown is. Do some metrics on the server to make sure it is doing easy stuff fast. And hard stuff reasonably fast. Look in the browser to see how long loading resources is taking. Look in the chrome/firebug profilers to see how much time the javascript is taking to run.
You will probably find a bunch of things that could be improved. Prioritize and address the issues...