I have a website that runs just fine on my local server. It's quick, responsive, and overall runs great.
But when I put it on the server of my domain host, sometimes it takes excessively long to load assets. For example, one 1MB png file took 2.31 seconds to load:
Chrome's Network Developer Tool reveals to me the following:
So is this likely due to poor implementation of my code or is it possibly a crappy server? (The company is subscribing at the lowest tier possible to host their content) My internet connection is quick so I doubt it's that.
I think it is probably a problem with your host. An image is an image :) there aren't a 100 ways to implement one!
Oversized images always take longer to load, so you should keep your images as small as possible.To lower down the content download time, you can optimize/compress the image without degrading it's visual quality.If you are using any graphics software to optimize the images, you should use “Save for Web” option. This will reduce the size of images and hence image load time.
Furthermore, you can use CDN to serve static assets of your website like, images, CSS,JS, videos, etc. A CDN populates your website files to geographically distributed network of servers called POPs. CDN serves the website resources from the nearest geographical location of a visitor, that means your website assets will load more faster.
Use SSD based host. SSD has excellent read/write rates compared to that of traditional HDDs. Hence, solid state drives perform better than hard disk drives and they are almost 100 times faster than traditional HDDs.
Here's a question for you
Is the image you're trying to load in background, or its an image tag?
Related
I recently enabled Brotli compression at one of the CDN platform that we use. With this I was expecting the performance to improve since the resources sizes gets reduced by 15-30% but to my surprise I see that the performance is still the same.
I did check various metrics and all looks still the same except ttfb where I see an increase of 10-15 miliseconds per resource.
Has anyone seen this before and if yes, what are the best ways to go about this issue? I am also suspecting that chrome might be taking longer time to decompress the resources when its brotli than Gzip but unfortunately I do not have any way to measure that time.
There is not enough detail to answer question. Performance is relative and so the gains due to Brotli may be getting drowned out by bigger performance issues with your site.
Some questions for you to answer for yourself:
Is Brotli setup correctly and working? Can you see br as the content-encoding in developer tools network tab? Note you may need to add the content encoding column.
Are you using HTTPS on your site (required by all browsers to use Brotli)? Did you move to HTTPS as part of this move? Is your HTTPS optimised.
Has the overall size of your site gone down once you enabled Brotli? If so by how much? If you have lots of 10Mb print-quality images on your site and you have changed your HTML from being 50kb to 45kb then you may not see much of an overall difference.
How long does your page take to generate? If your page takes 30 seconds to generate because the HTML is dynamic and your backend (app server, data server, whatever) is slow, then going to 29.5 seconds won’t seem that much.
Do you have lots of render-blocking CSS and JavaScript. These are text so should hopefully be delivered faster now but if they are massively complex and processing time on the client is large then the download time may be an insignificant part of this.
Are you testing from your company office while sitting 50metres from your data centre with a high speed 1000Mbps Ethernet connection which is basically talking straight into the web server? If so download speeds are going to be negligible no matter what the size of the downloads.
Brotli should compress text smaller. It can take longer/more processing power to do that compression than gzip but the network gains versus the CPU costs are usually worth it.
It is not magic however, and cannot make up for other performance issues on a site.
I have been trying to host my website using firebase. I have a folder in which there is an index.html page and another html page which i have connected to the index through a button link. This "another html" page contains a large image. Apparently, when i upload these files two things take place really slow:
The firebase takes time to host the site(around 9-10 min)
The image inside the "another html" page takes an era to load on my desktop
but is fast in my mobile device.
What should i do to speed things up? For reference i have some images. If you need any more reference please do mention.
P.S The other html is index1.html
The time it takes to deploy your site to Firebase Hosting is dependent on:
the size of the site assets you're trying to deploy
the bandwidth that you have available
And to a lesser degree:
the time it takes your local machine to compress the site assets
In my extended usage, deploys typically take seconds, not minutes. But my sites are usually quite small and I tend to deploy when I'm on a connection with good bandwidth.
To make deploys faster, you'll either have to reduce the total size of what you're trying to deploy, or deploy when you're on a faster connection.
I have single page web-app that currently consists of four files:
index.html
main.js
style.css
sprites.png
This means that every user who loads the site has to request index.html, parse it for the other three files, and then make three more http requests (serially, I believe) to fetch the remaining files.
It seems to me that it might be (a tiny bit) faster to embed the javascript, css and sprite image (base64 encoded) directly in the index.html file.
The main reasons I can think not to do this, along with my reasons why I don't think they apply in this case, are as follows:
Would prevent any of these additional files from being cached separately. This is not an issue for me because they will never be loaded from another page (since there is only one html page)
If the files were on different CDN servers, they could be downloaded in parallel. (Currently this project is not large enough to merit multiple servers)
I should disclose that this site is a small pet project that is no-where near large enough to merit this kind of meticulous performance tuning, but I like using pet projects as an avenue to explore problems (and their solutions) that I may face in my day job.
This isn't usually done because you increase the size of the entire HTML page. You'll save a couple requests on the first visit, but you'll force the client to reload everything every time they fetch the HTML file.
It would improve performance for users who visit your site once, and only once. For any kind of long-term strategy, it's unsuitable.
When your page is reloaded js, images, and CSS are cached on the client and doesnt need to reload. Also, base64 requires your clients to activate JavaScript to see your page. Lastly, it may very well take a weak client longer to decode your base64 than downloading the files.
So in short, dont overthink some things.
I've noticed that on mobile Safari, when I deliver my assets via Cloudfront they load noticeably slower than if my assets are served just from my EC2.
Specifically, my site has a main background image that appears noticeably slower than the text delivered by EC2. The loading of this background image doesn't noticeably lag behind the text on Chrome on my laptop presumably b/c of the greater performance of Chrome vs. mobile Safari.
I'm at a loss about what to do about this since the whole point of Cloudfront is to serve assets fast and take load of my EC2 but the delay in this background image appearing makes for quite an ugly, i.e., unacceptably bad, UX.
NOTE: Please don't reflexively vote to migrate this question to another SE site as the whole point is that it is not clear what approach would be best.
We've done some comparative tests and it seems that the advantage of using cloudfront depends on the size of the requested file.
For a small file (2kb) CF response time is greater than requesting directly to EC2.
For a 15kb file the response time is almost the same.
For a 57kb file (jquery-1.3.2.min.js), cloudfront was 4x to 5x faster than EC2.
I am thinking to save server load, i could load common javascript files (jquery src) and maybe certain images from websites like Google (which are almost always never down, and always pretty fast, maybe faster than my server).
Will it save much load?
Thanks!
UPDATE: I am not so much worried about saving bandwidth, as I am reducing Server Load because my server has difficulty when there are a lot of users online, and I think this is because there are too many images/files it loads from my single server.
You might consider putting up another server that does nothing but serve your static files using an ultra efficient web server such as lighttpd
This is known as a content delivery network, and it will help, although you should probably make sure you need one before you go about setting it all up. I have heard okay things about Amazon S3 for this (which Twitter, among other sites, use to host their images and such). Also, you should consider Google's API cloud if you are using any popular javascript libraries.
Well, there are a couple things in principle:
Serving up static resources (.htm files, image files, etc) rarely even make a server breathe hard except under the most demanding of circumstances (thousands of requests in a very short period of time)
Google's network is most likely faster than yours, and most everyone else. ;)
So if you are truly not experiencing any bandwidth problems, I don't think offloading your images, etc will do much for you. However, as you move stuff off to Google, then it frees your server's bandwidth up for more concurrent requests and faster transfer on the existing ones. The only tradeoff here is that clients will experience a slight (most likely unnoticable) initial delay while DNS looks up the other servers and initiates the connection to them.
It really depends on what your server load is like now. Are there lots of small web pages and lots of users? If so, then the 50K taken up by jQuery could mean a lot. If all of your pages are fairly large, and/or you have a small user base, caching jQuery with Google might not help much. Same with the pictures. That said, I have heard anecdotal reports (here on SO) that loading your scripts from Google does indeed provide noticeable performance improvement. I have also heard that Google is not necessarily 100% uptime (though it is close), and when it is down it is damned inconvenient.
If you're suffering from speed problems, putting your scripts at the bottom of the web page can help a lot.
I'm assuming you want to save costs by offloading commonly used resources to the web at large.
What you're suggesting is called Hotlinking.. that means directly linking to other people's content. While it can work in most cases, you do lose control of the content, that means your website may change without your input. Since image hosted on google are scoured from other websites, the images may be copyrighted, causing some (potential) concern, or they may have anti-hotlinking measures that may block the images from your webpage.
If you're just working on a hobby website, you can consider hosting your resources on a free web account to save bandwidth.