first background image on a website loads slowly - html

The first background image I have on my website - loads slowly.
Now, it's a background image but it should fit a large screen so it's a 900K png.
What are the tricks I can make in order for it to load faster? (it's the first thing you see so even the 3-4 seconds it takes is felt...

There are lots of tricks. The lads above in their comments are right, there are things you can do to compress it, and analysis what is taking the time.
Load times and improving them are around 3 components:
- Network Latency (the time it takes to pass data from the browser to your host)
- Image Size (physical size of the image that goes down the pipe)
- Connection Cost (Time taken to establish the connection with the host)
Now, if you we take your scenario, you have an image that is large, but not huge. So how can we make it faster, well we can remove the time it takes to download by:
a) compressing it - many options here, along with the google one above, here is a simple one: https://tinypng.com/
b) remove the network latency by ensuring the host sends the cache-control with a max-age in it. This will mean when you refresh it the host returns a 304 and that browser shows the image from the cache
c) use a CDN such as cloudflare to do the hard stuff for you. A CDN will handle the compression, setting of the cache flag, reduce your network latency and ensures your connections are established quickly. Some like cloudflare have free starting tariffs, so worth a look.
d) lazy load the image using javascript after the page has rendered. It's still good to do all the above, but if you create some javascript to fire on document.ready and set the background using css/img tag, background or whatever mechanism you are currently using then it will start the download of the image at that point, enabling your users to see the page and interact with it ahead of your background being finalised.

Besides improving website performance, I consider it an important courtesy to site visitors to make a point of compressing all images.
(Not to insult anyone's intelligence here but, just to clarify, compressing an image is not the same as "shrinking" it's width/height — although both will reduce file size.)
If you have 1000 visitors and your site has 100 images and you save a half second of load time per image, then you just saved mankind 14 hours of page load time! (and it only takes a few seconds to compress an image.)
I compress the occasional "one-off" manually with:
TinyPNG.com for .pngs and .jpg's (up to 20 images at a time)
ezGIF.com for animated .GIFs.
Batches of images can be compressed programmatically using a library like:
pngquant, or
PHP's built-in ImageMagick.
As far as I know these all use the same compression algorithm, known as "Lossy Compression", which can reduce filesize significantly without any noticeable degradation of image quality.

Related

How do I get our website to load under 1 second?

We have a responsive business website running in html5. Our goal always has been to load the site up fast so the users can browse and get the required info even if they are in a slow bandwidth area. Since our users are from every part of the world and the internet connection is not up to speed in every country, the best option we have is to deliver the content as fast as possible. The main purpose is to have the homepage optimized for speed and the rest of the pages can be served quickly from cache. The steps so far taken:
main document - 10 kb
fonts are preloaded in head
reduced and minified css - final size 6kb
use cdn for jquery.slim.min.js - size 25kb
main image compressed and reduced from 600kb to webp- final size 20kb
favicon - final size 940 bytes
use cdn for bootstrap js async loaded- size 23kb
font reduced - final size 20kb
font awesome locally served - final size 1kb
analytics (preconnected as well) including googletag async loaded - 50kb
.htaccess edited for browser caching (images,css,js,ico,webp,jpg etc) for repeat visits.
Having done all of that, the site loading really gotten faster and the consistent results for lighthouse are provided below for desktop and mobile respectively:
gtmetrix, pingdom has the site loading in 1-1.2 seconds consistently. Domcontentloaded - 910ms. The results are pretty similar from different regions in the world (tested from 8 different countries). Our site is hosted in hostgator and they are gzipping content. Above sizes are for the files that are compressed and sent to browsers before deflating. I have been studying all over the web including stackoverflow to see what can be optimized (for first visit) to get to that sweet spot. I even minified the html which negatively affected the performance, so it is put back to original. homepage html file doesnt have any comments so valuable bytes are not wasted and also we dont have any videos. What else can be done to shave off 200-300 milliseconds ? :) Appreciate any valuable info/advice. Thank you.
WOW!! Great job you are doing, I think you might only add one thing to make it faster which is using some lazyloading for content and photos if any, and fetch them whenever they are in the view port.

Speeding up the image loading process

How can we speed up the process of loading hundred's of images in as3? it seams to be taking a very long time to load 100's of images.
If you don't have the ability to change the image quality or sizes, then there isn't a lot you can do to speed up such a large load, as you are limited by connection speed and simultaneous browser connections.
You may be benefitted by using a loading manager, like Greensock's LoaderMax. LoaderMax will help make sure you are loading several assets simultaneously - check out the maxConnections parameter.
Also, I suggest not loading everything up front with one preloader. Instead, have placeholders with a spinning loader for each image that then gets replaced by the final image once loaded. This will provide a much smoother user experience and create the illusion of the overall load taking less time.
100s of requests can take quite a long time even if the total size of the images isn't prohibitively large. If you know which images you need in advance, you could embed the images in another swf which you load at runtime.
By the way, what is the reason that you need to load 100s of images right away. Is there any reason you can't load them in the background a little more slowly?
A few suggestions:
Reduce the filesize of your images (obvious but most important).
Don't run all the requests at once - queue the images and load them one at a time (most relevant to least relevant).
There were lots of good advices already. But here a re my 2 cents. One day I had to load about 1000 of really small images. It was taking too long so I used FZip AS3 library. Worked like a charm.
The question does not provide a lot of specifics, so here are some general guidelines.
If the images are large JPEGs, you might be able to sacrifice some quality and recompress the images to make them smaller.
If the images are PNGs and represent, e.g., lots of small, individual tiles for a game, you could try combining many of them into a single image and then parsing them out into images on the client side after loading the image over the network.
Implement download queue, for example simple loader
https://github.com/hydrotik/QueueLoader
Do not load images that
currently out of stage (you can load them in background)
Use texture packer if you have plenty of small images
Optimize pics
quality (i.e. size, type)
Use cache, for instance Local Shared
Object, so for next time you'll get your pics ultimately fast from
LSO.
Is there any way to get thumbnails of the images made? If so, you could use the thumbnails while the larger versions load. Load all the thumbnails first (since they'd load really fast), then load the larger versions in prioritized order.
It sounds like you don't have a lot of control over the images, though. If you don't have any control there, there's very little you can do to speed up the process besides using a loading optimizer as others have suggested.

Displaying a ton of images on one page?

Is there any way to do this without bogging down a computer?
I don't know much of the innerworkings of computers and browsers, so I don't know what about a bunch of images on a page takes up so much cpu. My first thought was to hide the images that aren't visible on the page anyway. The ones that have been scrolled past or yet to be scrolled to.
I tried a sample jsfiddle with randomly colored divs instead of pictures, but just scrolling up and down through that makes my computer sound like it's taking off.
What is it about all the pictures that takes up cpu? Can it be avoided?
The question is generally not about the amount of bandwith it might save. It is more about lowering the number of HTTP requests needed to render a webpage.
See this
What takes time, when doing lots of requests to get small contents (like images, icons, and the like) is the multiple round-trips to the server : you end up spending time waiting for the request to go, and the server to respond, instead of using this time to download data.
Less http requests = faster loading overall
If we can minimize the number of requests, we minimize the number of trips to the server, and use our hight-speed connection better (we download a bigger file, instead of waiting for many smaller ones).
That's why CSS sprites are used.
For more informations, you can have a look at, for instance : CSS Sprites: Image Slicing’s Kiss of Death
Other than this you can also use lazy load

How much faster is it to use inline/base64 images for a web site than just linking to the hard file?

How much faster is it to use a base64/line to display images than opposed to simply linking to the hard file on the server?
url(data:image/png;base64,.......)
I haven't been able to find any type of performance metrics on this.
I have a few concerns:
You no longer gain the benefit of caching
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
Let's define "faster" as in: the time it takes for a user to see a full rendered HTML web page
'Faster' is a hard thing to answer because there are many possible interpretations and situations:
Base64 encoding will expand the image by a third, which will increase bandwidth utilization. On the other hand, including it in the file will remove another GET round trip to the server. So, a pipe with great throughput but poor latency (such as a satellite internet connection) will likely load a page with inlined images faster than if you were using distinct image files. Even on my (rural, slow) DSL line, sites that require many round trips take a lot longer to load than those that are just relatively large but require only a few GETs.
If you do the base64 encoding from the source files with each request, you'll be using up more CPU, thrashing your data caches, etc, which might hurt your servers response time. (Of course you can always use memcached or such to resolve that problem).
Doing this will of course prevent most forms of caching, which could hurt a lot if the image is viewed often - say, a logo that is displayed on every page, which could normally be cached by the browser (or a proxy cache like squid or whatever) and requested once a month. It will also prevent the many many optimizations web servers have for serving static files using kernel APIs like sendfile(2).
Basically, doing this will help in certain situations, and hurt in others. You need to identify which situations are important to you before you can really figure out if this is a worthwhile trick for you.
I have done a comparison between two HTML pages containing 1800 one-pixel images.
The first page declares the images inline:
<img src="data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAYAAAAfFcSJAAAAAXNSR0IArs4c6QAAAARnQU1BAACxjwv8YQUAAAAJcEhZcwAADsQAAA7EAZUrDhsAAAANSURBVBhXYzh8+PB/AAffA0nNPuCLAAAAAElFTkSuQmCC">
In the second one, images reference an external file:
<img src="img/one-gray-px.png">
I found that when loading multiple times the same image, if it is declared inline, the browser performs a request for each image (I suppose it base64-decodes it one time per image), whereas in the other scenario, the image is requested once per document (see the comparison image below).
The document with inline images loads in about 250ms and the document with linked images does it in 30ms.
(Tested with Chromium 34)
The scenario of an HTML document with multiple instances of the same inline image doesn't make much sense a priori. However, I found that the plugin jquery lazyload defines an inline placeholder by default for all the "lazy" images, whose src attribute will be set to it. Then, if the document contains lots of lazy images, a situation like the one described above can happen.
You no longer gain the benefit of caching
Whether that matters would vary according to how much you depend on caching.
The other (perhaps more important) thing is that if there are many images, the browser won't get them simultaneously (i.e. in parallel), but only a few at a time -- so the protocol ends up being chatty. If there's some network end-to-end delay, then many images divided by a few images at a time multiplied by the end-to-end delay per image results in a noticeable time before the last image is loaded.
Isn't a base64 A LOT larger in size than what a PNG/JPEG file size?
The file format / image compression algorithm is the same, I take it, i.e. it's PNG.
Using Base-64, each 8-bit character represents 6-bits: therefore binary data is being decompressed by a ratio of 8-to-6, i.e. only about 35%.
How much faster is it
Define 'faster'. Do you mean HTTP performance (see below) or rendering performance?
You no longer gain the benefit of caching
Actually, if you're doing this in a CSS file it will still be cached. Of course, any changes to the CSS will invalidate the cache.
In some situations this could be used as a huge performance boost over many HTTP connections. I say some situations because you can likely take advantage of techniques like image sprites for most stuff, but it's always good to have another tool in your arsenal!

sprites vs image slicing

I don't have much experience with the sprite approach to images (http://www.alistapart.com/articles/sprites). Anyone care to share some pros/cons of sprites vs. old-school slices?
The main advantage of sprites is that the browser has to request less pictures from the webserver. That reduces the number of HTTP requests and makes it possible to compress the parts of the design more effectively. These two points also represent the disadvantages of sliced images.
Here you can see some good examples how sprites improve the loading speed of web pages:
http://css-tricks.com/css-sprites/
Pros:
It's far easier on the server to serve a single large image than many small ones.
It's (slightly) faster for a web browser to load such an image.
Browsers only load images as they needs them - if you are using multiple images in a rollover, the browser would "pause" the first time you roll over the element. This can be solved using sprites, because there is only one image to load.
Cons:
It's kind of a pain to code (more so than using multiple images at least)
One often overlooked downside of using CSS sprites is memory footprint:
https://web.archive.org/web/20130605000516/http://blog.vlad1.com/2009/06/22/to-sprite-or-not-to-sprite/
Unless the sprite image is carefully constructed, you end up with
incredible amounts of wasted space. My favourite example is from WHIT
TV’s web site, where this image is used as a sprite. Note that
this is a 1299×15,000 PNG. It compresses quite well — the actual
download size is around 26K — but browsers don’t render compressed
image data. When this image is downloaded and decompressed, it will
use almost 75MB in memory (1299 * 15000 * 4).
When sprites get loaded into the browser, they are stored uncompressed. A 26 KB file can uncompress to take up a whopping 75 MB of RAM. You should be mindful of using sprites with very large dimensions.
There's also the issue of what happens in browsers with poor CSS support (legacy browsers). The sprites may end up totally broken.
Sprites
Pros:
Less HTTP connections to the server
Faster loading on broadband
Cons:
No encapsulation: If you want to change one image, you have to change the sprite
It is difficult to setup individual images in CSS without a tool
Don't degrade: If the browser don't support CSS, you are in trouble
CSS Sprites:
Pros:
Graceful degrade in unsupported browsers (text can be shown when background images for links are not allowed)
Fewer HTTP requests
Each image has a separate overhead like color table so image slicing will be having more overhead than CSS sprites
Cons:
Poses a problem if images are turned off in the browsers (rare case though)
Image slicing:
Pros:
User perceives a faster load since loaded piece by piece.
Load on demand like when the user places his mouse on the image
Cons:
The webpages might have a large size on the client side even thought it might not be the case on the server side.
The main drawback of sprites is it makes it hard to read/maintain/modify your CSS. It can be difficult to remember the exact pixel offsets within the sprite.
pros using sprites :
since it is using 1 images for all, it require less load on http server.
cons:
- hard to code. you must know the coordinate each images inside sprites so you can display it correctly. once you change the size of the image, you need to adjust all ...
- big images could creates long waited page to display. while using images, user with slow internet connection can see one by one.
best practices.
use it for example roll over images.
Look into using a CSS sprite generator (we use SmartSprites). That way you can do slices locally, and have your build process generate a spritemap. It's the best of both worlds.
Also is SmartSprites isn't for you, there's definitely others, however I like it because it reduces the amount of work up front AND during changes.
Cons
- slower on older browsers/ maybe not working on them with hover effect (opera6)
- if not used correctly can get very/too huge (group them adequately!)
- tedious work to set them up
Pros
- less bytes transfered, because one big image is smaller then all individual images combined (one header/ color table)
- less http requests
I prefer going the middle ground of grouping similar images (normal, hover, selected page, the parent page of selected page) than having all the images in one file. To make these, you image slice like normal in Photoshop or Illustrator, open the files up and combine them with a shortcut key. I wrote the Photoshop script that combines images into CSS sprites. You will have multiple HTTP connections, but won't have the load delay on hover.