Displaying a ton of images on one page? - html

Is there any way to do this without bogging down a computer?
I don't know much of the innerworkings of computers and browsers, so I don't know what about a bunch of images on a page takes up so much cpu. My first thought was to hide the images that aren't visible on the page anyway. The ones that have been scrolled past or yet to be scrolled to.
I tried a sample jsfiddle with randomly colored divs instead of pictures, but just scrolling up and down through that makes my computer sound like it's taking off.
What is it about all the pictures that takes up cpu? Can it be avoided?

The question is generally not about the amount of bandwith it might save. It is more about lowering the number of HTTP requests needed to render a webpage.
See this
What takes time, when doing lots of requests to get small contents (like images, icons, and the like) is the multiple round-trips to the server : you end up spending time waiting for the request to go, and the server to respond, instead of using this time to download data.
Less http requests = faster loading overall
If we can minimize the number of requests, we minimize the number of trips to the server, and use our hight-speed connection better (we download a bigger file, instead of waiting for many smaller ones).
That's why CSS sprites are used.
For more informations, you can have a look at, for instance : CSS Sprites: Image Slicing’s Kiss of Death
Other than this you can also use lazy load

Related

first background image on a website loads slowly

The first background image I have on my website - loads slowly.
Now, it's a background image but it should fit a large screen so it's a 900K png.
What are the tricks I can make in order for it to load faster? (it's the first thing you see so even the 3-4 seconds it takes is felt...
There are lots of tricks. The lads above in their comments are right, there are things you can do to compress it, and analysis what is taking the time.
Load times and improving them are around 3 components:
- Network Latency (the time it takes to pass data from the browser to your host)
- Image Size (physical size of the image that goes down the pipe)
- Connection Cost (Time taken to establish the connection with the host)
Now, if you we take your scenario, you have an image that is large, but not huge. So how can we make it faster, well we can remove the time it takes to download by:
a) compressing it - many options here, along with the google one above, here is a simple one: https://tinypng.com/
b) remove the network latency by ensuring the host sends the cache-control with a max-age in it. This will mean when you refresh it the host returns a 304 and that browser shows the image from the cache
c) use a CDN such as cloudflare to do the hard stuff for you. A CDN will handle the compression, setting of the cache flag, reduce your network latency and ensures your connections are established quickly. Some like cloudflare have free starting tariffs, so worth a look.
d) lazy load the image using javascript after the page has rendered. It's still good to do all the above, but if you create some javascript to fire on document.ready and set the background using css/img tag, background or whatever mechanism you are currently using then it will start the download of the image at that point, enabling your users to see the page and interact with it ahead of your background being finalised.
Besides improving website performance, I consider it an important courtesy to site visitors to make a point of compressing all images.
(Not to insult anyone's intelligence here but, just to clarify, compressing an image is not the same as "shrinking" it's width/height — although both will reduce file size.)
If you have 1000 visitors and your site has 100 images and you save a half second of load time per image, then you just saved mankind 14 hours of page load time! (and it only takes a few seconds to compress an image.)
I compress the occasional "one-off" manually with:
TinyPNG.com for .pngs and .jpg's (up to 20 images at a time)
ezGIF.com for animated .GIFs.
Batches of images can be compressed programmatically using a library like:
pngquant, or
PHP's built-in ImageMagick.
As far as I know these all use the same compression algorithm, known as "Lossy Compression", which can reduce filesize significantly without any noticeable degradation of image quality.

Does a spritesheet use resources for the entire sheet or only what's being loaded from it?

I feel like this is a stupid question since I'm pretty sure the spritesheet uses memory for the entire sheet, but I just wanted to make 100% sure and can't find the answer any where. For a html project I wanted to be 'smart' and make as little http requests as possible, so I created a 2048x2048 image that only has about 150x150 not being used, which is actually great since it can allow me to add other images in the future if I need.
The problem is, after purchasing and configuring a server, I was afraid if a lot of people connected at once, that the server would be using more memory for the bigger spritesheet? I only use 10% of the images in the sprite sheet at any given time, and when they change, the previous ones get replaced. So was it stupid of me to make one big file, will it only use extra memory for no reason? Would the rule of thumb here be, use big sprite sheets for images that are always being loaded, and then split the remainder into smaller sprite sheets that are used only at certain times?
To answer your question, yes the entire sheet is loaded into memory.
However, only once.
Each CSS displays only a small portion of the sprite.
In theory: As you move from page to page that one large image is already in the cache and not requested from the server again. Making the next page load that much faster.
There is much more on caching v/s the number of requests v/s the combined sizes of images that would go into a more complete answer.

Speeding up the image loading process

How can we speed up the process of loading hundred's of images in as3? it seams to be taking a very long time to load 100's of images.
If you don't have the ability to change the image quality or sizes, then there isn't a lot you can do to speed up such a large load, as you are limited by connection speed and simultaneous browser connections.
You may be benefitted by using a loading manager, like Greensock's LoaderMax. LoaderMax will help make sure you are loading several assets simultaneously - check out the maxConnections parameter.
Also, I suggest not loading everything up front with one preloader. Instead, have placeholders with a spinning loader for each image that then gets replaced by the final image once loaded. This will provide a much smoother user experience and create the illusion of the overall load taking less time.
100s of requests can take quite a long time even if the total size of the images isn't prohibitively large. If you know which images you need in advance, you could embed the images in another swf which you load at runtime.
By the way, what is the reason that you need to load 100s of images right away. Is there any reason you can't load them in the background a little more slowly?
A few suggestions:
Reduce the filesize of your images (obvious but most important).
Don't run all the requests at once - queue the images and load them one at a time (most relevant to least relevant).
There were lots of good advices already. But here a re my 2 cents. One day I had to load about 1000 of really small images. It was taking too long so I used FZip AS3 library. Worked like a charm.
The question does not provide a lot of specifics, so here are some general guidelines.
If the images are large JPEGs, you might be able to sacrifice some quality and recompress the images to make them smaller.
If the images are PNGs and represent, e.g., lots of small, individual tiles for a game, you could try combining many of them into a single image and then parsing them out into images on the client side after loading the image over the network.
Implement download queue, for example simple loader
https://github.com/hydrotik/QueueLoader
Do not load images that
currently out of stage (you can load them in background)
Use texture packer if you have plenty of small images
Optimize pics
quality (i.e. size, type)
Use cache, for instance Local Shared
Object, so for next time you'll get your pics ultimately fast from
LSO.
Is there any way to get thumbnails of the images made? If so, you could use the thumbnails while the larger versions load. Load all the thumbnails first (since they'd load really fast), then load the larger versions in prioritized order.
It sounds like you don't have a lot of control over the images, though. If you don't have any control there, there's very little you can do to speed up the process besides using a loading optimizer as others have suggested.

How do I optimize a very loooong page with TONS of images on it?

I have been tasked with optimizing a marketing landing page for speed. It already does really well, but the problem is its very graphic heavy.
The entire thing I would guestimate is 30+ pages long all on one page (It must be like this, and everything must be images due to conversion reasons).
Here's what I've done so far but I'm not sure if there's something else I'm missing
I re-compressed over a 140 jpg's on the page to slightly smaller sizes
I played around with sprites but most of the images are all large (like entire testimonial boxes that are 600px wide). The sprite images ended up being like 2mb. That page actually took longer to load for some reason (by almost 2s) so I took them off
The MOST important thing is that everything immediately at the top loads as fast as humanly possible and before anything else so that the sale isn't lost by someone starting at a bunch of images loading out of order. I used some preaching images with this method: http://perishablepress.com/press/2009/01/18/css-image-caching/ - It did seem to work quite well on the smaller images, but when I add the background (which is very graphic intensive) everything seems to slow down at once, like its trying to do a certain number (5?) of images at a time. Ideally I'd love to group the first 4 smaller images being pre-cached, and then follow it up with focusing all bandwidth on the background, then the rest of the entire page in standard order..
Google Page Speed score is 90/100. the only thing its concerned about is unused styles but that I'm not really concerned about because its about 2kb total... the overhead from the 7mb of images is way more important.
I have the option to use a CDN for the images but I'm not sure how I'd go about doing this or if it would actually help?
I feel I've done all I can but then again I could totally be missing something...
A CDN would definitely help. And with 140 pictures, I hope it
contains more than just server. Just link directly to the IP of
the servers, to avoid unnecessary DNS lookup.
Minimize HTTP requests. You mention that you've been experimenting
with sprites. I still believe sprites to be the way to go, but you
might not want to create just one, huge image. If you have 140
images, you should probably have about 10 sprites.
Make sure that you have set headers to make the client cache all
content. Remember ETags.
Gzip/compress content for browsers that allow it.
If the source code is lengthy, make sure to flush the buffer early.
Consider lazily loading images below the fold? There are a number of javascript plugins to accomplish this
scrolling pagination + combining images as sprites on a per-page basis.

How to calculate overhead of a web request?

Suppose there is a fancy button to be put on a website. And the design of the button is such that parts of it can be sliced and applied as a repeating background.
I often slice the images and apply them as a repeating backgrounds this way. So one button in an image is split into several different images. I do this to reduce the size of the images used.
My team leader told me not to slice the images. If you slice a button into three parts, there would be three web requests. And this will slow down the site.
I find it hard to believe that the overhead of three requests would be more than using the entire image. So I just want to know how to calculate the amount of bytes transferred per web request.
You could use the net tab on firebug to see the time taken for each web request, its also broken down into the time it takes to download each component in the response.
If you reuse the three images a lot in the site than you will save requests and bandwidth (your gut reaction). However the three images need to be initially retrieved -- so you must reuse them.
The crux is this: Due to HTTP pipelining the overhead is pretty much negligible (especially since you consider that this is HTTP -- a string based protocol). Three images being retrieved probably have the same latency as retrieving a single image.
-- edit: after comment from Ericlaw --
Pipelining is indeed not widely supported. However this does not mean you don't stand to gain from three images. Just make sure you reuse them A LOT throughout your website.
-- edit --
Browsers also open multiple connections to a server, the norm is 2 connections last I checked -- however, I believe recent browsers open more.