Reduce loading time of heavy images in html5 - html

I am trying to load heavy ong images in website. I have tried to use photoshop -> posterize method to reduce the file size of heavy pngs followed by some online tools to reduce the png file size without major compromizing the filesize of an image.
I have als tried pre-loading images, but still they are taking a considerable amount of time to load.
Can anyone please suggest me some way to reduce the loading time of png images on website, so they that they load quick?
Many thanks in advance!

Even if you follow all/some of the the previous advice, a nice addon to your workflow is passing all your PNGs through PNGoptimizer.
It's a simple drag and drop program which strips unnecessary information out of your PNGs. Depending of the type of image, the size saving varies between 5-10% and more than 70%.

Try preloading using XHR and some server tricks.
Or use the webp format.

Your problem must be elsewhere.
At most, 79kb x 5 is less than 1/2 a MB. This would only be a slow load on a slow connection.
I suggest you install YSlow ( http://developer.yahoo.com/yslow/ ). It analyzes web pages and suggests ways to improve their performance based on a set of rules for high performance web pages.

There is a nice JS script to ease the preloading of images: https://github.com/jussi-kalliokoski/html5Preloader.js
Be also aware, that proper caching is a MUST for heavy sites. Google has a nice introduction for this purpose: https://developers.google.com/speed/docs/best-practices/caching
To avoid flashing images on your site, load low-res images with fast loading time, stretch them to the full size and replace them when the high-res images were loaded. To improve optics you could also place a grid over the low-res image.
This has a pretty neat appearance for the visitor and reduces wait time.

Depending on what server you have, I use image gen for all my .net sites.
http://our.umbraco.org/projects/website-utilities/imagegen
<img src="/ImageGen.ashx?image=/images/an-image.png&width=50&height=50" width="50" height="56" alt="">

Photoshop's Posterize will not help you; PNG's don't use a palette (Unless you choose PNG-8). You can use File->Export for Web to export a smaller PNG (or a gif). If your image contains alpha information, it will be lost if you export for web though. PNG-8 and PNG-24 both have index-transparency.

#ShrutiJakhete you can use lazy load Jquery with your HTML5. The images outside the view port will not be visible until you scroll down the page. Thus it'll reduce the loading time. Check this out http://www.appelsiini.net/projects/lazyload

Related

Optimize webpage speed with lots of graphics

We just finished this webpage http://marinosactiongrill.com/, as you can see its HUGE and it has a lot of graphics (almost 7M), that make it download very slow on the first time (aprox 20 to 40 secs)
Can you help me with tips to make show more fast?
I know that its better to use sprites... but i couldnt find a way to make responsive buttons and animations with sprite that is also compatible between IE and other browsers
Regards...
To begin with you might want to consider using Google Page Speed plugin - it can help you with analyzing your performance bottlenecks.
1) By scaling and optimizing images you can save up to ~500kb of your initial loading. Also you should think of using jpg compressed images instead of png format. jpg is a lossy format but can save some bandwidth compared to png-s.
2) Setting up correct server-side caching can also work wonders - instead of loading static content on every request you can just cache it after first request.
3) There also seems to be something wrong with your server-side setup - loading relatively small and static css and javascript files takes up to 2 seconds in some cases. Is your server under heavy load?
4) In modern browsers your page doesn't have to be built using just images. Instead of making two sets of images for each button you can just use custom fonts + some new fancy css in order to achive "glowing" effect for mouseover. Here is a quick demo - http://jsfiddle.net/nc6v4/, it is done using only css, no images.
some-selector {
font-family: /*Name for custom font loaded for example via google api*/
box-shadow: /*Use for blurry outlines, glowing effect etc*/
background: /*You can use for example gradients instead of using images*/
}
5) Do not host jquery in your own server - use some kind of CDN - jquery provides it's own CDN and if you are not comfortable with it, use one provided either by google or microsoft.
6) Server-side compression can save up a lot of bandwidth - in your case ~500kb of data. Instead of passing your html, java-script and css as a plain text, "gzip" it, use correct HTTP header and browser will automatically uncompress it.
7) One last but rather important technique - combine your javascript files into one big minified javascript file. The thing is, when making lots of request to server at the same time, those requests will start blocking each other. It is MUCH faster to load 1 file which has size of 20kb rather than 20 files with size of 1kb. Use same technique for css.
To sum it all up - there is no magical technique to make this site fast by using just couple of lines of code. When you set up server-side caching and combine +minify your java-script and css files you should be fine though. When extra performance is desired, consider using modern css + html techniques instead of bandwidth heavy images.
Also, as a quick side note, your page doesn't have to look 100% identical across all browsers. Modern browsers perform seamless auto-updates, keeping up with new standards and old versions of internet explorer are losing their market share at quite a satisfactory rate. Furthermore, since target audience for your page seems to be gamers, browser compatibility also shouldn't be an issue.
You need to go back and optimize your images as they are far too big given their dimensions. If you used photoshop its just a case of file > save as for web > and then finding the balance between compression artifacts and file sizes.
It could also be beneficial to use something like Pictue Fill which loads versions of images for certain resolutions making the site much quicker as you go down in screen sizes. Why bother loading massive images that just get scaled down for lower screen sizes!
Its as easy as;
<span data-picture data-alt="Image Description">
<span data-src="small.jpg"></span>
<span data-src="medium.jpg" data-media="(min-width: 400px)"></span>
<span data-src="large.jpg" data-media="(min-width: 800px)"></span>
<span data-src="extralarge.jpg" data-media="(min-width: 1000px)"></span>
This might be too much work, but it is definitely something to consider.
To improve the performance of your website you need Firebug and PageSpeed together (Firefox) or PageSpeed ​​(Google Chrome). These tools can help you to improve also other bottlenecks that you haven't found yet.
*Online Tool
http://developers.google.com/speed/pagespeed/insights/
*Extension for Google Chrome and Firefox
https://developers.google.com/speed/pagespeed/insights_extensions
*Firebug
https://getfirebug.com/downloads/
To use it in Firefox, you need to install Firebug and PageSpeed at the same time.

Creating a moving banner

Currently for my webpage's banner, I'm using http://workshop.rs/projects/coin-slider/ to have 4 images sliding. On the office's slower computers the coin slider seems to be very laggy.
My question is, would animating the 4 images into a single GIF file be better/faster? Are there better options to create a 4-image-moving banner?
Try using a lightweight image gallery, I found some here. The coin slider is lightweight 8KB but there are ones even lighter 2KB or so.
You could technically use a GIF but I think there are some mobile compatibility issues, you can also have better functionality with javascript/jQuery galleries.
As far as slow computers go there isn't much you can do about that, I wouldn't worry too much about it. Just do some extra research on how to minimise files and deliver a faster site.
Gif is not a wise alternet for sliders..you could reduce the quality of your Images and compress them..or if you are using large images and resizing them in slider, you can re size them with any image alteration tool. This will reduce image size too.
And I hope you are not giving images path direct internet links, cause this will automatically increase the loading time.

Speeding up the image loading process

How can we speed up the process of loading hundred's of images in as3? it seams to be taking a very long time to load 100's of images.
If you don't have the ability to change the image quality or sizes, then there isn't a lot you can do to speed up such a large load, as you are limited by connection speed and simultaneous browser connections.
You may be benefitted by using a loading manager, like Greensock's LoaderMax. LoaderMax will help make sure you are loading several assets simultaneously - check out the maxConnections parameter.
Also, I suggest not loading everything up front with one preloader. Instead, have placeholders with a spinning loader for each image that then gets replaced by the final image once loaded. This will provide a much smoother user experience and create the illusion of the overall load taking less time.
100s of requests can take quite a long time even if the total size of the images isn't prohibitively large. If you know which images you need in advance, you could embed the images in another swf which you load at runtime.
By the way, what is the reason that you need to load 100s of images right away. Is there any reason you can't load them in the background a little more slowly?
A few suggestions:
Reduce the filesize of your images (obvious but most important).
Don't run all the requests at once - queue the images and load them one at a time (most relevant to least relevant).
There were lots of good advices already. But here a re my 2 cents. One day I had to load about 1000 of really small images. It was taking too long so I used FZip AS3 library. Worked like a charm.
The question does not provide a lot of specifics, so here are some general guidelines.
If the images are large JPEGs, you might be able to sacrifice some quality and recompress the images to make them smaller.
If the images are PNGs and represent, e.g., lots of small, individual tiles for a game, you could try combining many of them into a single image and then parsing them out into images on the client side after loading the image over the network.
Implement download queue, for example simple loader
https://github.com/hydrotik/QueueLoader
Do not load images that
currently out of stage (you can load them in background)
Use texture packer if you have plenty of small images
Optimize pics
quality (i.e. size, type)
Use cache, for instance Local Shared
Object, so for next time you'll get your pics ultimately fast from
LSO.
Is there any way to get thumbnails of the images made? If so, you could use the thumbnails while the larger versions load. Load all the thumbnails first (since they'd load really fast), then load the larger versions in prioritized order.
It sounds like you don't have a lot of control over the images, though. If you don't have any control there, there's very little you can do to speed up the process besides using a loading optimizer as others have suggested.

get high quality images in background of websiteswithout compromising load time?

i am making a website where i want one huge image to cover the complete background. I have seen many websites where the images in the background look to be of very high quality and still the pages load really quickly. How?
for example look at this link http://g2geogeske.com/menus
how can i achieve images with quality like these without compromising with the load time. Also if anyone knows of any tutorial or site which explains this please post the link to thanks.
even after being so big its just 170 kb(backgroung in link).my images are at about 700 kb and still look of lower quality then the one in the link. Am i missing something. some trick or anything??
You want to know how to compress images? Go download a decent image viewer (Irfanview) or editor (Paint.NET) and go to town! The background for the site you linked is a 171kb jpeg file; that's peanuts for a half-decent internet connection.
usually when you save an image as a jpg, you have the option to specify a 'quality' rate. If you look at the image in the site you linked, the background image, even though it's really big, it's only 170K. To me, the magic number to save a JPG is 90% quality.
ADD: Also, if you use Irfanview, you can add a plugin called 'RIOT' that compresses images even further (I think you can also get it as stand-alone or something) http://luci.criosweb.ro/riot/

sprites vs image slicing

I don't have much experience with the sprite approach to images (http://www.alistapart.com/articles/sprites). Anyone care to share some pros/cons of sprites vs. old-school slices?
The main advantage of sprites is that the browser has to request less pictures from the webserver. That reduces the number of HTTP requests and makes it possible to compress the parts of the design more effectively. These two points also represent the disadvantages of sliced images.
Here you can see some good examples how sprites improve the loading speed of web pages:
http://css-tricks.com/css-sprites/
Pros:
It's far easier on the server to serve a single large image than many small ones.
It's (slightly) faster for a web browser to load such an image.
Browsers only load images as they needs them - if you are using multiple images in a rollover, the browser would "pause" the first time you roll over the element. This can be solved using sprites, because there is only one image to load.
Cons:
It's kind of a pain to code (more so than using multiple images at least)
One often overlooked downside of using CSS sprites is memory footprint:
https://web.archive.org/web/20130605000516/http://blog.vlad1.com/2009/06/22/to-sprite-or-not-to-sprite/
Unless the sprite image is carefully constructed, you end up with
incredible amounts of wasted space. My favourite example is from WHIT
TV’s web site, where this image is used as a sprite. Note that
this is a 1299×15,000 PNG. It compresses quite well — the actual
download size is around 26K — but browsers don’t render compressed
image data. When this image is downloaded and decompressed, it will
use almost 75MB in memory (1299 * 15000 * 4).
When sprites get loaded into the browser, they are stored uncompressed. A 26 KB file can uncompress to take up a whopping 75 MB of RAM. You should be mindful of using sprites with very large dimensions.
There's also the issue of what happens in browsers with poor CSS support (legacy browsers). The sprites may end up totally broken.
Sprites
Pros:
Less HTTP connections to the server
Faster loading on broadband
Cons:
No encapsulation: If you want to change one image, you have to change the sprite
It is difficult to setup individual images in CSS without a tool
Don't degrade: If the browser don't support CSS, you are in trouble
CSS Sprites:
Pros:
Graceful degrade in unsupported browsers (text can be shown when background images for links are not allowed)
Fewer HTTP requests
Each image has a separate overhead like color table so image slicing will be having more overhead than CSS sprites
Cons:
Poses a problem if images are turned off in the browsers (rare case though)
Image slicing:
Pros:
User perceives a faster load since loaded piece by piece.
Load on demand like when the user places his mouse on the image
Cons:
The webpages might have a large size on the client side even thought it might not be the case on the server side.
The main drawback of sprites is it makes it hard to read/maintain/modify your CSS. It can be difficult to remember the exact pixel offsets within the sprite.
pros using sprites :
since it is using 1 images for all, it require less load on http server.
cons:
- hard to code. you must know the coordinate each images inside sprites so you can display it correctly. once you change the size of the image, you need to adjust all ...
- big images could creates long waited page to display. while using images, user with slow internet connection can see one by one.
best practices.
use it for example roll over images.
Look into using a CSS sprite generator (we use SmartSprites). That way you can do slices locally, and have your build process generate a spritemap. It's the best of both worlds.
Also is SmartSprites isn't for you, there's definitely others, however I like it because it reduces the amount of work up front AND during changes.
Cons
- slower on older browsers/ maybe not working on them with hover effect (opera6)
- if not used correctly can get very/too huge (group them adequately!)
- tedious work to set them up
Pros
- less bytes transfered, because one big image is smaller then all individual images combined (one header/ color table)
- less http requests
I prefer going the middle ground of grouping similar images (normal, hover, selected page, the parent page of selected page) than having all the images in one file. To make these, you image slice like normal in Photoshop or Illustrator, open the files up and combine them with a shortcut key. I wrote the Photoshop script that combines images into CSS sprites. You will have multiple HTTP connections, but won't have the load delay on hover.