If I want to show thumbnail view of an image and show the actual size image when the thumbnail is clicked ,
What is the best practice/standard way ? I mean,
1- Keep two files in local storage (say 800x600 and 120x160)and load them separately
2- Keep only the full size image(say 800x600) and show the thumbnail by resizing it in HTML tags
width="160" height="120"
Best practice is to use the two files, i.e.
1- Keep two files in local storage (say 800x600 and 120x160)and load them separately
As resizing is both computationally expensive and uses more bandwidth.
Ideally, each thumbnail image is compressed to about 10 - 30K (JPEG), and has an alt text. That way you can have about 20 or so in a table, easy to see. Like here: http://www.flickr.com/explore/interesting/7days/
And as you likely know, there are thumbnail creators, like this one.
Option 1 - you save bandwidth and you also allow people with download limits to not use up their potential limit (such as on their phones). The only thing you end up giving up (assuming you automate the process) is hard drive space.
I think, that the best is to keep two files. User can load a several big files, not everything. So, to load them all is unnecessary.
It may be very important for phones or such devices.
best thing is keep high resolution version in the server and then based on the requirement dynamically re size the image. and if you want you can cache the file in the server side.if you are using C# this sample shows how to resize dynamically
next option is keep separate files in server.
Related
I am trying to find the best way to load many (say 100) images onto a web page. Some images are large background images for a parallax effect, others are images on cubes (so all sides of the cubes made from divs);
What I am wanting to know is what is the best approach, so far I have a few idea:
method A
1 lazy load all the images and background images when they appear.
2 use a sprite sheet for the smaller images.
3 compress all images for best compression, use Gzip.
method B (experimental)
1 create a separate style sheet with variables for each image converted to base64 (ex :root{--boxImage1:url("data:image/jpeg;base64,/9j/4AAQSkZJRgABA.....)
2 create a service worker for the css as it would become rather large
3 use compression to min the css file.
4 use gzip to transfer from server.
Method c
store the images as binary on a database
(I worry this would actually make them slower as it would be searching the database rather than the file system).
are any of these methods worth doing or is there a way simpler way that I should use ?
I'm currently working on PWA with a lot of user uploaded images, and I can confidently say that using your method A did wonders for our app load time.
Compressing images on upload and converting them to base64 strings drastically reduces their size, and combined with lazy load of 5 images at a time, app is much faster.
Since you have service worker tag in your question, I would also recommend precaching at least the initial batch of images for even faster load speed of your homepage.
I have a web page which loads almost 70-80 images on the web page. I already have implemented lazy loading. Should I keep these images on local server or on some image site like picasa and reference it from there ? or is there any better option ? What would be the best way to load these many images without loosing performance and efficiency ?
Please suggest.
One more question - is it good idea to add css animation effect to those images ?
There are a number of things you can do.
Use a tool like smushit - www.smushit.com to optimise images and reduce loading times.
Make sure that you are not resizing large images by setting incorrect widths and heights.
Call images from a subdomain on your server.
Combine images used as backgrounds/display elements into a single image (known as a sprite), and use css to only call specific areas of the single image - this means it will only be loaded once, reducing HTTP requests.
You can get some advice on using Sprites here: http://css-tricks.com/css-sprites/
You can also get some more info on page loading times, and things you can do to optimise by using GTMetrix tools at http://gtmetrix.com/
I am putting some photos on my website and I do not know which syntax will load them quicker. My photos are usually 4300x3000 or 3000x4300 (which is from 7-10 MB per photo). I am using
#image {
max-height:500px;
max-width:750px;
}
When I viewed my website on a low-end PC, it took a lot of time to load. I do not want to use fixed height and width because I could have photos as big as 2500x2500 and that would cause a mess. What should I do to reduce the load time? Is there something like an “autoresize” that will keep the height to width ratio?
Compression
You should compress the images using some external software (if you are not using any other language apart from HTML and CSS). I would recommend Photoshop or GIMP.
That's the only way to improve the load: reducing the image weight. The forced resize is just loading the same amount of data from the server.
Improving quality of resized images:
If you are interested on improve the quality of the resized images, you can take a look at this article:
http://articles.tutorboy.com/2010/09/13/resize-images-in-same-quality/
Auto-resizable background
Loading image of 4.000 pixels is not a very common practice even in the websites with a full background. What it is usually done is loading a picture of about 1800-2000 pixels width and then adapt the image to bigger or smaller monitors using CSS preferable.
Here an article about that:
http://css-tricks.com/perfect-full-page-background-image/
Responsive images:
You can also load a different image depending on the predefined resolutions of your chose.
You will need to have multiple versions of each image though.
More information about it use.
My photos are usually 4300x3000 or 3000x4300 ( which is from 7-10
mb/photo ).
It has little or nothing to do with max-height versus height. The performance hit is coming from the original size of the image which causes the browser to:
download a large file
exercise a scaling algorithm against an enormous number of pixels
What should I do to reduce the load time? Is there something like an
autoresize that will keep the height to width ratio?
Create a smaller version(s) of the file when you upload it, and serve the small version. You can optionally open the original version when the user clicks on the small image.
You can create one or more copies of the file manually and upload them with different filenames.
A more elegant solution is to create one or more copies of the file programmatically (you didn't indicate server technology, but there are many options available). For example, Facebook creates six copies of each image you upload so that they can rapidly serve them in different places with the correct size.
Whether or not you do it automatically or manually, you may choose to adjust/crop the image to achieve the desired aspect ratio.
You should be resizing your images and loading those resized images instead if you want quicker loading. You could keep both large and small on disk and only load the large images when user clicks the link.
To resolve loading time
You have to compress your photos before uploading them to the server. Use export to web in photoshop, make sure the image size is reasonable (I would say never more than 1mb); You can also use image optimisation software (In Mac I would recommend JPEGmini).
You can, if you wish keep your large images in a folder in your site and link to them (so that one can download them if you allow this).
To resolve the ratio issue (square vs rectangle)
You can just use one of the properties and css will calculate the other. For example, if you put only
#image{
width:750px;
}
This will resolve the matter of things "getting messed up" if you mix rectangle images with square images.. Good luck!
I have a slideshow which will be displaying large 400px X 1000px images. I need to have thumbnails of each image in the slideshow, so I was wondering... will the page load faster if I simply use a second instance of the image but lower the width and height - will the page load the large image twice? Or would the page load faster if I load the HD image and a smaller thumbnail?
Generally, I would advise against using images that are larger than the intended use. I would, personally, definitely recommend serving a properly sized image. These recommendations are based less on page load time (as the difference in your case is probably negligible) and more about best practice.
A very handy php utility I've use a lot for stuff like this is timthumb
Generally it's a bad practice to create an image gallery of any kind using only full size images, lowering their dimensions when using as thumbnails. Yet if your case is a small gallery with < 30 images displayed per page, then this might do. In the other case scenario, imagine that a page has to preload all of the big images no matter if user would play the whole slideshow or not. This slows the page load time and generates unwanted data transfer. When it goes to timing, it would still be faster to generate thumbnails. A good tool for that could for instance be phpThumb which is able to generate any image size from its original and cache the output afterwards, speeding up the page load next time.
I have a web app that lets users upload images. The images then appear as thumbnails on the app, and when a user clicks on the thumbnail it shows the full size image. Pretty standard.
Right now I'm storing the images in a MySQL database, and have the entire app running on Jboss. To display the thumbnails, I have an IMG html tag that points to the full size image, and I just set the width and height to 50px. This loads the image out of the db, sends the entire thing to the client, and then resizes it in the browser. Obviously this seems like a big waste of resources and makes the pages take a long time to load, since there could be a lot of thumbnails on the page, so I'm trying to figure out the best way to improve this.
I found an apache module that resizes images on the fly. So I was thinking I could use that to send the thumbnails to the client, reducing the bandwidth and page load time. Since my images are stored in the database, is it possible and efficient for Apache to serve them? Should I be storing the images on the file system instead? Any suggestions on how I should handle this?
I think that all the overhead of processing the request and pulling image data from the database is totally unneeded.
Just handle the resize upon upload, and serve both thumbnails and big images as static files directly via webserver.
Depending on the number of images and image size - for 1 million 4MB images i would prefer not to store images in the database. But for 100 100K images yes it could handle it.
You need to consider Database backup times, and data retreival.
In one of my applications, which relied heavily on reducing database backup times and sizes, i did not store images in the database.