Progressive JPEGs seem pretty useful for performance purposes. I've found: https://github.com/gruntjs/grunt-contrib-imagemin, which has an option to progressive'ify your jpegs. It is using http://libjpeg-turbo.virtualgl.org/ under the hood. This is great; however, it seems like my images load top down still.
Questions:
Do pjpegs have their own file extension?
Do pjpegs have their own mime types?
What is a good way to test that the pjpeg is doing what it needs to do?
Every JPEG library I have seen recently decodes progressive JPEG image. A progressive JPEG breaks each component into two or more scan (as opposed to 1 scan per component in sequential JPEG).
In theory, an application can redisplay a progressive JPEG image after each scan. In a web browser, that effect would give you images that start out as 8x8 blocks the get clearer.
In ye olde days of internet over 1200 baud serial lines that made a lot of sense.
Now, most images can be downloaded so fast that there is little need for a web browser to progressively display an image, even when it is in progressive JPEG.
To use progressive JPEGs, you just need an encoder that creates them
Progressive JPEGs generally do have their own extension.
No
Just display the image. If it is visible in your web browser, it is doing what it needs to do.
Related
I want to add a gif of an app that I'm working on to the app's website. Previously, I used quicktime to screen record my computer and then used EZGIF to convert the video into a gif. Unfortunetly, in order to get the gif to a reasonable size to embed on my website (~5MB), the quality goes completely down the drain (you can see the bad quality gif on the website now.
To show off enough functionality of the app to make it worth going on the hompepage, the original video that I'm recording is ~45 seconds.
Are there other, better methods for recording / compressing gifs for websites?
Is there a way to control how much of the file will be buffered ahead, once you click play, much like Youtube once did?
If you use the built-in video support in the browser then there is no way to control the amount of data that's being buffered. It depends on the browser implementation and there is no API to control it.
Browser implementations are quite good and typically a browser buffers just a small portion of the video before playback begins. Browsers don't download the whole file. However if a file gets completely downloaded by the browser before playback begins then possible causes include:
Maybe the file is very small and the browser has decided to buffer the whole file in memory.
Maybe the header of the file (required to initialize the video decoder) is not at the start of the file but at the end and the browser has to download the whole file until it reaches it. This is uncommon nowadays but old video encoders used to place the mp4 header at the end of the file instead of at the beginning because it simplified the encoder's implementation.
I've noticed that browsers behave oddly if an mp4 file doesn't have a segment index (used for seeking) in its header. Some browsers download the whole file so that they can build a segment index themselves.
If your server is old/misconfigured then the browser may decide that range requests are not supported and download the whole file (or disable seeking).
If you require to control the buffered amount before playback begins then you have to use a more sophisticated protocol (MPEG-DASH, HLS) and a javascript player that allows you to control this parameter. Youtube use MPEG-DASH and have their own player that they've developed over the years.
I'm pretty sure you c an't stop the HTML5 video from buffering the entire video, but here is a useful link for working out how much of the video has been buffered.
https://developer.mozilla.org/en-US/Apps/Build/Audio_and_video_delivery/buffering_seeking_time_ranges
Like mylescc mentioned, preventing the video element from buffering might not be possible, without a workaround (described in a similar posting). However, depending on your use case, you can also make use of existing player implementations, which provide this functionality, like the bitmovin player, dash.js, etc.
On my website, I intend to stream background audio using the HTML5 <audio> tag. However, even after cutting down on the track length, my two files (MP3 and OGG Vorbis, for different browsers) end up at just short of 5MB a piece.
Due to this, it would be nice to ensure loading time and bandwidth is conserved by caching the files. What I would like to know, but can't seem to find, is if it's possible to force the files to cache, or if browsers would normally cache the files at all.
Thanks for your input!
You can not force caching. Browser treats these files as standard resources, so make sure that your server is properly configured to make caching as likely as possible. (Returns valid ETag, Expires and Cache-Control, no Pragma:none, 304 Not modified responses, etc..). HTML5 local storage can be used (but not worth the effort) to cache small items like pictures.
Mobile browsers have such a small cache, that even this doesn't help and they will flush the cache pretty soon.
Fast forward 7 years, and you can now easily do this with a Service Worker. You could even get crafty and cache/combine various Range requests if you wanted.
https://developers.google.com/web/ilt/pwa/caching-files-with-service-worker
I have pages where i need to play dozens of small audio file when the user clicks on things. Responsiveness is very important.
I'm thinking of using one for each file, and preloading the audio files. Is this the a reasonible approach?
Thanks.
What I experience using SoundManager2 (audio Javascript lib) is that Chrome nor Firefox have no issues loading and playing multiple (100+) sounds through their Html5 capabilities
(Firefox must play OGG though)
With IE9 it's a different story. Looks like it has a limit to load and play no more than 40 sounds. :-(
As, the game we develop requires constantly to have 50+ sounds played within 1 minute period, we have to fallback to Flash for playing sounds on IE9.. luckily SM2 does it too
I also confirm this behaviour with html5 mode using jPlayer. I'm only able to create 40 instances of jPlayer. Each can preload and play sound that it defines.
41st and following instances will fail with an error on IE9/Windows7
Error: "Media URL could not be loaded"
It's reasonable, and probably the correct solution. I recently wrote a demo application (http://www.soundscribe.com) that makes heavy use of individual (and simultaneous) audio clips in HTML5. IE9 and FF3/4 handle it well. Chrome has some issues that seem to be specifically related to simultaneous playback (which probably won't apply to your app). The biggest block I hit was in IE9, which seems to have a mysterious limit on the number of audio objects that can exist at once. The max is about 40, after which IE9 will silently fail to download the file. FF and Chrome both try to support an unlimited number.
The alternative approach of putting all the audio in a single file and changing the offset to play is a bad choice for several reasons. It's much more complicated to code, you need to keep up with additional metadata (where does the clip start, how long is it), and it's likely to work slightly different between browsers. And the worst part, there's really no way to know when your clip is fully loaded. You can only tell when the clip "can play through", which is determined by the browser based on the size of the audio file and the current download rate. This means that even after the browser reports the audio clip is ready, you may not be able to play a clip somewhere near the end.
It seems like a reasonable approach. However you need to consider a couple of things.
Each sound clip will need to be held in memory. While this is will not matter for most cases, users with a lot of tabs open, multiple programs open or old computers may get slow down on their computer. Especially if the sound files are large.
From a usability point of view, if I hear a sound every time a click a button on the site, I'll leave immediately
I have been looking into HTML5 manifest but I am unclear as to whether or not there is file size limit for caching using the manifest.
For example if i wanted to make several audio files available offline would this be achieved using manifest? or is it really only for small images and text?
as far as i know the spec doesn't specify a maximum size for an object or for the entire cache, but e.g. firefox has a preference which by default allows a total of 50mb worth of cache-files. that implies that indeed the cache is optimized for small files (html, css, js, images) and not for big files (video, audio, ...).