I've got an HTML5 <video> element whose source is a .m3u8 (HLS stream)
I have an M3U8 with three different renditions: 640x360, 960x540, and 1280x720
On Desktops I have a Flash Player for playing the video, so the HTML5 fallback is only intended for mobile (iOS and Android) - I am doing all of my testing on an iPad and, once it's working, I will try it out on Android and hope everything works the same.
My goal is to, at any point in time, figure out what rendition the video element is playing. The rendition is subject to change as the user's bandwidth changes.
I tried using the .videoHeight property, but it always returns 480 regardless of the rendition being downloaded - which is particularly odd because 480 isn't even an option.
Does anyone know how I can figure out the rendition being downloaded?
Cleaning up some old questions that never received answers:
Unfortunately this one is just not possible. The HTMl5 video spec and HTML5 video implementations in most browsers are intended to abstract away all of the underlying magic involved in playing videos. You give it a source, it plays. Everything else is completely hidden and you have no access. No access to metadata channels, no access to audio channels, no access to bitrate and resolution information,...
At best I developed a solution to guess which resolution was playing. Every 10 seconds a 1 MB file was loaded over AJAX. I measured the speed at which this downloaded to guess at their current bandwidth. I know that QuickTime will only play a rendition if you have double the required bandwidth. So if the 960x540 rendition requires 1400 kbit/s then it won't play unless you have 2800 kbit/s bandwidth.
It's not very good (and wastes 6 MB of bandwidth per minute) but it's better than nothing.
Related
I'm trying to ensure (to the extent possible) that an HTML5 video begins playing only only when it is able to play through completely without buffering. For context, the MediaStream of the video is then used to mix with another audio source and sent over peer WebRTC connections. The videos are typically 5-10MB and a few minutes long (i.e. a decent broadband connection should have no trouble loading the entire video well before it's done playing).
To achieve this, my code currently waits for the canplaythrough event on the video element to begin and calls play() when it fires.
This "works" in the sense that the video begins playing and, in most cases, buffering is sufficient for the video to play through uninterrupted. But, in a few cases (specifically for two people so far that happen to both have been running Chrome on MacBook Airs and with apparently not incredible but decent broadband Internet connections) the video plays staggered and choppy---which I believe to mean the video has not sufficiently buffered.
Are there better techniques for either ensuring that video is sufficiently buffered on most browsers?
Would using fetch() to buffer the entire video in memory probably do the trick? Or is a resulting blob() also actually lazily buffered behind the scenes?
Are there good practices for testing and debugging these sorts of issues given that I can't really replicate this locally?
I have built a tool called Stream or Not that might help on the network side. It will tell you how long the video takes to start, how many stalls, etc. You can use your browser's devTools to throttle the network (and in Chrome, you can emulate the CPU).
Honestly - to see if the network is the issue, as long as the bitrate (use FFprobe https://www.streamclarity.com/probe?url=) is lower than the network speed, you are not network constrained.
There is another possibility. What are the dimensions of your video? How what are the dimensions of the viewport on the browser? If you are asking the device to chop down a lot of pixels - the playback limit might move away from bandwidth to CPU processing speed. I have seen this happen on mobile devices and on older Macs trying to play 4k videos - there just isn't enough CPU to process that many pixels.
I'd test the network speeds, just to be sure.
Make sure you are not sending more pixels than you need. Underpowered devices will have issues.
I noticed in youtube, for the same resolution, like 360p
for the same video (with the same video id)
html5 videos are much larger in size than flash videos
is there any relationship on video size between html5 and flash?
thanks
This really depends on what codec you are using and what are the settings of the video (bit-rate, frame-rate), codecs usually also have certain adjustments. Flash and HTML can both play .h645 video, same profile, same settings. In that case it would be identical (since it would be the same file), but not all browsers and operation systems support all codecs. For instance, this one is proprietary, and Firefox doesn't support it.
Theora Vorbis, the another codec, supported by Firefox, it, on average, offers somewhat more conservative compression, which offers perhaps a little bit better quality while sacrificing size. I.e. it is larger in size, but maybe a little bit more sharp / detailed, not really sure, I can't tell the difference from watching the video.
So, while files don't have to differ in size, for the sake of compatibility you would have to have at least 2 copies of your video encoded in different formats, if you want to use HTML video tag.
I have two different videos, both (as far as I know) generally captured in the same manner, that I'm trying to play using an HTML5 video tag in Chrome. Both videos open and play perfectly in VLC, so I don't think there's any issue with a corrupted file, and both are mp4's with an H.264 format, using YUV color space. However, when I try to play one in Chrome (Version 21.0.1180.89) it gives me a grayed-out play button, while the other works perfectly. For reference, my OS is Ubuntu 10.10, although I've seen the same problem in newer versions of the OS. This is whether I'm loading the video into the HTML5 tag, or navigating directly to the URL where the video is being stored. I'm somewhat at a loss here, does anyone know what direction I should go to find what the significant differences are between the two videos?
Edit:
This one works: https://dl.dropbox.com/u/100841270/1_G101_20120914_0139PM_Course_101.mp4
This one does not: https://dl.dropbox.com/u/100841270/1_G101_20120914_1156AM_Course_101.mp4
Update:
It appears to have nothing to do with OS, since I've seen the same problem in both Windows and Linux. Chrome 22 beta in Ubuntu didn't seem to work either.
We had this problem and found that encoding the files in accordance with iPhone's webview's standards created files that played fine in Chrome. Chrome and iPhone webview share the same render engine, and it appears they have similar HTML5 video requirements.
Not all H.264 encoded Mp4 files are supported by Chrome and slight differences in the encoding process can produce videos that do not work. Even if the EXACT same encoding settings were used, H.264 is a variable bit-rate encoder, so different videos may exceed bitrate limits.
The encoding settings that were successful for us were:
Only use the H.264 Baseline Profile Level 3.0
Resolution below 640 x 480 and framerate up to 30 fps
B frames are not supported in the Baseline profile.
bitrate limit of 900kb.
Here is the reference we used to arrive at those settings. Likely not all of these are required for Chrome, but we stuck to these rules and found that all videos worked on both platforms. Further research could likely determine the exact setting that is/was causing Chrome to not play the video.
I am running Windows XP, and chrome doesn't like the second one either.
My best guess of the cause is that, the working one is only 6.4 MB, but the other one is about 21.7 MB. Chrome might just be refusing to directly play a video that big. Have you tried having YouTube host it, and embedding their player into your site? That may solve the problem. (If you are worried about random strangers watching the videos, why did you post them here? Why would anybody even want to watch them? Though, you can make videos private on YouTube, in case these are just two videos that demonstrate the same problem you are facing with the real videos.)
That may also be compounded by a different problem that exists with both videos, manifested when I try to use Windows' built-in player. Both videos appear distorted when I use my computer's video player, stretched like 300% horizontally.
Are there other videos you have that fail in exactly the same way? Since these are only test videos for the real thing, if this is the only video with that problem, I would not say that it really is a problem unless it recurs. The dysfunctional video may have just run into that one-in-a-million chance that it has just the right contents for it to not work.
Check out this space shooter demo.
The HTML5 audio is perfect on Chrome 18 and Firefox 10. There is no lag in playing sounds and each sample plays perfectly. The last time I tried to play sounds using HTML5 audio and JavaScript I couldn't get a sound to play more than once.
What sorcery is Scirra doing to make this so perfect?
I'm the developer of Construct 2, so I hope I'm sufficiently qualified to answer your question :)
HTML5 audio is indeed a mess, so I've gone to considerable lengths to try and make it bulletproof in Construct 2. Here's an outline of what I've done:
Use the Web Audio API
HTML5 audio appears designed for streaming music, so a HTML5 Audio object is kind of a heavyweight object. Playing 10 sounds a second with it like Space Blaster does can easily seize up the browser. On the other hand, the Web Audio API is a high-performance audio engine with routing, effects, and lightweight sound playback. It's perfect for games. Audio buffers and audio playback are separated, so you can have one data buffer and efficiently play it many times simultaneously, whereas some browsers are so buggy if you play a HTML5 sound a few times it re-downloads it each time! Since it was actually designed for games and such, you can happily play back tonnes of sound for ages and it will still hum along nicely. It can also use HTML5 audio as a sound source, although I only use HTML5 audio for things the user has designated as music tracks (since that's where you'd prefer to have streaming - typically everything else in the Web Audio API is fully downloaded before playing).
The Web Audio API is supported in Chrome, has also made it in to iOS 6+ (although it's muted until you try to play some audio in a touch event), Firefox are working on support, and it should be coming soon to Chrome for Android. So on these platforms audio will be significantly more reliable.
More info on HTML5Rocks and the proposed spec - you'll have to use the spec as the documentation for now, there's not much else out there.
Other browsers: implement an audio recycling system
The Web Audio API isn't yet supported everywhere, notably IE, which means you still need to crowbar HTML5 audio in to something that might work for games for backwards compatibility. The way to do this is to recycle audio objects.
The player's laser in Space Blaster fires 10 times a second - and that's not including any other sound effects! As I mentioned earlier, Audio is kind of a heavyweight object, so if you're doing new Audio() 10+ times a second, lo and behold, the browser eventually dies and audio starts glitching up. However, you can drastically reduce the number of Audio objects created by recycling them.
Basically, for each sound effect, keep a cache of every Audio object you've created with that sound as a source. Then, when playing a new sound, search the cache for any sound effects which have finished playing (the ended property will be true). If you find one, rewind it back to the beginning (currentTime = 0) and play() again. Otherwise, create a new Audio() object in the cache.
Since the player's laser sound effect is short, instead of creating 600 Audio objects a minute, there will just be 3 or 4 that it keeps cycling round. Some browsers unfortunately will still download it 4 times (Safari did this last I checked!) or have high latency the first time each sound buffer is played, but eventually the browser catches up since the same buffers are always being reused. So basically sound might be a bit weird for a few moments, then it clears up. We also use the HTML5 app cache so next time you play everything loads from disk, so subsequent plays should perform well immediately.
That's basically it. It's still a little dodgy on the first play with HTML5 audio, but every time after that should be fairly solid providing the browser has a sane audio implementation. There are a number of ways to try to clone Audio objects, but I've found that rewinding existing Audios works best.
There's no SoundManager or any Flash/plugin-based fallbacks at all since we make a point of being pure HTML5.
We also support audio APIs provided by PhoneGap and appMobi for mobile, since HTML5 audio on mobile isn't even worth trying. That makes a total of four audio APIs our audio engine wraps, and yes, it does look like a frankenstein mess, but it works.
That's it. I suppose our competitors will read this, but who cares when there's SO rep to be had???!!!1111
I have pages where i need to play dozens of small audio file when the user clicks on things. Responsiveness is very important.
I'm thinking of using one for each file, and preloading the audio files. Is this the a reasonible approach?
Thanks.
What I experience using SoundManager2 (audio Javascript lib) is that Chrome nor Firefox have no issues loading and playing multiple (100+) sounds through their Html5 capabilities
(Firefox must play OGG though)
With IE9 it's a different story. Looks like it has a limit to load and play no more than 40 sounds. :-(
As, the game we develop requires constantly to have 50+ sounds played within 1 minute period, we have to fallback to Flash for playing sounds on IE9.. luckily SM2 does it too
I also confirm this behaviour with html5 mode using jPlayer. I'm only able to create 40 instances of jPlayer. Each can preload and play sound that it defines.
41st and following instances will fail with an error on IE9/Windows7
Error: "Media URL could not be loaded"
It's reasonable, and probably the correct solution. I recently wrote a demo application (http://www.soundscribe.com) that makes heavy use of individual (and simultaneous) audio clips in HTML5. IE9 and FF3/4 handle it well. Chrome has some issues that seem to be specifically related to simultaneous playback (which probably won't apply to your app). The biggest block I hit was in IE9, which seems to have a mysterious limit on the number of audio objects that can exist at once. The max is about 40, after which IE9 will silently fail to download the file. FF and Chrome both try to support an unlimited number.
The alternative approach of putting all the audio in a single file and changing the offset to play is a bad choice for several reasons. It's much more complicated to code, you need to keep up with additional metadata (where does the clip start, how long is it), and it's likely to work slightly different between browsers. And the worst part, there's really no way to know when your clip is fully loaded. You can only tell when the clip "can play through", which is determined by the browser based on the size of the audio file and the current download rate. This means that even after the browser reports the audio clip is ready, you may not be able to play a clip somewhere near the end.
It seems like a reasonable approach. However you need to consider a couple of things.
Each sound clip will need to be held in memory. While this is will not matter for most cases, users with a lot of tabs open, multiple programs open or old computers may get slow down on their computer. Especially if the sound files are large.
From a usability point of view, if I hear a sound every time a click a button on the site, I'll leave immediately