I need to use a video tag to serve over 3GB of video on the web.
When the page is loaded, it takes a long time for the media element to receive the 'loadedmetadata event'.
I've found that the size of the moov box is too large (33MB).
So when I re-encoded it with the 'empty_moov + frag_keyframe' option of 'ffmpeg', but it also took longer to fetch all fragmented information from the 'Inspector - Network' tab in Chrome.
Is there a way to speed up loading when playing 'fragmented mp4' with html5 video tag?
You don't mention what protocol you are using to the deliver the video to the browser, but fragmented MP4 is usually delivered with an ABR (Adaptive Bit Rate) streaming protocol. The most commonly used ABR protocols at the time of writing are probably HLS and DASH.
Using ABR allows the client start at a lower bit rate and hence speed up initial playback - it can then step up through different quality levels to reach the optimal quality for the particular device and the current network conditions.
You can see this effect with large streaming services where the video quality will be noticeably lower at start up and then improve after 10-20 seconds. See more more info in this answer:
https://stackoverflow.com/a/42365034/334402
Bowsers generally don't support ABR natively with the HTML5 tag - for this reason you generally will use a Javascript based player which uses the HTML5 MSE (Media Source Extensions) mechanism to support ABR. You can see open source exmamples such as:
http://dashif.org/reference/players/javascript/1.3.0/samples/dash-if-reference-player/index.html
https://shaka-player-demo.appspot.com/demo/#asset=//storage.googleapis.com/shaka-demo-assets/angel-one/dash.mpd;lang=en-US
Related
Is there any way possible to make HTML Video run off of internet, and buffer when the internet connection is weak, or low, for example, the YouTube Player, The basic idea is to chop video, and buffer it before the client clicks play, can this be based on internet connection and wifi? I will do anyway possible to implement this feature!
Thank you.
IMHO, the video element does most of the buffering in the background (depending on the internet connection), and is usually quite good at it. From experience, it is more time-efficient as a developer to try to optimize the size/quality of the streaming video (using tools like Handbrake) rather than try to alter buffering depending on network variables.
That being said, if you are mainly targeting mobile devices, you could could use the experimental Network Information API to get info in Javascript what kind of mobile connection the device is using, and then direct them to a different (low quality) version of the video if they are using 3g or slower.
Alternatively, if you just want the video to start preloading (not playing) as soon as the page loads, have a look at the preload feature on the html video element.
I notice (by looking at the Chromium network tab) an annoying phenomenon of many Web sites that include videos implementing video playing by sending many small video chunks, with the user (behind the Web browser) not aware of this and even being capable of seeking through the video and seeing the current "position" in the video's reproduction relative to the start and end.
What is the professional term/jargon used for this?
Some existing questions that talk about or mention such Web videos:
Play multi part video without interrupts (HTML5)
Chunk audio/video files for web
Are html5 streamed videos cacheable?
videojs: Download/stream video in chunks with quality selecton
How does video on demand work in Youtube?
The term your probably looking for is “Adaptive Streaming” or “Adaptive Bitrate Streaming”. Or maybe you are looking for names of implementations like “DASH” or “HTTP Live Streaming”
Also, it’s not an “annoying phenomenon” it’s a technique that allows for live streaming media with an undetermined length that can adjust to each users internet connection without using expensive media servers and can leverage existing CDNs and caching infrastructure.
I have big collection of CC music, which I want to stream.
I want adaptive streaming for users with low internet connection to not wait every 5 sec for buffering.
I read about mpeg-dash, hls, etc and it seems that mpeg-dash supports only mp4 and ts containers, I am not sure if I can do what I want.
I have LC-AAC-320k and HE-AAC-64k files. Quality matters, and I hear the difference between HE-AAC and LC-AAC-320 even on realtek audio.
Is it possible to do adaptive streaming for these formats with support for Chrome, Firefox, Safari? If not, is there any way to detect low bandwidth (often buffering) and switch to HE-AAC?
I'd like to know how YouTube plays long-form videos so quickly, with seeking, on mobile.
This is an example video: https://www.youtube.com/watch?v=eyU3bRy2x44
I can load it just fine on mobile within 5-15 seconds and I can even seek through it.
Are they using HLS? Or are they using any other streaming technology? Are they using MP4 with highly optimized MOOV Atoms placed at the front of the files?
I'd like to know because I want to serve up long-form videos on one of my websites, and they take forever to load even if served from a CDN.
Thank you in advance!
Your videos should not really take a long time to load even with 'normal' HTTP streaming if the CDN is doing its job properly.
One possible problem might be the quality/bit rate of your videos - if they are only available in high quality or high bit rate then this will definitely cause a delay in initial playback.
Many (most?) YouTube videos now will support multiple bit rates, which allows the client device select the bit rate that is most appropriate for the current network conditions. This technique is called adaptive bit rate streaming, as you likely are aware given the reference to HLS above.
MPEG DASH, as Aquary mentions, is an adaptive bit rate streaming format. It is designed to be an open standard - Apple's HLS, Microsofts's Smooth streaming and Adobe Dynamic Streaming are the other main adaptive bit rate formats.
For videos that support adaptive bit rate streaming the client will usually start up at a low or medium bit rate to ensure quick start up and then 'step up' to the highest bit rate the network will support once the video is playing. This helps fast startup. When you jump to the middle of a video the same approach is used to 'start' again from the point you have selected.
You can quite often see this if you look closely at a video when it starts up - the playback quality will improve after a short while as the video steps up through the bit rates to higher quality streams.
YouTube uses MPEG-DASH in HTML5 on the devices that are capable of that. This allows seeking through the media and start from the moment which you select.
Traditional progressive download (AKA pseudo-streaming) is not a good option in case of long videos because by default, media players try to download entire video even though you may stop the playback. Seeking is also supported in PD but your video should be prepared for that and your media server needs to be able to process seek requests properly.
I've got an HTML5 <video> element whose source is a .m3u8 (HLS stream)
I have an M3U8 with three different renditions: 640x360, 960x540, and 1280x720
On Desktops I have a Flash Player for playing the video, so the HTML5 fallback is only intended for mobile (iOS and Android) - I am doing all of my testing on an iPad and, once it's working, I will try it out on Android and hope everything works the same.
My goal is to, at any point in time, figure out what rendition the video element is playing. The rendition is subject to change as the user's bandwidth changes.
I tried using the .videoHeight property, but it always returns 480 regardless of the rendition being downloaded - which is particularly odd because 480 isn't even an option.
Does anyone know how I can figure out the rendition being downloaded?
Cleaning up some old questions that never received answers:
Unfortunately this one is just not possible. The HTMl5 video spec and HTML5 video implementations in most browsers are intended to abstract away all of the underlying magic involved in playing videos. You give it a source, it plays. Everything else is completely hidden and you have no access. No access to metadata channels, no access to audio channels, no access to bitrate and resolution information,...
At best I developed a solution to guess which resolution was playing. Every 10 seconds a 1 MB file was loaded over AJAX. I measured the speed at which this downloaded to guess at their current bandwidth. I know that QuickTime will only play a rendition if you have double the required bandwidth. So if the 960x540 rendition requires 1400 kbit/s then it won't play unless you have 2800 kbit/s bandwidth.
It's not very good (and wastes 6 MB of bandwidth per minute) but it's better than nothing.