I took an mp4 video, encoded it for HTTP Live Streaming (HLS) using ffmpeg — resulting in a series of myvideo.ts files and a myvideo.m3u8 playlist — and am attempting to play it using the HTML <video> tag in Safari, with the native HLS capabilities of that browser:
<video id="myvideo" src="myvideo.m3u8" loop="loop"></video>
It plays, once. But despite the "loop" attribute in the video tag, it doesn't loop. It stays frozen on the last frame of the video.
If I try to detect the end of the video using an event listener as described here:
Detect when an HTML5 video finishes
… that event never seems to fire.
The "paused" property in javascript (document.getElementById('myvideo').paused) evaluates to false, even after the video has played once and stopped.
How can I get the video to loop in Safari?
HLS is intended to be a live stream, so it never actually "finishes" in order to automatically loop. I used a JavaScript timer as a hack to get around this:
var LOOP_WAIT_TIME_MS = 1000,
vid = document.getElementById("myvideo"),
loopTimeout;
vid.addEventListener('play', function() {
if (!/\.m3u8$/.test(vid.currentSrc)) return;
loopTimeout = window.setTimeout(function() {
loopTimeout = null;
vid.pause();
vid.play();
}, (vid.duration - vid.currentTime) * 1000 + LOOP_WAIT_TIME_MS);
});
vid.addEventListener('pause', function() {
if (!/\.m3u8$/.test(vid.currentSrc)) return;
if (loopTimeout) {
clearTimeout(loopTimeout);
loopTimeout = null;
}
});
Related
I have a simple site with an HLS stream from a m3u8 playlist in an autoplay video tag. If the stream stops for more than 10s or so it will not "catch up" and start again when the stream is restarted - I need to manually refresh the page to get it to play again.
Is there a way with js (or something else) to automatically refresh the page after the video has been buffering for X time? (say 5 seconds)
I managed to get this with some help from VC.One's answer. This will reload after the video has been buffering for 5 seconds, if the video was previously playing. Otherwise it will be stuck in a reload loop if it never starts playing. I am still looking for a way to check whether the stream is live without actually reloading the page. video.load() and video.play() are giving me errors, but I will update this post when I figure it out.
var reloadCheck;
var reloadThisTime;
var video = document.getElementById("videotag");
var sourcetag = document.getElementById("sourcetag");
video.addEventListener('waiting', (event) => {
console.log("No connection");
reloadCheck = setTimeout(function(){
if(reloadThisTime){
location = '';
};
},5000);
});
video.addEventListener('playing', (event) => {
console.log("Connected");
clearTimeout(reloadCheck);
reloadThisTime = true;
});
I have a mp4 video inside html video tag. SRC= is given for the google drive video.
see my code below
<video auto play muted loop id="video" class="" src="https://drive.google.com/uc?export=download&id=1qSC6ySf6ZZldFRpuBx9EXvDsD-mfve1Z" type="video/mp4"> </video>
Note: when I am directly calling mp4 video from my server than it will not pause but when the video is called from google drive it gets paused after few minutes. As I am not showing any controls. I need the video to be played continuously.
the issues I'm getting is
Codepen link: https://codepen.io/5salman/pen/bGRMyKj
There's nothing inherit in Google Drive (that I can think of) that would cancel autoplay. More likely you're hitting an error event. The video will be loaded via "206 Partial Content" byte-range requests. If it doesn't end up being locally cached then that could make it more susceptible to network issues than on a server that doesn't support byte-range requests. Also consider looking at the network Devtools for any network errors.
Mitigation options:
Reduce the size of your video! It's 14mb for 20 seconds of video. Use Handbrake or something to re-encode the video to a smaller bitrate/size. That will reduce network traffic and make it less likely to choke the viewing device.
Add an error event handler on the <video> tag - this can help you with diagnostics, and you can also trigger the video to (try) to play again.
function reloadVideo(vidElement, preserveTime = true) {
if (preserveTime) {
// wait to set the current time again until after playable
const position = vidElement.currentTime;
var cb = vidElement.addEventListener('canplay', () => {
if (!isNaN(position) && position < vidElement.duration) {
// I recommend seeking just a frame or so ahead, just in case there was a decode error on the video
vidElement.currentTime = position + 1/30;
}
vidElement.removeEventListener('canplay', cb);
});
}
const src = vidElement.currentSrc;
vidElement.src = "";
vidElement.src = src;
}
var vidElement = document.querySelector('#video');
// retry on error
var retryErrorCodes = [MediaError.MEDIA_ERR_NETWORK, MediaError.MEDIA_ERR_DECODE];
var SECONDS_BEFORE_RETRY = 5;
vidElement.addEventListener('error', evt => {
if (retryErrorCodes.includes(vidElement.error.code)) {
setTimeout(reloadVideo, SECONDS_BEFORE_RETRY * 1000, vidElement);
}
}
(Not Recommended) if you need it to keep running you can add a setInterval function that restarts the video if it's paused. Keep in mind that these "zombie" functions can get annoying if you don't manage them carefully.
// use setInterval to keep retrying the playback
var SECONDS_BETWEEN_PLAYING_CHECKS = 10;
var keepPlayingTimer = setInterval(function () {
var vidElement = document.querySelector('#video');
// make sure the element is still there
if (vidElement && typeof vidElement.play === 'function') {
// if it's not playing start it playing again
if (vidElement.paused) {
vidElement.play();
}
} else {
// don't call this function again if video element is gone
clearInterval(keepPlayingTimer);
}
}, SECONDS_BETWEEN_PLAYING_CHECKS * 1000);
I am using getUserMedia and mediaRecorder API to record an video from webcam.
I am using chrome version 80.
How to getUserMedia and record the video mixing the mp3 with javascript? mp3 can play pause and stop
I don't know how to mixing the mp3 to the video stream on live.
When I removeTrack and addTrack, I stop on MediaRecorder fail.
show Error: Failed to execute 'stop' on 'MediaRecorder': The MediaRecorder's state is 'inactive'.
my code on codepen: https://codepen.io/zhishaofei3/pen/eYNrYGj
and prime codes:
function getFileBuffer(filepath) {
return fetch(filepath, {method: 'GET'}).then(response => response.arrayBuffer())
}
function mp3play() {
getFileBuffer('song.mp3')
.then(buffer => context.decodeAudioData(buffer))
.then(buffer => {
console.log(buffer)
const source = context.createBufferSource()
source.buffer = buffer
let volume = context.createGain()
volume.gain.value = 1
source.connect(volume)
dest = context.createMediaStreamDestination()
volume.connect(dest)
// volume.connect(context.destination)
source.start(0)
const _audioTrack = stream.getAudioTracks();
if (_audioTrack.length > 0) {
_audioTrack[0].stop();
stream.removeTrack(_audioTrack[0]);
}
console.log(dest.stream)
console.log(dest.stream.getAudioTracks()[0])
stream.addTrack(dest.stream.getAudioTracks()[0])
})
}
thank you !
Many containers don't support adding/removing tracks like that, and it's doubtful the Media Recorder API does at all. It's an unusual thing to do.
You need to create the stream you're going to record before instantiating Media Recorder, with all of the tracks you want. Therefore, you need to do things in this order:
Set up your AudioContext.
Call getUserMedia(). (And while you're at it, set audio: false in your constraints. No need to open a microphone if you're not using one.)
videoStream.getVideoTracks() and dest.stream.getAudioTracks() to get all of the tracks.
Create a new MediaStream with those tracks. new MediaStream([audioTrack, videoTrack])
Now, run your MediaRecorder on this new MediaStream and you'll have what you want.
Background
Since Chrome version 66, videos that should autoplay on my site may be prevented from playing if the user hasn't been on my site before.
<video src="..." autoplay></video>
Question
How do I detect if the video autoplay was disabled? And what can I do about it?
The autoplay attribute
According to web standard specifications, the autoplay attribute should only be a hint for what the browser should to with the media element. Neither of W3 of WHATWG web specifications mentions anything about when to prevent autoplay for media, which means that each browser probably have different implementations.
Autoplay policies
Autoplay policies implemented by each browser now govern whether video should be allowed to autoplay.
Chrome uses something they call Media
Engagement Index and you can read more about that here and their autoplay policy here.
Safari developers made a post on webkit.org
regarding this.
Firefox seems to put it in the hands of the user to choose if it's allowed or not (link).
Best practices
Detecting if autoplay is disabled
Instead of using autoplay on your element, you can use the play() method on the video and audio element to start playing your media. The play() method returns a promise in modern browsers (all according to the spec). If the promise rejects, it can indicate that autoplay is disabled in the current browser on your site.
can-autoplay is a library solely for detecting autoplay features for both video and audio elements.
When autoplay is disabled
The good thing is that when you know that autoplay is disabled you can, in some browsers, then mute the video and try the play() method again, while showing something in the UI that says that the video is playing while muted.
var video = document.querySelector('video');
var promise = video.play();
if (promise !== undefined) {
promise.then(_ => {
// Autoplay started!
}).catch(error => {
// Autoplay not allowed!
// Mute video and try to play again
video.muted = true;
video.play();
// Show something in the UI that the video is muted
});
}
<video src="https://www.w3schools.com/tags/movie.ogg" controls></video>
For me best solution was:
function _callback_onAutoplayBlocked() {
// do something, for example "show big play button"
}
function isSafari() {
var chr = window.navigator.userAgent.toLowerCase().indexOf("chrome") > -1;
var sfri = window.navigator.userAgent.toLowerCase().indexOf("safari") > -1;
return !chr && sfri;
}
function _checkAutoPlay(p) {
var s = window['Promise'] ? window['Promise'].toString() : '';
if (s.indexOf('function Promise()') !== -1 || s.indexOf('function ZoneAwarePromise()') !== -1) {
p.catch(function(error) {
console.error("_checkAutoPlay, error:", error)
if(error.name == "NotAllowedError") { // For Chrome/Firefox
console.error("_checkAutoPlay: error.name:", "NotAllowedError")
_callback_onAutoplayBlocked();
} else if (error.name == "AbortError" && isSafari()) { // Only for Safari
console.error("_checkAutoPlay: AbortError (Safari)")
_callback_onAutoplayBlocked();
} else {
console.error("_checkAutoPlay: happened something else ", error);
// throw error; // happened something else
}
}).then(function(){
console.log("_checkAutoPlay: then");
// Auto-play started
});
} else {
console.error("_checkAutoplay: promise could not work in your browser ", p);
}
}
var video1 = document.getElementById('video1');
_checkAutoPlay(video1.play());
Trying to get the hooks for the audio tag events in the HTML5 through QtWeKit. For that I created a sample application that just loads a html file through QwebView.
The html file contains a HTML5 audio tag.
<audio id="audio_with_local_controls" controls>
<source src="nokia-tune.mp3" type="audio/mp3" />
</audio>
In the script side, I'm trying to get the hooks for the audio tag play, pause and ended events.
/// AUDIO TAG EVENTS.
var aid = document.getElementById('audio_with_local_controls');
function onplay_(){
console.log('onplay');
alert('onplay');
}
function oncanplay_(){
console.log('oncanplay');
alert('oncanplay');
}
function onpause_(){
console.log('onpause');
alert('onpause');
}
console.log(aid);
aid.onplay = onplay_;
aid.oncanplay = oncanplay_;
aid.onpause = onpause_;
aid.onprogress = function onprogress_(){ alert('onprogress'); }
aid.onended = function onended_(){ alert('onended'); }
aid.onabort = function onabort_(){ alert('onabort'); }
The code sequence might not make sense as I was trying something up and down in the code.
Chrome was able to capture the hooks. But QWebView remains silent on this, nothing gets captured.
Is it that QWebView doesn't support this? or Am I writing something wrong?