Currently there is a global file loaded in Firefox chrome://global/content/bindings/videocontrols.xml, which has values for the things like seek time and playback rate for native HTML5 videos.
File can be found here: https://github.com/mozilla/gecko-dev/blob/master/toolkit/content/widgets/videocontrols.xml
https://support.mozilla.org/en-US/questions/1063314#answer-732595 Has a great post on how to edit this file and save it, but the link they provided (guide) is different from the original posting, so it's not easy to implement.
The file does contain:
case "accel-rightArrow": /* Seek forward 10% */
oldval = this.video.currentTime;
var maxtime = (this.video.duration || this.maxCurrentTimeSeen / 1000);
if (keystroke == "rightArrow") {
newval = oldval + 15;
} else {
newval = oldval + maxtime / 10;
}
this.video.currentTime = (newval <= maxtime ? newval : maxtime);
Which can edited, but since it's somewhere in /lib/ in a jar file, doesn't seem easily editable.
Is there a simple way, even in about:config to edit the native HTML5 videoplayer keybindings in Firefox?
Javascript Method:
https://developer.mozilla.org/de/docs/Web/HTML/Using_HTML5_audio_and_video
This link provides information on how to control video player, so a possible user script is another option if changing videocontrols.xml is not viable.
Related
I already made some addons for Firefox and extensions for Chrome, but now I had a crazy idea and I would like to know if I can disable Firefox / Chrome components from an addon / extension.
When I say disable components, I mean something like (mostly FF examples):
Firefox Hello
Pocket (Firefox has now a default integration with Pocket)
History
Favorites
Other installed extensions
Resources like "Print" and "Developer Tools"
Etc.
I've searched for the whole Firefox Addon Developer Hub and I didn't found if I can do something like that. If you know the answer, how can I do that or why I can't?
You don't need to describe why it's (or isn't) possible and how I can achieve that, but in this case provide useful and interesting links.
From Firefox it is very easy to disable other things.
This for example disables an addon by id:
//this checks to see if AdBlock Plus is enabled
AddonManager.getAddonsByIDs(["{d10d0bf8-f5b5-c8b4-a8b2-2b9879e08c5d}"], function ([aAddon1]) {
console.log(aAddon1);
var isAddonEnabled = aAddon1.isActive;
alert('AdBlock plus enabled = ' + isAddonEnabled)
//for other properties see here: https://developer.mozilla.org/en-US/Add-ons/Add-on_Manager/Addon#Required_properties
});
Components are done a little differently, you would have to use nsICategoryManager, this disables the default pdf reader:
https://developer.mozilla.org/en-US/docs/Mozilla/Tech/XPCOM/Reference/Interface/nsICategoryManager#Remarks
var CONTENT_TYPE = 'application/pdf';
// Update the category manager in case the plugins are already loaded.
let categoryManager = Cc['#mozilla.org/categorymanager;1'];
categoryManager.getService(Ci.nsICategoryManager).deleteCategoryEntry('Gecko-Content-Viewers', CONTENT_TYPE, false);
// Update pref manager to prevent plugins from loading in future
var stringTypes = '';
var types = [];
var PREF_DISABLED_PLUGIN_TYPES = 'plugin.disable_full_page_plugin_for_types';
if (Services.prefs.prefHasUserValue(PREF_DISABLED_PLUGIN_TYPES)) {
stringTypes = Services.prefs.getCharPref(PREF_DISABLED_PLUGIN_TYPES);
}
if (stringTypes !== '') {
types = stringTypes.split(',');
}
if (types.indexOf(CONTENT_TYPE) === -1) {
types.push(CONTENT_TYPE);
}
Services.prefs.setCharPref(PREF_DISABLED_PLUGIN_TYPES, types.join(','));
As discussed in a previous question, I have built a prototype (using MVC Web API, NAudio and NAudio.Lame) that is streaming live low quality audio after converting it to mp3. The source stream is PCM: 8K, 16-bit, mono and I'm making use of html5's audio tag.
On both Chrome and IE11 there is a 15-34 second delay (high-latency) before audio is heard from the browser which, I'm told, is unacceptable for our end users. Ideally the latency would be no more than 5 seconds. The delay occurs even when using the preload="none" attribute within my audio tag.
Looking more closely at the issue, it appears as though both browsers will not start playing audio until they have received ~32K of audio data. With that in mind, I can affect the delay by changing Lame's MP3 'bitrate' setting. However, if I reduce the delay (by sending more data to the browser for the same length of audio), I will introduce audio drop-outs later.
Examples:
If I use Lame's V0 encoding the delay is nearly 34 seconds which requires almost 0.5 MB of source audio.
If I use Lame's ABR_32 encoding, I can reduce the delay to 10-15 seconds but I will experience pauses and drop-outs throughout the listening session.
Questions:
Any ideas how I can minimize the start-up delay (latency)?
Should I continue investigating various Lame 'presets' in hopes of picking the "right" one?
Could it be that MP3 is not the best format for live streaming?
Would switching to Ogg/Vorbis (or Ogg/OPUS) help?
Do we need to abandon HTML5's audio tag and use Flash or a java applet?
Thanks.
You can not reduce the delay, since you have no control on the browser code and buffering size. HTML5 specification does not enforce any constraint, so I don't see any reason why it would improve.
You can however implement a solution with webaudio API (it's quite simple), where you handle streaming yourself.
If you can split your MP3's chunk in fixed size (so that each MP3 chunks size is known beforehand, or at least, at receive time), then you can have a live streaming in 20 lines of code. The chunk size will be your latency.
The key is to use AudioContext::decodeAudioData.
// Fix up prefixing
window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var offset = 0;
var byteOffset = 0;
var minDecodeSize = 16384; // This is your chunk size
var request = new XMLHttpRequest();
request.onprogress = function(evt)
{
if (request.response)
{
var size = request.response.length - byteOffset;
if (size < minDecodeSize) return;
// In Chrome, XHR stream mode gives text, not ArrayBuffer.
// If in Firefox, you can get an ArrayBuffer as is
var buf;
if (request.response instanceof ArrayBuffer)
buf = request.response;
else
{
ab = new ArrayBuffer(size);
buf = new Uint8Array(ab);
for (var i = 0; i < size; i++)
buf[i] = request.response.charCodeAt(i + byteOffset) & 0xff;
}
byteOffset = request.response.length;
context.decodeAudioData(ab, function(buffer) {
playSound(buffer);
}, onError);
}
};
request.open('GET', url, true);
request.responseType = expectedType; // 'stream' in chrome, 'moz-chunked-arraybuffer' in firefox, 'ms-stream' in IE
request.overrideMimeType('text/plain; charset=x-user-defined');
request.send(null);
function playSound(buffer) {
var source = context.createBufferSource(); // creates a sound source
source.buffer = buffer; // tell the source which sound to play
source.connect(context.destination); // connect the source to the context's destination (the speakers)
source.start(offset); // play the source now
// note: on older systems, may have to use deprecated noteOn(time);
offset += buffer.duration;
}
I`m building a mobile-web app, and I am implementing Video & Audio tags. Apparently not all devices know to run all file formats. Modernizr knows to return to me the Codec, but how can I know if my file has this specific codec?
I can identify the file extension or mim-type, but than I`ll need to compare the codec with the file extension, and maintain an array with this data, and this seems like a sisyphic task.
What do you guys say? Any one knows on a good library that knows provide us with this information? Or maybe my approach is wrong and theres a better way to achive that.
Please not that Firefox OS for example display a msg to the user that specific file format isn't supported by the OS, but he doesn`t provide this msg when dealing with Audio files. Which means that I need to provide this feedback to the user, and I also prefer to do it in a customised visual way to my application.
An example:
I got video.mp4. Modernizr returns Codec of H264. this two pieces of information don't relate.
I cannot place fallbacks for my video to run in all video formats available. The app is a browser of a Cloud Files Hosting (like dropbox) and if the user uploaded a file which cannot run on FirefoxOS, than he must see understand why its file don`t run, and for that I need this library, instead of managing this compression by myself.
Some references:
Mozile - Supported media formats
Dive Into HTML5 - detect video formats
Thank you.
If you have access to the mime type you could simply use the audio or video's canPlayType() method:
canPlay = (function(){
var a = document.createElement('audio'),
v = document.createElement('video');
function canPlayAudio(type){
var mime = 'audio/';
if (a && a.canPlayType) {
if (type.indexOf(mime) === -1) type = mime+type;
return !!a.canPlayType(type) || !!a.canPlayType(type.replace(/^audio\/(x-)?(.+)$/, mime+'x-$2'))
} return false
}
function canPlayVideo(type){
var mime = 'video/';
if (v && v.canPlayType) {
if (type.indexOf(mime) === -1) type = mime+type;
return !!v.canPlayType(type) || !!v.canPlayType(type.replace(/^video\/(x-)?(.+)$/, mime+'x-$2'))
} return false
}
return {
audio: canPlayAudio,
video: canPlayVideo
}
})();
Then you can perform the tests like this (note: including the "audio/" or "video/" part is optional):
canPlay.audio('audio/mpeg') // true
canPlay.audio('m4a') // true
canPlay.audio('wav') // true
canPlay.audio('flac') // false
canPlay.audio('audio/ogg') // true
canPlay.video('video/mpeg') // false
canPlay.video('video/mp4') // true
canPlay.video('m4v') // true
canPlay.video('video/webm') // true
canPlay.video('avi') // false
I have captured 3 videos on my mobile which is by default stored on the phone gallery (Gallery/videos/). I have to play these 3 videos in one of my flex mobile application. How can I get the videos to the flex project? if I need to browse the mobile directory means kindly help me with some code to do so.
I too am looking for an answer to this question. Right now, based on other Stackoverflow discussions, exhaustive perusal of tutorials and Adobe documentation, and comments to both (often the more useful resource), I'm coming to the conclusion that it's not possible.
you can use CameraRoll.browseForImage() and open the iOS gallery of photos to see all entities of MediaType.IMAGE, but it will not show you MediaType.VIDEO
you can use CameraUI to launch the system camera by delegation and that returns a MediaPromise, but as far as I can tell, it does not save the video you capture anywhere, and I cannot find a way to access the captured video using the MediaPromise (at least using the Loader class)
Here's my code as a hint in that direction. The second code block is using the CameraRoll to browseForImage() but there is no browseForVideo() in the API.
if(CameraUI.isSupported)
{
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.addEventListener(Event.CANCEL, cameraCanceled);
camera.addEventListener(ErrorEvent.ERROR, cameraError);
camera.launch(MediaType.VIDEO);
}
else
{
statusText.text = "Camera not supported on this device.";
startTimer();
}
if (CameraRoll.supportsBrowseForImage)
{
roll = new CameraRoll();
roll.addEventListener(MediaEvent.SELECT, cameraRollEventComplete);
roll.addEventListener(Event.CANCEL, cameraCanceled);
roll.addEventListener(ErrorEvent.ERROR, cameraError);
roll.browseForImage();
}
else
{
statusText.text = "Camera roll not supported on this device.";
startTimer();
}
I've since found that Videos captured using the delegated system camera are stored in a temporary storage location that iOS -DOES!- allow access to. (I was pleasantly shocked.)
The Captured video is not added to the device's Camera Roll as other videos captured using the iOS System Camera app, so it's not enough to capture video and expect to be able to access it later (if, for instance, CameraRoll.browseForVideo() is ever added to the API.
Therefore, you have to 'get while the getting is good' and move the file from the temporary storage location to some non-volatile location such as ApplicationStorageDirectory or the user's Documents directory (The only options in iOS I think).
The MediaPromise... I think... is completely useless for accessing the video via any direct progressive loader/streamer method, but still provides the location/url/path/filename of the temporary file so you can perform File operations on it.
Ironic that there are tutorials for getting around the lack of a file location/url/path/filename in the MediaPromise when using CameraRoll.browseForImage()... and that method is to use a loader class to load the image content (which you can then write out to a file), but when taking video, the video content is not accessible, and instead a file location/url/path/filename is provided. Ironic that there are nearly no resources I was able to find to help with this also. grumble
I'm going to include some code chunks w/o really editing them to strip out extraneous bits because it's way past when I need to be in bed, but I wanted you to have this. I may come clean it up later.
This section is in a Spark SkinnablePopUpContainer and I use the same click event for several buttons, thus the below 'case' is in the switch-case in that event handler function.
In case you are not familiar, the 'close(true, data)' is the method to close the SkinnablePopUpContainer, tell the parent/owner that the container was closed purposefully and that it should look for the data object being shared back (i.e., there are changes to be 'commit'ed).
case "cameraVideo":
{
if(CameraUI.isSupported)
{
camera = new CameraUI();
camera.addEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.addEventListener(Event.CANCEL, cameraCanceled);
camera.addEventListener(ErrorEvent.ERROR, cameraError);
camera.launch(MediaType.VIDEO);
}
else
{
statusText.text = "Camera not supported on this device.";
startTimer();
}
break;
}
protected function cameraCanceled(event:Event):void
{
statusText.text = "Camera access canceled by user.";
startTimer();
}
protected function cameraError(event:ErrorEvent):void
{
statusText.text = "There was an error while trying to use the camera.";
startTimer();
}
protected function videoMediaEventComplete(event:MediaEvent):void
{
statusText.text="Preparing captured video...";
camera.removeEventListener(MediaEvent.COMPLETE, videoMediaEventComplete);
camera.removeEventListener(Event.CANCEL, cameraCanceled);
camera.removeEventListener(ErrorEvent.ERROR, cameraError);
var media:MediaPromise = event.data;
data.MediaType = MediaType.VIDEO;
data.MediaPromise = media;
data.source = "camera video";
close(true,data)
}
This section is the Actionscript in the close handler of the parent/owner of the SkinnablePopUpContainer (truncated once the useful code is included)
private function choosePictureLightboxClosed(event:PopUpEvent):void
{
imageButtonsActive = false;
if(event.commit)
{
this.data = event.data as Object;
filters = new Array();
selection = true;
switch(data.MediaType)
{
case MediaType.VIDEO:
{
mediaType = "video";
trace(data.MediaPromise.file.url + " - " + data.MediaPromise.relativePath + " - " +data.MediaPromise.mediaType);
var sourceFile:File = new File(data.MediaPromise.file.url);
var destinationFile:File = File.applicationStorageDirectory.resolvePath("User" +parentApplication.userid);
if(destinationFile.exists && !destinationFile.isDirectory)
{
destinationFile.deleteFile();
}
destinationFile.createDirectory();
destinationFile = destinationFile.resolvePath("Videos");
if(destinationFile.exists && !destinationFile.isDirectory)
{
destinationFile.deleteFile();
}
destinationFile.createDirectory();
destinationFile = destinationFile.resolvePath(parentApplication.userid+"Video"+new Date().getTime()+".mov");
trace(destinationFile.nativePath);
sourceFile.moveTo(destinationFile,true);
break;
}
I sure do hope this helps. This has been a very frustrating (and costly in terms of our project being government grant funded and having deadlines we utterly failed to meet), and I very much hope that these hard-won solutions might help others avoid the same experience.
I have a video as a background to a web page, and I am trying to get it to loop. Here is the code:
<video autoplay='true' loop='true' muted='true'>
<source src='/admin/wallpapers/linked/4ebc66e899727777b400003c' type='video/mp4'></source>
</video>
Even though I have told the video to loop, it does not. I also tried to get it to loop with the onended attribute (as per this Mozilla support thread, I also tried that bit of jQuery). Nothing has worked so far. Is it an issue with Chrome, or my code?
Edit:
I checked the Network events and HEAD of a working copy (http://fhsclock-labs.heroku.com/no-violence) versus the application I'm trying to get working. The difference is the working copy is serving up the video from a static asset on Heroku (via Varnish, apparently), whilst mine is serving from GridFS (MongoDB).
The Network tab of Chrome's Inspector show that in my application, the video is requested three times. One time the Status is "pending", the second is "canceled", and the final one is 200 OK. The working copy only shows two requests, one's Status is pending and the other is 206 Partial Content. However, after the video plays once, that request changes to "Cancelled" and it makes another request for that video. In my application, that does not happen.
As for Type, in my application, two are "undefined" and the other "video/mp4" (which it is supposed to be). In the working app, all of the requests are "video/mp4".
In addition, I'm getting Resource interpreted as Other but transferred with MIME type undefined. warnings in the Console.
I'm not really quite sure where to begin on this. It's my belief that the issue is server-side, as serving the file as static assets works fine. It could be that the server isn't sending the correct content type. It could be an issue with GridFS. I do not know.
At any rate, the source is here. Any insight that you can offer is appreciated.
Ah, I just stumbled into this exact problem.
As it turns out, looping (or any sort of seeking, for that matter) in <video> elements on Chrome only works if the video file was served up by a server that understands partial content requests. i.e. the server needs to honor requests that contain a "Range" header with a 206 "Partial Content" response. This is even the case if the video is small enough to be fully buffered by chrome, and no more server-round trips are made: if your server didn't honor chrome's Range request the first time, the video will not be loopable or seekable.
So yes, an issue with GridFS, although arguably Chrome should be more forgiving.
Simplest workaround:
$('video').on('ended', function () {
this.load();
this.play();
});
The 'ended' event fires when the video reaches the end, video.load() resets the video to the beginning, and video.play() starts it playing immediately once loaded.
This works well with Amazon S3 where you don't have as much control over server responses, and also gets around Firefox issues related to video.currentTime not being settable if a video is missing its length metadata.
Similar javascript without jQuery:
document.getElementsByTagName('video')[0].onended = function () {
this.load();
this.play();
};
Looks like its been an issue in the past, there are at least two closed bugs on it, but both state that it was fixed:
http://code.google.com/p/chromium/issues/detail?id=39683
http://code.google.com/p/chromium/issues/detail?id=18846
Since Chrome and Safari both use webkit based browsers you might be able to use some of these work arounds:
http://blog.millermedeiros.com/2011/03/html5-video-issues-on-the-ipad-and-how-to-solve-them/
function restartVideo(){
vid.currentTime = 0.1; //setting to zero breaks iOS 3.2, the value won't update, values smaller than 0.1 was causing bug as well.
vid.play();
}
//loop video
vid.addEventListener('ended', restartVideo, false);
Just in case none of the answers above help you, make sure you don't have your inspector running with the Disable cache option checked. Since Chrome grabs the video from cache, it will basically work once. Just debugged this for 20 minutes before realizing this was the cause. For reference and so I know I am not the only one someone else's chromium bug report.
My situation:
I have the exact same problem, however changing the header of the response message alone didnt do. No loop, replay or seek. Also a pure stop doesnt work, but that could be my configuration.
Answer:
According to some sites (couldnt find them anymore) its also possible to trigger the load() method right after the video ends, and before the next one is supposed to start. That should reload the source causing a once again working video/audio element.
#john
Please note that your answers/links are normal bugs, and not focused on this problem. Using a server/webserver is what causes this problem. Whereas the bugs these links describe are of a different kind. Thats also why the answer isnt working.
I hope it helps, i am still looking for a solution.
For anyone coming on this page 9 years later and if all the above answers didn't work: I had this issue too and I thought the source of the issue was either my browsers or with the server.
I've later noticed that the other websites on internet which use looping videos they don't have issue with looping videos. To troubleshoot I have downloaded a random video from one of the sites and I visited and uploaded on my own server to delightedly find out it was working, so it seemed that the source of the issue was the video I was using.
Then I fixed my video with an online video converter website (don't want to publicize any in particular but the first ones from a quick google research do work) and alas, this solved the issue.
I'm not sure what the real reason of the issue was. I do assume there was a conversion or compression error of the original video that was handed me from my client.
I know this doesn't pertain exactly to the question asked, but if someone comes across this when having a similar issue, make sure that you have your sources in order properly.
I was loading an mp4 and a webm file and noticed that the video was not looping in Chrome. It was because the webm file was the first source listed so Chrome was loading the webm file and not the mp4.
Hope that helps someone else that comes across this issue.
<video autoplay loop>
<source src="/path-to-vid/video.mp4" type="video/mp4">
<source src="/path-to-vid/video.webm" type="video/webm">
</video>
it is super lame but dropbox use the right status code. So upload to dropbox and replace the www by dl.
Thus using a dropbox url the video play fine.
I had same issue and inevitably solved problem by streaming the content.
e.g this is the code with PHP laravel blade html code which is requesting to streaming route:
<video>
<source src="{{route('getVideoStream',$videoId)}}" type="video/mp4"/>
</video>
in the Controller I will stream video and return it as laravel stream function:
public function getVideoStream($videoId){
$path = $pathOfVideo;
$headers = [
'Content-Type' => 'video/mp2t',
'Content-Length' => File::size($path),
'Content-Disposition' => 'attachment; filename="start.mp4"'
];
$stream = new VideoStream($path);
return response()->stream(function () use ($stream) {
$stream->start();
});
}
and VideoStream Class is the streaming class I found from a GitHub gist:
class VideoStream
{
private $path = "";
private $stream = "";
private $buffer = 102400;
private $start = -1;
private $end = -1;
private $size = 0;
function __construct($filePath)
{
$this->path = $filePath;
}
/**
* Open stream
*/
private function open()
{
if (!($this->stream = fopen($this->path, 'rb'))) {
die('Could not open stream for reading');
}
}
/**
* Set proper header to serve the video content
*/
private function setHeader()
{
ob_get_clean();
header("Content-Type: video/mp4");
header("Cache-Control: max-age=2592000, public");
header("Expires: " . gmdate('D, d M Y H:i:s', time() + 2592000) . ' GMT');
header("Last-Modified: " . gmdate('D, d M Y H:i:s', #filemtime($this->path)) . ' GMT');
$this->start = 0;
$this->size = filesize($this->path);
$this->end = $this->size - 1;
header("Accept-Ranges: 0-" . $this->end);
if (isset($_SERVER['HTTP_RANGE'])) {
$c_start = $this->start;
$c_end = $this->end;
list(, $range) = explode('=', $_SERVER['HTTP_RANGE'], 2);
if (strpos($range, ',') !== false) {
header('HTTP/1.1 416 Requested Range Not Satisfiable');
header("Content-Range: bytes $this->start-$this->end/$this->size");
exit;
}
if ($range == '-') {
$c_start = $this->size - substr($range, 1);
} else {
$range = explode('-', $range);
$c_start = $range[0];
$c_end = (isset($range[1]) && is_numeric($range[1])) ? $range[1] : $c_end;
}
$c_end = ($c_end > $this->end) ? $this->end : $c_end;
if ($c_start > $c_end || $c_start > $this->size - 1 || $c_end >= $this->size) {
header('HTTP/1.1 416 Requested Range Not Satisfiable');
header("Content-Range: bytes $this->start-$this->end/$this->size");
exit;
}
$this->start = $c_start;
$this->end = $c_end;
$length = $this->end - $this->start + 1;
fseek($this->stream, $this->start);
header('HTTP/1.1 206 Partial Content');
header("Content-Length: " . $length);
header("Content-Range: bytes $this->start-$this->end/" . $this->size);
} else {
header("Content-Length: " . $this->size);
}
}
/**
* close curretly opened stream
*/
private function end()
{
fclose($this->stream);
exit;
}
/**
* perform the streaming of calculated range
*/
private function stream()
{
$i = $this->start;
set_time_limit(0);
while (!feof($this->stream) && $i <= $this->end) {
$bytesToRead = $this->buffer;
if (($i + $bytesToRead) > $this->end) {
$bytesToRead = $this->end - $i + 1;
}
$data = fread($this->stream, $bytesToRead);
echo $data;
flush();
$i += $bytesToRead;
}
}
/**
* Start streaming video content
*/
function start()
{
$this->open();
$this->setHeader();
$this->stream();
$this->end();
}
}