HTML5-how to open more than one cameras - html

I have successful get the video streaming of local camera by use getUserMedia() in HTML5.But only one camera.I have two cameras,how to get their video streaming at the same time.

Unfortunately this does not appear to be possible, yet. See here and the comments below this article here. Basically the user decides via their browser settings which camera is primary, and no option for secondary, etc, camera exists as of yet.

Related

WebXR and WebRTC don't work simultaneously

I am new to WebXR. I was trying to use webRTC with WebXR. The user will first enter into AR session and then create a WebRTC peer connection but ice candidates are not generated in Chrome for Android of the user is in AR session. As soon as the user gets out of AR session, ice candidates are transferred. Is this a bug in Chrome??
The problem is Hardware related. Some devices allow the use of both Front and back camera simultaneously. In such devices, the code worked properly. In other devices, both front and back camera cannot be accessed simultaneously. Hence, code does not work in these devices. Also, the WebXR Device API does not allow access to camera feed at the moment, however it is a proposed feature.
Although I haven't tried it myself. But you can in theory use the canvas captureStream API to stream webXR canvas.
Can you post your code here. You might want to tweak how you pass stream to the webrtc connection.
As far as I know, it is not possible to use canvas.captureStream() because WebXR doesn't render to the canvas directly.
I am also looking for a way to stream a webXR-Session via WebRTC. So I would be highly interested in your solution shivamag00!
Hope to hear from you!

Is it possible to check he bitrate of a twilio video stream?

I am developing a video chat application using twilio. I would like to check the bitrate of a video stream playing in the browser to study how the bitrate will be affected at different bandwidths. How can I do this?
Twilio developer evangelist here.
You can measure various data about the incoming and outgoing streams using the WebRTC getStats API. There's a really good article that walks through the available stats that you should read to understand it. I would try to write more about it here, but reading the spec and checking out that article will be more accurate and useful to you.
Hope this helps.
Many videos actually will have variable bit rate, so you can either get an average by simply dividing the file size by the time, or alternatively use a tool like VLC player which will show you the bit rate changing over time (on a Mac it shows the numbers but I believe on Windows it shows a graph):
If you are more interested in the download bandwidth itself, you can use developer tools in Chrome to see the bit rate.
If you open developer tools and go to the network tab you should see a waterfall column.
Hover over the timeline here in a row that corresponds to your video download and you can see all the details about the request and response including the time it took. The time combined with the size which is also in the row will show you the actual achieved download bit rate.
Here is an example for a YouTube video:

HTML5 Video MP4 Codec Settings

I've recently switched my employers self-hosted video player from Flash to HTML5 Video. Since the switch, we have had increasing numbers of users report that they are unable to play our videos.
Through a series of error-reporting functions, and further tests with users I can say conclusively that the issue is with the video files themselves. The users are capable of playing the file type, and having compared with the likes of YouTube, Vimeo etc, I cannot see any clear reason why the way our videos are enconded would cause a failure. The issue largely affects users on older of lower-end devices, but not exclusively.
I have included screen grabs of a typical videos settings for review in case anyone is able to spot an obvious problem.
Image 1: https://i.stack.imgur.com/A3zOS.png
Image 2: https://i.stack.imgur.com/Hdu6I.png
Edit:
The users cannot initiate playback at all. We have provided them with direct links to the video files, so their browsers can play them in their natural optimised state - and still nothing works.
The error reporting we have implement shows us a range of data from connection speed, device info, browser info, location and tracking various possible points of error throughout our full player. There doesn't appear to be any notable correlation between users other than almost all Chromebook users have issues. But many others do also.

WebRTC Check if camera and microphone are in use on Chrome

Is it possible to check if camera and microphone are in use by another PC application (like Skype)?
The problem is, I'm able to get the stream object and initiate the connection between two computers, but there's no video nor sound, because the devices are in use by Skype. What I need is a way to detect if the devices are busy in order to tell the user to close the applications that are using the camera and microphone
I know it's possible to check if there are any devices present on the PC with MediaStreamTrack.getSources(), however it does not provide any information about devices status.
The testrtc project has a test that detects silence from a microphone here that could be useful in determining if the audio track returned is silent, and a similar video test that detects frozen or black frames.
There is no direct way, but getUserMedia should throw errors and/or return less audio/video tracks than you would expect if checking whether a device exists.
Devices can be detected either from MediaStreamTrack.getSources or (preferably) the spec-compliant navigator.mediaDevices.enumerateDevices.

Is three a way for a Chrome App to reach user's file on hard drive with permission?

I'm developing a Chrome Packaged App with video playback feature.
First of all, I want to allow the user to stream online media (e.g. MP4 video), and at the same time, saving the video to a location chosen by the user. Is there a way to achieve so?
Also, I want to save the locations of media played by the user, and allow the user to play it later without locating it again. Do anyone have some ideas on that?
Thank you guys very much.
You should be able to do what you want. Your best bet currently is to use the chrome.fileSytem API, which lets you save files to a location chosen by the user. You can also use retainEntry and restoreEntry to allow you to play the files back in later sessions, however I don't believe that is not available in stable channel yet (it is currently restricted to the dev channel, but should be available for general use in version 31).
Also check out the chrome.mediaGalleries API. It is designed to provide access to media, however it doesn't provide the write capabilities you need yet.
Streaming can be done using HTML5 Video tag.
Please check :
http://html5doctor.com/the-video-element/
Also, you can use plugins like :
http://www.videojs.com/