WebRTC Check if camera and microphone are in use on Chrome - google-chrome

Is it possible to check if camera and microphone are in use by another PC application (like Skype)?
The problem is, I'm able to get the stream object and initiate the connection between two computers, but there's no video nor sound, because the devices are in use by Skype. What I need is a way to detect if the devices are busy in order to tell the user to close the applications that are using the camera and microphone
I know it's possible to check if there are any devices present on the PC with MediaStreamTrack.getSources(), however it does not provide any information about devices status.

The testrtc project has a test that detects silence from a microphone here that could be useful in determining if the audio track returned is silent, and a similar video test that detects frozen or black frames.

There is no direct way, but getUserMedia should throw errors and/or return less audio/video tracks than you would expect if checking whether a device exists.
Devices can be detected either from MediaStreamTrack.getSources or (preferably) the spec-compliant navigator.mediaDevices.enumerateDevices.

Related

WebXR and WebRTC don't work simultaneously

I am new to WebXR. I was trying to use webRTC with WebXR. The user will first enter into AR session and then create a WebRTC peer connection but ice candidates are not generated in Chrome for Android of the user is in AR session. As soon as the user gets out of AR session, ice candidates are transferred. Is this a bug in Chrome??
The problem is Hardware related. Some devices allow the use of both Front and back camera simultaneously. In such devices, the code worked properly. In other devices, both front and back camera cannot be accessed simultaneously. Hence, code does not work in these devices. Also, the WebXR Device API does not allow access to camera feed at the moment, however it is a proposed feature.
Although I haven't tried it myself. But you can in theory use the canvas captureStream API to stream webXR canvas.
Can you post your code here. You might want to tweak how you pass stream to the webrtc connection.
As far as I know, it is not possible to use canvas.captureStream() because WebXR doesn't render to the canvas directly.
I am also looking for a way to stream a webXR-Session via WebRTC. So I would be highly interested in your solution shivamag00!
Hope to hear from you!

Can I mirror my Chrome browser to multiple Chromecast devices

We have a dashboard system (Dashing) that we can view through a browser (Chrome). We have two TVs up displaying this dashboard now. Each TV has a $350 PC connected to it. I am wondering if we could use a Chromecast plugged into each TV and have only one PC "displaying" the dashboard. This is relevant because 1) we are looking at adding monitors after we expand our office and 2) the PCs tend to be a pain in the ..., with things like updates pushed from IT, password chances, etc. One PC would definitely be better. Bonus points if I can get rid of all the PCs and just use Chromecast pointed at a URL.
Thanks,
Although not supported officially, I was successfully able to cast my entire screen to two devices by using both the Google cast and the Google cast (Beta) extensions simultaneously. I could only get it to work using one extension in an incognito window and the other in a normal window.
You can cast to multiple Cast devices if you create an additional "people" in Chrome settings for each of your Cast devices, then open a new window(not tab) for each "people".
Each "people"/user can attach to a different Cast device. It helps to match the user name to the device or location to keep it all straight.
Ex: Cast a movie from NAS to 3 TVs.
Use VLC to stream movie.
Open 3 Chrome/chromium windows(not tabs), changing each window to a unique user/person.
Start streaming.
Point each window at stream, play and cast to a different device.
Performance will depend on ability of PC and router/wifi. We use the Ethernet power supply for Chromecast to reduce load on wifi AP. (We had Cat5 Ethernet already installed)
Since all browser windows are pointed at same stream, the sync is pretty good. Although we usually stream to living room, patio, and game room which a difference hard to detect.
Hope this helps someone.
Currently, you can only be connected to a single device at a time, hence can only mirror to one Cast device. You mentioned you are mirroring your browser. Is the content that you are mirroring a simple page, a video, ...? You may be able to come up with a simple app that would eliminate the need for having PC's completely. Tell us more about the content you are mirroring (hopefully it is not flash :-) )
The Chrome browser extension can only cast to one device. A workaround is to install the Chrome browser extension Beta version alongside the non-beta version, and you'll have a second browser extension that can cast to a second device.
My colleague suggests that if you install Chrome Canary, the Beta version of Chrome, you can run two Chromes, with two extensions each, and support four devices.
Hopefully, this limitation will be removed, and the browser extension will support multiple devices.
Our use case is driving information kiosk displays from a single computer. Chromecast is nice in that the display's physical security does not expose the PC driving the display.

HTML5-how to open more than one cameras

I have successful get the video streaming of local camera by use getUserMedia() in HTML5.But only one camera.I have two cameras,how to get their video streaming at the same time.
Unfortunately this does not appear to be possible, yet. See here and the comments below this article here. Basically the user decides via their browser settings which camera is primary, and no option for secondary, etc, camera exists as of yet.

HTML5 Video recording and automatically uploading video on server

I am trying to develop a test taking website for students. In this website, students should be able to answer the questions(displayed in text format) by using webcam in one go. Currently I have implemented this feature using Flash, it captures the frames and simultaneously sends it to the server. The problem with this technique is that the quality(FPS) of my video is restricted and is dependent on the bandwidth of the internet connection. Also I am not in favor of using flash.
I want that as soon as student clicks on the start button, a timer should start to record the video. The video should get saved on the client's machine (without asking the client to mention the path) and on completion of video, it should automatically get uploaded on the server and when uploading gets completed, the video should be automatically deleted from the client's machine.
In short can anyone give me a starting point, so as to I can proceed with the work. Any helo will be highly appreciated.Thanks!
Here is a good example how to get webcam working on html5:
http://blog.teamtreehouse.com/accessing-the-device-camera-with-getusermedia
It doesnt tell how to upload the video to the server.
Currently I have implemented this feature using Flash, it captures the frames and simultaneously sends it to the server. The problem with this technique is that the quality(FPS) of my video is restricted and is dependent on the bandwidth of the internet connection.
That is actually incorrect.
The fps you're getting depends 100% on:
the webcam quality
the light available in the room (the more light the better)
The resolution you're recording at (lower res results in higher fps even with low quality webcams in low light)
The video should get saved on the client's machine (without asking the client to mention the path) and on completion of video, it should automatically get uploaded on the server and when uploading gets completed, the video should be automatically deleted from the client's machine.
Flash records by streaming (through rtmp) the audio/video data to a media server (Red5, AMS, Wowza). After the recording is stopped you could move the file to a web server and trigger a http download.
In what regards HTML the Media Recording API has been implemented by Firefox and Chrome 49 and it allows you to record to local RAM and download the file as .webm (the audio video codecs might differ btwn browsers).
Disclaimer: I work at Pipe which handles video recording.

Is it possible to stream live video to Flash Media Server via NetStream byte access?

So, I'm working with a video source that I'm feeding into my Adobe AIR application via some native extension work, with the goal of ultimately getting it to a Flash Media Server. The video is H.264 encoded and muxed into a FLV container, which aligns me with supported Flash Media Server codecs and NetStream (appendBytes) requirements. I can get the data into AIR just fine.
The mine I stepped onto today, however, is that documentation for NetStream.appendBytes states I must call NetStream.play(null):
Call this method on a NetStream in "Data Generation Mode". To put a NetStream into Data Generation Mode, call NetStream.play(null) on a NetStream created on a NetConnection connected to null. Calling appendBytes() on a NetStream that isn't in Data Generation Mode is an error and raises an exception.
NetStream.play() called with a null parameter yields local FLV playback. I can't publish the stream to FMS in this mode. But my research into Flash seems to indicate NetStream's byte access is my only real hope here when dealing with non-camera or non-web video data.
Q: Can I latch onto the video playback buffer for publish to a FMS? Can I create a sort of pipeline of NetStreams or NetConnections to achieve this? Or is there an alternate approach here for transmitting H.264/FLV data to FMS? (The source of my video cannot communicate with FMS directly.)
The answer to your question is quite simply no. This is apparently implemented as a security feature, which is probably less of a security based issue and more of a sales issue. Adobe likes to block certain capabilities intentionally in order to create the possibility of, or need of another product aka more revenue.
I tried looking into this for you to see if there was some dirty hack where you could attach a camera or something and override the binary data being sent to the stream like you can with Audio but unfortunately, to my knowledge, no such hack is possible. More nfo here: NetStream.appendBytes
Update
You might be able to do something hackish by using ManyCam which is a virtual webcam driver (from what I understand). This will provide a valid camera you can select from flash and you can also select a video file as the source file for ManyCam. See http://manycam.com/user_guide/#HowtoSelectaVideofileasthePictureSourceforManyCam
Update #2
If you're looking for something open source that will do the same thing as manycam, check out the following:
http://code.google.com/p/webcamstudio/wiki/VideoSourceMovie (GPL Licensed)