Audio devices plugin and plugout event on chrome browser - google-chrome

I'm building an audio chat web application using WebRTC. So I trying to build that if any external audio device get plugged-in to the system my application automatically start using that microphone and when that external device plugged out application start using system default microphone (as Hangout do).
Is there any event that notify me about device plugged-in plugged-out information?(For chrome browser)
Is there any way to know which device to use in all the listed device that we get from browser so that we actually get an audio? (For example as in desktop systems there are two microphone jacks one in front-side another in back-side so when get Media Devices in browser we will get both jacks as device, but how to choose between both jacks that in this particular jack a microphone is plugged-in)
How system choose default device. Is it good to use always default device?
What is the difference between default and communication devices that browser provides?

(1) in the spec, a devicechange event is fired on the navigator.mediaDevices object. That is not yet implemented in Chrome. You can poll navigator.mediaDevices.enumerateDevices which has a performance impact however.
(2) enumerate the devices and look at their labels? See this sample

Related

How does an OS know which visual data from a browser is okay to share with a screen capture application, and which should be blacked out?

Data passed to my physical monitors is separate from the data passed to screen captures. I am sure of this, because if I run a screen capture while playing a movie on chrome (but not Firefox!), most popular services will just show a black screen. This implies that visual data passed from a browser to your desktop is separate from the data going from your desktop to a screen capture application. But how? What are they doing to keep these separate? How does my OS know which data from the browser is fine to show to the screen capture program and which is not?
Another example of this phenomena is when certain less-than-ethical streamers use video game cheats that will show hidden player locations on their monitor's screen, but not to their audience on a live-stream.
When you play video using DRM and encryption, the decryption of that video isn't capturable via usual methods. In fact, if done correctly that video will be encrypted all the way up to the monitor via HDCP. Reality is that the whole stack of components used for DRM is unreliable, so it's more common you'll just get lower quality video if your system cannot ensure encryption.
Some resources:
DRM and HDCP
Encrypted Media Extensions API

How to make HTML <video> tag show black during screen capture? [duplicate]

I noticed that netflix employs a method of preventing users from recording or even taking still screenshot images of the video playback in their browser-based app.
If you are watching a video on netflix (in my case Windows 10 and Chrome) the video will turn to a black screen if you begin to record or screenshot.
What technology is at play here. Is there a windows/chrome API for telling content on the screen to hide if an attempted screenshotting is detected?
Is it possible for a web developer to add this feature to their products?
Most streaming media services now make use of EME https://en.wikipedia.org/wiki/Encrypted_Media_Extensions. The media players built by these services make use of EME to invoke the underlying DRM (Digital Rights Management)
WebBrowser -> HTML5/Javascript -> EME -> DRM
And yes, of course you can build your own solution using EME.
To add to the other answers and comments - the screen capture prevention mechanisms are actually dependent on the DRM security level and the device capabilities and so may be different on different machines.
Browsers using a SW based DRM solution which is not linked in to the secure media path on the device actually will allow screen capture.
Browsers using a HW based DRM or a SW one which is linked to the device secure media path will prevent screen capture.
Typically streaming services restrict their high resolution content, e.g. 4K, and sometimes even their high value content, e.g. live sports, to devices which support a secure media path. You can see this with popular streaming services where you may be able to stream a video in 4K on one browser/device combination but only in 720p on another, even on the same device.
Content security is an ever changing domain so you may find a particular browser and device combination supports different security levels over time.
Hardware Acceleration — you need to disable it.
in macOS, you can screenshot in Netflix —> Chrome > Settings > System > Disable Hardware Acceleration.
2cents from toronto island
jp

How to do cross browser/device Audio capture

I would like to clarify certain things with what I found and raise certain questions with things that I dont know,
Capturing cam/mic through browser could be done through getusermedia();
Is there anything for i devices? because getusermedia() doesn't seem to work in i devices
How could I trap actual audio from web browser application (eg. if I play an audio file and forward it 2mins, I would like to capture the actual audio stream from the html5 player so that I hold the actual audio data)
You need to use Flash, if you are not going to support mobile devices. One best solution is to use wami-recorder.
From their website:
The Problem
As of this writing, most browsers still do not support WebRTC's getUserMedia(), which promises to give web developers microphone access via Javascript. This project achieves the next best thing for browsers that support Flash. Using the WAMI recorder, you can collect audio on your server without installing any proprietary media server software.
The Solution
The WAMI recorder uses a light-weight Flash app to ship audio from client to server via a standard HTTP POST. Apart from the security settings to allow microphone access, the entire interface can be constructed in HTML and Javascript.
Hope this helps.

Detect HTML5 Media Capture API Support

Is there a way to detect whether a browser supports the HTML5 Media Capture API for a mobile website I'm building? I can only seem to find solutions for detecting getUserMedia() support.
I would like to be able to present mobile users one of two scenarios:
User's browser supports the API, so two upload buttons are displayed, one activating the camera and one activating the image gallery.
User's browser doesn't support the API, so just one upload buttons is displayed, hopefully activating the gallery if their browser supports the accept parameter.
User's browser supports the API, so two upload buttons are displayed, one activating the camera and one activating the image gallery.
There's no way (at the moment) to create 2 separate HTML buttons: one for (just) the library & one for (just) the camera (
I've covered all the possible HTML Media Capture options in this article)
Use
<input type="file" accept="image/*;"> and you'll be prompted to choose between capturing a photo or selecting an existing one:
User's browser doesn't support the API, so just one upload buttons is displayed, hopefully activating the gallery if their browser supports the accept parameter
Support is as follows:
Android 2.2+ and iOS6.0+ support the above code
Android 3.0+ supports capture and takes the user straight to the camera
iOS6 through 10 do NOT support capture (prompt is always shown)
desktop browsers do NOT support HTML Media Capture
Detecting support resumes to detecting the above browsers.
Support reference: this 2013 O'Reilly book and my testing
You can use modernizr, on the docs page you can see in the table of features supported and detected that they detect both the HTML Media Capture and also the getUserMedia

jQuery Mobile video website, convert to PhoneGap app with videos on SD card instead of streamed

I have a jQuery Mobile website I created for a friend/client of mine. It only has 6 pages or so (2 of them are dialog windows). The site has HTML5 video with fallback for flash support via the videojs library. All videos are encoded properly in mp4, ogv (theora), and webm and so far play on every device I have used.
My problem lies bandwidth, the purpose of the program really needs to be an application because these are informational videos that may need to be viewed at any time, even with no web access (web access is required for first login to verify credentials).
I was left with 3 solutions, try writing native apps for all the platforms myself in their native languages, use Sencha Touch (which I am comfortable enough with extJS to do), or taking my existing jquery mobile app that is 100% functional including log-in and some backend package management to assign users a package of videos (there are multiple packages each with between 8-20 videos), and follow the jQuery Mobile tutorial for getting your app ready for PhoneGap, I believe its only enabling two settings, and both are to enable "cross-domain" requests, since my current web app would be running as localhost, it would see the scripts as external pages.
My main question/problem is for one, I have never used PhoneGap; aside from their Hello World android tutorial, and I know there are other all-in-one frameworks out there now: PhoneGap, Titanium, Corona, Adobe Flex (which I am installing while writing this tutorial, to see what it has to offer. If it has features like encoding videos automatically for the target device (video resolution changing), or even has local video playback features at all that may work.
Does anyone know which of the current frameworks have the ability to install a set of videos to the sdcard, (totaling around 6mb per install), and play them natively (by that i mean, in the devices native player, not inline inside of a webview). Which on android phones anyways, my current videojs based player plays the files natively in everything I have tried it on.
I just need a push in the right direction, if there is a PhoneGap plugin that I don't know about that allows videos to be played from the sd card, that would be terrific. Although I am not very happy with the speed of the android and blackberry webview controls. So something that uses 100% native controls would be great. I hope you guys can come up with some ideas, you can see the current app in action at m.yourvideobenefits.com email:abc#tool.com password: demo
You should view it from your phone if you want to see it properly, but if you do not have a smart phone; keep in mind that when viewing this page certain desktop browsers, the videos become their actual size after they are through loading. This is because i have autoload="true" in the video tag (which is ignored on most phones, but believe it or not, setting autoload="true" is what actually allowed the videos to not play inline on certain devices. A bug on the device, I am sure...but without this tag the videos played inline on iPhone 4 with the latest iOS version.
You could do it very easily with phonegap; you already have your web page, so it would be much less work, probably.
You could get the videos from inside your apps bundle in ios, and then it wouldn't be hard to select the one with the best resolution for the device being used. You could also download the videos at the perfect format and resolution the first time your app plays from your server using the file api. That convined with the storage api is nice for actualizations.
There's a plugin I use for android, because video tag is sometimes bugged or doesn't work at all in older versions, https://github.com/phonegap/phonegap-plugins/tree/master/Android/VideoPlayer.
It only plays from web or sdcard, but that's rarely too bad.
I can't help you with black berry, but I'm pretty sure there must be a way of doing it. And, anyway, appcelerator doesn't support it yet, so you would probably had to do it natively. Even if there isn't a plugin for black berry, you'd probably have to chose between native developement and html5 player inside phonegap. I won't give you my opinion about it here, for I'm not the one to give it and Stack Overflow says I shouln'd give it anyway.