webRTC: How to get external microphones to work? - google-chrome

In a working webRTC app (voice only) I came across a weird bug: When prompted to select the audio input via getUserMedia() it seems that any other microphone but the built in will work.
Although the selection results in no immediate errors, there is no signal transferred when a webRTC connection is established - the line stays silent. If I select the internal microphone, everything works as expected.
I tested this with Chrome and Firefox to no avail.
Anybody more information on this behavior?
EDIT SEPTEMBER, 13th
More info on test setup: Chrome 45, with experimental features on. Chrome will list the external audio sources via navigator.mediaDevices.enumerateDevicesbut will result in no sound at all when anything else but external mic is chosen from gUM input select.
The question: Is there ANYBODY that managed to get an external mic to work with webRTC?

Finally, I found the solution.
The reaons why no sound was picked up is rather simple: webRTC expects the mircophone attached to input channel 1 or 2 in case you use a microphone connected with an audio interface to your computer.
I have not found a way to tell my webRTC app to chose a different input channel, so the mic simply has to be channel 1 or 2.
BTW: The same is true for Skype. Any mic connected with an audio interface needs to be plugged into channel 1 – otherwise it will not be recognized as Skype seems to use channel 1 as default, too.

Related

How do you change source in a web audio context

I'm making a game that changes some of it's object depending on what music is playing. After each song has ended I want my audio context to load in a new source and analyze that. However whenever I tried to do that I've gotten the error that an audio object or buffer can't be called twice.
After some researching I learned that ctx.createMediaElementSource(MyHTML5AudioEl) lets you create a sourceNode that takes the source from a HTML5 object. With this I was able to loop through different song.
However for my game I need to play/analyze a 30 seconds "remote url" that comes out of the Spotify API. I might be wrong but ctx.createMediaElementSource(MyHTML5AudioEl)does not allow you to analyze a source that is on a remote site.
Also the game needs to work on Mobile Chrome, which createMediaElementSource(MyHTML5AudioEl) does not seem to work on.
I might be on the completely wrong path here but my main question is:
How can I switch remote songs urls in web audio api. With it being compatible with mobile chrome.
Thanks!
First, as you found out, you can't set the buffer again for an AudioBufferSource. The solution is to create a new one. AudioBufferSources are intended to be light-weight objects that you can easily create and use.
Second, in Chrome 42 and newer, createMediaElementSource requires appropriate CORS access so you have to make sure the remote url sends the appropriate headers and you set the crossOrigin attribute appropriately.
Finally, Mobile Chrome currently does not pass the data from an audio element through createMediaElementSource.

Detect whether or not the camera is in use?

The docs say
If getCamera() returns null, either the camera is in use by another
application, or there are no cameras installed on the system. To
determine whether any cameras are installed, use the names.length
property
So to detect that the camera is taken, I should be able to check if (Camera.getCamera() == null && Camera.names.length > 0), right?
I can duplicate having my webcam "taken" by another application by opening Webcam Toy in IE, and then trying to debug my application in Chrome, but Camera.getCamera() still returns a camera object, even when I can't see the feed from my webcam.
If I turn off the IE application and restart my app in Chrome, I can see the feed again.
Is the documentation wrong, or am I wrong?
Hopefully this will help. I wrote a blog post a while back about detecting multiple cameras and setting the default in AS3. Hopefully you find this information useful.
Blog post:
http://www.charlesclements.net/blog/flash-as3-setting-default-camera-part-2/
Source files:
http://www.charlesclements.net/blog/swfs/camera_detection/UseDefaultCameras_201203141746.zip

HTTP live Streaming reader

I have a server providing live streaming channel over a network via http and upd (http://..., upd://...), and I've been trying to find a way to display the stream to users in their browser (on at least Windows 7). I have tried to set video tag with src "http:// ip of server - ip of channel...", but i have obtained that "video format or MIME type is not supported".
so i have tried to save this content for 1 min and then played as a normal video, it's played normaly. So i think that my problem is to find a solution to display this content on the browser.
kindly, can you help me to find this solution or the steps and the tools required for this issue. please.
best regards,

HTML5 Audio recording getUserMedia with rezorder.js provide empty WAV file

I'm recording audio with HTML5 getUserMedia function. My code is similar to example in https://github.com/rokgregoric/html5record/archive/master.zip, server receives correct Wav data. However all the data received are 0.
What the issue could be? I'm trying with Chrome 23.0.1271.95, my OS is Win7.
I've found similar issue described here: http://www.smartjava.org/content/record-audio-using-webrtc-chrome-and-speech-recognition-websockets# but it doesn't help in my case.
By the way examples based on record.js are not working for me too. The record went fine, but during playback there is only silence, same as for my server side record.
Pretty sure you need to be running Chrome Canary for getUserMedia. You'll also need to go to chrome://flags and make sure Web Audio Input is enabled.

Returning OS and browser with flash version of uploadify using $_SERVER['HTTP_USER_AGENT']

I am using $_SERVER['HTTP_USER_AGENT'] to return the user's OS and browser when uploads are successfully completed using uploadifive. I have to use a flash fallback for browsers that don't yet support HTML5, i.e. the script falls back to uploadify. After successful uploads I have a script which submits data to a MySQL database, including the OS and browser. This works fine with uploadifive, but for uploadify 'Adobe Flash Player 11' is returned. I am wondering if there is a workaround to return the OS and browser when using flash?
Thanks
Nick
Yes and no. If the user is interested in not providing this information to you, you will not get it. But if they don't care / would like to provide it, then you can do it through http://help.adobe.com/en_US/FlashPlatform/beta/reference/actionscript/3/#!flash/system/Capabilities.html os property. You can also query JavaScript through ExternalInterface for navigator.
However, in general, Flash may send requests all by itself, i.e. being the user agent, not needing other user agents.
These headers are generated by the client-side software, which inherently cannot be trusted. Nothing can prevent the user from sending whatever information with the request. So, by handling the particular case of Flash not sending something you expect you aren't fixing the problem, you should be ready to have just any random information in that field, or, maybe, none at all.