HTML5 Audio recording getUserMedia with rezorder.js provide empty WAV file - html

I'm recording audio with HTML5 getUserMedia function. My code is similar to example in https://github.com/rokgregoric/html5record/archive/master.zip, server receives correct Wav data. However all the data received are 0.
What the issue could be? I'm trying with Chrome 23.0.1271.95, my OS is Win7.
I've found similar issue described here: http://www.smartjava.org/content/record-audio-using-webrtc-chrome-and-speech-recognition-websockets# but it doesn't help in my case.
By the way examples based on record.js are not working for me too. The record went fine, but during playback there is only silence, same as for my server side record.

Pretty sure you need to be running Chrome Canary for getUserMedia. You'll also need to go to chrome://flags and make sure Web Audio Input is enabled.

Related

webRTC: How to get external microphones to work?

In a working webRTC app (voice only) I came across a weird bug: When prompted to select the audio input via getUserMedia() it seems that any other microphone but the built in will work.
Although the selection results in no immediate errors, there is no signal transferred when a webRTC connection is established - the line stays silent. If I select the internal microphone, everything works as expected.
I tested this with Chrome and Firefox to no avail.
Anybody more information on this behavior?
EDIT SEPTEMBER, 13th
More info on test setup: Chrome 45, with experimental features on. Chrome will list the external audio sources via navigator.mediaDevices.enumerateDevicesbut will result in no sound at all when anything else but external mic is chosen from gUM input select.
The question: Is there ANYBODY that managed to get an external mic to work with webRTC?
Finally, I found the solution.
The reaons why no sound was picked up is rather simple: webRTC expects the mircophone attached to input channel 1 or 2 in case you use a microphone connected with an audio interface to your computer.
I have not found a way to tell my webRTC app to chose a different input channel, so the mic simply has to be channel 1 or 2.
BTW: The same is true for Skype. Any mic connected with an audio interface needs to be plugged into channel 1 – otherwise it will not be recognized as Skype seems to use channel 1 as default, too.

How do you change source in a web audio context

I'm making a game that changes some of it's object depending on what music is playing. After each song has ended I want my audio context to load in a new source and analyze that. However whenever I tried to do that I've gotten the error that an audio object or buffer can't be called twice.
After some researching I learned that ctx.createMediaElementSource(MyHTML5AudioEl) lets you create a sourceNode that takes the source from a HTML5 object. With this I was able to loop through different song.
However for my game I need to play/analyze a 30 seconds "remote url" that comes out of the Spotify API. I might be wrong but ctx.createMediaElementSource(MyHTML5AudioEl)does not allow you to analyze a source that is on a remote site.
Also the game needs to work on Mobile Chrome, which createMediaElementSource(MyHTML5AudioEl) does not seem to work on.
I might be on the completely wrong path here but my main question is:
How can I switch remote songs urls in web audio api. With it being compatible with mobile chrome.
Thanks!
First, as you found out, you can't set the buffer again for an AudioBufferSource. The solution is to create a new one. AudioBufferSources are intended to be light-weight objects that you can easily create and use.
Second, in Chrome 42 and newer, createMediaElementSource requires appropriate CORS access so you have to make sure the remote url sends the appropriate headers and you set the crossOrigin attribute appropriately.
Finally, Mobile Chrome currently does not pass the data from an audio element through createMediaElementSource.

Failed to Load Resource, Plugin Handled Load on iOS

Every time I try to view a video file on my server I get this error on iOS in Safari, Chrome.
I am using a blob server and then an Apache server so I am not sure what the problem is. However, when I only use Apache, I do get this error but then I have the video rendering too.
However when I render this using my server this is not working. Does anyone know why this is? The videos work fine on other devices and in browsers also works fine if accessed through Apache only.
The solution to this problem was just a work around. The reason being the that blob servers aren't streaming servers. iOS devices expect the videos to arrive in small chunks. So for instance a streaming server is able to do this. However, a blob server just hands the video as a blob which is not what the iOS device expects. Some browsers are smart enough to handle this but others not.
The way I solved this was to add the video files outside of the blob server in a folder within the project and then render this through the Apache server instead of serving it via the actual blob server we were using. I hope this helps.
I was also getting this error for some mp4 videos. Turns out it wasn't a server issue for me it was a video encoding issue.
Issue
A "moov atom" needs to be placed at the front of the video file. It serves as a table-of-contents for the video. That "moov atom" has to be read first for html streaming or it won't play on some devices.
The Fix
To fix, I used handbrake to transcode my video. Turn on 'web optimize' Also turning on zerolatency and 'fast decode' may help (found in the video tab).
We were getting a similar error here. I thought it may have been the streaming issue since our video was hosted in blob storage on Azure. After setting up a Media Service for streaming, the video still didn't work. It turns out, the cause of the bug for us was Safari using a Service Worker. Below is some further explanation of what we found:
Safari first sends a byte range request for a Video tag that expects a 206 response. However, if you use a Service worker, the response returns with a 200 and it appears Safari doesn't know how to handle this. Our solution was to exclude using a Service Worker for Safari.
We found this by using the network tab of the Safari debugger on a Macbook to troubleshoot the issue we were seeing on the iPad. Attached is a screenshot for comparison/reference. The left tab shows what the call should look like by default. The right tab shows what you would see if using a Service Worker.
Add the following line of code to your .htaccess (located in the root of your WordPress installation):
SetEnvIfNoCase Request_URI .(?:mp4)$ no-gzip dont-vary
The following screenshot is the new complete .htaccess
Reference: https://clickshepherd.com/blog/solved-elegant-themes-divi-and-cloudflare-mp4-media-error-formats-not-supported-or-sources-not-found/
In our case, we created a URL pattern for our blob assets and then set headers in that URL pattern definition page which sent back a mime type of 'video/mp4'. This should instruct the browser to treat the binary stream as chunked, which in turn meant we didn't need to download the whole thing before it started playing.
Google Cloud Platform Solution
This issue caused me a lot of headache, so I just wanted to add my specific solution here, if anyone else encounters this while deploying to Google Cloud Platform.
When trying to load MP4 videos in Safari, I was getting the same error:
"Failed to Load Resource, Plugin Handled Load"
Which was preventing the videos from playing.
Still, I wanted to try to keep everything inside Google Cloud, so I created a Storage Bucket for the site, and added the videos there.
Of course, trying to retrieve the videos from the storage URL from the main site resulted in a CORS error.
Fortunately, you can configure CORS pretty easily on storage buckets:
Configuring cross-origin resource sharing (CORS)
Once that configuration was deployed, I was able to retrieve and load the videos on the site in Safari without the "plugin handled load" error.
I saw the error "Failed to Load Resource" and though that this is reason, why my videos are not playing.
Turned out, my videos were missing the hvc1 tag. And when I added it - they're playing fine.
In my case issue was with H256 HEVC videos, but in your case some other encoding / tagging issues can be the reason.
In my case, issue was fixed with ffmpeg:
ffmpeg -i input.mp4 -tag:v hvc1 -acodec copy -c:v copy -movflags faststart out.mp4

Record Video from Browser using Webcam and Microfone inputs

I need to record a video through user browser using input from camera and microphone and send to my server. Since html5 still doesn't make that magic happen, I'm looking for flash solutions.
Do I really need some flash media server to do that, or can I do a POST request?
I want to get both inputs(webcam and microphone), put them in a .flv and send to my server.
I've seen some implementations using bytearrays to record and send, audio and video separated. The problem is that it generates a series of synchronization problems when you try to compose them in a single file.
If you're still looking for a solution check out:
http://framebase.io
They have an embed-able recording widget that can transcode the videos automatically. I'd check out the docs, but on success, you can run an API call to check the status of transcoding and download it to your server or you can just use your own S3 bucket.

Chrome won't play mp3 files?

There's something very weird on my server- chrome won't play mp3 files on it.
for example, when chrome is pointed to an mp3 file on first server: http://tinyurl.com/czqfw5a - it won't play. When I place the same file on my second server: http://tinyurl.com/cju4yg4 - it works fine.
I checked http response headers, on both servers it looks fine- mime type is set correctly.
The problem happens only with chrome. ff / ie work fine.
Anyone have an idea?
Short story, it's this bug: http://code.google.com/p/chromium/issues/detail?id=110309
Long story is in the way it works, Chrome asks for the MP3 file, cancels out that request (cause it doesn't want to download), then sends out another request asking for streaming.
I tested a few times with WireShark. On one occasion it didn't close the connection and kept on sending packets, never actually responding to the second request. On other occasions I've even got a HTTP/1.1 304 Not Modified.
I had to post here because I had the same problem. I realized that in Chrome, if you have the same mp3 file being served in two different tabs, the second tab will not load the mp3 file. It took me a while to realize that I had two tabs open. When I closed the one the other loaded fine. I hope I help some people out there that made the same stupid mistake as me lol
Found a solution:
I downgraded apache to 2.2.17 - and all of a sudden things started to work.
No idea why, but probably something in the new version of apache manifests the mp3 playing bug at chrome. .
Methinks it is a flash problem. at least on Chrome Version 32.0.1700.77 which i believe is still 32 bit and the latest [required] flash plugin is 64bit.