Listen to an live audio stream - html

I'm currently streaming my microphone from c++ with an standard socket. Now i want to listen to this stream from the web.
I have tried accessing the stream from the audio tag directly, but this didn't work.
I also tried to set the Content Type to audio/wav, but this also didn't work.
I have tried to write to an file and access it directly from the audio element, but it wasnt adding the new written parts to the audio element. Also it started from the beginning of the audio and i read that it had a high delay.
Is there a way to stream audio with very low latency?
I have thought on making a get request every 50ms and adding the new data to the audio or recording 50ms of audio and changing the src of the audio at the end of the last clip.

Streaming audio buffers through a WebSocket and then trying to play it with an <audio> element is not the way to go.
The way to stream audio to your browser with low latency is to use WebRTC.
WebRTC is built in the browser to enable peer-to-peer real-time communications.
That said, you can use some WebRTC implementation in C++ like libwebrtc or Gstreamer to stream your mic input, and then use the native JavaScript API on your browser to receive the audio stream and place it in the srcObject property of an <audio> element to listen in the browser.

Related

How to use flash player to encode and push a live stream to media server?

I have a media server (Wowza) that I am pushing a live RTMP stream to and playing it (in several protocols HLS/RTMP) back in the browser with Flowplayer.
Currently I am encoding the stream with OBS but I would like to also give my users an option to encode from within the browser with Flash.
Do I need to learn ActionScript and write my own SWF to access the camera/mic and push the stream?
Is there a commercial SWF file with parameters I can pass in (such as the RTMP stream URL etc)?
You can use flash to send Video from webcam to Wowza. Video will encoded by Flash
use webrtc send video to Wowza. Video will encoded by Web Browser
I have used https://github.com/theintencity/flash-videoio. It not bad.
Please note - Latest browser not enable flash by default

Buffered audio in SoundJS

I'm using SoundJS 0.5.2 to play audio in a music player that I'm designing for a client.
According to the documentation, I have to register or preload an audio file before I can use it, by using the registerSound method of the Sound Class and the audio cannot be played before it is fully loaded.
But how I do I go about buffering the audio while it is being played? Like wait till 10% of the audio is buffered, then play the song? Can this be accomplished using PreloadJS ?
After Googling about it, I found this thread. It says that the WebAudioPlugin does not support buffering because of the underlying technology, but the HTMLAudioPlugin can play the audio before it is fully loaded.
But it does not mention how to do that. When using HTMLAudioPlugin, do I still have to register the sound using registerSound ?
Also, when using FlashPlugin as a fallback, will the buffering still be supported by FlashPlugin?
HTMLAudioPlugin fires the ready event when it receives the canplaythrough event from the audio tag, so it supports buffering by default (no setup required). You have to register sounds for them to start loading, either with PreloadJS or internal SoundJS methods. Buffering will work the same regardless.
Buffering is not supported by the FlashPlugin as far as I know.
Hope that helps.

Video is lagging from Audio in a vLine media session

At first I have a voice call in a vLine session, I hear the audio by getting the audio tag calling createAudioElement() on the media stream and appending it to the $(body)
Then on when the remote user opens his video, I send this piece of info to the other user using an x-msg, after received I get the video stream by calling createVideoElement() on the media stream.
After that I find a lag between the audio and the video, the audio always reaching faster than the video, how can I synchronize the audio with the video in this case?
When you call createVideoElement on the stream it will create a <video> element, which plays both audio and video, so at that point there is no need for the <audio> element that you created with createAudioElement.
The browser handles synchronizing the audio and video in a single MediaStream, so if they are consistently out of sync, you may need to file a WebRTC bug.

Stop download of streaming audio to HTML5 element with Buzz

I am using Buzz to abstract HTML5 audio for internet radio. This works well, but I need a way to stop downloading the stream when audio is stopped.
For example, when I start playing, I can see the network requests for the stream begin. When I stop audio, that data for the stream is still being transferred, as if it were a static resource. I need to either prevent that from happening, or get it to stop once audio is stopped.
I believe jPlayer does this by destroying the <audio> element, but I don't see any method for doing this in Buzz. Is it possible? If so, how?
After spending some time with the Buzz source, I don't see any method for doing this currently. Fortunately, the raw audio element is exposed, allowing something like this:
buzz.sound.prototype.destroy = function () {
this.set('src', '');
}
This probably messes up some internal state information for Buzz. I'm looking into that right now.

RTMP stream of a+v plays only audio, no video

In Flash, AS3, I am using NetConnection to connect to a RTMP server, then I use NetStream to play a video+audio stream.
I attach the stream (attachNetStream) to a flash.media.Video instance that is added to stage (double checked that it is ON the stage) and play it, but all I get is the sound of the stream that's being played - no video is displayed.
Note that even though I cannot see the video, when I listen to the onMetaData of the stream I can get plenty of information about the video such as width, height, FPS (changes during playback as if a video is shown), number of decoded frames.
Does anybody have an idea how can I make the video work too?
Instead of using "raw" NetConnection and NetStream and attach it to a flash.media.Video, I'd recommend using some wrapper such as Pyro Player. Its basically a video API and I've used it many times for RTMP video and it works like a charm (I've always found the Video component from Adobe very buggy, specially when displaying video from a RTMP server). Give it a try!
Thank you guys! but I found out the answer:
Apparently Flash's (CS3) built-in Video class does not support H.264 streams. I tried to compile the exact same code in Flex 3.5 and everything worked!
There is a possibility that CS4 also supports H.264 streams. I did not try.
JWPlayer is great, I did not try Pyro.
Cheers.