At first I have a voice call in a vLine session, I hear the audio by getting the audio tag calling createAudioElement() on the media stream and appending it to the $(body)
Then on when the remote user opens his video, I send this piece of info to the other user using an x-msg, after received I get the video stream by calling createVideoElement() on the media stream.
After that I find a lag between the audio and the video, the audio always reaching faster than the video, how can I synchronize the audio with the video in this case?
When you call createVideoElement on the stream it will create a <video> element, which plays both audio and video, so at that point there is no need for the <audio> element that you created with createAudioElement.
The browser handles synchronizing the audio and video in a single MediaStream, so if they are consistently out of sync, you may need to file a WebRTC bug.
Related
I'm currently streaming my microphone from c++ with an standard socket. Now i want to listen to this stream from the web.
I have tried accessing the stream from the audio tag directly, but this didn't work.
I also tried to set the Content Type to audio/wav, but this also didn't work.
I have tried to write to an file and access it directly from the audio element, but it wasnt adding the new written parts to the audio element. Also it started from the beginning of the audio and i read that it had a high delay.
Is there a way to stream audio with very low latency?
I have thought on making a get request every 50ms and adding the new data to the audio or recording 50ms of audio and changing the src of the audio at the end of the last clip.
Streaming audio buffers through a WebSocket and then trying to play it with an <audio> element is not the way to go.
The way to stream audio to your browser with low latency is to use WebRTC.
WebRTC is built in the browser to enable peer-to-peer real-time communications.
That said, you can use some WebRTC implementation in C++ like libwebrtc or Gstreamer to stream your mic input, and then use the native JavaScript API on your browser to receive the audio stream and place it in the srcObject property of an <audio> element to listen in the browser.
I have pre-recorded opus and vp8 files. In a webrtc call I am trying to stream audio / video to chrome.
While audio is played in chrome, but it fails to show video.
I read from google search that first video frame I need to send as i-frame. I am bit new to video stuff. Can somebody help me to learn how to send a i-frame in case of video.
Can there be other potential issues, why chrome does not like video frames. Please note that ICE and Dtls handshake goes well.
This MPEG-DASH stream
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11.smil/manifest_mpm4sav_mvlist.mpd
doesn't play in dash.js -- it plays the first segment at the lowest bitrate, switches to the next higher bitrate and stops after loading the second bitrate's init information. You can see this by pointing Chrome at the dash.js reference player, entering the stream URL in the top box and hitting Load. Open the JavaScript console to see that dash.js reported a media error, which means that the video element had a .error.
The same player is able to play this stream in IE11 without error.
These streams, each of which contains only one of the bitrates that play in the above sequence, both play without error in Chrome, so it's not that the underlying media is just somehow corrupt.
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11_0.smil/manifest_mpm4sav_mvlist.mpd
http://54.241.9.147/new-fandor/vod/21/2157/dark_star_FILM_v11_1.smil/manifest_mpm4sav_mvlist.mpd
Any ideas?
A Chromium person says that this is due to audio sample rate switching, which Chrome doesn't support yet: https://code.google.com/p/chromium/issues/detail?id=315330 Although each of our videos' video bitrates should have had the same audio bitrate, some videos had different audio bitrates for different video bitrates. The solution was to reencode those videos correctly.
We have an ogg vorbis stream from an icecast2 server. When the metadata is enabled within the stream, the stream will stop playing (though still connected to the server) when a new logical stream is started (when the metadata changes) within chrome and firefox using the native html5 tag. However, this does not happen when the metadata is disabled from the audio stream.
<p><b>Example of audio tag in HTML5.</b></p>
<audio src="http://mysite:port/stream.ogg" controls ></audio>
We need the metadata in the stream so we can't disable it. Does anyone know a work around for this? Any insight would be helpful. Thanks.
Flv seems like can't be use with loader but NetStream. But I wanted to play the video only after I have completely download it (smooth viewing with no buffering time). How can I do it with NetStream? And Can I have multiple video load at same time, and play it according to some array arrangement?
you can check if stream.bytesLoaded == stream.bytesTotal