I have developed video recorder and player. Which will recorder video to flash media server.
When i start recording i set camera quality as following
cam.setQuality(65536, 65);
mic.setRate(44);
I will record perfectly on save video. Video will move from streaming folder to another folder. On play click video will play that place.
Now, problem is when i start playing then video played some second and then stop playing. Only volume i can listen. Video will not playing.
What is the issue of that?
Is some frames cut or bandwidth issue?
Edit:
Initialize camera as follow:
cam = Camera.get();
mic = Microphone.get();
mic.setCodec("Speex");
mic.setRate(44);
mic.setUseEchoSuppression(true);
mic.setSilenceLevel(0);
cam.setMode(627,353,24,true);
cam.setLoopback(true);
cam.setQuality(65536, 65);
mic.setRate(44);
for record flv:
ns.publish(fileName, "record");
Ex. Currently i record video of 5 minute then i play it so, it played upto only 2:10 then after it stops playing video and play only audio. same happen when play with vlc player.
If i record with same code locally stream server then works perfectly.
Related
I have pre-recorded opus and vp8 files. In a webrtc call I am trying to stream audio / video to chrome.
While audio is played in chrome, but it fails to show video.
I read from google search that first video frame I need to send as i-frame. I am bit new to video stuff. Can somebody help me to learn how to send a i-frame in case of video.
Can there be other potential issues, why chrome does not like video frames. Please note that ICE and Dtls handshake goes well.
I have 2 video players on single page. On desktop, everything works fine, both video players are playing videos. But when I try to cast this page to Chromecast (via Google Chrome extension or via https://demille.github.io/url-cast-receiver/), only first video player is active and playing, second video player is not working.
I tried to debug it and it looks like second video ends at video readyState=1 (first video has readyState=4)
Is there way to fix this? (I need multiple video players on single page, so using only one player and switching video files URLs is not a solution)
URL: http://iuvomedia.eu/chromecast/
If you want one video to play and other to be pre-loaded then you should make a queue of videos you want to play. Chromecast will play single video at a time and when it ends it'll automatically load the next one in the queue.
For information on autoplay and queuing you may visit https://developers.google.com/cast/docs/autoplay
You cannot have more than one active media element.
I mean, you can have two media elements where one plays a video that doesn't have any audio and the second one plays only audio.
You cannot have two active video or two active audio pipelines at the same time.
check here: Create multiple instances of html video object
I'm using JW Player to live stream content onto a web page. The player is backed by an open-source library called cine.io.
My issue is that the player falls back to an HTML 5 video element for all mobile web, both on iPhone and Android. There are some differences between the flash solution of JW Player and HTML5 - notably that if a live stream starts, then stops, then restarts, the video element will not pick up the restarted stream.
This is a problem since streams often drop in and out - and the flash solution does pick up the restarted stream.
I tested a bunch of listener methods on the video and the only one that signalled that the stream had ended was a "time update" listener:
$video.on('timeupdate', function(){
//Do something
});
However none of my attempts to re-open the stream have been effective.
Is this even possible? Can anyone provide pointers?
Would an example like this work?
http://support.jwplayer.com/customer/portal/articles/1442607-example-a-custom-error-message
At first I have a voice call in a vLine session, I hear the audio by getting the audio tag calling createAudioElement() on the media stream and appending it to the $(body)
Then on when the remote user opens his video, I send this piece of info to the other user using an x-msg, after received I get the video stream by calling createVideoElement() on the media stream.
After that I find a lag between the audio and the video, the audio always reaching faster than the video, how can I synchronize the audio with the video in this case?
When you call createVideoElement on the stream it will create a <video> element, which plays both audio and video, so at that point there is no need for the <audio> element that you created with createAudioElement.
The browser handles synchronizing the audio and video in a single MediaStream, so if they are consistently out of sync, you may need to file a WebRTC bug.
I load external video with playback component (skin over play stop seek mute vol.swf), its working great but when I use
fscommand("fullscreen", "true");
on my first scene (first keyframe in timeline), and after that when I reach my video scene. I got just black screen