here is my question:
I use Android Apis "MediaMuxer" to combine h264 stream and aac stream to mp4 file,when I want to stop record,so I call this:mMediaMuxer.stop(),and the mp4 file can play well.
but sometimes,there is an unexpected happen,like kill power or power is gone suddendly,so there is no time to call "mMediaMuxer.stop()",finally this file can not play anyway....
Is anybody know now to fix this problem? I want to play this video event didn't call "mMediaMuxer.stop()" this method... or there is other Apis or sdk can combine h264+aac stream well?
Related
I'm using Hls for streaming audio playlists with m3u8 format.
When I stop the music I run:
hls.stopLoad();
hls.detachMedia();
But I still see requests on network for the playlist/chunklist.
Is there a way to stop the requests other than refreshing/closing the window?
The player will continue to refresh until it encounters a ‘#EXT-X-ENDTAG’ in the playlist
I am using SimpleWebRTC to create a video chat room application.
One of the requirements is, a peer machine that has no microphone and webcam, should atleast be able to hear and see the video of other peers.
Is it possible to do?
I tried this using constraints{audio: false, video: false} in regular webRTC and it works on a machine that has no microphone and webcam.
How to accomplish this using simpleWebRTC?
Thanks!
I am not familiar with simpleWebRTC, but I use PeerJS. To answer even if you don't have a webcam or mic, all I do is I answer with 'null' instead of the stream.
So in Peerjs instead of
call.answer(window.localStream);
you can say
call.answer(null);
Try to find the code in SimpleWebRTC where the call is made or answered and try this.
I'm trying to get the raw audio in getUserMedia() success callback and post it to the server.
The success callback receives the LocalMediaStream object.
var onSuccess = function(s) {
var m=s.getAudioTracks(s);
//m[0] contains MediaStreamTrack object for audio
//get the raw audio and do the stuff
}
But there is no attribute or method to get the raw audio from channels in MediaStreamTrack.
How can we access the raw audio into this callback which is called on success of getUserMedia()?
I found the Recorder.js library-- https://github.com/mattdiamond/Recorderjs
But it is recording blank audio in Chrome: Version 26.0.1410.64 m.
It works fine on Chrome: Version 29.0.1507.2 canary SyzyASan.
I think there is issue of Web Audio API used by recorder.js
I'm looking for the solution without Web Audio API, that should work at least on chrome's official build.
Check out the MediaStreamAudioSourceNode. You can create one of those (via the AudioContext's createMediaStreamSource method) and then connect the output to RecorderJS or a plain old ScriptProcessorNode (which is what RecorderJS is built on).
Edit: Just realized you're asking if you can access the raw audio samples without the Web Audio API. As far as I know, I don't think that's possible.
I've been playing around with node and websockets and built a small test app that streams audio using websockets. The server breaks apart the mp3 using createReadStream, throttles the stream using node-throttle and sens the binary data using the "ws" module.
On the client side I pick up the chunks on the websocket and use decodeAudioData (http://www.html5rocks.com/en/tutorials/webaudio/intro/) to decode and play the chunk. It all works relatively ok.
What I was curious to do next was to stream video in the same manner to the HTML5 video tag. But I can't really find any reference material on the web to achieve this in the same manner as my audio test above.
Is there a video equivalent for "decodeAudioData"?
Can I feed chunks of data into a video tag?
I've got a similar sample running that I picked up from...
https://gist.github.com/paolorossi/1993068
But this isn't really what I am looking for. First of all it doesn't really seem to be streaming to me. The client buffers it all before playing it.
Also, similar to my audio test I want the stream to be throttled on the server side so that when a new client connects they join the video at whatever point it is currently at. i.e. 30 minutes in or whatever.
Thanks
OK,
I found a solution to this after much searching.
The MediaSource API is what I was looking for...
var mediaSource = new MediaSource();
var sourceBuffer = mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
sourceBuffer.append(new Uint8Array(data));
This link provided the solution...
http://html5-demos.appspot.com/static/media-source.html
I'm trying to serve up a live stream (ie. completely buffered in memory, cannot access the past) and am having trouble with Expression Encoder 4.
Ideally, I'd like to just stream a bare H.264 byte stream to the client consumed by:
<video id="mainVideoWindow">
<source src='http://localhost/path/to/my/stream.mp4' type='video/mp4' />
</video>
I figured I could stream it to the client just like any other byte stream over HTTP. However, I'm having trouble figuring out the appropriate code required to do (first day with Expression Encoder, not sure how to go about getting the raw byte stream) so nor do I know if it would work in the first place.
An alternate was to use IIS Live Streaming server:
var source = job.AddDeviceSource(device, null);
job.ActivateSource(source);
job.ApplyPreset(LivePresets.VC1IISSmoothStreaming720pWidescreen);
var format = new PushBroadcastPublishFormat();
format.PublishingPoint = new Uri("http://localhost/test.isml");
job.PublishFormats.Add(format);
job.StartEncoding();
// Let's listen for a keypress or error message to know when to stop encoding
while (Console.ReadKey(true).Key != ConsoleKey.X) ;
// Stop our encoding
Console.WriteLine("Encoding stopped.");
job.StopEncoding();
However, I'm having trouble getting the client side markup to want to display the video on Chrome and I haven't seen anything to indicate that it'd work on Chrome (though http://learn.iis.net/page.aspx/854/apple-http-live-streaming-with-iis-media-services indicates how it would work with an iOS device).
Anyone have any insights?
You are trying to consume (with your sencond example) a Smooth Streaming feed (HTTP-Adaptive Streaming by Microsoft) through HTML5, which is not supported.
This could work on iOS devices if you enable the Apple HTTP Live Streaming to transmux the fragments into MPEG-2 Transport Stream. This will also generate an Apple HTTP Live Streaming manifest which than can be called though the video tag.
...I saw that you have the IIS link. The Apple HTTP Live Streaming needs to be enabled on the IIS Server (IIS Media Services). This will work for iOS devices. Quicktime will get into play...