How to receive a rtp, rtcp or udp, from a stream of gstreamer, on video HTML5? - html

I'm trying to get a video stream RTP/RTCP using HTML5, the stream was generated by gstreamer. I used examples of gstreamer, so I can pass through RTP ports:5000, and RTCP:5001, and can receive streams using gstreamer. But using HTML5 could not receive them. So I tried to read a bit about HTML5 and saw that it can receive theora/ogg, webm/vp8, mp4/avc, and protocols may be, HTTP, RTP, RTCP, UDP, and others, but I could not use RTP, RTCP or UDP, HTTP only managed to receive. But I had a very satisfactory result using the VLC plugin for Mozilla Firefox, using the UDP protocol. I wonder if anyone has any tips, I do not want to use source files as src="/tmp/test.avi" needs to be a video stream that can be udp, RTP, RTCP. Thank you!

If you don't need to stream at low fps, you can use GStreamer to transcode your stream in MJPEG and stream it in TCP, and then use VLC to get this TCP stream and stream it to HTTP. It works very well (0.5 sec of latency), but if you decrease the fps (1 fps) VLC introduces a latency of around 11 sec.
Here are some test commands that should work out of the box, using the GStreamer videotestsrc :
GStreamer :
gst-launch -v videotestsrc horizontal-speed=1 ! deinterlace ! videorate ! videoscale ! video/x-raw-yuv, framerate=15/1, width=256,
height=144 ! jpegenc quality=20 ! multipartmux
boundary="--videoboundary" ! tcpserversink host=localhost port=3000
VLC :
vlc -vvv -I rc tcp://localhost:3000 --sout
'#standard{access=http{mime=multipart/x-mixed-replace;boundary=--7b3cc56e5f51db803f790dad720ed50a},mux=mpjpeg,dst=localhost:8081}'
then open a browser to http://localhost:8081 (or create an HTML page with an img tag whose "src" attribute is http://localhost:8081)

Related

Android use MediaMuxer combine h264+aac stream, but i found some questions

here is my question:
I use Android Apis "MediaMuxer" to combine h264 stream and aac stream to mp4 file,when I want to stop record,so I call this:mMediaMuxer.stop(),and the mp4 file can play well.
but sometimes,there is an unexpected happen,like kill power or power is gone suddendly,so there is no time to call "mMediaMuxer.stop()",finally this file can not play anyway....
Is anybody know now to fix this problem? I want to play this video event didn't call "mMediaMuxer.stop()" this method... or there is other Apis or sdk can combine h264+aac stream well?

View h264 from rtmpsrc via gstreamer

I want to play flash media server stream via gstreamer. My video is published from camera to FMS with h264 encoding (720x480 Main,3.0).
My command for ubuntu is:
gst-launch-1.0 rtmpsrc
location="rtmp://192.168.1.153:1935/appname/mp4:cameraFeed44.mp4
live=1" ! decodebin name=decoder decoder. ! queue ! videoconvert !
queue ! xvimagesink
For resolution 720x480 it throws:
ERROR: from element /GstPipeline:pipeline0/GstRTMPSrc:rtmpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2812): gst_base_src_loop (): /GstPipeline:pipeline0/ GstRTMPSrc:rtmpsrc0:
streaming task paused, reason error (-5)
ERROR: pipeline doesn't want to preroll.
However it works fine for low resolution, like: 320x240. But I need even more than FullHD.
Thanks,
Stan

AR Drone 2.0, Gstreamer, C++ RTMP Server (streaming without SDK)

This question is the follow up question to this thread: AR Drone 2 and ffserver + ffmpeg streaming
We are trying to get a stream from our AR Drone through a Debian server and into a flash application.
The big picture looks like this:
AR Drone --> Gstreamer --> CRTMPServer --> Flash Application
We are using the PaveParse plugin for Gstreamer found in this thread: https://projects.ardrone.org/boards/1/topics/show/4282
As seen in the thread the AR Drone is using PaVE, Parrot Video Ecapsulation, which is unrecognizable by most players like VLC. The PaVeParse plugin removes these.
We have used different pipelines and they all yield the same error.
Sample pipeline:
GST_DEBUG=3 gst-launch-0.10 tcpclientsrc host=192.168.1.1 port=5555 ! paveparse ! queue ! ffdec_h264 ! queue ! x264enc ! queue ! flvmux ! queue ! rtmpsink localtion='rtmp://0.0.0.0/live/drone --gst-plugin-path=.
The PaVEParse plugin needs to be located at the gst-plugin-path for it to work.
A sample error output from Gstreamer located in the ffdec_h264 element can be found at: http://pastebin.com/atK55QTn
The same thing will happen if the decoding is taking place in the player/dumper e.g. VLC, FFplay, RTMPDUMP.
The problem comes down to missing headers: PPS Reference is non-existent. We know that the PaVEParse plugin removes PaVE headers but we suspect that when these are removed there are no H264 headers for the decoder/player to identify the frame by.
Is it possible to "restore" these H264 headers either from scratch or by transforming the PaVE headers?
Can you please share a sample of the traffic between gstreamer and crtmpserver?
You can always use the LiveFLV support built inside crtmpserver. Here are more details:
Re-Stream a MPEG2 TS PAL Stream with crtmpserver

Expression Encoder 4 live stream consumed by HTML 5 <video>

I'm trying to serve up a live stream (ie. completely buffered in memory, cannot access the past) and am having trouble with Expression Encoder 4.
Ideally, I'd like to just stream a bare H.264 byte stream to the client consumed by:
<video id="mainVideoWindow">
<source src='http://localhost/path/to/my/stream.mp4' type='video/mp4' />
</video>
I figured I could stream it to the client just like any other byte stream over HTTP. However, I'm having trouble figuring out the appropriate code required to do (first day with Expression Encoder, not sure how to go about getting the raw byte stream) so nor do I know if it would work in the first place.
An alternate was to use IIS Live Streaming server:
var source = job.AddDeviceSource(device, null);
job.ActivateSource(source);
job.ApplyPreset(LivePresets.VC1IISSmoothStreaming720pWidescreen);
var format = new PushBroadcastPublishFormat();
format.PublishingPoint = new Uri("http://localhost/test.isml");
job.PublishFormats.Add(format);
job.StartEncoding();
// Let's listen for a keypress or error message to know when to stop encoding
while (Console.ReadKey(true).Key != ConsoleKey.X) ;
// Stop our encoding
Console.WriteLine("Encoding stopped.");
job.StopEncoding();
However, I'm having trouble getting the client side markup to want to display the video on Chrome and I haven't seen anything to indicate that it'd work on Chrome (though http://learn.iis.net/page.aspx/854/apple-http-live-streaming-with-iis-media-services indicates how it would work with an iOS device).
Anyone have any insights?
You are trying to consume (with your sencond example) a Smooth Streaming feed (HTTP-Adaptive Streaming by Microsoft) through HTML5, which is not supported.
This could work on iOS devices if you enable the Apple HTTP Live Streaming to transmux the fragments into MPEG-2 Transport Stream. This will also generate an Apple HTTP Live Streaming manifest which than can be called though the video tag.
...I saw that you have the IIS link. The Apple HTTP Live Streaming needs to be enabled on the IIS Server (IIS Media Services). This will work for iOS devices. Quicktime will get into play...

How to get the video resolution in a RTMFP stream?

I would like to get the NetStream width/height when receiving a RTMFP stream. This is important because the video component needs different measures when, for example, the user receives a 4:3 or a 16:9 stream.
Unfortunately, the onMetaData callback for NetStream does not work as it does for RTMP streams.
Is there a workaround?
You might try using different ports and see if the onMetaData gives you anything different. I believe the 3 main ones are: 1935, 443 and 80.
The following link can give you further documentation on configuring your server:
http://help.adobe.com/en_US/flashmediaserver/configadmin/WSdb9a8c2ed4c02d261d76cb3412a40a490be-8000.html