How to run rtsp video into browser? - html

We are developing an application, where we required to load the rtsp video into the browser without using any poxy.
We tried to VLC Plugin but it only supports for the ie browser.

I am not aware of any browser which natively support RTSP streams at this time.
The usual approach is to use a proxy or a streaming server to covert the steams to something like HLS or DASH, but you say you can't do that.
There does exist a VLC browser plugin which certainly did support this (not tested it recently) but it does not include support for chrome which may rule it out if you are looking for cross browser support:
https://wiki.videolan.org/Documentation%3aWebPlugin/

Related

How to stream RTSP live video in Firefox and Chrome now that the VLC plugin is not supported anymore?

Now that the NPAPI that the VLC plugin uses is being discontinued in Firefox and that Google Chrome has discontinued the NPAPI for long is there any solution to stream RTSP live video inside these browsers?
After a long time digging and following this topic I have came to interesting results.
At this point the best option seems to be an RTSP proxy that changes RTSP in a way that makes it compatible with something supported by web browsers (WebRTC, etc.).
I have collected the following solutions:
https://github.com/Streamedian/html5_rtsp_player RTSP - Proxy - JS
Player (nodeJS)
https://github.com/lulop-k/kurento-rtsp2webrtc RTSP -
WebRTC Proxy - Browser (nodeJS)
Others in nodeJS -
https://www.pincer.io/npm/tags/rtsp
https://wmspanel.com/
https://easyrtc.com/
http://stackoverflow.com/questions/21921790/best-approach-to-real-time-http-streaming-to-html5-video-client
http://www.streamingmedia.com/Articles/Editorial/Featured-Articles/Choosing-a-Video-Player-Features-and-Specs-for-the-Top-Five-94188.aspx
Native browser Video Player with HTML5 Video tag + WebSocket RTSP
proxy
https://github.com/SpecForge/html5_rtsp_player/wiki/HTML5-RTSP-Player~
For the future I expect:
Video Camera manufacturers will start to implement protocols like
WebRTC and MPEGdash
Web Browsers SHOULD implement RTSP but this is probably not going to happen

How does HTML5 Video works

I would like to know How a HTML5 compliant browser plays a video using tags.
Does it actually call the underlying Player APIs using some plugin API ?
Can we write a custom plugin in chrome browser , such that using this we could call the Video player API like ffmpeg ?
Does it actually call the underlying Player APIs using some plugin API ?
Yes, The video players are implementations written in the browser, but the only API exposed is that stated in the HTML standard (and those which exist in JS).
Can we write a custom plugin in chrome browser?
No.
such that using this we could call the Video player API like ffmpeg ?
No, and the issue of distributing codecs and licensing come into play here. All of the format support provided in modern HTML5 browsers have been given licensing access to decode formats such as MP4 and AAC.

cross browser live video streaming

I trying to find an optimal way to make live streaming web client (html5 in ideal and without any plugins) for the video server, which at the moment supports rtsp/rtp h264 streaming and server push of jpeg.
After some investigating, I realised that the server should be modified. Now browsers don't support rtp but h264 is enable from the box. Server push is also not attractive because only safari and firefox support it, chrome doesn't.
Adding hls and smooth streaming to the server will not solve the problem due to known problems with hls on android.The only real cross browser working solution that I've seen is a js script at the client which is requesting jpeg by the timer. Looks a little bit awkward... Any suggestions?

HTML5 live "real-time" streaming audio (not from a file) [duplicate]

I'm building a web app that should play back an RTSP/RTP stream from a server http://lscube.org/projects/feng.
Does the HTML5 video/audio tag support the rtsp or rtp? If not, what would the easiest solution be? Perhaps drop down to a VLC plugin or something like that.
Technically 'Yes'
(but not really...)
HTML 5's <video> tag is protocol agnostic—it does not care. You place the protocol in the src attribute as part of the URL. E.g.:
<video src="rtp://myserver.com/path/to/stream">
Your browser does not support the VIDEO tag and/or RTP streams.
</video>
or maybe
<video src="http://myserver.com:1935/path/to/stream/myPlaylist.m3u8">
Your browser does not support the VIDEO tag and/or RTP streams.
</video>
That said, the implementation of the <video> tag is browser specific. Since it is early days for HTML 5, I expect frequently changing support (or lack of support).
From the W3C's HTML5 spec (The video element):
User agents may support any video and audio codecs and container formats
The spirit of the question, I think, was not truly answered. No, you cannot use a video tag to play rtsp streams as of now. The other answer regarding the link to Chromium guy's "never" is a bit misleading as the linked thread / answer is not directly referring to Chrome playing rtsp via the video tag. Read the entire linked thread, especially the comments at the very bottom and links to other threads.
The real answer is this: No, you cannot just put a video tag on an html 5 page and play rtsp. You need to use a Javascript library of some sort (unless you want to get into playing things with flash and silverlight players) to play streaming video. {IMHO} At the rate the html 5 video discussion and implementation is going, the various vendors of proprietary video standards are not interested in helping this move forward so don't count of the promised ease of use of the video tag unless the browser makers take it upon themselves to somehow solve the problem...again, not likely.{/IMHO}
This is an old qustion, but I had to do it myself recently and I achieved something working so (besides response like mine would save me some time):
Basically use ffmpeg to change the container to HLS, most of the IPCams stream h264 and some basic type of PCM, so use something like that:
ffmpeg -v info -i rtsp://ip:port/h264.sdp -c:v copy -c:a copy -bufsize 1835k -pix_fmt yuv420p -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 /var/www/html/test.m3u8
Then use video.js with HLS plugin This will play Live stream nicely There is also a jsfiddle example under second link).
Note: although this is not a native support it doesn't require anything extra on user frontend.
There are three streaming protocols / technology in HTML5:
Live streaming, low latency
- WebRTC
- Websocket
VOD and Live streaming, high latency
- HLS
1. WebRTC
In fact WebRTC is SRTP(secure RTP protocol).
Thus we can say that video tag supports RTP(SRTP) indirectly via WebRTC.
Therefore to get RTP stream on your Chrome, Firefox or another HTML5 browser, you need a WebRTC server which will deliver the SRTP stream to browser.
2. Websocket
It is TCP based, but with lower latency than HLS. Again you need a Websocket server.
3. HLS
Most popular high-latency streaming protocol for VOD(pre-recorded video).
Chrome will never implement support RTSP streaming.
At least, in the words of a Chromium developer here:
we're never going to add support for this
With VLC i'm able to transcode a live RTSP stream (mpeg4) to an HTTP stream in a OGG format (Vorbis/Theora). The quality is poor but the video work in Chrome 9.
I have also tested with a trancoding in WEBM (VP8) but it's don't seem to work (VLC have the option but i don't know if it's really implemented for now..)
The first to have a doc on this should notify us ;)
Chrome not implement support RTSP streaming.
An important project to check it WebRTC.
"WebRTC is a free, open project that provides browsers and mobile applications with Real-Time Communications (RTC) capabilities via simple APIs"
Supported Browsers:
Chrome, Firefox and Opera.
Supported Mobile Platforms:
Android and IOS
http://www.webrtc.org/
Years past, there are some updates about RTSP in H5:
RTSP is not supported in H5, neither PC nor mobile.
Flash is disabled in Chrome, see Adobe
MSE works good except iOS safari, for flv.js to play HTTP-FLV on H5, or hls.js to play HLS on H5.
WebRTC is also a possible way to play streaming in H5, especially in 0.2~1s latency scenarios.
Note: I think it's because RTSP use TCP signaling protocol to exchange SDP, which is not HTTP in H5 so it's really hard to support it, especially there is WebRTC now.
So, if you could transcode RTSP to other protocols, like HTTP-FLV/HLS/WebRTC, then you could use H5 to play the stream. Recommend to use FFmpeg to do the transcode:
ffmpeg -i "rtsp://user:password#ip" -c:v libx264 -f flv rtmp://server/live/stream
Start a RTMP server like SRS to accept the RTMP and transmux to HTTP-FLV, HLS and WebRTC:
./objs/srs -c conf/rtmp2rtc.conf
Then it's OK to play the stream by:
HLS by video or hls.js: http://server:8080/live/stream.m3u8
HTTP-FLV by flv.js: http://server:8080/live/stream.flv
WebRTC by H5 or native SDK: webrtc://server:1985/live/stream
Note that the latency of HLS is about 5~10s, LLHLS is better but not too much. The HTTP-FLV is about 1~3s, very similar to RTMP. And the WebRTC latency is about 0.2s, while if covert RTSP to RTMP to WebRTC the latency is about 0.8s.
My observations regarding the HTML 5 video tag and rtsp(rtp) streams are, that it only works with konqueror(KDE 4.4.1, Phonon-backend set to GStreamer). I got only video (no audio) with a H.264/AAC RTSP(RTP) stream.
The streams from http://media.esof2010.org/ didn't work with konqueror(KDE 4.4.1, Phonon-backend set to GStreamer).
Putting a conclusion as of now.
I am trying to build a way around it meaninglessly since rtsp doesn't work OOB. Without a "manager" handling the streaming to be perfected to the way a video tag works, it's not possible now.
I am currently working on something around android+html (hybrid) solution to manage this in a very wicked way. Since it is supposed to play directly from camera to android with no intermediary servers, we came up with a solution involving canvas tag to bridge the non-webview with the webview.

Streaming an audio file vs serving it statically

I have a website where users can upload audio files (of type aac). The users can playback their audio files through a web browser or mobile devices such as a iPhone or an Android. For web browsers, I would like to support the latest HTML5 audio tag and have a flash fallback for older browsers.
I did some research and mp3 looks like the best format for serving audio files to a web browser because some modern browsers support mp3's natively and for browsers that don't (FireFox) can fallback to flash. Once the user uploads an aac file I will create another version of the audio file as an mp3 that can be used to serve.
What is the best way to serve these audio files? Streaming or statically serving? Are there any advantages or disadvantages? Perhaps there is a flexible server technology. I know about icecast but I don't think it fit my specific use case.
Also I have a relational db which stores a link to each static audio file. I would like to use HTTP streaming and not a propriety protocol as well. Most importantly I would like to do this as efficiently as possible since bandwidth may get expensive.
Think that the streaming protocols supported by iDevices (iPhone, iPad, iPod) and Android phones, are not the sames. While iDevices support HTTP Streaming, Android phones only support RTSP protocol.
So, if you want to serve multiple devices with a streaming protocol, think you will have to use encoders/servers for each type (segmenter and a webserver for iDevices, RTSP server for Android).
In terms of efficience I don't think you will improve a lot, but using HTTP streaming you get others benefits like the possibility to use multibitrate files, that allow you to serve different encoded versions of the same audio to serve different audio qualities depending on the user<->server connection speed.
Implement HTTP Streaming is very cheap. In fact, you can use ffmpeg to encode the files and the free segmenter provided for Apple to do it. But, remember that won't work for Android devices.