I am looking into implementing live audio streaming from a vxWorks system into an HTML5 audio player. I have been researching for a while but am stuck on one key step.
Work Flow:
A host has a mic and speaks into the mic
Audio is received into the vxWorks OS. Here it can be encoded, packaged - anything is possible here
????
User opens web page to listen to the live audio in an HTML5 player.
I don't know what goes in step 3. Suppose I have now encoded the audio into Mp3. What technologies do I need to send this to the browser? I believe I can send this through HTTP procotol, but I am not understanding how that is packaged. Ie, how is audio packaged into an HTTP protocol. And what does the HTML5 player want as a source for this data. A URL? Or websocket data?
Thanks.
The solution for #3 depends on whether you need a low-latency live audio stream at the HTML5 player, or latency can be 10-30 seconds.
1. Latency can be high
You need a streaming server / HLS packager that will stream via HLS, and a webpage that hosts Flowplayer or JWPlayer which will play that HLS stream using HTML5 video tag. AAC-encoded (not mp3!) audio stream needs to be pushed (RTMP publishing is a most popular method) to that streaming server / HLS packager.
You could go with free nginx; it is portable to vxWorks and can stream out via HLS. You could also try free VLC or ffmpeg for the same.
If you make it work, the stream will be playable on any browser on any device - iOS, Android, Windows OS etc...
2. You need low latency
This is much harder. You would need another machine running Linux or Windows OS.
On that machine install Unreal Media Server or Evostream server - these servers stream to HTML5 players via Websocket protocol using ISO BMFF packaging.
Same as before, AAC-encoded (not mp3!) audio stream needs to be pushed (RTMP publishing is a most popular method) to that streaming server.
These streams will play on any browser / any OS EXCEPT iOS!
Related
I'm having H264 real-time video stream issuing by gst-rtp-server. Moreover, there is possibility to use an augmented FEC stream from the server to improve performance in noisy environment (like WiFi). FEC works on RTP layer. So, on a client side these two RTP streams must be combined into a final one.
Using GStreamer on a client side inside a dedicated native app works perfectly. But, instead of such native app I'm also considering a modern HTML5 Web browser to receive and render the video stream.
So, formal question: Is it possible to get raw RTP video stream from a modern browser somehow? I need to support iOS, Android, as well as main desktop systems.
Currently, I'm considering GSreamer-based preprocessing on a client side - standalone tiny GStreamer-based service (native GUI-less app) will be activated from a webpage and will perform RTP and FEC-based processing, depaying from RTP and paying to something that HTML5 supports. That new stream then will be issued from the localhost to HTML5's 'video' tag on the webpage.
Alternatively, such GStreamer-based service may be implemented as a NPAPI plugin, but nowdays NPAPI is deprecated way and might be not supported at all.
Any other ideas?
Thanks.
Now that FireFox has dropped their support for plugins, I have lost a huge feature on my website that integrated a VLC plugin for audio and for video from UDP data.
What are the best options (sticking to FireFox only is completely acceptable for my customer base) to get both audio and video data (as separate modules) into a browser from a UDP stream?
I was hoping Flash was a solution but it looks like their udp datagram api is not supported on the browser. VLC was the most convenient but now it is dead...
I've got a website and I've been looking for ways to embed a 24/7 webcast. I've looked at options such as Ustream and Justin.TV however, these do not work on mobile devices, which is what I really need.
I don't have that much knowledge on how streaming works but I've read that the streaming Engine Wowza is another option. I also found that HTML 5 player works cross platform and on any mobile device aswell.
If I were to use Wowza would it work with HTML 5 player? And am I even going in the right path with how I can do this. I also have a home dedicated server for streaming to a cloud wouldn't be required.
I'm very amateur just trying to broadcast my television program on my website for viewing. Any advice would help here. Thanks
Wowza can packetize video as http live streaming (HLS) which, although an Apple invention, works on most HTML5-capable browsers except IE11: http://www.jwplayer.com/html5/hls/ . Many players will fall back to using Flash for browsers which don't support native HLS or H.264 encoding. Flash uses http dynamic streaming (HDS) rather than HLS, so you would add that as another packetizer in wowza. (Wowza calls these packetizers "cupertinostreamingpacketizer" and "sanjosestreamingpacketizer" respectively.)
You would then point your preferred HTML5 video player (jwplayer, flowplayer, etc) at the URL http:// your-wowza-server.com:1935/live/yourstreamname/playlist.m3u8 [1]. For Flash fallback in flowplayer you can use the f4m resolver and the http-streaming plugin, as in the first example here, to access the subtly different URL http:// your-wowza-server.com:1935/live/yourstreamname/manifest.f4m. I'm sure something similar applies in players like jwplayer and others.
The main problem with Wowza is how much it costs: for your own server you're looking at around $55 per month per channel [2]. At least during testing, you may find it cheaper to get Wowza on Amazon EC2 devpay: $5/month rental plus an extra couple of cents per hour on your normal EC2 instance costs.
[1] Assuming you're using Wowza's default /live/ application on port 1935
[2] A channel is roughly the number of streams you're sending to the server to be re-broadcast
We developped a custom HTML5 player which we wanted to make compatible with HLS and fragmented mp4 for LIVE events. We started on Zencoder but realize they were not able to do genrate fragmented mp4.
I would like to explore the flash fallback solution and the wowza( probably on AWS) for the packaging.
Would you be available to consult on this project?
We use www.bitcodin.com for event-based or 24/7 live transcoding and streaming. It generates DASH - which can be playback natively in HTML5 using the bitdash MPEG-DASH players - as well as HLS for iOS devices. You can find an example here: http://www.dash-player.com/demo/live-streaming/
There are solutions for HTML5 to get audio stream by using <audio> or <video> tag. Can I have the step reverse? What if I stream to the server by using getUserMedia() and websocket?
It seems that it is not simple as I cannot get the byte stream directly. Is it possible indeed? If possible, how to put the audio stream to server by websocket ws.send()?
Thanks.
Websockets is not a protocol for streaming. However you can achieve your task by using webrtc. It will be much easier doing webrtc p2p than client to server but it can be done. You can actually stream both audio and video to a server from html5 (webrtc).
Look at the webrtc specifications and then implement the ICE, TURN on your server to get the negotiation running. You will then be able to recieve the streams from several browsers to your server.
Not easy... But it can be done :)
What is the best solution to display multiple live streams from a surveillance camera (so low latency is a requirement) in a web application (VideoWall-like)?
Personally I'm thinking about two possible solutions, but I can't choose between them:
1) Develop a custom Firefox plugin that uses ffmpeg to acquire and decode the video streams
2) Rely on HTML5 inserting a layer between the cameras and the web application that transcodes/restreams the video streams (probably using http live streaming)
The requirements are compatibility with H.264 over RTSP and Mpeg-4 over RTP and, of course, low latency and no loss on video quality.
Thanks
Andrea