Streaming an audio file vs serving it statically - html

I have a website where users can upload audio files (of type aac). The users can playback their audio files through a web browser or mobile devices such as a iPhone or an Android. For web browsers, I would like to support the latest HTML5 audio tag and have a flash fallback for older browsers.
I did some research and mp3 looks like the best format for serving audio files to a web browser because some modern browsers support mp3's natively and for browsers that don't (FireFox) can fallback to flash. Once the user uploads an aac file I will create another version of the audio file as an mp3 that can be used to serve.
What is the best way to serve these audio files? Streaming or statically serving? Are there any advantages or disadvantages? Perhaps there is a flexible server technology. I know about icecast but I don't think it fit my specific use case.
Also I have a relational db which stores a link to each static audio file. I would like to use HTTP streaming and not a propriety protocol as well. Most importantly I would like to do this as efficiently as possible since bandwidth may get expensive.

Think that the streaming protocols supported by iDevices (iPhone, iPad, iPod) and Android phones, are not the sames. While iDevices support HTTP Streaming, Android phones only support RTSP protocol.
So, if you want to serve multiple devices with a streaming protocol, think you will have to use encoders/servers for each type (segmenter and a webserver for iDevices, RTSP server for Android).
In terms of efficience I don't think you will improve a lot, but using HTTP streaming you get others benefits like the possibility to use multibitrate files, that allow you to serve different encoded versions of the same audio to serve different audio qualities depending on the user<->server connection speed.
Implement HTTP Streaming is very cheap. In fact, you can use ffmpeg to encode the files and the free segmenter provided for Apple to do it. But, remember that won't work for Android devices.

Related

HTML5 video player showing to enable flash in browser. How can I play stream video without enable the flash?

I made a streaming server and a website to show the Video. I have tried with many HTML5 player. But problem is no player working without enable flash on browser. There are a website http://jagobd.com and its playing video even I block flash on this site. How they did it? and How can I get this kind of player open source? could you please give me any solution?
My streaming link is Rtmp
RTMP is a Flash technology, and only plays in Flash or other players that support it. No browser supports RTMP, and it's unlikely that any will in the future.
If you want to use a regular HTML5 player, you need to use a compatible streaming format. Consider DASH. While it doesn't have native support in-browser, it doesn't need it as it can be handled with MediaSource Extensions. Most modern browsers support MSE. Many encoders do as well, and you can use whatever static web hosting or CDN you want.
There are other options for video distribution as well, if you have special streaming requirements.

How to stream mkv file using html/javascript?

Is there any way to stream mkv files on webpage using javascript/html or any other technologies? I found many questions about this, but I really want to know the answer - is this possible in any way? Maybe ajax, javascript, php, html? Maybe some external libraries? Anything?
I was wondering how youtube works. Is there possible to upload mkv file? Is so, how that videos are streaming to end user?
I know that browser doesn't support mkv natively, but maybe is some way to forcing html to do that?
Any help will be appreciated.
YouTube most probably works using the DASH protocol format. On the server side, the source audio and source video are separately divided into segments of different bitrate/quality. A manifest keeps an index of all possible segments values and their location. This allows to switch quality during playback in the player.
On the client side, the DASH (should be the same with the other main technology: HLS) manifest is used by the player to locate the segments to load in order to feed the content in two separate SourceBuffer, one for audio and one for video and both are being played synchronously together in the MediaSource. For an example player that handles this, see the Shaka Player developed by Google.
Conclusion, there is no need to use a container like mkv but each channel (video, audio) needs to point to a browser supported codec encoded segments.
You don't need anything special for streaming pre-recorded media files. A normal HTTP/1.1 or HTTP/2 server will work just fine. The browser is generally capable of seeking into the file using range requests.
Matroska (MKV) is a container format, and it actually is widely supported because it's basically the same as WebM. WebM is a subset of Matroska... the key differences being that there are suggested codecs for use. (Matroska itself supports almost anything.)
Your audio and video tracks in the file can use a variety of codecs... the key is to use codecs compatible with browsers. Opus for audio and VP8 for video will take you far.
From there, simply reference your video file in a <video> tag.

How to embed RTMP live-stream?

I want to embed an RTMP Live Stream in a HTML document. I want to use HTML5 instead of flash (That it can work under *nix/osx/mobile devices).
How can I do this? Do I need to use 3rd party libraries? When yes: Can you recommend one?
I've found an answer on StackOverflow but it wasn't very helpful. Since the answer was from 2011 I guess it's okay to ask this question again.
RTMP was designed for Flash and works with Flash. I'm not aware of a way to embed it in HTML5 without a Flash engine.
Considering the above you could:
write or find a specialized player that can talk to a RTMP server and
play the stream without Flash, but this beats your intention of
embedding the video in a web page
or
create two streams based on the same source for each target device. This can be achieved by transcoding the source material in multiple formats or live transcoding and re-streaming of the RTMP source. You could use HLS as an alternative protocol which is supported on a greater number of platforms, even if it has its hiccups with certain versions of Android (especially 4.4.3 and 4.4.4)
There are paid and freeware solutions for RTMP re-streaming, like Nimble or Wowza Streaming Engine to name a few.

Does web based radio and audio streaming services use the Web Audio API for playback?

I'm trying to figure out if web based audio streaming sites use the Web Audio API for playback or if they rely on the audio element or something else.
Since the user of an audio streaming service typically doesn't need more functionality than starting and stopping the audio, then I guess that the audio element is enough. If a VU-meter is required then I would guess the Web Audio API would be used since it has an built in analyser node. But since IE doesn't support the API then I suppose you'd rather use the audio element and reach the IE users than using fancy extras such as an VU-meter.
I've been looking at the source code for Spotifys web player, Grooveshark, BBC radio and the Polish public radio but I find neither audio elements or use of the Web Audio API. I did find that the Swedish public radio (sr.se) makes use of the audio element though.
I'm not asking for anyone to go through the JavaScript source code for me, but rather if someone who is familiar with the subject could point me in the right direction.
I don't know of any internet radio services playing back their streams with the Web Audio API currently, but I wouldn't be surprised to find one. I've been working on one myself using Audiocog's excellent Aurora.js library, which enables codecs in-browser that wouldn't normally be available, by decoding the audio with JavaScript. However, for compatibility reasons as you have pointed out, this would be considered a bit experimental today.
Most internet radio stations use progressive HTTP streaming (SHOUTcast/Icecast style) which can be played back within an <audio> element or Flash. This works well but can be hard to get right, especially if you use SHOUTcast servers as they are not quite 100% compatible with HTTP, hurting browser support in some versions of Firefox and a lot of mobile browsers. I ended up writing my own server called AudioPump Server to get better browser and mobile browser support with HTTP progressive.
Depending on your Flash code and ActionScript version available, you might also have to deal with memory leaks in creative ways, since by default Flash will keep all of your stream data in memory indefinitely as it was never built to stream over HTTP. Many use RTMP with Flash (with Wowza or similar on the server), which Flash was built to stream with to get around this problem.
iOS supports HLS which is basically a collection of static files served by an HTTP server. The encoder writes a chunk of the stream to each file as the encoding occurs, and the client just downloads them and plays them back seamlessly. The benefit here is that the client can choose a bitrate to stream and, raising quality up and down as network conditions change. This also means that you can completely switch networks (say from WiFi to 3G) and still maintain the stream since chunks are downloaded independently and statelessly. Android "supports" HLS, but it is buggy. Safari is the only browser currently supporting HLS.
Compatibility detection is not something you need to solve yourself. There are many players, such as jPlayer and JW Player which wrangle HTML5 audio support detection, codec support detection, and provide a common API between playback for HTML5 audio and Flash. They also provide an optional UI if you want to get up and running quickly.
Finally, most stations do offer a link to allow you to play the stream in your own media player. This is done by linking to a playlist file (usually M3U or PLS) which is downloaded and often immediately opened (as configured by the user and their browser). The player software loads this playlist and then connects directly to the streaming server to begin playback. On Android, you simply link to the stream URL. It will detect the Content-Type response header, disconnect, and open its configured media player for playback. These days you have to hunt to find these direct links, but they are out there.
If you ever want to know what a station is using without digging around in their compiled and minified source code, simply use a tool like Fiddler or Wireshark and watch the traffic. You will find that it is very straightforward under the hood.
We use Web Audio for streaming via Aurora.js using a protocol very similar to HTTP Live Streaming. We did this because we wanted the same streaming backend to serve iPhone, Android and the web.
It was all a very long and painful process that took over 6 months of effort, but now that its all finished, its all good.
Have a look at http://radioflote.com and feel free to shoot questions or clarifications regarding anything. Go ahead and disassemble the code if you want to. Not a problem.

How to do cross browser/device Audio capture

I would like to clarify certain things with what I found and raise certain questions with things that I dont know,
Capturing cam/mic through browser could be done through getusermedia();
Is there anything for i devices? because getusermedia() doesn't seem to work in i devices
How could I trap actual audio from web browser application (eg. if I play an audio file and forward it 2mins, I would like to capture the actual audio stream from the html5 player so that I hold the actual audio data)
You need to use Flash, if you are not going to support mobile devices. One best solution is to use wami-recorder.
From their website:
The Problem
As of this writing, most browsers still do not support WebRTC's getUserMedia(), which promises to give web developers microphone access via Javascript. This project achieves the next best thing for browsers that support Flash. Using the WAMI recorder, you can collect audio on your server without installing any proprietary media server software.
The Solution
The WAMI recorder uses a light-weight Flash app to ship audio from client to server via a standard HTTP POST. Apart from the security settings to allow microphone access, the entire interface can be constructed in HTML and Javascript.
Hope this helps.