How to analyse audio that comes in via HTML5 input? - html5-audio

How could I analyse the pitch that comes in via HTML5 audio input (for example Chrome can do this)? I'm talking about a user singing into his mikrophone and the application analyses the pitch level to tell him what note he sings (or how close he is to hitting that note).
Pocket Full Of HTML5 seems to be able to nicely analyse audio but it doesn't use live input.
Do you have any tips for me?

Related

System Sound as Input (Web Audio API)

Is there any way to take the system's sound as an input for the Web Audio API?
For example take a user's spotify music that is playing as an input and analyze it to do stuff with it?
There is a specification which almost does what you want. It's the counterpart of the Media Capture and Streams spec (aka getUserMedia()).
navigator.mediaDevices.getDisplayMedia();
The spec is called Screen Capture and as the name implies it is meant for capturing video and not audio. Capturing audio is optional and not yet supported by any browser if I recall correctly.
There is currently an open issue discussing audio-only support. Maybe it's a good idea to share your use case there.

On play preloading video

How to do:
Pre loading video on play
Change resolution
Change language
In multiple platforms I see these techniques used but I can't understand which api they use to do these things does anyone have an idea?
(Sorry for the ignorance I'm Junior developer)
If you are using HTML5 on a browser then there are a set of events which the HTML5 video playback generates at different stages and which you can register to monitor and then take whatever action you want - for example change language etc.
There is a really nice illustration of the various events and their values when a vide plays (at the time of writing) here:
https://www.w3.org/2010/05/video/mediaevents.html
It's worth noting that the type of features you are describing are usually built into a HTMl5 video player if it is on a website or a native player if it is on Mobile.
For example on a browser the open source DASH if player, https://reference.dashif.org/dash.js/latest/samples/dash-if-reference-player/index.html, will change resolutions for ABR streams and provide the user controls to change it manually and to set the langage.
On Android the open source ExoPlayer, https://github.com/google/ExoPlayer, provides similar functionality, and on iOS the built in media library https://developer.apple.com/documentation/avfoundation?language=objc, provides similar functionality.
To understand ABR streams and how and why they change resolutions this answer may help: (https://stackoverflow.com/a/42365034/334402)

Tech for navigable audio recording database, from IIS to browser

In short, I want to record presentation audio, create time markers for that audio in a database, and then provide marker-navigation of that audio content from a web page. What technology (e.g. HTML5 Audio, RTMP) can support this?
My requirements in more detail:
quickly navigate to server-side marked points in server-stored audio, from the browser.
avoid any proprietary client-side software, such as Silverlight; although due to penetration Flash is acceptable, and future-looking standards like HTML5 media are acceptable, provided it ships with the latest browsers.
prefer to leave the 30-50 minute audio files un-split rather than pre-splitting on the selected markers; so that the markers can be seamlessly changed later.
like to keep licensing cost minimal; although single-purchase licensed server-side technologies are fine.
prefer to do most of this from IIS, where I have most experience. however, a parallel streaming server such as Adobe with Windows APIs is acceptable.
Here are my best guesses on a solution so-far:
the presentations will be compressed and stored in mp3 files (but really, any advice on an easily seekable format for speech recording is welcome).
the client will play a unicast stream rather than download file chunks (although TBR, below, challenges this assumption)
HTML5 is not ready, so flash will be required at the client
IIS Media Services is no-go as it requires Silverlight for seekable audio
The leading products in this space, such as Adobe Media Server 5, are probably large kits with their focus on video media. I can probably find a more focused tool like Icecast to reliably do what I need.
OK, I'll bite. I went and looked for how people actually address this, and it's how I said it is in my comment.
Setting HTML5 audio position (it's so close it almost makes this question a dupe)
I've also found this nice blog post from 2009 that describes the further technical possibilities and options. The latter part revolves around advanced video use cases and Ogg, but the first part should apply just fine to <audio>.
http://gingertech.net/2009/08/19/jumping-to-time-offsets-in-videos/

Does web based radio and audio streaming services use the Web Audio API for playback?

I'm trying to figure out if web based audio streaming sites use the Web Audio API for playback or if they rely on the audio element or something else.
Since the user of an audio streaming service typically doesn't need more functionality than starting and stopping the audio, then I guess that the audio element is enough. If a VU-meter is required then I would guess the Web Audio API would be used since it has an built in analyser node. But since IE doesn't support the API then I suppose you'd rather use the audio element and reach the IE users than using fancy extras such as an VU-meter.
I've been looking at the source code for Spotifys web player, Grooveshark, BBC radio and the Polish public radio but I find neither audio elements or use of the Web Audio API. I did find that the Swedish public radio (sr.se) makes use of the audio element though.
I'm not asking for anyone to go through the JavaScript source code for me, but rather if someone who is familiar with the subject could point me in the right direction.
I don't know of any internet radio services playing back their streams with the Web Audio API currently, but I wouldn't be surprised to find one. I've been working on one myself using Audiocog's excellent Aurora.js library, which enables codecs in-browser that wouldn't normally be available, by decoding the audio with JavaScript. However, for compatibility reasons as you have pointed out, this would be considered a bit experimental today.
Most internet radio stations use progressive HTTP streaming (SHOUTcast/Icecast style) which can be played back within an <audio> element or Flash. This works well but can be hard to get right, especially if you use SHOUTcast servers as they are not quite 100% compatible with HTTP, hurting browser support in some versions of Firefox and a lot of mobile browsers. I ended up writing my own server called AudioPump Server to get better browser and mobile browser support with HTTP progressive.
Depending on your Flash code and ActionScript version available, you might also have to deal with memory leaks in creative ways, since by default Flash will keep all of your stream data in memory indefinitely as it was never built to stream over HTTP. Many use RTMP with Flash (with Wowza or similar on the server), which Flash was built to stream with to get around this problem.
iOS supports HLS which is basically a collection of static files served by an HTTP server. The encoder writes a chunk of the stream to each file as the encoding occurs, and the client just downloads them and plays them back seamlessly. The benefit here is that the client can choose a bitrate to stream and, raising quality up and down as network conditions change. This also means that you can completely switch networks (say from WiFi to 3G) and still maintain the stream since chunks are downloaded independently and statelessly. Android "supports" HLS, but it is buggy. Safari is the only browser currently supporting HLS.
Compatibility detection is not something you need to solve yourself. There are many players, such as jPlayer and JW Player which wrangle HTML5 audio support detection, codec support detection, and provide a common API between playback for HTML5 audio and Flash. They also provide an optional UI if you want to get up and running quickly.
Finally, most stations do offer a link to allow you to play the stream in your own media player. This is done by linking to a playlist file (usually M3U or PLS) which is downloaded and often immediately opened (as configured by the user and their browser). The player software loads this playlist and then connects directly to the streaming server to begin playback. On Android, you simply link to the stream URL. It will detect the Content-Type response header, disconnect, and open its configured media player for playback. These days you have to hunt to find these direct links, but they are out there.
If you ever want to know what a station is using without digging around in their compiled and minified source code, simply use a tool like Fiddler or Wireshark and watch the traffic. You will find that it is very straightforward under the hood.
We use Web Audio for streaming via Aurora.js using a protocol very similar to HTTP Live Streaming. We did this because we wanted the same streaming backend to serve iPhone, Android and the web.
It was all a very long and painful process that took over 6 months of effort, but now that its all finished, its all good.
Have a look at http://radioflote.com and feel free to shoot questions or clarifications regarding anything. Go ahead and disassemble the code if you want to. Not a problem.

Capture image from camera into form or html5 canvas

I need to capture an image from a webcam (tethered camera, etc.) into a form or html5 canvas so that I can save the image to the server. Also, I would like to be able to preview the image live in the page.
For example, I have a browser running at a registration check in station. I would like to take a picture of the attendee currently standing in front of the table, and submit that image into the database. Then I can use that image to print the attendee's badge with their picture on it.
I'm using rails and paperclip, though I doubt that matters.
Anyone done this before, or have some ideas how to do it?
There is a plugin for jQuery entitled 'jQuery Webcam Plugin' that provides a friendly and easy way to interact with a webcam. It actually relies on a small flash component (unfortunately), but it does a great job of making the interaction easy - as well as providing functionality to copy imagery direct into an HTML5 canvas.
Again, it's unfortunate that it relies on Flash, but I think any reliable solution is going to need flash at this point in time.
The plugin is available here: http://www.xarg.org/project/jquery-webcam-plugin/
At present, if you want to interact with a web cam from a web page you need to look at using a plug in. Flash has a mature interface to web cams, so it would be my first choice of tool.
There used to be a spec for native web cam support in HTML 5, but it has been spun out into its own, independent, specification. Currently there is no browser support for it outside of experimental Opera builds.
Android >=3.0 (on plenty of tablets and one phone soon) is supposed to support this. Searching for "html media capture" and "device api" will get you a lot more information.
On the not-even-alpha bleeding edge side, there are things like webrtc and the mozilla rainbow plugin.