Buffering Audio Streams in Windows Phone 8 - windows-phone-8

I am working on a Internet radio streaming application for Windows Phone. I am researching the best-practices and different implementations. I am using the following code to read the stream:
private MemoryStream bufferStream;
private Stream stream;
...
...
...
byte[] data = new byte[2048];
int read;
while (true)
{
read = stream.Read(data, 0, data.Length);
BinaryWriter bw = new BinaryWriter(bufferStream);
bw.Write(data);
bw.Flush();
}
I am not sure whether this is an efficient way. Also, I saw an circular buffer implementation. What is the most-efficient way to stream the music without any "hiccups", "artifacts" and interruptions?

I found phonesm project on codeplex that provides great examples and functionality for implementing internet audio streaming.
Have a look over here too:
http://www.c-sharpcorner.com/uploadfile/dhananjaycoder/smooth-streaming-on-windows-phone-7/
Hope this helps!

Related

Clap sound detection in libgdx

I am just curious. I am making a very simple game and I want to include the game logic when the user claps (like do the loud sound). Is it possible in libgdx?
There is AudioRecorder interface in libgdx for the same. You can access PCM data from the microphone on a PC or Android phone.
int samples = 44100;
boolean isMono = true;
AudioRecorder recorder = Gdx.audio.newAudioRecorder(samples, isMono);
This will create an AudioRecorder with a sampling rate of 44.1khz, in mono mode. If the recorder couldn't be created, a GdxRuntimeException will be thrown.
Samples can be read as 16-bit signed PCM:
int seconds = 5;
final short[] data = new short[samples * seconds];
recorder.readSamples(data, 0, data.length);
Audio recording is not supported in the JavaScript/WebGL backend.

Unauthorized Access Exception when Creating an instance of SpeechSynthesizer in WP8.1 Emulator

I was trying to recreate the simle Text to Speech example used on the MSDN website. However whenever the code came to create the instance of the SpeechSynthesizer class it failed with a Unauthorised Acception error when running on the WP8.1 emulator. I currently do not have an actual device to test on to see if this makes a difference.
My code was simply:
private async void TTS()
{
// The media object for controlling and playing audio.
MediaElement mediaElement = new MediaElement();
// The object for controlling the speech synthesis engine (voice).
var synth = new Windows.Media.SpeechSynthesis.SpeechSynthesizer();
// Generate the audio stream from plain text.
SpeechSynthesisStream stream = await synth.SynthesizeTextToStreamAsync("Hello World");
// Send the stream to the media object.
mediaElement.SetSource(stream, stream.ContentType);
mediaElement.Play();
}
I know there was an issue with the SpeechSynthesizer in Windows 8.1, and I found solutions to this when looking to fix the problem, but found little about the problem with WP8.1 SpeechSynthesizer. Has anybody else came across this problem and found a fix?
You should add one DeviceCapability in Package.appxmanifest file:
In DeviceCapability Tab, check the microphone, because it will provides access to the microphone’s audio feed, which allows the app to record audio from connected microphones.
Look at this library: App capability declarations (Windows Runtime apps)

Play live audio stream - html5

I have a desktop application which streams raw PCM data to my browser over a websocket connection. The stream looks like this ...\\x00\\x00\\x02\\x00\\x01\\x00\\x00\\x00\\x01\\x00\\xff\\xff\\xff\\xff\\....
The question is simple: can I play such a stream in HTML with the Web Audio API / WebRTC / ...?
Any suggestions are very welcome!
code edit
This code plays noise, randomly generated:
function myPCMSource() {
return Math.random() * 2 - 3;
}
var audioContext;
try {
window.AudioContext = window.AudioContext || window.webkitAudioContext;
audioContext = new AudioContext();
} catch(e) {
alert('Web Audio API is not supported in this browser');
}
var bufferSize = 4096;
var myPCMProcessingNode = audioContext.createScriptProcessor(bufferSize, 1, 1);
myPCMProcessingNode.onaudioprocess = function(e) {
var output = e.outputBuffer.getChannelData(0);
for (var i = 0; i < bufferSize; i++) {
output[i] = myPCMSource();
}
}
So changing the myPCMSource() to the websocket stream input, should make it work somehow. But it doesn't. I don't get any errors, but the API is not playing any sound nor noise.
Use a ScriptProcessorNode, but be aware that if there is too much load on the main thread (the thread that runs your javascript, draws the screen, etc.), it will glitch.
Also, your PCM stream is probably in int16, and the Web Audio API works in terms of float32. Convert it like so:
output_float[i] = (input_int16[i] / 32767);
that is, go from a [0; 65535] range to a [-1.0; 1.0] range.
EDIT
I was using output_float[i] = (input_int16[i] / 32767 - 1);, this article shows that you should use output_float[i] = (input_int16[i] / 32767);. Now it's working fine!
Just for the record, the ScriptProcessorNode is deprecated. See the MDN article for details. The feature was replaced by AudioWorklets and the AudioWorkletNode interface.
In short, a ScriptProcessorNode runs outside of the browser's internal audio thread, which creates at least on frame (128 samples) of latency. Worse, the ScriptProcessorNode often fails to respond quickly enough, if the thread is busy, so will just randomly drop the ball every so often.
Worklets are basically task-specific workers that run in one of the browsers internal threads (paint, layout, audio etc). Audio worklets run in the audio thread, and implement the guts of custom audio nodes, which are then exposed through the WebAudio API as normal.
Note: You are also able to run WebAssembly inside worklets to handle the processing.
The solution provided above is still useful, as the basic idea holds, but it would ideally use an audio worklet.

create (webcam capture like youtube) in my website

I have a website working the same as youtube. At this moment I am trying to create a video image captured by WEBCAM.
The video image should be saved on my computer (by FLV format) first and then if the user is satisfied, he or she can upload it on the server
I am trying to use Actionscript3 in Adobe flash CS5 and Flash media server4
1- How can I do that?
2- Is the flash media server needed?
Please pay attention that we would like to allow the user to save video on his/her computer and then be able to uploaded to the server.
Many thanks.
Assuming the computer can take the overhead of doing the encoding on the fly (or has enough memory to buffer the data then can run the data through an encoding process) then the library mentioned in the SO answer here should work:
Encode video from any format to .flv format in AS3
I believe the Flash media server would only really be necessary in this case for broadcast.
Pseudocode example
private var cam:Camera;
public function Whatever()
{
//In constructor
addEventListener(Event.ENTER_FRAME, grabFrame);
cam = Camera.getCamera();
if (cam != null)
{
var vid:Video = new Video(cam.width, cam.height);
vid.attachCamera(cam);
addChild(vid);
}
}
private function grabFrame(event:Event):void
{
var bd:BitmapData = new BitmapData(cam.width, cam.height)
bd.draw(vid);
//now the BitmapData has a frame of the video, at this point you also
//would want to capture the audio then use the FLV class in the library
}
You can also check out using Red5 as an alternative open source video stream recorder.
http://distriqt.com/post/493
Cheers

How to load a ByteArray FLV in OSMF?

I'm working on a local application ( it's not a website or nothing related ) and I have various FLVs with a very simple encryptation method by now (just like adding 10 at each byte).
I can load/play them using NetStream.appendBytes() after decrypting, but that happens only after I read all video data it's not streamed.
What I really need is to stream those videos from a remote url, and decrypting while receiving data, using a OSMF based player that I already have built.
I'm lost on how OSMF deals with FLV, otherwise, I would try to create a plugin or something like.
I'd be very thankful if someone point me how to deal with that.
But I'd be happy if someone help me to find a way to load a local file using OSMF, passing a ByteArray value, instead of a url (below). Or even giving me directions to create a OSMF plugin to solve my problem.
videoElement.resource = "video_url/video.flv";
This is my current code just to play my decoded FLV byte array
private function playBytes(bytes:ByteArray):void
{
// detecting it's header
if (bytes.readUTFBytes(3) != "FLV")
{
_text.appendText("\nFile \""+ file +"\" is not a FLV")
return void;
}
bytes.position = 0;
netConnection.connect(null);
netStream = new NetStream(netConnection);
netStream.client = { onMetaData:function(obj:Object):void { } }
video.attachNetStream(netStream);
addChild(video);
// put the NetStream class into Data Generation mode
netStream.play(null);
// before appending new bytes, reset the position to the beginning
netStream.appendBytesAction(NetStreamAppendBytesAction.RESET_BEGIN);
// append the FLV video bytes
netStream.appendBytes(bytes);
}
Interesting post, I'd be interested to see the answer. Looking at something similar myself, though not with a stream, I came across the following.
http://ntt.cc/2008/07/15/bitsreader-read-bits-from-given-bytearray.html
After passing the byte array you can use bits.read(8) of a 10 bit array. Perhaps this would send you down the correct path? Otherwise, I'm thinking you'd need to break it apart and essentially do smaller sections to buffer in order to concatenate all the buffered data...
Just a thought,