WinRT MediaElement not working with InMemoryRandomAccessStream - windows-runtime

We loaded video as bytes array, created InMemoryRandomAccessStream over this array and tried to MediaElement.SetSource. In UI we have message on MediaElement - Invalid Source. We tried to save this stream to file and read new stream from this file - works perfectly. Both stream are the identical (we check it using SequenceEqual).
What is the problem?
Part of our code:
var stream = await LoadStream();
mediaElement.SetSource(stream , #"video/mp4");
...
public async Task<IRandomAccessStream> LoadStream()
{
...
var writeStream = part.ParentFile.AccessStream.AsStreamForWrite();
foreach (var filePart in part.ParentFile.Parts)
{
writeStream.Write(filePart.Bytes, 0, filePart.Bytes.Length);
}
writeStream.Seek(0, SeekOrigin.Begin);
return part.ParentFile.AccessStream;
}
P.S - the mime-type is correct for sure
Thanks!

Related

IO_Error opening file using Adobe Air App on a Mac

I have an Adobe Air Application that would open an csv file on Mac. I am getting an error which I am not getting with the same app for Windows, so I am thinking something about file locations, etc is amiss. Here is the code:
var file: File = File.applicationStorageDirectory;
file = file.resolvePath("A&P plans");
file.addEventListener(FileListEvent.SELECT_MULTIPLE, filesSelected);
function filesSelected(event: FileListEvent): void {
//trace(event.files.length);
fileList = new Array();
fileNames = new Array();
for (var i: uint = 0; i < event.files.length; i++) {
fileList.push(event.files[i].nativePath);
trace("name of file loaded is ", event.files[i].name, "where :", event.files[i].nativePath);
}
}
var urlRequest: URLRequest = new URLRequest(fileList[0]);
function openCSVFile():void{
csv = new CSV(urlRequest);
csv.addEventListener(Event.COMPLETE,onComplete);
csv.addEventListener(IOErrorEvent.IO_ERROR, onErrorOpening);
function onComplete(event:Event):void{
trace("file open successful");
}
function onErrorOpening(event:IOErrorEvent):void{
trace ("error opening file");
}
}
The URLRequest trace shows the location where it should be, so the app knows where to look, and it does find the file. Here is the result of the trace :/Applications/myApp.app/Contents/Resources/A&P plans/majors/NationalLeague.csv. Yet, instead of the completeEvent showing the trace, the errorEvent is inovked. Any ideas where to look for the issue? The file does not have any weird characters in its name or anything. Tracing the error shows the following: Error #2032: Stream Error. Thanks
Quite possibly the ampersand and the space in 'A&P plans' might be troublesome.
Try removing/changing those.

getUserMedia and mediarecorder in html

I record audio via getUserMedia and mediarecorder:
...
navigator.mediaDevices.getUserMedia(constraints).then(mediaStream => {
try {
const mediaRecorder = new MediaRecorder(mediaStream);
mediaRecorder.ondataavailable = vm.mediaDataAvailable;
mediaRecorder.start(1000);
...
When I receive the chucks in the callback, I send them to a web api via websockets, which simply writes the parts one after another to a file:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.websocket.SendBlob(e.data);
...
The file, which is created on the webserver side in the webapi (lets call it "server.webm"), does not work correct. More exactly: It plays the first n seconds (n is the time I chose for the start command), the it stops. This means, the first chunk is transferred correctly, but it seams that adding the 2nd, 3rd, ... chuck together to a file does not work. If I push the chuncks in the web page on an array and the to a file, the resulting file (lets call it "client.webm") is working the whole recording duration.
Creating file on web client side:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.chuncks.push(e.data);
...
stopCapturing() {
var blob = new Blob(this.chuncks, {type: 'audio/webm'});
var url = window.URL.createObjectURL(blob);
var a = document.createElement('a');
a.style.display = 'none';
a.href = url;
a.download = 'client.webm';
document.body.appendChild(a);
a.click();
I compared the files client.webm and server.webm. They are quite similar, but there are certain parts in the server.webm which are not in the client.webm.
Anybody any ideas? Server code looks like the following:
private async Task Echo( HttpContext con, WebSocket webSocket)
{
System.IO.BinaryWriter Writer = new System.IO.BinaryWriter(System.IO.File.OpenWrite(#"server.webm"));
var buffer = new byte[1024 * 4];
WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
while (!result.CloseStatus.HasValue)
{
Writer.Write(buffer);
Writer.Flush();
await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);
result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
}
await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
Writer.Close();
}
resolved, I write all bytes of the reserved byte array to file in server code, not the count of received bytes.

Play an audio from an arraybuffer of bytes?

I Have a bufferarray comes from a rest endpoint I have created with Java that returns a byte[] array ; so I managed to get the array with HTTP (I am using Angular) now I want to play the audio in the browser. I have done some research and I found the web audio API but the error is that I can not decode the array.
context = new AudioContext();
audioArray : ArrayBuffer;
buf ;
let arrayBuffer = new ArrayBuffer(this.audioArray.byteLength);
let bufferView = new Uint8Array(arrayBuffer);
for (let i = 0; i < this.audioArray.byteLength; i++) {
bufferView[i] = this.audioArray[i];
}
console.log(arrayBuffer);
// this function should decode the array but the error occurs
// DOMException: Unable to decode audio data
this.context.decodeAudioData(this.audioArray).then((buffer)=>{
this.buf = buffer;
this.play();
}).catch((error)=>{
console.log(error);
});
console.log(this.audioArray);
play() {
let source = this.context.createBufferSource();
source.buffer = this.buf;
source.connect(this.context.destination);
source.start(0);
}
And I am getting the array in ngOnInit function by calling the rest API:)
this.radioService.generateVoiceMethode().subscribe(res => {
console.log(res);
this.audioArray = new ArrayBuffer(res.byteLength);
this.audioArray = res;
console.log(this.audioArray);
});
Though this is an old question, I am sharing an answer that worked for me. May be useful for some one in the future.
async play(data: ArrayBuffer) {
const context = new AudioContext();
const buffer = await context.decodeAudioData(data);
const source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start();
}
The code above works in Angular 8.

WebAudio streaming with fetch : DOMException: Unable to decode audio data

I'm trying to play an infinite stream coming from the fetch API using Chrome 51. (a webcam audio stream as Microsoft PCM, 16 bit, mono 11025 Hz)
The code works almost OK with mp3s files, except some glitches, but it does not work at all with wav files for some reason i get "DOMException: Unable to decode audio data"
The code is adapted from this answer Choppy/inaudible playback with chunked audio through Web Audio API
Any idea if its possible to make it work with WAV streams ?
function play(url) {
var context = new (window.AudioContext || window.webkitAudioContext)();
var audioStack = [];
var nextTime = 0;
fetch(url).then(function(response) {
var reader = response.body.getReader();
function read() {
return reader.read().then(({ value, done })=> {
context.decodeAudioData(value.buffer, function(buffer) {
audioStack.push(buffer);
if (audioStack.length) {
scheduleBuffers();
}
}, function(err) {
console.log("err(decodeAudioData): "+err);
});
if (done) {
console.log('done');
return;
}
read()
});
}
read();
})
function scheduleBuffers() {
while ( audioStack.length) {
var buffer = audioStack.shift();
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
if (nextTime == 0)
nextTime = context.currentTime + 0.01; /// add 50ms latency to work well across systems - tune this if you like
source.start(nextTime);
nextTime += source.buffer.duration; // Make the next buffer wait the length of the last buffer before being played
};
}
}
Just use play('/path/to/mp3') to test the code. (the server needs to have CORS enabled, or be on the same domain your run script from)
AudioContext.decodeAudioData just isn't designed to decode partial files; it's intended for "short" (but complete) files. Due to the chunking design of MP3, it sometimes works on MP3 streams, but wouldn't on WAV files. You'll need to implement your own decoder in this case.
Making the wav stream sound correctly implies to add WAV headers to the chunks as Raymond suggested, plus some webaudio magic and paquet ordering checks;
Some cool guys helped me to setup that module to handle just that and it works beautifully on Chrome : https://github.com/revolunet/webaudio-wav-stream-player
Now works on Firefox 57+ with some config flags on : https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream/getReader#Browser_compatibility

How to encode IImageProvider as a PNG image?

Assuming I have a LumiaImagingSDK rendering chain setup, with a final IImageProvider object that I want to render, how do I encode that into a PNG image?
Lumia Imaging SDK supports PNG images as input, however there isn't a "PNG Renderer" avaliable in the SDK.
Luckily if you are developing for Windows 8.1 (StoreApplication / universal application / Windows phone 8.1 project) there is a Windows encoder (Windows.Graphics.Imaging.BitmapEncoder) you can use.
Assuming the IImageProvider you want to render is called "source" this is a code snippet you can use to encode the resulting image as PNG:
using Lumia.Imaging;
using Windows.Graphics.Imaging;
using System.IO;
...
using (var renderer = new BitmapRenderer(source, ColorMode.Bgra8888))
{
var bitmap = await renderer.RenderAsync();
byte[] pixelBuffer = bitmap.Buffers[0].Buffer.ToArray();
using (var stream = new InMemoryRandomAccessStream())
{
var encoder = await BitmapEncoder.CreateAsync(BitmapEncoder.PngEncoderId, stream).AsTask().ConfigureAwait(false);
encoder.SetPixelData(BitmapPixelFormat.Bgra8, BitmapAlphaMode.Straight, (uint)bitmap.Dimensions.Width, (uint)bitmap.Dimensions.Height, 96, 96, pixelBuffer);
await encoder.FlushAsync().AsTask().ConfigureAwait(false);
//If InMemoryRandomAccessStream (IRandomAccessStream) works for you, end here.
//If you need an IBuffer, here is how you get one:
using (var memoryStream = new MemoryStream())
{
memoryStream.Capacity = (int)stream.Size;
var ibuffer = memoryStream.GetWindowsRuntimeBuffer();
await stream.ReadAsync(ibuffer, (uint)stream.Size, InputStreamOptions.None).AsTask().ConfigureAwait(false);
}
}
}
This will give you bytes in memory as either InMemoryRandomAccessStream (IRandomAccessStream) or an IBuffer depending on what you need. You can then save the buffer to disk or pass it to other parts of your application.