Play an audio from an arraybuffer of bytes? - html

I Have a bufferarray comes from a rest endpoint I have created with Java that returns a byte[] array ; so I managed to get the array with HTTP (I am using Angular) now I want to play the audio in the browser. I have done some research and I found the web audio API but the error is that I can not decode the array.
context = new AudioContext();
audioArray : ArrayBuffer;
buf ;
let arrayBuffer = new ArrayBuffer(this.audioArray.byteLength);
let bufferView = new Uint8Array(arrayBuffer);
for (let i = 0; i < this.audioArray.byteLength; i++) {
bufferView[i] = this.audioArray[i];
}
console.log(arrayBuffer);
// this function should decode the array but the error occurs
// DOMException: Unable to decode audio data
this.context.decodeAudioData(this.audioArray).then((buffer)=>{
this.buf = buffer;
this.play();
}).catch((error)=>{
console.log(error);
});
console.log(this.audioArray);
play() {
let source = this.context.createBufferSource();
source.buffer = this.buf;
source.connect(this.context.destination);
source.start(0);
}
And I am getting the array in ngOnInit function by calling the rest API:)
this.radioService.generateVoiceMethode().subscribe(res => {
console.log(res);
this.audioArray = new ArrayBuffer(res.byteLength);
this.audioArray = res;
console.log(this.audioArray);
});

Though this is an old question, I am sharing an answer that worked for me. May be useful for some one in the future.
async play(data: ArrayBuffer) {
const context = new AudioContext();
const buffer = await context.decodeAudioData(data);
const source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start();
}
The code above works in Angular 8.

Related

getUserMedia and mediarecorder in html

I record audio via getUserMedia and mediarecorder:
...
navigator.mediaDevices.getUserMedia(constraints).then(mediaStream => {
try {
const mediaRecorder = new MediaRecorder(mediaStream);
mediaRecorder.ondataavailable = vm.mediaDataAvailable;
mediaRecorder.start(1000);
...
When I receive the chucks in the callback, I send them to a web api via websockets, which simply writes the parts one after another to a file:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.websocket.SendBlob(e.data);
...
The file, which is created on the webserver side in the webapi (lets call it "server.webm"), does not work correct. More exactly: It plays the first n seconds (n is the time I chose for the start command), the it stops. This means, the first chunk is transferred correctly, but it seams that adding the 2nd, 3rd, ... chuck together to a file does not work. If I push the chuncks in the web page on an array and the to a file, the resulting file (lets call it "client.webm") is working the whole recording duration.
Creating file on web client side:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.chuncks.push(e.data);
...
stopCapturing() {
var blob = new Blob(this.chuncks, {type: 'audio/webm'});
var url = window.URL.createObjectURL(blob);
var a = document.createElement('a');
a.style.display = 'none';
a.href = url;
a.download = 'client.webm';
document.body.appendChild(a);
a.click();
I compared the files client.webm and server.webm. They are quite similar, but there are certain parts in the server.webm which are not in the client.webm.
Anybody any ideas? Server code looks like the following:
private async Task Echo( HttpContext con, WebSocket webSocket)
{
System.IO.BinaryWriter Writer = new System.IO.BinaryWriter(System.IO.File.OpenWrite(#"server.webm"));
var buffer = new byte[1024 * 4];
WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
while (!result.CloseStatus.HasValue)
{
Writer.Write(buffer);
Writer.Flush();
await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);
result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
}
await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
Writer.Close();
}
resolved, I write all bytes of the reserved byte array to file in server code, not the count of received bytes.

Imported generated JSON in JSX causes Webpack build loop

I've got a small postcss plugin I've made that generates a JSON file off a colors.css variable file during webpack build.
My postcss plugin
const fs = require('fs');
const postcss = require('postcss');
const capitalize = (string) => string.charAt(0).toUpperCase() + string.slice(1);
const getPropName = (string) => {
let name = clean(string.split('-'));
name.shift();
for(let k = 1; k < name.length; k++){ //start at 1 to skip 'color' prefix
name[k] = capitalize(name[k].toString());
}
return name.join('');
};
const clean = (array) => {
let i = array.length;
while(i--){
if (!array[i]) {
array.splice(i, 1);
i++;
}
}
return array;
};
module.exports = postcss.plugin('cssobject', (files, filters, options) =>
(css) => {
options = options || {
destination: ''
};
// Processing code will be added here
const getVariable = (variable) => {
let result;
css.walkRules((rules) => {
rules.walkDecls((decl) => {
const pointer = variable.replace('var(', '').replace(')','');
if(!decl.prop.match(pointer)) return;
result = decl.value;
});
});
return result;
};
css.walkRules((rules) => { //hooks into CSS stream
let i = files.length;
let cssObject = {};
while (i--) {
if(!rules.source.input.from.match(files[i])) return; //scrubs against requested files
rules.walkDecls((decl) => {
let j = filters.length;
while(j--){
if(!decl.prop.match(filters[j])) return; //scrubs against requested rules
let prop = getPropName(decl.prop);
cssObject[prop] = (decl.value.match('var'))? getVariable(decl.value) : decl.value;
}
});
}
if (options.destination) {
fs.writeFile(options.destination, JSON.stringify(cssObject), 'utf8');
}
});
}
);
I'm then importing this JSON file into a react component JSX file to then parse JSON data into a visual guide of project's used colors under AA and AAA requirements... anywho
The problem I'm having is my webpack-dev-server keeps re-building over and over again cause it thinks a change has been made to the JSX file, when in fact it's only ever a change to the JSON file being imported.
Is there a standard way of importing generated files in to a JSX without causing infinite build loops?
I've already tried having the JSON file be saved well outside of the webpack dev's watch location, and still build loop remains.
Thanks in advance!
you can change you file's timestamp, the webpack will not build after you change your file
const now = Date.now() / 1000;
const lastModifyTime = now - 11;
const lastAccessTime = now - 11;
fs.utimesSync(jsonPath, lastModifyTime, lastAccessTime);
Have a try, hope to help you.

Stream video through socket to html5 video tag

Hello i`ve been trying to stream a webm video through a socket.io socket directly to the html5 video tag. The client and server code follows below:
Server:
(function() {
var Alert, Channel, Receiver, Takeover, express, pathLib;
pathLib = require("path");
fs = require("fs");
express = require("express");
module.exports = function(app, sockets) {
router = express.Router();
router.get("/clearAlerts", function(req, res) {
console.log("reached!");
return sockets.emit("alert-deleted");
});
router.get("/castVideo", function(req, res) {
//move this to a better place
console.log("reachedCastVideoss");
var readStream = fs.createReadStream(pathLib.join(__dirname + "/../../../public/elephants-dream.webm"));
readStream.addListener('data', function(data) {
console.log("cast-video emitted");
sockets.emit('cast-video', data);
});
});
return app.use('/custom/', router);
};
}).call(this);
Client:
var socket = io.connect('http://localhost:4994');
window.URL = window.URL || window.webkitURL;
window.MediaSource = window.MediaSource || window.WebKitMediaSource;
var mediaSource = new MediaSource();
var video = document.getElementById("video");
var queue = [];
var sourceBuffer;
var firstChunk = true;
video.src = window.URL.createObjectURL(mediaSource);
streamIt = function(e) {
video.pause();
mediaSource.addSourceBuffer('video/webm; codecs="vorbis,vp8"');
mediaSource.sourceBuffers[0].addEventListener('updateend', onBufferUpdated);
socket.on("cast-video", function(data) {
console.log("appending to buffer");
var uIntArray = new Uint8Array(data);
if (firstChunk) {
mediaSource.sourceBuffers[0].appendBuffer(uIntArray);
firstChunk = false;
}
queue.push(uIntArray);
if (queue.length === 33) {
//mediaSource.endOfStream();
}
});
var onBufferUpdated = function() {
if (queue.length) {
mediaSource.sourceBuffers[0].appendBuffer(queue.shift());
}
};
};
mediaSource.addEventListener('sourceopen', streamIt);
mediaSource.addEventListener('webkitsourceopen', streamIt);
When I try to run this code, It seems that the first chunk of the stream is appended
to the sourceBuffer, I can see the first frame(title and an url) of the video file im trying to play, but thats it. It seems that only the first call appendBuffer works. I read somewhere something about a required initialization segment for the video to play, but I also saw an working example that does not use this initialization segment, so im a little confuse.(link to the example)
Can anyone clarify if I really need this initial segment? If I do, how can I retrieve the byte range of this segment? Or if I dont need this segment, what is wrong in my code? Thank you.
Trying a little bit more today,Ive found that if I use the same file from http://html5-demos.appspot.com/static/media-source.html, this code actually works. When I try with the files from
http://www.webmfiles.org/demo-files, the code does not works. I have no idea why.

WinRT MediaElement not working with InMemoryRandomAccessStream

We loaded video as bytes array, created InMemoryRandomAccessStream over this array and tried to MediaElement.SetSource. In UI we have message on MediaElement - Invalid Source. We tried to save this stream to file and read new stream from this file - works perfectly. Both stream are the identical (we check it using SequenceEqual).
What is the problem?
Part of our code:
var stream = await LoadStream();
mediaElement.SetSource(stream , #"video/mp4");
...
public async Task<IRandomAccessStream> LoadStream()
{
...
var writeStream = part.ParentFile.AccessStream.AsStreamForWrite();
foreach (var filePart in part.ParentFile.Parts)
{
writeStream.Write(filePart.Bytes, 0, filePart.Bytes.Length);
}
writeStream.Seek(0, SeekOrigin.Begin);
return part.ParentFile.AccessStream;
}
P.S - the mime-type is correct for sure
Thanks!

How to get media stream object form HTML5 video element in javascript

all
I'm in peer to peer communication using webRTC , we have media stream object from the getUserMedia which is given as input stream to peerconnection. Here I need video stream from the selected video file from the local drive which is playing using Video element of HTML5.
Is it possible to create mediastream object from the video tag?
thanks,
suri
For now you can't add a media stream from a video tag, but it should be possible in the future, as it is explained on MDN
MediaStream objects have a single input and a single output. A MediaStream object generated by getUserMedia() is called local, and has as its source input one of the user's cameras or microphones. A non-local MediaStream may be representing to a media element, like or , a stream originating over the network, and obtained via the WebRTC PeerConnection API, or a stream created using the Web Audio API MediaStreamAudioSourceNode.
But you can use Media Source Extensions API to do what yo want : you have to put the local file into a stream and append in in a MediaSource object. You can learn more about MSE here : http://www.w3.org/TR/media-source/
And you can find a demo and source of the method above here
2021 update: It is now possible using MediaRecorder interface: https://developer.mozilla.org/en-US/docs/Web/API/MediaRecorder
Example from same page:
if (navigator.mediaDevices) {
console.log('getUserMedia supported.');
var constraints = { audio: true };
var chunks = [];
navigator.mediaDevices.getUserMedia(constraints)
.then(function(stream) {
var mediaRecorder = new MediaRecorder(stream);
visualize(stream);
record.onclick = function() {
mediaRecorder.start();
console.log(mediaRecorder.state);
console.log("recorder started");
record.style.background = "red";
record.style.color = "black";
}
stop.onclick = function() {
mediaRecorder.stop();
console.log(mediaRecorder.state);
console.log("recorder stopped");
record.style.background = "";
record.style.color = "";
}
mediaRecorder.onstop = function(e) {
console.log("data available after MediaRecorder.stop() called.");
var clipName = prompt('Enter a name for your sound clip');
var clipContainer = document.createElement('article');
var clipLabel = document.createElement('p');
var audio = document.createElement('audio');
var deleteButton = document.createElement('button');
clipContainer.classList.add('clip');
audio.setAttribute('controls', '');
deleteButton.innerHTML = "Delete";
clipLabel.innerHTML = clipName;
clipContainer.appendChild(audio);
clipContainer.appendChild(clipLabel);
clipContainer.appendChild(deleteButton);
soundClips.appendChild(clipContainer);
audio.controls = true;
var blob = new Blob(chunks, { 'type' : 'audio/ogg; codecs=opus' });
chunks = [];
var audioURL = URL.createObjectURL(blob);
audio.src = audioURL;
console.log("recorder stopped");
deleteButton.onclick = function(e) {
evtTgt = e.target;
evtTgt.parentNode.parentNode.removeChild(evtTgt.parentNode);
}
}
mediaRecorder.ondataavailable = function(e) {
chunks.push(e.data);
}
})
.catch(function(err) {
console.log('The following error occurred: ' + err);
})
}
MDN also has a detailed mini tutorial: https://developer.mozilla.org/en-US/docs/Web/API/MediaStream_Recording_API/Recording_a_media_element