getUserMedia and mediarecorder in html - html

I record audio via getUserMedia and mediarecorder:
...
navigator.mediaDevices.getUserMedia(constraints).then(mediaStream => {
try {
const mediaRecorder = new MediaRecorder(mediaStream);
mediaRecorder.ondataavailable = vm.mediaDataAvailable;
mediaRecorder.start(1000);
...
When I receive the chucks in the callback, I send them to a web api via websockets, which simply writes the parts one after another to a file:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.websocket.SendBlob(e.data);
...
The file, which is created on the webserver side in the webapi (lets call it "server.webm"), does not work correct. More exactly: It plays the first n seconds (n is the time I chose for the start command), the it stops. This means, the first chunk is transferred correctly, but it seams that adding the 2nd, 3rd, ... chuck together to a file does not work. If I push the chuncks in the web page on an array and the to a file, the resulting file (lets call it "client.webm") is working the whole recording duration.
Creating file on web client side:
mediaDataAvailable(e) {
if (!event.data || event.data.size === 0)
{
return;
}
vm.chuncks.push(e.data);
...
stopCapturing() {
var blob = new Blob(this.chuncks, {type: 'audio/webm'});
var url = window.URL.createObjectURL(blob);
var a = document.createElement('a');
a.style.display = 'none';
a.href = url;
a.download = 'client.webm';
document.body.appendChild(a);
a.click();
I compared the files client.webm and server.webm. They are quite similar, but there are certain parts in the server.webm which are not in the client.webm.
Anybody any ideas? Server code looks like the following:
private async Task Echo( HttpContext con, WebSocket webSocket)
{
System.IO.BinaryWriter Writer = new System.IO.BinaryWriter(System.IO.File.OpenWrite(#"server.webm"));
var buffer = new byte[1024 * 4];
WebSocketReceiveResult result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
while (!result.CloseStatus.HasValue)
{
Writer.Write(buffer);
Writer.Flush();
await webSocket.SendAsync(new ArraySegment<byte>(buffer, 0, result.Count), result.MessageType, result.EndOfMessage, CancellationToken.None);
result = await webSocket.ReceiveAsync(new ArraySegment<byte>(buffer), CancellationToken.None);
}
await webSocket.CloseAsync(result.CloseStatus.Value, result.CloseStatusDescription, CancellationToken.None);
Writer.Close();
}

resolved, I write all bytes of the reserved byte array to file in server code, not the count of received bytes.

Related

Play an audio from an arraybuffer of bytes?

I Have a bufferarray comes from a rest endpoint I have created with Java that returns a byte[] array ; so I managed to get the array with HTTP (I am using Angular) now I want to play the audio in the browser. I have done some research and I found the web audio API but the error is that I can not decode the array.
context = new AudioContext();
audioArray : ArrayBuffer;
buf ;
let arrayBuffer = new ArrayBuffer(this.audioArray.byteLength);
let bufferView = new Uint8Array(arrayBuffer);
for (let i = 0; i < this.audioArray.byteLength; i++) {
bufferView[i] = this.audioArray[i];
}
console.log(arrayBuffer);
// this function should decode the array but the error occurs
// DOMException: Unable to decode audio data
this.context.decodeAudioData(this.audioArray).then((buffer)=>{
this.buf = buffer;
this.play();
}).catch((error)=>{
console.log(error);
});
console.log(this.audioArray);
play() {
let source = this.context.createBufferSource();
source.buffer = this.buf;
source.connect(this.context.destination);
source.start(0);
}
And I am getting the array in ngOnInit function by calling the rest API:)
this.radioService.generateVoiceMethode().subscribe(res => {
console.log(res);
this.audioArray = new ArrayBuffer(res.byteLength);
this.audioArray = res;
console.log(this.audioArray);
});
Though this is an old question, I am sharing an answer that worked for me. May be useful for some one in the future.
async play(data: ArrayBuffer) {
const context = new AudioContext();
const buffer = await context.decodeAudioData(data);
const source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start();
}
The code above works in Angular 8.

progressive load and play video from base64 pieces

I have many pieces of a video in base64.
Just that I want is to play the video progressively as I receive them.
var fileInput = document.querySelector('input#theInputFile');//multiple
fileInput.addEventListener('change', function(e) {
var files = fileInput.files;
for (var i = 0; i < files.length; i++) {
var file = fileInput.files[i]
fileLoaded(file, 0, 102400, file.size);
};
e.preventDefault();
});
videoA=[];
function fileLoaded(file, ini, end, size) {
if (end>size){end=size}
var reader = new FileReader();
var fr = new FileReader();
fr.onloadend = function(e) {
if (e.target.readyState == FileReader.DONE) {
var piece = e.target.result;
display(piece.replace('data:video/mp4;base64,', ''));
}
};
var blob = file.slice(ini, end, file.type);
fr.readAsDataURL(blob);
var init = end;
var endt = init+end;
if (end<size){
fileLoaded(file, init, end, size);
}
}
Trying to display the video by chunks:
var a=0;
function display(vid, ini, end) {
videoA.push(vid);
$('#video').attr('src','data:video/mp4;base64,'+videoA[a]);
a++;
}
I know this is not the way but I`m trying to search and any response adjust to that I'm searching.
Even I'm not sure if it is possible.
Thanks!
EDIT
I've tried to play the chunks one by one and the first one is played well but the rest of them give the error:
"Uncaught (in promise) DOMException: Failed to load because no supported source was found".
If I could make the chunks to base64 correctly it's enough for me
Ok, the solution is to solve the creation of base64 pieces from the original uploaded file in the browser that can be played by an html5 player
So I've put another question asking for that.
Chunk video mp4 file into base64 pieces with javascript on browser

How to send Data to the client with out client requesting the ws server in nodejs

I need to convert a csv file to json format and send it to a client requesting to ws server in nodejs ,
the file will be updated so many times so i need to send updated data to client
i am able to send data once it is loaded completely(like when app is started it sends all data in file to client) but when i update data in the file the updated data is being printed out on console but it is not being sent to client is their any thing wrong in my code
my node.js code:
var ts = require('tail-stream');
var Converter = require("csvtojson").Converter;
var converter = new Converter({constructResult:false}); //for big csv data
var WebSocketServer = require('websocket').server;
var http = require('http');
var server = http.createServer(function(request, response) {
// process HTTP request. Since we're writing just WebSockets server
// we don't have to implement anything.
response.write('hello');
console.log('in http server \n');
});
server.listen(1337, function() { });
// create the server
wsServer = new WebSocketServer({
httpServer: server
});
// WebSocket server
wsServer.on('request', function(request) {
var connection = request.accept(null, request.origin);
console.log('wsserver');
connection.send('ws server');
converter.on("record_parsed", function (jsonObj) {
console.log(jsonObj); //here is your result json object
connection.send(jsonObj);
});
var tstream = ts.createReadStream('log.csv', {
beginAt: 0,
onMove: 'follow',
detectTruncate: false,
onTruncate: 'end',
endOnError: false
});
tstream.pipe(converter);
});
Right now you are creating a new read stream and adding a listener to the converter on every new connection, that will cause trouble once you have more than one client (same event emitted multiple times, etc..). Instead of that you should keep just one reader and notify all open connections when there's a new record.
Also notice that the library you are using only accepts UTF-8 strings or binary type messages, row objects sent the way you're sending them now will be received as a "[object Object]" string after toString() is called on them. You should probably send just send the row string or use JSON.stringify / JSON.parse.
Try this:
var http = require("http");
var tailStream = require("tail-stream");
var Converter = require("csvtojson").Converter;
var WebSocketServer = require("websocket").server;
var server = http.createServer();
var wsServer = new WebSocketServer({ httpServer: server });
var converter = new Converter({constructResult:false});
var logStream = tailStream.createReadStream("log.csv", { detectTruncate : false });
var connections = [];
server.listen(1337);
logStream.pipe(converter);
//----------------------------------------------------
converter.on("record_parsed", function (jsonObj) {
connections.forEach(function(connection){
connection.send(JSON.stringify(jsonObj));
});
});
//----------------------------------------------------
wsServer.on("request", function(request) {
var connection = request.accept(null, request.origin);
connection.on("close", function() {
connections.splice(connections.indexOf(connection), 1);
});
connections.push(connection);
});
The code above works, tested like this on the client side:
var socket = new WebSocket('ws://localhost:1337/');
socket.onmessage = function (event) {
console.log(JSON.parse(event.data));
}
Note: this doesn't send the whole content of the file at the beginning, just the updates, but you can easily achieve this storing the records and sending them on new connections.

Receiving only one byte in Serial Port - Chrome App

The situation
I have a scanner that has been working with a compiled application which I don't have the source. It still works and can be tested to make sure the scanner is working. I need to convert the data entry process to my web based system.
So I'm building a chrome app that read serial port information incoming from the com port. I first tried setting it up with a com port emulator and a virtual null modem. This allowed me to test the connection and the receive data. I can't find why I am receiving only 1 byte.
The problem
When I connected to the actual scanner, I am able to connect without any issue, but when I receive the dataArray it's only one byte long. After reveiving the first data, I'm unable to receive any other data until I restart the connection.
The Code
var connectionId = -1;
var e_dtr, e_rts, e_dcd, e_cts, e_ri, e_dsr;
var dtr, rts;
chrome.app.runtime.onLaunched.addListener(function(launchData) {
chrome.serial.getDevices(function(objs,arg2){
chrome.serial.connect(objs[0].path, {ctsFlowControl:true}, onConnect)
});
});
chrome.serial.onReceive.addListener(function(info){
chrome.serial.getInfo(info.connectionId, output);
var uint8View = new Uint8Array(info.data);
var value = String.fromCharCode.apply(null, uint8View);
console.log(value);
});
chrome.serial.onReceiveError.addListener(function(info){
var uint8View = new Uint8Array(info.data);
var value = String.fromCharCode.apply(null, uint8View);
console.log(value);
});
function readSignals() {
chrome.serial.getControlSignals(connectionId,onGetControlSignals);
}
function onSetControlSignals(result) {
console.log("onSetControlSignals: " + result);
};
function changeSignals() {
chrome.serial.setControlSignals(connectionId, { dtr: dtr, rts: rts }, onSetControlSignals);
}
function onGetControlSignals(signals) {
console.log(signals);
}
function onConnect(connectionInfo) {
console.log(connectionInfo);
if (!connectionInfo) {
console.log('Could not open');
return;
}
connectionId = connectionInfo.connectionId;
console.log('Connected');
dtr = false;
rts = false;
changeSignals();
setInterval(readSignals, 1000);
};

Send Image File via XHR on Chrome

I'm using HTML5 drag&drop to get images from a user's computer and want to upload them to my Rails server (using Carrierwave on that end). I don't know exactly what I'm doing here, but cobbled together this code from these instructions http://code.google.com/p/html5uploader/wiki/HTML5Uploader
This returns a 500 error - can anyone take a look and help me out with what I'm doing wrong?
var files = e.dataTransfer.files;
if (files.length){
for (var i = 0; i<files.length; i++) {
var file = files[i];
var reader = new FileReader();
reader.readAsBinaryString(file);
reader.onload = function() {
var bin = reader.result;
var xhr = new XMLHttpRequest();
var boundary = 'xxxxxxxxx';
xhr.open('POST', '/images?up=true&base64=true', true);
xhr.setRequestHeader('content-type', 'multipart/form-data; boundary=' + boundary);
xhr.setRequestHeader('UP-FILENAME', file.name);
xhr.setRequestHeader('UP-SIZE', file.size);
xhr.setRequestHeader('UP-TYPE', file.type);
xhr.send(window.btoa(bin));
};
};
};
There are a couple of things that could be the culprit. You're reading the file as a binary string, then creating a multipart request, then sending a base64 encoded value.
There's no need to read the file or mess with base64 encoding. Instead, just construct a FormData object, append the file, and send that directly using xhr.send(formData). See my response here.