I have a code base with around 200 js files and my task is to minify each JS file. I am using the google closure compiler to do this job. Here is my code
gulp.task( 'js-compile',function ()
{
return gulp.src( 'src/public/src/app/**/*.js' ).pipe( tap( minifyJS ) );
} );
function minifyJS ( file,t )
{
const fileName = path.basename(file.path)
const fileList = file.path.split( '/' )
fileList.splice(0, 8)
let destFilePath = fileList.join( '/' )
destFilePath = 'src/public/dist/www/src/' + destFilePath;
const dirList = file.path.split( '/' )
const baseDir = dirList.slice( 0, 5 );
const destDirPath = baseDir.join( '/' );
const command =
'npx google-closure-compiler --js=' +
file.path +
' --js_output_file=' +destDirPath+ "/"+ destFilePath;
console.log( "Execute Command: ", command );
exec( command,function ( err,stdout,stderr )
{
console.log( stdout );
console.log( stderr );
} );
console.log( "Command Execution complete" );
return t.through( gulp.dest,[ destDirPath ] )
}
The issue is in the shell it is showing tasks completed but not coming out of the execution and taking a long time to minify the files and my laptop is freezing during this operation. What is the best way to minify each file and execute them in parallel and wait until all are completed ??
Related
I'm seeing these two errors:
Uncaught TypeError:
Cannot read properties of null (reading 'filter')
along with this - Uncaught ReferenceError: $ is not defined
I've tried running it a few times and it gave me the same answer each time. I can't seem to figure out where the issue is..
This is the code:
function scrollToBottom(){
bottom = document.body.scrollHeight;
current = window.innerHeight+ document.body.scrollTop;
if((bottom-current) >0){
window.scrollTo(0, bottom);
//wait a few
setTimeout ( 'scrollToBottom()', 1500 );
}
(function() {
var placeholderString,
placeholderWords,
contactName,
companyName,
customNote,
jobDescription,
jobUrl,
jobInfo,
str = '';
$('.job_listings')
.filter(function(_, elem) {
return ($(elem).attr('data-force-note') );
})
.each(function(_, elem) {
// Find the URL for each job listing.
$(elem)
.find('.top a[href]')
.each( function(idx, value) { str += $(value).attr('href') + "\n"; });
// Get the company and contact info
placeholderString = $(elem)
.find('.interested-note').attr('placeholder');
// Split placeholder string into words:
placeholderWords = placeholderString.split(' ');
// Grab name of recruiter/contact
contactName = placeholderWords[4];
// Grab company name
companyName = $(elem).find('.startup-link').text();
// Build personalized note
customNote = "Hi " + contactName + "! Would love to join " + companyName + " using my diverse set of skills. Let's chat!";
// .header-info .tagline (text)
jobDescription = $(elem).find('.tagline').text();
// .header-info .startup-link (href attr)
jobUrl = $(elem).find('.startup-link').attr('href');
// Compile and format job information
jobInfo = companyName + '\n' + jobDescription + '\n' + str + '\n\n';
// Get job data for your own records
console.log(jobInfo);
// Comment this out to verify your customNote
// console.log(customNote + '\n');
// Add your custom note.
// Comment these lines out to debug
$(elem)
.find('.interested-note').text( customNote );
//Comment these lines out to debug
$(elem)
.find('.interested-with-note-button')
.each( function(idx, button) { $(button).click(); });
});
// Print all of the company and job info to the console.
return jobInfo;
})();
};
scrollToBottom();
Is anyone able to help me here?
Thank you!
I'm trying to execute some gulp tasks inside a map loop to generate css files for each sass theme I defined.
Finally, I return all merged tasks with merge-stream package.
Unfortunately, callback seems sent for each iteration / gulp task of my loop and i'm getting the following message :
Error: write callback called multiple times
#Task('build')
build() {
var self = this;
var tasks = this.themes.map((theme) => {
this.foldersXml = getAttributes('jntFolderWithExternalProvider');
this.filesXml = getAttributes('jntFolder');
this.foldersXml[theme] = []; this.foldersXml[theme].push(getAttributes('jntFolderWithExternalProvider'));
this.filesXml[theme] = []; this.filesXml[theme].push(getAttributes('jntFolderWithExternalProvider'));
fs.readFile(this.themesFolder + '/' + theme + '/' + this.fileName, 'utf8', (err: Error, data: string & Buffer) => {
if (err) {
throw new gutil.PluginError({
plugin: 'build',
message: 'Main file doesn\'t exist for the theme: ' + theme
});
} else {
var vars = data.match(/\$(.*?)\:/g);
this.requiredVars.map(requiredVar => {
if(vars !== null){
if(!vars.contains(requiredVar)){
throw new gutil.PluginError({
plugin: 'build',
message: 'Required variable ' + requiredVar + ' is not defined'
});
};
}
});
}
});
return gulp.src(this.templatesFolder + '/' + this.pattern)
.pipe(header('#import \'' + this.themesFolder + '/' + theme + '/' + this.fileName + '\';'))
.pipe(sass.sync().on('error', gutil.log))
.pipe(rename(function (path: any) {
var file = path.basename + path.extname;
var folderXml = getAttributes('jntFolderWithExternalProvider') as any;
folderXml[file] = [ getAttributes('jntCssFile') ];
self.foldersXml[theme][0][file] = []; self.foldersXml[theme][0][file].push(folderXml);
var fileXml = getAttributes('jntFolderWithExternalProvider') as any;
fileXml['jcr:content'] = [ getAttributes('jntRessource') ];
self.filesXml[theme][0][file] = []; self.filesXml[theme][0][file].push(fileXml);
path.dirname += '/' + file;
}))
.pipe(gulp.dest(this.themesFolder + '/' + theme))
});
return merge(tasks);
}
#SequenceTask()
run(){
return ['build'];
}
I just want to get rid of this message and be sure that callback is invoked only at the end. What's the best approach for you ?
I want to use SSE's to send data from the server to a specific individual client that is logged in to my web-application, when their session timeout 'clock' is about to expire or when there is a message that is supposed to be sent to all 'connected' users, similar to a UNIX wall command, such as 'the system will be unavailable in 5 minute, please complete your transactions and log-off.'
The examples that I've seen at w3schools and MSDN, automatically/periodically send messages to the connected clients, such as transmitting the server's time.
I want to avoid polling the server from the client with ajax requests for questions like 'Am I still logged-in?' (session timeout expired) or 'What, if any, is the the server message?' The first is specific to the user and the second is for all current users.
Can this be done, and is there an example of how to do this?
Thanks
Since posting this question, I was partially successful in cobbling a SSE server/client demo to work. I say, partly successful, because I've not been able to get the different durations to work using the retry option. If I leave the option out, the client gets a message every three seconds. If I include the option, then the client only gets one message and hages trying to connect with the server. Here is the PHP server-side code:
<?php
header( 'Content-Type: text/event-stream' );
header( 'Cache-Control: no-cache' );
// data: {"msg": "First message"}\ndata: {"msg": "second message"}\n\n
$r = mt_rand( 1, 10 ) * 1000;
$m = "retry interval: ${r}";
$r *= 1000;
$t = date( 'r' );
// echo "retry: ${r}" . PHP_EOL; // <==== With out this, it works!
echo "data: {\"message\" : \"${m}\"}" . PHP_EOL;
echo "data: {\"message\" : \"The server time is: ${t}\"}" . PHP_EOL;
echo PHP_EOL;
ob_flush();
flush();
?>
Here is the client-side code:
<!DOCTYPE html>
<html>
<head>
<meta charset="UTF-8">
<title>Server-Sent Events Example</title>
</head>
<body>
<p>
<button onclick="source.close();">Stop!</button>
</p>
<h1>Getting server updates</h1>
<div id="result"></div>
<script>
// Source: https://www.html5rocks.com/en/tutorials/eventsource/basics/
if( typeof( EventSource ) !== 'undefined' ) {
// 'http://symfony.dev.iambreakinout.com/SSE_Server.php'
var source = new EventSource( 'SSE_Server.php' );
source.addEventListener(
'message',
function( event ) {
if( event.origin != window.location.origin ) {
alert( 'Cross-site scripting: message origin did not match:' +
'\r\n' +
'expected origin: ' + window.location.origin + '\r\n' +
'actual origin: ' + event.origin );
return;
}
else {
var data = [];
for( var i = 0, b = 0, e = 0, d = '';; ++i, b = e + 1 ) {
e = event.data.indexOf( "\n", b );
s = ( ( e > -1 )
? event.data.substr( b, e )
: event.data.substr( b ) );
data[ i ] = JSON.parse( s ).message;
if( e === -1 ) break;
}
document.getElementById( 'result' ).innerHTML +=
data.join( '<br>' ) + '<br>';
}
},
false );
source.addEventListener(
'open',
function( event ) {
// Connection was opened.
document.getElementById( 'result' ).innerHTML += 'Open<br>';
},
false );
source.addEventListener(
'error',
function( event ) {
var readyState;
//
// The closed ready state is seen when an error event occurs, but
// the rest are shown here as a reminder to me of the defined
// ready state constant values.
//
switch( event.currentTarget.readyState ) {
case EventSource.CONNECTING: readyState = 'Connecting'; break;
case EventSource.OPEN: readyState = 'Open'; break;
case EventSource.CLOSED: readyState = 'Closed'; break;
default: readyState = '?'; break;
}
document.getElementById( 'result' ).innerHTML += readyState +
'<br>';
},
false );
}
else {
document.getElementById("result").innerHTML =
'Sorry, your browser does not support server-sent events...';
}
</script>
<button onclick="source.close();">Stop!</button>
</body>
</html>
Without changing the client code and allowing the echo "retry: ${r}" . PHP_EOL; statement to specify a retry duration causes the output to stop after Connecting i shown.
Getting server updates
Open
retry interval: 3000
Connecting
What am I doing wrong or not doing to allow for the retry option to work?
Thanks again
OK, the code as stands is fine, but it really should not have had the first 'times 1000' when it generated the random number of interval. (duh!).
You need to keep your active users in some data structure to be able to send events to them.
Below is a simple commented example on NodeJS (based on https://github.com/mchaov/simple-sse-nodejs-setup/):
const http = require('http');
const msg = " - Simple SSE server - ";
const port = 5000;
// Create basic server
http.createServer(function (request, response) {
// answer only to event stream requests
if (request.headers.accept && request.headers.accept == 'text/event-stream') {
// check if the resource is what we want
// => http://domain.ext/sse
// store the user into your data structure
if (/sse/gim.test(request.url)) {
sendSSE(request, response);
}
}
else {
// if not just return that you are online and a string of text
response.writeHead(200);
response.write('Welcome to ' + msg + '# :' + port);
response.end();
}
}).listen(port);
// Setup the SSE "service" :)
// You need to cache the "response" object per user, and then you can use the functions below to send messages whenever you desire. You can have a collection that holds all the references to the response object and decide if you want to send message to one or all.
function sendSSE(request, response) {
// Setup headers
// For ease of use: example for enabled CORS
response.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
// enabling CORS
'Access-Control-Allow-Origin': "*",
'Access-Control-Allow-Headers': 'Origin, X-Requested-With, Content-Type, Accept'
});
var id = (new Date()).toLocaleTimeString();
// send first message
constructSSE(response, id, (new Date()).toLocaleTimeString());
// send message every second and a half
setInterval(function () {
constructSSE(response, id, (new Date()).toLocaleTimeString());
}, 1500);
}
// setup simple message
// call this function with proper response you took from data structure with active users to send message to the specific user
function constructSSE(response, id, data) {
response.write('id: ' + id + '\n');
// response.write('event: ' + 'logout' + '\n'); // check this
response.write("data: " + msg + port + '\n\n');
}
I have written a gulp file that watches over several directories for changes, and then create concatenation to multiple specified destination.
Here is a simplified version of my project structure:
I have 2 site folders:
one/ and two/
Each site have two branch folders:
a/ and b/
Inside each branch, there are three folders:
inner/, outer/ and web/
My task is to grab similar part files from the inner and outer folders, and concatenate them into relative web folders. Below is a simple example of desired output.
-- inner/
|-- color1
|-- color2
|-- fruit1
|-- fruit2
-- outer/
|-- color1
|-- color2
|-- fruit1
|-- fruit2
-- web/
|-- colors.txt
|-- fruits.txt
I have created a config.json file to hold site specific configuration. Currently only using it to customize site paths. Here is the config.json
{
"sites": {
"one": {
"a": "/path/to/one/a/",
"b": "/path/to/one/b/"
},
"two": {
"a": "/path/to/two/a/",
"b": "/path/to/two/b/"
}
}
}
And finally here is the gulpfile.js
// Include local Gulp
var gulp = require("gulp");
// Get data from config.json
var sites = require("./config.json").sites;
// Include Gulp specific plugins
var gConcat = require("gulp-concat");
var gHeader = require("gulp-header");
var gUtil = require("gulp-util");
var gNotify = require("gulp-notify");
// Setup directories
var outer = "outer/";
var inner = "inner/";
var web = "web/";
// Misc
var alertMessage = "# GENERATED FILE - DO NOT MODIFY\n\n";
// 8 total tasks for concatenation
// Concatenate to colors.txt - 4 tasks
// Color task 1: [ Site => one ] [ Branch => a ]
gulp.task("one_a_color", function() {
return gulp.src([sites.one.a + outer + "color?", sites.one.a + inner + "color?"])
.pipe(gConcat("colors.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.one.a + web))
.pipe(gNotify());
});
// Color task 2: [ Site => one ] [ Branch => b ]
gulp.task("one_b_color", function() {
return gulp.src([sites.one.b + outer + "color?", sites.one.b + inner + "color?"])
.pipe(gConcat("colors.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.one.b + web))
.pipe(gNotify());
});
// Color task 3: [ Site => two ] [ Branch => a ]
gulp.task("two_a_color", function() {
return gulp.src([sites.two.a + outer + "color?", sites.two.a + inner + "color?"])
.pipe(gConcat("colors.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.two.a + web))
.pipe(gNotify());
});
// Color task 4: [ Site => two ] [ Branch => b ]
gulp.task("two_b_color", function() {
return gulp.src([sites.two.b + outer + "color?", sites.two.b + inner + "color?"])
.pipe(gConcat("colors.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.two.b + web))
.pipe(gNotify());
});
// Concatenate to fruits.txt - 4 tasks
// Fruit task 1: [ Site => one ] [ Branch => a ]
gulp.task("one_a_fruit", function() {
return gulp.src([sites.one.a + outer + "fruit?", sites.one.a + inner + "fruit?"])
.pipe(gConcat("fruits.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.one.a + web))
.pipe(gNotify());
});
// Fruit task 2: [ Site => one ] [ Branch => b ]
gulp.task("one_b_fruit", function() {
return gulp.src([sites.one.b + outer + "fruit?", sites.one.b + inner + "fruit?"])
.pipe(gConcat("fruits.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.one.b + web))
.pipe(gNotify());
});
// Fruit task 3: [ Site => two ] [ Branch => a ]
gulp.task("two_a_fruit", function() {
return gulp.src([sites.two.a + outer + "fruit?", sites.two.a + inner + "fruit?"])
.pipe(gConcat("fruits.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.two.a + web))
.pipe(gNotify());
});
// Fruit task 4: [ Site => two ] [ Branch => b ]
gulp.task("two_b_fruit", function() {
return gulp.src([sites.two.b + outer + "fruit?", sites.two.b + inner + "fruit?"])
.pipe(gConcat("fruits.txt"))
.pipe(gHeader(alertMessage))
.pipe(gulp.dest(sites.two.b + web))
.pipe(gNotify());
});
// Watch for all events in specified {directories}/{files}, then trigger appropriate task
// 8 total watch jobs
gulp.task("watch", function () {
// Color related watch jobs - Total 4
// Color watch 1: [ Site => one ] [ Branch => a ]
gulp.watch([sites.one.a + outer + "**/color?", sites.one.a + inner + "**/color?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("one_a_color");
});
// Color watch 2: [ Site => one ] [ Branch => b ]
gulp.watch([sites.one.b + outer + "**/color?", sites.one.b + inner + "**/color?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("one_b_color");
});
// Color watch 3: [ Site => two ] [ Branch => a ]
gulp.watch([sites.two.a + outer + "**/color?", sites.two.a + inner + "**/color?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("two_a_color");
});
// Color watch 4: [ Site => two ] [ Branch => b ]
gulp.watch([sites.one.b + outer + "**/color?", sites.one.b + inner + "**/color?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("two_b_color");
});
// Fruit related watch jobs - Total 4
// Fruit watch 1: [ Site => one ] [ Branch => a ]
gulp.watch([sites.one.a + outer + "**/fruit?", sites.one.a + inner + "**/fruit?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("one_a_fruit");
});
// Fruit watch 2: [ Site => one ] [ Branch => b ]
gulp.watch([sites.one.b + outer + "**/fruit?", sites.one.b + inner + "**/fruit?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("one_b_fruit");
});
// Fruit watch 3: [ Site => two ] [ Branch => a ]
gulp.watch([sites.two.a + outer + "**/fruit?", sites.two.a + inner + "**/fruit?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("two_a_fruit");
});
// Fruit watch 4: [ Site => two ] [ Branch => b ]
gulp.watch([sites.one.b + outer + "**/fruit?", sites.one.b + inner + "**/fruit?"], function(event) {
gUtil.log(event.path.split("/").pop(), "=>", event.type);
gulp.start("two_b_fruit");
});
});
// Run all tasks
gulp.task("background",
[
"one_a_color", "one_b_color", "two_a_color", "two_b_color",
"one_a_fruit", "one_b_fruit", "two_a_fruit", "two_b_fruit",
"watch"
]
);
The above gulp file works and does the job. However, as you can see, most of the codes are repeated, only part that changes are the gulp.src and gulp.dest, along with the task names.
My question is. Would it be possible to simplify this gulp file, so instead of repeating codes for every tasks, maybe similar tasks can be batched together.
Not that easy a task, but let's see if we can optimise that. Gulp and Globs greatly deal with arrays, that's why we have to convert your paths to an array first:
var gulp = require('gulp');
var concat = require('gulp-concat');
var es = require('event-stream');
var sites = require('./config.json').sites;
var toArray = function(conf) {
var arr = [];
for(var key in conf) {
if(typeof conf[key] === 'object') {
arr = arr.concat(toArray(conf[key]));
} else {
arr.push(conf[key]);
}
}
return arr;
};
var sites = toArray(sites);
Now that we have the paths, we create the globs for fruits and colors.
var globs = [];
sites.forEach(function(data) {
globs.push(data + '**/color*');
globs.push(data + '**/fruit*');
});
With your current config, you get an array of 8 entries. Next, let us define the concat-task. Here is what you mean with "batched" together, we need a so called stream array (I wrote about that here). It's a simple mapping of an existing array to many gulp streams, which are merged at the end via the event-stream module. With the color/fruit thing going on, we need to be a little creative with our concat names and dest names.
Note that I use the changed plugin to prevent useless builds.
gulp.task('concat', function() {
var tasks = globs.map(function(glob) {
var file = glob.indexOf('color') >= 0 ? 'col' : 'fru';
var dest = glob.replace('**/color*','').replace('**/fruit*','') + 'web';
return gulp.src(glob)
.pipe(concat(file + '.txt'))
.pipe(gulp.dest(dest))
});
return es.merge.apply(null, tasks);
});
This task now does everything we need, and incrementally so. So our watch process is rather straightforward.
gulp.task('watch', ['concat'], function() {
gulp.watch(globs, ['concat']);
});
Hope this helps!
Update
Alright, I made some adaptations, which should prevent having your whole project rebuilt.
First, I extracted the concatStream to a function. This is actually the one thing you already did with your own sample:
var concatStream = function(glob) {
var file = glob.indexOf('color') >= 0 ? 'farbe' : 'frucht';
var dest = glob.replace('**/color*','').replace('**/fruit*','') + 'web';
return gulp.src(glob)
.pipe(concat(file + '.txt'))
.pipe(header(alertMessage))
.pipe(notify())
.pipe(gulp.dest(dest))
};
Depending on the Glob (the file pattern we select either colors or fruits from our directories), we define a new output (file, is 'col' when 'color' is in our search string, 'fru' otherwise) and a new destination (which is just the old folder without the colors/fruits search pattern).
gulp.task('concat') does now the following:
gulp.task('concat', function() {
var tasks = globs.map(concatStream);
return es.merge.apply(null, tasks);
});
Each of our globs (console.log them, if you want to know what's in there) gets mapped to the concatStream, then the new array of streams gets merged and executed.
The watch task is now new... we do kinda the same as with our 'concat' task:
gulp.task('watch', ['concat'], function() {
globs.map(function(glob) {
gulp.watch(glob, function() {
return concatStream(glob);
})
})
});
For each glob, we create a new watcher, which just calls the concatStream again.
Update
Small change
Inside glob, changing the wildcard (*) to an optional single character match (?), will allow us to use the same name for output file (ex. color and fruit).
var globs = [];
sites.forEach(function(data) {
globs.push(data + '**/color?');
globs.push(data + '**/fruit?');
});
And this as well...
var concatStream = function(glob) {
var file = glob.indexOf('color') >= 0 ? 'color' : 'fruit';
var dest = glob.replace('**/color?','').replace('**/fruit?','') + 'web';
return gulp.src(glob)
.pipe(concat(file + '.txt'))
.pipe(header(alertMessage))
.pipe(notify())
.pipe(gulp.dest(dest))
};
Now I can keep the names of color and fruit for my output file, without worrying bout glob matching the name and adding its existing content back onto the file
I have an audio buffer rendered using webkitOfflineAudioContext. Now, I wish to export it into a WAV file. How do I do it? I tried using recorder.js but couldn't figure out how to use it. Here's my code: http://jsfiddle.net/GBQV8/.
Here's a gist that should help: https://gist.github.com/kevincennis/9754325.
I haven't actually tested this, so there might be a stupid typo or something, but the basic approach will work (I've done it before).
Essentially, you're going to use the web worker from Recorder.js directly so that you can process one big AudioBuffer all in one shot, rather than recording it incrementally in real-time.
I'll paste the code here too, just in case something happens to the gist...
// assuming a var named `buffer` exists and is an AudioBuffer instance
// start a new worker
// we can't use Recorder directly, since it doesn't support what we're trying to do
var worker = new Worker('recorderWorker.js');
// initialize the new worker
worker.postMessage({
command: 'init',
config: {sampleRate: 44100}
});
// callback for `exportWAV`
worker.onmessage = function( e ) {
var blob = e.data;
// this is would be your WAV blob
};
// send the channel data from our buffer to the worker
worker.postMessage({
command: 'record',
buffer: [
buffer.getChannelData(0),
buffer.getChannelData(1)
]
});
// ask the worker for a WAV
worker.postMessage({
command: 'exportWAV',
type: 'audio/wav'
});
I figured I'd share a working solution that I managed to put together from the Kevin's answer.
Here's the waveWorker.js script:
self.onmessage = function( e ){
var wavPCM = new WavePCM( e['data']['config'] );
wavPCM.record( e['data']['pcmArrays'] );
wavPCM.requestData();
};
var WavePCM = function( config ){
this.sampleRate = config['sampleRate'] || 48000;
this.bitDepth = config['bitDepth'] || 16;
this.recordedBuffers = [];
this.bytesPerSample = this.bitDepth / 8;
};
WavePCM.prototype.record = function( buffers ){
this.numberOfChannels = this.numberOfChannels || buffers.length;
var bufferLength = buffers[0].length;
var reducedData = new Uint8Array( bufferLength * this.numberOfChannels * this.bytesPerSample );
// Interleave
for ( var i = 0; i < bufferLength; i++ ) {
for ( var channel = 0; channel < this.numberOfChannels; channel++ ) {
var outputIndex = ( i * this.numberOfChannels + channel ) * this.bytesPerSample;
var sample = buffers[ channel ][ i ];
// Check for clipping
if ( sample > 1 ) {
sample = 1;
}
else if ( sample < -1 ) {
sample = -1;
}
// bit reduce and convert to uInt
switch ( this.bytesPerSample ) {
case 4:
sample = sample * 2147483648;
reducedData[ outputIndex ] = sample;
reducedData[ outputIndex + 1 ] = sample >> 8;
reducedData[ outputIndex + 2 ] = sample >> 16;
reducedData[ outputIndex + 3 ] = sample >> 24;
break;
case 3:
sample = sample * 8388608;
reducedData[ outputIndex ] = sample;
reducedData[ outputIndex + 1 ] = sample >> 8;
reducedData[ outputIndex + 2 ] = sample >> 16;
break;
case 2:
sample = sample * 32768;
reducedData[ outputIndex ] = sample;
reducedData[ outputIndex + 1 ] = sample >> 8;
break;
case 1:
reducedData[ outputIndex ] = ( sample + 1 ) * 128;
break;
default:
throw "Only 8, 16, 24 and 32 bits per sample are supported";
}
}
}
this.recordedBuffers.push( reducedData );
};
WavePCM.prototype.requestData = function(){
var bufferLength = this.recordedBuffers[0].length;
var dataLength = this.recordedBuffers.length * bufferLength;
var headerLength = 44;
var wav = new Uint8Array( headerLength + dataLength );
var view = new DataView( wav.buffer );
view.setUint32( 0, 1380533830, false ); // RIFF identifier 'RIFF'
view.setUint32( 4, 36 + dataLength, true ); // file length minus RIFF identifier length and file description length
view.setUint32( 8, 1463899717, false ); // RIFF type 'WAVE'
view.setUint32( 12, 1718449184, false ); // format chunk identifier 'fmt '
view.setUint32( 16, 16, true ); // format chunk length
view.setUint16( 20, 1, true ); // sample format (raw)
view.setUint16( 22, this.numberOfChannels, true ); // channel count
view.setUint32( 24, this.sampleRate, true ); // sample rate
view.setUint32( 28, this.sampleRate * this.bytesPerSample * this.numberOfChannels, true ); // byte rate (sample rate * block align)
view.setUint16( 32, this.bytesPerSample * this.numberOfChannels, true ); // block align (channel count * bytes per sample)
view.setUint16( 34, this.bitDepth, true ); // bits per sample
view.setUint32( 36, 1684108385, false); // data chunk identifier 'data'
view.setUint32( 40, dataLength, true ); // data chunk length
for (var i = 0; i < this.recordedBuffers.length; i++ ) {
wav.set( this.recordedBuffers[i], i * bufferLength + headerLength );
}
self.postMessage( wav, [wav.buffer] );
self.close();
};
And here's how you can use it:
async function audioBufferToWaveBlob(audioBuffer) {
return new Promise(function(resolve, reject) {
var worker = new Worker('./waveWorker.js');
worker.onmessage = function( e ) {
var blob = new Blob([e.data.buffer], {type:"audio/wav"});
resolve(blob);
};
let pcmArrays = [];
for(let i = 0; i < audioBuffer.numberOfChannels; i++) {
pcmArrays.push(audioBuffer.getChannelData(i));
}
worker.postMessage({
pcmArrays,
config: {sampleRate: audioBuffer.sampleRate}
});
});
}
It's pretty quickly hacked together so feel free (of course) to fix it up and post a link to a better version in the comments :)
When using recorder.js, make sure you start with recording a piece of audio, and then stop it. After you stopped the recorder you can call the .exportWAV function. The callback contains a blob in wav format. Instead of recording the buffer yourself, you'd better use recorder.js's buffer creation, because if you call exportWAV, it will export the buffer it previously saved. It created the buffer from the source object you entered when creating a new recorder.
var rec = new Recorder(yourSourceObject);
rec.record();
//let it record
rec.stop();
rec.exportWAV(function(blob){
//the generated blob contains the wav file
}])
You can also check out the source code of recorderWorker.js and find out how to convert a buffer to a wav file yourself.
For no real time processing, take a look at OfflineAudioContext.
That might be useful to process audio data as if it was a regular AudioContext, but not in real time. If your data does not com from the microphone, you probably want to want to process it as fast as possible. Then, you'll need OfflineAudioContext to create a buffer before encoding it to wav