Fun with Web Audio and other modern API's

. 5 min read.

Of all the new JavaScript API’s becoming available in web browsers today, the Web Audio API has to be one of the most exciting recent additions for web app and HTML5 game developers. This API opens a whole range of new possibilities, and requires very little knowledge of DSP to achieve some pretty impressive results. In short, it offers the developer a lot more scope and control when compared to using abusing the poor old HTML5 Audio element.

The Web Audio API

The Web Audio API works a bit like wiring together a modular, programmable synthesizer or audio processor. It is not only capable of playing back pre-recorded sounds with sample accuracy, but it can also create raw sound using oscillators and waveforms - the core elements of any audio synthesizer. It can route sounds, perform frequency analysis and even apply native effects such as convolution reverb, delay and distortion effects.

The Web Audio API is currently supported in Chrome, Safari 6 and iOS6 Safari. I thought it was about time to start learning how to use the API and have some fun experimenting, so after a few hours I managed to throw together a relatively simple synth effect unit. It’s both mouse and touch friendly, and functions in a similar way to the Korg Kaoss Pad, albeit with a much simpler set of features.

The basic audio components for the demo are pretty simple. It consists of a single oscillator, a filter, a delay node, and two volume gain nodes (one for delay feedback, the other for master output). The demo also uses an audio frequency analyzer to create a visual representation of the audio output, which is drawn to the Canvas for added real time, visual feedback.

Please note that the current WebKit implementation shown in the code examples below may differ from the final non-prefixed version, according to the official W3C specification.

var myAudioContext = new webkitAudioContext();
var myAudioAnalyser = myAudioContext.createAnalyser();
var source = myAudioContext.createOscillator();
var nodes = {};

nodes.filter = myAudioContext.createBiquadFilter();
nodes.volume = myAudioContext.createGainNode();
nodes.delay = myAudioContext.createDelayNode();
nodes.feedbackGain = myAudioContext.createGainNode();

The source is then routed through the processor nodes and connected to the audio context destination (i.e. the speakers).

source.connect(nodes.filter);
nodes.filter.connect(nodes.volume);
nodes.filter.connect(nodes.delay);
nodes.delay.connect(nodes.feedbackGain);
nodes.feedbackGain.connect(nodes.volume);
nodes.feedbackGain.connect(nodes.delay);
nodes.volume.connect(myAudioAnalyser);
myAudioAnalyser.connect(myAudioContext.destination);

Responding to user input

Touching the user interface surface manipulates the oscillator frequency along the x axis, and the filter cutoff value along the y axis. To start the sound, source.noteOn(0) is called.

source.frequency.value = x;
nodes.filter.frequency.value = y;
source.noteOn(0);

There are a range of different waveforms to use in the main oscillator, and multiple different filter types to experiment with. It’s more a musical toy than anything serious, but it can produce some interesting sci-fi effects and spaced-out synth sounds by using different delay times and feedback settings. If you haven’t already, have some fun and try it out.

Because this is a demo that currently runs in only the most modern browsers, I was also free to use some other useful new API’s and CSS effects.

Smoother animations using requestAnimationFrame

Canvas animation frame rate is handled by the browser using requestAnimationFrame. This helps to achieve smoother animation and better browser performance. It is also better for your CPU (and battery), as rAF will automatically pause animation when the browser tab is not visible.

function animateSpectrum () {
    var mySpectrum = requestAnimationFrame(animateSpectrum);
    drawSpectrum();
}

CSS Filter effects

The red blur effect on the finger tracking element is achieved using CSS filter effects.

.blurred {
    -webkit-filter: blur(5px);
    filter: blur(5px);
}

window.matchMedia

The adaptive user interface switches to a mobile optimized interface when the viewport is smaller than a set threshold. This is done using the matchMedia API to detect CSS Media Queries in JavaScript.

if (window.matchMedia) {
    var breakPoint = window.matchMedia('(max-width: 512px)');
    var isSmallViewport = breakPoint.matches ? true : false;

    breakPoint.addListener(function (mql) {
        isSmallViewport = mql.matches ? true : false;
    });
}

classList API

CSS active and pressed states are applied using the handy new classList API. This makes adding and removing multiple class names in CSS much less of a chore.

var finger = document.querySelector('.finger');

finger.classList.add('active');
finger.classList.remove('active');

Page Visibility API

Audio is automatically muted when the browser tab is not active using the Page Visibility API. This is currently supported in Chrome, but not yet in Safari 6.

Demo

You can check out the demo here. Try it in Chrome, Safari 6 or iOS6. There are still a few glitches to iron out on iOS6, but feel free to open an issue if you find a bug or have any new ideas for features.