merging / layering multiple ArrayBuffers into one AudioBuffer using Web Audio API
Asked Answered
W

2

6

I need to layer looping .wav tracks that ultimately I will need to be able to turn on and off and keep in sync.

First I load the tracks and stopped BufferLoader from turning the loaded arraybuffer into an AudioBuffer (hence the false)

        function loadTracks(data) {
            for (var i = 0; i < data.length; i++) {
                trackUrls.push(data[i]['url']);
            };
            bufferLoader = new BufferLoader(context, trackUrls, finishedLoading);
            bufferLoader.load(false);
            return loaderDefered.promise;
        }

When you click a button on screen it calls startStop().

    function startStop(index, name, isPlaying) {
        if(!activeBuffer) {
            activeBuffer = bufferList[index];
        }else{
            activeBuffer = appendBuffer(activeBuffer, bufferList[index]);
        }
        context.decodeAudioData(activeBuffer, function(buffer){
            audioBuffer = buffer;
            play();
        })


    function play() {
        var scheduledTime = 0.015;
        try {
            audioSource.stop(scheduledTime);
        } catch (e) {}

        audioSource = context.createBufferSource();
        audioSource.buffer = audioBuffer;
        audioSource.loop = true;
        audioSource.connect(context.destination);
        var currentTime = context.currentTime + 0.010 || 0;
        audioSource.start(scheduledTime - 0.005, currentTime, audioBuffer.duration - currentTime);
        audioSource.playbackRate.value = 1;
    }

Most of the code I found on this guys github. In the demo you can hear he is layering AudioBuffers.

I have tried the same on my hosting.

Disregarding the argularJS stuff, the Web Audio stuff is happening on the service.js at:

/js/angular/service.js 

If you open the console and click the buttons you can see the activeBuffer.byteLength (type ArrayBuffer) is incrementing, however even after being decoded by the context.decodeAudioData method it still only plays the first sound you clicked instead of a merged AudioBuffer

Warsle answered 20/9, 2013 at 14:33 Comment(0)
N
6

I'm not sure I totally understand your scenario - don't you want these to be playing simultaneously? (i.e. bass gets layered on top of the drums).

Your current code is trying to concatenate an additional audio file whenever you hit the button for that file. You can't just concatenate audio files (in their ENCODED form) and then run it through decode - the decodeAudioData method is decoding the first complete sound in the arraybuffer, then stopping (because it's done decoding the sound).

What you should do is change the logic to concatenate the buffer data from the resulting AudioBuffers (see below). Even this logic isn't QUITE what you should do - this is still caching the encoded audio files, and decoding every time you hit the button. Instead, you should cache the decoded audio buffers, and just concatenate it.

function startStop(index, name, isPlaying) {

    // Note we're decoding just the new sound
    context.decodeAudioData( bufferList[index], function(buffer){
        // We have a decoded buffer - now we need to concatenate it
        audioBuffer = buffer;

        if(!audioBuffer) {
            audioBuffer = buffer;
        }else{
            audioBuffer = concatenateAudioBuffers(audioBuffer, buffer);
        }

        play();
    })
}

function concatenateAudioBuffers(buffer1, buffer2) {
    if (!buffer1 || !buffer2) {
        console.log("no buffers!");
        return null;
    }

    if (buffer1.numberOfChannels != buffer2.numberOfChannels) {
        console.log("number of channels is not the same!");
        return null;
    }

    if (buffer1.sampleRate != buffer2.sampleRate) {
        console.log("sample rates don't match!");
        return null;
    }

    var tmp = context.createBuffer(buffer1.numberOfChannels, buffer1.length + buffer2.length, buffer1.sampleRate);

    for (var i=0; i<tmp.numberOfChannels; i++) {
        var data = tmp.getChannelData(i);
        data.set(buffer1.getChannelData(i));
        data.set(buffer2.getChannelData(i),buffer1.length);
    }
    return tmp;
};
Noontide answered 20/9, 2013 at 15:18 Comment(3)
Hi, we use data.set(buffer1.getChannelData(i)); data.set(buffer2.getChannelData(i),buffer1.length); to append two buffers. what if we want to merge them. so they play at same timePyoid
Please answer this questionSupersonic
If you want to layer them - like, really just play them at the same time - personally, I would keep them as separate AudioBuffers and just make two AudioBufferSourceNodes that start() at the same time. But if you really want to overlay, just go through and sum each value in the buffer arrays (don't use "Buffer.set()", literally do data[i] = buf[i]+buf2[i];).Noontide
W
-1

SOLVED:

To get multiple loops of the same duration playing at the same time and keep in sync even when you start and stop them randomly.

First, create all your buffer sources where bufferList is an array of AudioBuffers and the first sound is a sound you are going to read from and overwrite with your other sounds.

    function createAllBufferSources() {
        for (var i = 0; i < bufferList.length; i++) {
            var source = context.createBufferSource();
            source.buffer = bufferList[i];
            source.loop = true;
            bufferSources.push(source);
        };
        console.log(bufferSources)
    }

Then:

    function start() {
        var rewrite = bufferSources[0];
        rewrite.connect(context.destination);
        var processNode = context.createScriptProcessor(2048, 2, 2);
        rewrite.connect(processNode)
        processNode.onaudioprocess = function(e) {
                            //getting the left and right of the sound we want to overwrite
            var left = rewrite.buffer.getChannelData(0);
            var right = rewrite.buffer.getChannelData(1);
            var overL = [],
                overR = [],
                i, a, b, l;

            l = bufferList.length,
                            //storing all the loops channel data
            for (i = 0; i < l; i++) {
                    overL[i] = bufferList[i].getChannelData(0);
                    overR[i] = bufferList[i].getChannelData(1);
            }
                            //looping through the channel data of the sound we are going to overwrite
            a = 0, b = overL.length, l = left.length;
            for (i = 0; i < l; i++) {
                            //making sure its a blank before we start to write
                left[i] -= left[i];
                right[i] -= right[i];
                            //looping through all the sounds we want to add and assigning the bytes to the old sound, both at the same position
                for (a = 0; a < b; a++) {
                    left[i] += overL[a][i];
                    right[i] += overR[a][i];
                }
                left[i] /= b;
                right[i] /= b);
            }
        };

        processNode.connect(context.destination);
        rewrite.start(0)
    }

If you remove a AudioBuffer from bufferList and add it again at any point, it will always be in sync.

EDIT:

Keep in mind that: -processor node gets garbage collected weirdly. -This is very taxing, might want to think about using WebWorkers somehow

Warsle answered 24/9, 2013 at 13:27 Comment(1)
Aaagh! You don't need to use a ScriptProcessorNode for this; you're causing a ton of inter-thread communication and glitching potential to do this live, when you can easily concatenate the buffers (see above) or just call start() on them at the appropriately-offset time.Noontide

© 2022 - 2024 — McMap. All rights reserved.