How to play multiple sounds from buffer simultaneously using nodes connected to AVAudioEngine's mixer
Asked Answered
U

1

6

I am making a basic music app for iOS, where pressing notes causes the corresponding sound to play. I am trying to get multiple sounds stored in buffers to play simultaneously with minimal latency. However, I can only get one sound to play at any time.

I initially set up my sounds using multiple AVAudioPlayer objects, assigning a sound to each player. While it did play multiple sounds simultaneously, it didn't seem like it was capable of starting two sounds at the same time (it seemed like it would delay the second sound just slightly after the first sound was started). Furthermore, if I pressed notes at a very fast rate, it seemed like the engine couldn't keep up, and later sounds would start well after I had pressed the later notes.

I am trying to solve this problem, and from the research I have done, it seems like using the AVAudioEngine to play sounds would be the best method, where I can set up the sounds in an array of buffers, and then have them play back from those buffers.

class ViewController: UIViewController
{
    // Main Audio Engine and it's corresponding mixer
    var audioEngine: AVAudioEngine = AVAudioEngine()
    var mainMixer = AVAudioMixerNode()

    // One AVAudioPlayerNode per note
    var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)

    // Array of filepaths
    let noteFilePath: [String] = [
    Bundle.main.path(forResource: "note1", ofType: "wav")!, 
    Bundle.main.path(forResource: "note2", ofType: "wav")!, 
    Bundle.main.path(forResource: "note3", ofType: "wav")!]

    // Array to store the note URLs
    var noteFileURL = [URL]()

    // One audio file per note
    var noteAudioFile = [AVAudioFile]()

    // One audio buffer per note
    var noteAudioFileBuffer = [AVAudioPCMBuffer]()

    override func viewDidLoad()
    {
        super.viewDidLoad()
        do
        {

            // For each note, read the note URL into an AVAudioFile,
            // setup the AVAudioPCMBuffer using data read from the file,
            // and read the AVAudioFile into the corresponding buffer
            for i in 0...2
            {
                noteFileURL.append(URL(fileURLWithPath: noteFilePath[i]))

                // Read the corresponding url into the audio file
                try noteAudioFile.append(AVAudioFile(forReading: noteFileURL[i]))

                // Read data from the audio file, and store it in the correct buffer
                let noteAudioFormat = noteAudioFile[i].processingFormat

                let noteAudioFrameCount = UInt32(noteAudioFile[i].length)

                noteAudioFileBuffer.append(AVAudioPCMBuffer(pcmFormat: noteAudioFormat, frameCapacity: noteAudioFrameCount)!)

                // Read the audio file into the buffer
                try noteAudioFile[i].read(into: noteAudioFileBuffer[i])
            }

           mainMixer = audioEngine.mainMixerNode

            // For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
            for i in 0...2
            {
                audioEngine.attach(audioFilePlayer[i])

                audioEngine.connect(audioFilePlayer[i], to: mainMixer, fromBus: 0, toBus: i, format: noteAudioFileBuffer[i].format)
            }

            // Start the audio engine
            try audioEngine.start()

            // Setup the audio session to play sound in the app, and activate the audio session
            try AVAudioSession.sharedInstance().setCategory(AVAudioSession.Category.soloAmbient)
            try AVAudioSession.sharedInstance().setMode(AVAudioSession.Mode.default)
            try AVAudioSession.sharedInstance().setActive(true)            
        }
        catch let error
        {
            print(error.localizedDescription)
        }
    }

    func playSound(senderTag: Int)
    {
        let sound: Int = senderTag - 1

        // Set up the corresponding audio player to play its sound.
        audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, options: .interrupts, completionHandler: nil)
        audioFilePlayer[sound].play()

    }

Each sound should be playing without interrupting the other sounds, only interrupting its own sound when the sounds is played again. However, despite setting up multiple buffers and players, and assigning each one to its own Bus on the audioEngine's mixer, playing one sound still stops any other sounds from playing.

Furthermore, while leaving out .interrupts does prevent sounds from stopping other sounds, these sounds won't play until the sound that is currently playing completes. This means that if I play note1, then note2, then note3, note1 will play, while note2 will only play after note1 finishes, and note3 will only play after note2 finishes.

Edit: I was able to get the audioFilePlayer to reset to the beginning again without using interrupt with the following code in the playSound function.

if audioFilePlayer[sound].isPlaying == true
        {
            audioFilePlayer[sound].stop()
        }
        audioFilePlayer[sound].scheduleBuffer(noteAudioFileBuffer[sound], at: nil, completionHandler: nil)
        audioFilePlayer[sound].play()

This still leaves me with figuring out how to play these sounds simultaneously, since playing another sound will still stop the currently playing sound.

Edit 2: I found the solution to my problem. My answer is below.

Unappealable answered 4/8, 2019 at 1:25 Comment(0)
U
8

It turns out that having the .interrupt option wasn't the issue (in fact, this actually turned out to be the best way to restart the sound that was playing in my experience, as there was no noticeable pause during the restart, unlike the stop() function). The actual problem that was preventing multiple sounds from playing simultaneously was this particular line of code.

// One AVAudioPlayerNode per note
    var audioFilePlayer: [AVAudioPlayerNode] = Array(repeating: AVAudioPlayerNode(), count: 7)

What happened here was that each item of the array was being assigned the exact same AVAudioPlayerNode value, so they were all effectively sharing the same AVAudioPlayerNode. As a result, the AVAudioPlayerNode functions were affecting all of the items in the array, instead of just the specified item. To fix this and give each item a different AVAudioPlayerNode value, I ended up changing the above line so that it starts as an empty array of type AVAudioPlayerNode instead.

// One AVAudioPlayerNode per note
    var audioFilePlayer = [AVAudioPlayerNode]()

I then added a new line to append to this array a new AVAudioPlayerNode at the beginning inside of the second for-loop of the viewDidLoad() function.

// For each note, attach the corresponding node to the audioEngine, and connect the node to the audioEngine's mixer.
for i in 0...6
{
    audioFilePlayer.append(AVAudioPlayerNode())
    // audioEngine code
}

This gave each item in the array a different AVAudioPlayerNode value. Playing a sound or restarting a sound no longer interrupts the other sounds that are currently being played. I can now play any of the notes simultaneously and without any noticeable latency between note press and playback.

Unappealable answered 4/8, 2019 at 6:6 Comment(2)
Remember, in Swift, class instances have reference semantics (structs and enums have value semantics)!Follow
UI get freeze when more audio files. for i in 0...31{}Granulocyte

© 2022 - 2024 — McMap. All rights reserved.