completionHandler of AVAudioPlayerNode.scheduleFile() is called too early
Asked Answered
T

7

9

I am trying to use the new AVAudioEngine in iOS 8.

It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing.

I am using a sound file with a length of 5s -- and the println()-Message appears round about 1 second before the end of the sound.

Am I doing something wrong or do I misunderstand the idea of a completionHandler?

Thanks!


Here is some code:

class SoundHandler {
    let engine:AVAudioEngine
    let player:AVAudioPlayerNode
    let mainMixer:AVAudioMixerNode

    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        engine.attachNode(player)
        mainMixer = engine.mainMixerNode

        var error:NSError?
        if !engine.startAndReturnError(&error) {
            if let e = error {
                println("error \(e.localizedDescription)")
            }
        }

        engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
    }

    func playSound() {
        var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
        var soundFile = AVAudioFile(forReading: soundUrl, error: nil)

        player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })

        player.play()
    }
}
Topcoat answered 3/4, 2015 at 6:26 Comment(0)
C
7

The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).

Example code:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
    // reminder: we're not on the main thread in here
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"done playing, as expected!");
    });
}];
Cumberland answered 14/4, 2015 at 14:32 Comment(10)
Love it. Works like a charm!Cooperage
Actually, after testing this, it turned out that even with the buffer the callback gets called before the player stops. audioPlayer.scheduleBuffer(audioBuffer){ dispatch_async(dispatch_get_main_queue()) { [unowned self] in if (self.audioPlayer.playing == false){ self.stopButton.hidden = true } } } in this example, the condition never gets passedCooperage
What's odd is my AVAudioPlayerNode's produce sound on iOS9 but aren't working on some older devices and devices running iOS8.Morality
Did someone actually file a bug for this? I can do it if needed.Ptolemaist
@Ptolemaist i did, but you can too! the more bug reports they receive on something, the more likely they'll fix it.Cumberland
Good solution but you lose all users with pre-iOS 11 devices.Impotent
@Impotent how so? Looks like this method has been around since iOS 8: developer.apple.com/documentation/avfoundation/…Cumberland
Sorry, confused it with the scheduleFile method with similar parameters.Impotent
how can i use this method in swift any sample code ?Prem
For any future searchers, be sure to read arlomedia's answer below. This answer is incorrect; there's no bug here. The callback is running when it's supposed to. "Called after the player has scheduled the buffer for playback on the render thread or the player is stopped." It is called when the scheduling is done, not when the playing is done. But arlomedia's answer includes what you're probably looking for (AVAudioPlayerNodeCompletionDataPlayedBack).Lisle
P
9

I see the same behavior.

From my experimentation, I believe the callback is called once the buffer/segment/file has been "scheduled", not when it is finished playing.

Although the docs explicitly states: "Called after the buffer has completely played or the player is stopped. May be nil."

So I think it's either a bug or incorrect documentation. No idea which

Pulido answered 6/4, 2015 at 19:35 Comment(1)
In the meantime it has changed to "Called after the player has scheduled the file for playback on the render thread or the player is stopped. May be nil.“ — not sure if this includes when the player ends naturally.Impotent
C
7

The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).

Example code:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
    // reminder: we're not on the main thread in here
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"done playing, as expected!");
    });
}];
Cumberland answered 14/4, 2015 at 14:32 Comment(10)
Love it. Works like a charm!Cooperage
Actually, after testing this, it turned out that even with the buffer the callback gets called before the player stops. audioPlayer.scheduleBuffer(audioBuffer){ dispatch_async(dispatch_get_main_queue()) { [unowned self] in if (self.audioPlayer.playing == false){ self.stopButton.hidden = true } } } in this example, the condition never gets passedCooperage
What's odd is my AVAudioPlayerNode's produce sound on iOS9 but aren't working on some older devices and devices running iOS8.Morality
Did someone actually file a bug for this? I can do it if needed.Ptolemaist
@Ptolemaist i did, but you can too! the more bug reports they receive on something, the more likely they'll fix it.Cumberland
Good solution but you lose all users with pre-iOS 11 devices.Impotent
@Impotent how so? Looks like this method has been around since iOS 8: developer.apple.com/documentation/avfoundation/…Cumberland
Sorry, confused it with the scheduleFile method with similar parameters.Impotent
how can i use this method in swift any sample code ?Prem
For any future searchers, be sure to read arlomedia's answer below. This answer is incorrect; there's no bug here. The callback is running when it's supposed to. "Called after the player has scheduled the buffer for playback on the render thread or the player is stopped." It is called when the scheduling is done, not when the playing is done. But arlomedia's answer includes what you're probably looking for (AVAudioPlayerNodeCompletionDataPlayedBack).Lisle
S
7

You can always compute the future time when audio playback will complete, using AVAudioTime. The current behavior is useful because it supports scheduling additional buffers/segments/files to play from the callback before the end of the current buffer/segment/file finishes, avoiding a gap in audio playback. This lets you create a simple loop player without a lot of work. Here's an example:

class Latch {
    var value : Bool = true
}

func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
    let looping = Latch()
    let frames = file.length

    let sampleRate = file.processingFormat.sampleRate
    var segmentTime : AVAudioFramePosition = 0
    var segmentCompletion : AVAudioNodeCompletionHandler!
    segmentCompletion = {
        if looping.value {
            segmentTime += frames
            player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
        }
    }
    player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
    segmentCompletion()
    player.play()

    return looping
}

The code above schedules the entire file twice before calling player.play(). As each segment gets close to finishing, it schedules another whole file in the future, to avoid gaps in playback. To stop looping, you use the return value, a Latch, like this:

let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()
Stadtholder answered 15/3, 2016 at 2:3 Comment(0)
T
6

My bug report for this was closed as "works as intended," but Apple pointed me to new variations of the scheduleFile, scheduleSegment and scheduleBuffer methods in iOS 11. These add a completionCallbackType argument that you can use to specify that you want the completion callback when the playback is completed:

[self.audioUnitPlayer
            scheduleSegment:self.audioUnitFile
            startingFrame:sampleTime
            frameCount:(int)sampleLength
            atTime:0
            completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
            completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
    // do something here
}];

The documentation doesn't say anything about how this works, but I tested it and it works for me.

I've been using this workaround for iOS 8-10:

- (void)playRecording {
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
        float totalTime = [self recordingDuration];
        float elapsedTime = [self recordingCurrentTime];
        float remainingTime = totalTime - elapsedTime;
        [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
    }];
}

- (float)recordingDuration {
    float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
    if (isnan(duration)) {
        duration = 0;
    }
    return duration;
}

- (float)recordingCurrentTime {
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
    AVAudioFramePosition sampleTime = playerTime.sampleTime;
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
    float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
    self.audioUnitLastKnownTime = time;
    return time;
}
Tonguing answered 18/11, 2017 at 0:7 Comment(0)
C
1

As of today, in a project with deployment target 12.4, on a device running 12.4.1, here's the way we found to successfully stop the nodes upon playback completion:

// audioFile and playerNode created here ...

playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in
    os_log(.debug, log: self.log, "%@", "Completing playing sound effect: \(filePath) ...")

    DispatchQueue.main.async {
        os_log(.debug, log: self.log, "%@", "... now actually completed: \(filePath)")

        self.engine.disconnectNodeOutput(playerNode)
        self.engine.detach(playerNode)
    }
}

The main difference w.r.t. previous answers is to postpone node detaching on main thread (which I guess is also the audio render thread?), instead of performing that on callback thread.

Crosswise answered 11/9, 2019 at 10:12 Comment(0)
R
0

Yes, it does get called slightly before the file (or buffer) has completed. If you call [myNode stop] from within the completion handler the file (or buffer) will not fully complete. However, if you call [myEngine stop], the file (or buffer) will complete to the end

Rights answered 12/3, 2017 at 8:21 Comment(0)
C
0
// audioFile here is our original audio

audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: {
        print("scheduleFile Complete")

        var delayInSeconds: Double = 0

        if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {

            if let rate = rate {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!)
            } else {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate)
            }
        }

        // schedule a stop timer for when audio finishes playing
        DispatchTime.executeAfter(seconds: delayInSeconds) {
            audioEngine.mainMixerNode.removeTap(onBus: 0)
            // Playback has completed
        }

    })
Consuela answered 8/2, 2018 at 6:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.