7

I am trying to use the new AVAudioEngine in iOS 8.

It looks like the completionHandler of player.scheduleFile() is called before the sound file has finished playing.

I am using a sound file with a length of 5s -- and the println()-Message appears round about 1 second before the end of the sound.

Am I doing something wrong or do I misunderstand the idea of a completionHandler?

Thanks!


Here is some code:

class SoundHandler {
    let engine:AVAudioEngine
    let player:AVAudioPlayerNode
    let mainMixer:AVAudioMixerNode

    init() {
        engine = AVAudioEngine()
        player = AVAudioPlayerNode()
        engine.attachNode(player)
        mainMixer = engine.mainMixerNode

        var error:NSError?
        if !engine.startAndReturnError(&error) {
            if let e = error {
                println("error \(e.localizedDescription)")
            }
        }

        engine.connect(player, to: mainMixer, format: mainMixer.outputFormatForBus(0))
    }

    func playSound() {
        var soundUrl = NSBundle.mainBundle().URLForResource("Test", withExtension: "m4a")
        var soundFile = AVAudioFile(forReading: soundUrl, error: nil)

        player.scheduleFile(soundFile, atTime: nil, completionHandler: { println("Finished!") })

        player.play()
    }
}
Oliver
  • 1,348
  • 1
  • 11
  • 11

7 Answers7

7

I see the same behavior.

From my experimentation, I believe the callback is called once the buffer/segment/file has been "scheduled", not when it is finished playing.

Although the docs explicitly states: "Called after the buffer has completely played or the player is stopped. May be nil."

So I think it's either a bug or incorrect documentation. No idea which

Alan Queen
  • 71
  • 1
  • In the meantime it has changed to "Called after the player has scheduled the file for playback on the render thread or the player is stopped. May be nil.“ — not sure if this includes when the player ends naturally. – bio May 03 '18 at 14:41
7

You can always compute the future time when audio playback will complete, using AVAudioTime. The current behavior is useful because it supports scheduling additional buffers/segments/files to play from the callback before the end of the current buffer/segment/file finishes, avoiding a gap in audio playback. This lets you create a simple loop player without a lot of work. Here's an example:

class Latch {
    var value : Bool = true
}

func loopWholeFile(file : AVAudioFile, player : AVAudioPlayerNode) -> Latch {
    let looping = Latch()
    let frames = file.length

    let sampleRate = file.processingFormat.sampleRate
    var segmentTime : AVAudioFramePosition = 0
    var segmentCompletion : AVAudioNodeCompletionHandler!
    segmentCompletion = {
        if looping.value {
            segmentTime += frames
            player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
        }
    }
    player.scheduleFile(file, atTime: AVAudioTime(sampleTime: segmentTime, atRate: sampleRate), completionHandler: segmentCompletion)
    segmentCompletion()
    player.play()

    return looping
}

The code above schedules the entire file twice before calling player.play(). As each segment gets close to finishing, it schedules another whole file in the future, to avoid gaps in playback. To stop looping, you use the return value, a Latch, like this:

let looping = loopWholeFile(file, player)
sleep(1000)
looping.value = false
player.stop()
Patrick Beard
  • 492
  • 6
  • 8
6

The AVAudioEngine docs from back in the iOS 8 days must have just been wrong. In the meantime, as a workaround, I noticed if you instead use scheduleBuffer:atTime:options:completionHandler: the callback is fired as expected (after playback finishes).

Example code:

AVAudioFile *file = [[AVAudioFile alloc] initForReading:_fileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:nil];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
[file readIntoBuffer:buffer error:&error];

[_player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
    // reminder: we're not on the main thread in here
    dispatch_async(dispatch_get_main_queue(), ^{
        NSLog(@"done playing, as expected!");
    });
}];
taber
  • 3,096
  • 3
  • 43
  • 68
  • Love it. Works like a charm! – Next Developer Jan 31 '16 at 21:43
  • Actually, after testing this, it turned out that even with the buffer the callback gets called before the player stops. `audioPlayer.scheduleBuffer(audioBuffer){ dispatch_async(dispatch_get_main_queue()) { [unowned self] in if (self.audioPlayer.playing == false){ self.stopButton.hidden = true } } }` in this example, the condition never gets passed – Next Developer Feb 01 '16 at 20:01
  • What's odd is my AVAudioPlayerNode's produce sound on iOS9 but aren't working on some older devices and devices running iOS8. – vikzilla Feb 20 '16 at 16:29
  • Did someone actually file a bug for this? I can do it if needed. – lancejabr Feb 17 '17 at 05:08
  • 1
    @lancejabr i did, but you can too! the more bug reports they receive on something, the more likely they'll fix it. – taber Feb 17 '17 at 13:46
  • Good solution but you lose all users with pre-iOS 11 devices. – bio May 03 '18 at 14:44
  • @bio how so? Looks like this method has been around since iOS 8: https://developer.apple.com/documentation/avfoundation/avaudioplayernode/1388422-schedulebuffer?language=swift – taber May 24 '18 at 16:08
  • Sorry, confused it with the scheduleFile method with similar parameters. – bio May 25 '18 at 16:57
  • how can i use this method in swift any sample code ? – RAM Sep 18 '18 at 11:09
  • For any future searchers, be sure to read arlomedia's answer below. This answer is incorrect; there's no bug here. The callback is running when it's supposed to. "Called after the player has scheduled the buffer for playback on the render thread or the player is stopped." It is called when the scheduling is done, not when the playing is done. But arlomedia's answer includes what you're probably looking for (AVAudioPlayerNodeCompletionDataPlayedBack). – Rob Napier Jan 13 '20 at 21:03
3

My bug report for this was closed as "works as intended," but Apple pointed me to new variations of the scheduleFile, scheduleSegment and scheduleBuffer methods in iOS 11. These add a completionCallbackType argument that you can use to specify that you want the completion callback when the playback is completed:

[self.audioUnitPlayer
            scheduleSegment:self.audioUnitFile
            startingFrame:sampleTime
            frameCount:(int)sampleLength
            atTime:0
            completionCallbackType:AVAudioPlayerNodeCompletionDataPlayedBack
            completionHandler:^(AVAudioPlayerNodeCompletionCallbackType callbackType) {
    // do something here
}];

The documentation doesn't say anything about how this works, but I tested it and it works for me.

I've been using this workaround for iOS 8-10:

- (void)playRecording {
    [self.audioUnitPlayer scheduleSegment:self.audioUnitFile startingFrame:sampleTime frameCount:(int)sampleLength atTime:0 completionHandler:^() {
        float totalTime = [self recordingDuration];
        float elapsedTime = [self recordingCurrentTime];
        float remainingTime = totalTime - elapsedTime;
        [self performSelector:@selector(doSomethingHere) withObject:nil afterDelay:remainingTime];
    }];
}

- (float)recordingDuration {
    float duration = duration = self.audioUnitFile.length / self.audioUnitFile.processingFormat.sampleRate;
    if (isnan(duration)) {
        duration = 0;
    }
    return duration;
}

- (float)recordingCurrentTime {
    AVAudioTime *nodeTime = self.audioUnitPlayer.lastRenderTime;
    AVAudioTime *playerTime = [self.audioUnitPlayer playerTimeForNodeTime:nodeTime];
    AVAudioFramePosition sampleTime = playerTime.sampleTime;
    if (sampleTime == 0) { return self.audioUnitLastKnownTime; } // this happens when the player isn't playing
    sampleTime += self.audioUnitStartingFrame; // if we trimmed from the start, or changed the location with the location slider, the time before that point won't be included in the player time, so we have to track it ourselves and add it here
    float time = sampleTime / self.audioUnitFile.processingFormat.sampleRate;
    self.audioUnitLastKnownTime = time;
    return time;
}
arlomedia
  • 7,625
  • 5
  • 53
  • 96
1

As of today, in a project with deployment target 12.4, on a device running 12.4.1, here's the way we found to successfully stop the nodes upon playback completion:

// audioFile and playerNode created here ...

playerNode.scheduleFile(audioFile, at: nil, completionCallbackType: .dataPlayedBack) { _ in
    os_log(.debug, log: self.log, "%@", "Completing playing sound effect: \(filePath) ...")

    DispatchQueue.main.async {
        os_log(.debug, log: self.log, "%@", "... now actually completed: \(filePath)")

        self.engine.disconnectNodeOutput(playerNode)
        self.engine.detach(playerNode)
    }
}

The main difference w.r.t. previous answers is to postpone node detaching on main thread (which I guess is also the audio render thread?), instead of performing that on callback thread.

superjos
  • 10,834
  • 4
  • 79
  • 120
0

Yes, it does get called slightly before the file (or buffer) has completed. If you call [myNode stop] from within the completion handler the file (or buffer) will not fully complete. However, if you call [myEngine stop], the file (or buffer) will complete to the end

Andrew Coad
  • 257
  • 2
  • 11
0
// audioFile here is our original audio

audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: {
        print("scheduleFile Complete")

        var delayInSeconds: Double = 0

        if let lastRenderTime = self.audioPlayerNode.lastRenderTime, let playerTime = self.audioPlayerNode.playerTime(forNodeTime: lastRenderTime) {

            if let rate = rate {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate) / Double(rate!)
            } else {
                delayInSeconds = Double(audioFile.length - playerTime.sampleTime) / Double(audioFile.processingFormat.sampleRate)
            }
        }

        // schedule a stop timer for when audio finishes playing
        DispatchTime.executeAfter(seconds: delayInSeconds) {
            audioEngine.mainMixerNode.removeTap(onBus: 0)
            // Playback has completed
        }

    })
ZaEeM ZaFaR
  • 1,438
  • 17
  • 21