3

When using an AVAudioPlayerNode to schedule a short buffer to play immediately on a touch event ("Touch Up Inside"), I've noticed audible glitches / artifacts on playback while testing. The audio does not glitch at all in iOS simulator, however there is audible distortion on playback when I run the app on an actual iOS device. The audible distortion occurs randomly (the triggered sound will sometimes sound great, while other times it sounds distorted)

I've tried using different audio files, file formats, and preparing the buffer for playback using the prepareWithFrameCount method, but unfortunately the result is always the same and I'm stuck wondering what could be going wrong..

I've stripped the code down to globals for clarity and simplicity. Any help or insight would be greatly appreciated. This is my first attempt at developing an iOS app and my first question posted on Stack Overflow.

let filePath = NSBundle.mainBundle().pathForResource("BD_withSilence", ofType: "caf")!
let fileURL: NSURL = NSURL(fileURLWithPath: filePath)!
var error: NSError?
let file = AVAudioFile(forReading: fileURL, error: &error)

let fileFormat = file.processingFormat
let frameCount = UInt32(file.length)
let buffer = AVAudioPCMBuffer(PCMFormat: fileFormat, frameCapacity: frameCount)

let audioEngine = AVAudioEngine()
let playerNode = AVAudioPlayerNode()

func startEngine() {

    var error: NSError?
    file.readIntoBuffer(buffer, error: &error)
    audioEngine.attachNode(playerNode)
    audioEngine.connect(playerNode, to: audioEngine.mainMixerNode, format: buffer.format)
    audioEngine.prepare()

    func start() {
        var error: NSError?
        audioEngine.startAndReturnError(&error)
    }

    start()

}

startEngine()

let frameCapacity = AVAudioFramePosition(buffer.frameCapacity)
let frameLength = buffer.frameLength
let sampleRate: Double = 44100.0

func play() {

    func scheduleBuffer() {
    playerNode.scheduleBuffer(buffer, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Interrupts, completionHandler: nil)
    playerNode.prepareWithFrameCount(frameLength)
    }

    if playerNode.playing == false {
        scheduleBuffer()
        let time = AVAudioTime(sampleTime: frameCapacity, atRate: sampleRate)
        playerNode.playAtTime(time)
    }
    else {
        scheduleBuffer()
    }
}

// triggered by a "Touch Up Inside" event on a UIButton in my ViewController

@IBAction func triggerPlay(sender: AnyObject) {
   play()
}

Update:

Ok I think I've identified the source of the distortion: the volume of the node(s) is too great at output and causes clipping. By adding these two lines in my startEngine function, the distortion no longer occurred:

playerNode.volume = 0.8
audioEngine.mainMixerNode.volume = 0.8

However, I'm still don't know why I need to lower the output- my audio file itself does not clip. I'm guessing that it might be a result of the way that the AVAudioPlayerNodeBufferOptions.Interrupts is implemented. When a buffer interrupts another buffer, could there be an increase in output volume as a result of the interruption, causing output clipping? I'm still looking for a solid understanding as to why this occurs.. If anyone is willing/able to provide any clarification about this that would be fantastic!

Andrea Mugnaini
  • 8,880
  • 3
  • 34
  • 49
nastynate13
  • 116
  • 6
  • I'm having the same exact problem. Works great in simulator but on device (iPhone 6+) it randomly plays the sound distorted. Once it starts I have to quit the entire app. I tried your idea of changing volumes but still having the problem. Wondering if there is another solution or if this is an iOS bug – Amos Oct 05 '15 at 23:57
  • Hi Amos, yes I recognized that this didn't entirely fix the problem after the fact- it just seemed to make it less noticeable / less likely.. so, perhaps a bug. I've stepped away from this project, but will be digging in again soon. Will let you know here if I discover a solution. Perhaps bringing it to Apple's attention in the meantime would be a good idea. GL! – nastynate13 Oct 08 '15 at 16:40
  • This could also be because your engine has too many nodes on it. You need to detach some – vikzilla Feb 21 '16 at 19:04
  • Have you got any solution regarding this? – Kishore Suthar Jun 04 '18 at 09:36

1 Answers1

0

Not sure if this is the problem you experienced in 2015, it may be the same issue that @suthar experienced in 2018.

I experienced a very similar problem and was due to the fact that the sampleRate on the device is different to the simulator. On macOS it is 44100 and on iOS Devices (late model ones) it is 48000.

So when you fill your buffer with 44100 samples on a 48000 device, you get 3900 samples of silence. When played back it doesn't sound like silence, it sounds like a glitch.

I used the mainMixer format when connecting my playerNode and also when creating my pcmBuffer. Don't refer to 48000 or 44100 anywhere in the code.

    audioEngine.attach( playerNode)
    audioEngine.connect( playerNode, to:mixerNode, format:mixerNode.outputFormat(forBus:0))

    let pcmBuffer = AVAudioPCMBuffer( pcmFormat:SynthEngine.shared.audioEngine.mainMixerNode.outputFormat( forBus:0),
                                      frameCapacity:AVAudioFrameCount(bufferSize))
Brett
  • 1,125
  • 12
  • 25