5

I'm having an issue with downsampling audio taken from the microphone. I'm using AVAudioEngine to take samples from the microphone with the following code:

assert(self.engine.inputNode != nil)
let input = self.engine.inputNode!

let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)    
let mixer = AVAudioMixerNode()
engine.attach(mixer)
engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0))

do {
    try engine.start()

    mixer.installTap(onBus: 0, bufferSize: 1024, format: audioFormat, block: {
            (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
        //some code here
    })

} catch let error {
    print(error.localizedDescription)
}

This code works great on the iPhone 5s since the microphone input is 8000Hz, and the buffer gets filled with data from the microphone.

The problem is that I want to be able to record from iPhone 6s (and upwards) which microphone records with 16000Hz. And whats weird is that if I connect the mixernode with the engines mainmixernode (with the following code):

engine.connect(mixer, to: mainMixer, format: audioFormat)

this actually works, and the buffer I get has the format of 8000Hz and the sound comes out perfectly downsampled, only problem is that the sound also comes out from the speaker which I don't want (and if I don't connect it the buffer is empty).

Does anyone know how to resolve this issue?

Any help, input or thought is very much appreciated.

nullforlife
  • 1,244
  • 1
  • 14
  • 29

3 Answers3

4

An other way to do it , with AVAudioConverter in Swift 5

let engine = AVAudioEngine()


func setup() {

    let input = engine.inputNode
    let bus = 0
    let inputFormat = input.outputFormat(forBus: bus )
    guard let outputFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: true), let converter = AVAudioConverter(from: inputFormat, to: outputFormat) else{
        return
    }

    input.installTap(onBus: bus, bufferSize: 1024, format: inputFormat) { (buffer, time) -> Void in
        var newBufferAvailable = true

        let inputCallback: AVAudioConverterInputBlock = { inNumPackets, outStatus in
            if newBufferAvailable {
                outStatus.pointee = .haveData
                newBufferAvailable = false
                return buffer
            } else {
                outStatus.pointee = .noDataNow
                return nil
            }
        }

        if let convertedBuffer = AVAudioPCMBuffer(pcmFormat: outputFormat, frameCapacity: AVAudioFrameCount(outputFormat.sampleRate) * buffer.frameLength / AVAudioFrameCount(buffer.format.sampleRate)){
            var error: NSError?
            let status = converter.convert(to: convertedBuffer, error: &error, withInputFrom: inputCallback)
            assert(status != .error)

            // 8kHz buffers
            print(convertedBuffer.format)
        }
    }
    do {
        try engine.start()
    } catch { print(error) }
}
dengST30
  • 1,898
  • 10
  • 19
  • the accepted approach only gave me empty buffers. The solution with the converter worked for me, thanks for that! – bobski Jun 30 '20 at 18:07
  • hi buddy, can u help my problem: https://stackoverflow.com/questions/66971504/i-got-crash-when-record-required-condition-is-false-format-samplerate-hwf – famfamfam Apr 06 '21 at 15:34
  • Just saw it. fixed the crash, then you got 48000 by device. you can get 44100 by converting – dengST30 Apr 09 '21 at 15:45
3

I solved this issue by simply changing my mixers volume to 0.

mixer.volume = 0

This makes me able to take advantage of the engines main mixer awesome ability to resample any samplerate to my desired samplerate, and not hearing the microphone feedback loop coming directly out from the speakers. If anyone needs any clarifying about this please let me know.

This is my code now:

assert(self.engine.inputNode != nil)
let input = self.engine.inputNode!

let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)    
let mixer = AVAudioMixerNode()
engine.attach(mixer)
engine.connect(input, to: mixer, format: input.inputFormat(forBus: 0))
mixer.volume = 0
engine.connect(mixer, to: mainMixer, format: audioFormat)

do {
    try engine.start()

    mixer.installTap(onBus: 0, bufferSize: 1024, format: audioFormat, block: {
        (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in
        //some code here
    })

} catch let error {
    print(error.localizedDescription)
}
nullforlife
  • 1,244
  • 1
  • 14
  • 29
  • where you define the "mainMixer" ? – Osman Jun 20 '17 at 23:12
  • It was quite a while ago I wrote this code but I'm 95% sure that is the AVAudioEngines main mixer node. – nullforlife Jun 21 '17 at 08:13
  • 3
    If I use this code it gives me all zeros in the buffer. Do you know what I'm doing wrong? I'm using iPhone 7 which has a microphone input sample rate of 44100Hz. – Robert Veringa Jul 09 '17 at 18:43
  • @RobertVeringa weird, are you running on a real device or on simulator? Because it won't work on the simulator. You could also test the first code in my question and see if you can hear anything through the speakers. Also, you could try to change the AudioFormat to input.inputFormat(forBus: 0) to see if you get any data in the buffers. – nullforlife Jul 10 '17 at 10:50
  • Thanks for your response. I had to configure the audio engine's preferred sample rate to 16000Hz and downsample it to 8000Hz – Robert Veringa Jul 12 '17 at 09:20
  • @nullforlife Why does this not work on the simulator? Do you know how to get it work in real hardware and the simulator? – Simon Hessner Mar 19 '18 at 19:52
  • @SimonH the reason is probably because the simulator doesn't have it's own hardware. It would be using your computers microphone/speakers and I guess Apple don't want that. The code above should work on a real device. – nullforlife Mar 20 '18 at 13:40
  • I am getting `AVAEInternal.h:76 required condition is false: [AVAudioIONodeImpl.mm:1064:SetOutputFormat: (format.sampleRate == hwFormat.sampleRate)]` on this :( – user2161301 Jun 22 '20 at 00:49
  • tried to set mixer but not working, still crash TT^TT – famfamfam Apr 06 '21 at 15:11
  • can someone help my problem? Thanks:https://stackoverflow.com/questions/66971504/i-got-crash-when-record-required-condition-is-false-format-samplerate-hwf – famfamfam Apr 06 '21 at 15:22
1

The only thing I found that worked to change the sampling rate was

AVAudioSettings.sharedInstance().setPreferredSampleRate(...)

Unfortunately, there is no guarantee that you will get the sample rate that you want, although it seems like 8000, 12000, 16000, 22050, 44100 all worked.

The following did NOT work:

  1. Setting the my custom format in a tap off engine.inputNode. (Exception)
  2. Adding a mixer with my custom format and tapping that. (Exception)
  3. Adding a mixer, connecting it with the inputNode's format, connecting the mixer to the main mixer with my custom format, then removing the input of the outputNode so as not to send the audio to the speaker and get instant feedback. (Worked, but got all zeros)
  4. Not using my custom format at all in the AVAudioEngine, and using AVAudioConverter to convert from the hardware rate in my tap. [Length of the buffer was not set, no way to tell if results were correct]
prewett
  • 1,438
  • 12
  • 18
  • 2
    @matt The answer seemed relevant all four times, and since I happened on all four questions in the process of solving my problem, I figured it would be helpful to save other people the bother. So what should I do instead? Pick one place to answer it and put a link in the others? – prewett Jul 19 '19 at 21:00
  • didnt work to me, can u help? https://stackoverflow.com/questions/66971504/i-got-crash-when-record-required-condition-is-false-format-samplerate-hwf – famfamfam Apr 06 '21 at 15:22