I am using AVAudioEngine to capture a user's voice while applying some real-time effects(like reverb or sth) to the voice.Here's my code
import UIKit
import AVFoundation
class ViewController: UIViewController{
var audioEngine:AVAudioEngine!
var audioReverb:AVAudioUnitReverb!
var audioInputNode:AVAudioInputNode!
override func viewDidLoad() {
super.viewDidLoad()
let session = AVAudioSession.sharedInstance()
session.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
session.setActive(true, error: nil)
}
override func didReceiveMemoryWarning() {
super.didReceiveMemoryWarning()
// Dispose of any resources that can be recreated.
}
@IBAction func recordWithReverb(sender: AnyObject) {
audioEngine = AVAudioEngine()
audioEngine.stop()
audioEngine.reset()
audioInputNode = audioEngine.inputNode
audioReverb = AVAudioUnitReverb()
audioReverb.loadFactoryPreset(.LargeRoom)
audioReverb.wetDryMix = 50
audioEngine.attachNode(audioReverb)
let inputFormat = audioReverb.inputFormatForBus(0)
audioEngine.connect(audioInputNode, to: audioReverb, format: inputFormat)
audioEngine.connect(audioReverb, to: audioEngine.outputNode, format: inputFormat)
audioEngine.startAndReturnError(nil)
}
}
When I test it with headphones which have mic on it(like earpods), it all works well.But when I use some other headphones that is without mic (so I could only use the mic on the iphone as the input source), the right speaker of the headphone always sounds smaller than the left speaker of the headphone.How can I fix it?
And there's another question I'd like to ask, when I write the code
var audioEngine:AVAudioEngine!
var audioReverb:AVAudioUnitReverb!
var audioInputNode:AVAudioInputNode!
inside the function "recordWithReverb",the code cannot work correctly, I can't get any output, Why did this happen?