I want to process the audio from my device's built-in microphone (AVAudioInputNode
) with an audio unit effect (AVAudioUnitEffect
). For my example, I'm using AVAudioUnitReverb
. Connecting AVAudioUnitReverb
is causing the application to crash.
import UIKit
import AVFoundation
class ViewController: UIViewController {
let audioEngine = AVAudioEngine()
let unitReverb = AVAudioUnitReverb()
var inputNode: AVAudioInputNode!
override func viewDidLoad() {
super.viewDidLoad()
inputNode = audioEngine.inputNode
audioEngine.attachNode(unitReverb)
let inputFormat = inputNode.inputFormatForBus(0)
audioEngine.connect(inputNode, to: unitReverb, format: inputFormat)
// This line is crashing the application!
// With this error "AVAudioNode.mm:521: AUSetFormat: error -10868"
audioEngine.connect(unitReverb, to: audioEngine.outputNode, format: inputFormat)
audioEngine.startAndReturnError(nil)
}
}
I have no issues if I bypass the reverb and connect inputNode
directly to audioEngine.outputNode
, but then I have no reverb:
audioEngine.connect(inputNode, to: audioEngine.outputNode, format: inputFormat)
What am I doing wrong?
Update
I inadvertently discovered that the above code only crashes the application when my Apple EarPods with Remote and Mic are connected. When using the device's built-in microphone, I have no issues. So, why does the mic on my headphones crash the application, but only when using an AVAudioUnitEffect
?