Using AudioKit, I'm trying to build an app that analyses the input of the microphone and separate the incoming sound into pieces of 3 frequency ranges (low, mid, high) and their amplitude.
This is the code I have:
class ViewController: UIViewController {
var mic: AKMicrophone!
var amplitude: AKAmplitudeTracker!
var fftTap: AKFFTTap?
var timer: Timer!
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view, typically from a nib.
mic = AKMicrophone()
fftTap = AKFFTTap.init(mic)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
do {
try AudioKit.start()
} catch {
AKLog("AudioKit did not start!")
}
mic.start()
timer = Timer.scheduledTimer(withTimeInterval: 0.01, repeats: true, block: { (timer) in
for i in 0...256 {
print(Double(self.fftTap?.fftData[i] ?? 0.0))
}
})
}
}
But now I have no idea what the output actually means?
How do I get the max amplitude for a certain frequency range? I need all three ranges at the same time, so I think the mere Frequency-Tracker won't do it.
From reading documentations about FFT, I understand that the first 256 bins are representations of a certain frequency's amplitude. But I only found Matlab plotting-examples that convert those values to plots (which don't really make sense to me).