Hej. I want to implement a realtime audio application with the new AVAudioEngine
in Swift. Has someone experience with the new framework? How does real time applications work?
My first idea was to store the (processed) input data into a AVAudioPCMBuffer
object and then let it play by an AVAudioPlayerNode
as you can see in my demo class:
import AVFoundation
class AudioIO {
var audioEngine: AVAudioEngine
var audioInputNode : AVAudioInputNode
var audioPlayerNode: AVAudioPlayerNode
var audioMixerNode: AVAudioMixerNode
var audioBuffer: AVAudioPCMBuffer
init(){
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioMixerNode = audioEngine.mainMixerNode
let frameLength = UInt32(256)
audioBuffer = AVAudioPCMBuffer(PCMFormat: audioPlayerNode.outputFormatForBus(0), frameCapacity: frameLength)
audioBuffer.frameLength = frameLength
audioInputNode = audioEngine.inputNode
audioInputNode.installTapOnBus(0, bufferSize:frameLength, format: audioInputNode.outputFormatForBus(0), block: {(buffer, time) in
let channels = UnsafeArray(start: buffer.floatChannelData, length: Int(buffer.format.channelCount))
let floats = UnsafeArray(start: channels[0], length: Int(buffer.frameLength))
for var i = 0; i < Int(self.audioBuffer.frameLength); i+=Int(self.audioMixerNode.outputFormatForBus(0).channelCount)
{
// doing my real time stuff
self.audioBuffer.floatChannelData.memory[i] = floats[i];
}
})
// setup audio engine
audioEngine.attachNode(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioMixerNode, format: audioPlayerNode.outputFormatForBus(0))
audioEngine.startAndReturnError(nil)
// play player and buffer
audioPlayerNode.play()
audioPlayerNode.scheduleBuffer(audioBuffer, atTime: nil, options: .Loops, completionHandler: nil)
}
}
But this is far away from real time and not very efficient. Any ideas or experiences? And it does not matter, if you prefer Objective-C or Swift, I am grateful for all notes, remarks, comments, solutions, etc.
IOProcs
. For example, no memory allocation, no locks, no Objective-C method calls, etc. See rossbencina.com/code/… I imagine that internallyAVAudioEngine
uses only C inside the realtime methods, and I also bet that the taps have the same restrictions asIOProcs
. – CoactiveAVAudioEngine.inputNode
directly toAVAudioEngine.outputNode
. – Vaughn