I am new to iOS programming and I want to port an Android app to iOS using Swift 3. The core functionality of the app is to read the byte stream from the microphone and to process this stream live. So it is not sufficient to store the audio stream to a file and process it after recording has stopped.
I already found the AVAudioRecorder class which works, but I don't know how to process the data stream live (filtering, sending it to a server, etc). The init-function of the AVAudioRecorder looks like that:
AVAudioRecorder(url: filename, settings: settings)
What I would need is a class where I can register an event handler or something like that which is called every time x bytes have been read so I can process it.
Is this possible with AVAudioRecorder? If not, is there another class in the Swift library that allows me to process audio streams live? In Android I use android.media.AudioRecord so It would be great if there's an equivalent class in Swift.
Regards