Background - I saw a video titled "AVAudioEngine in Practice" from the following list of videos published at Apple's recent WWDC to apply sound effects to an audio. https://developer.apple.com/videos/wwdc/2014/
After that, I was successfully able to change the pitch of an audio with the following code:
//Audio Engine is initialized in viewDidLoad()
audioEngine = AVAudioEngine()
//The following Action is called on clicking a button
@IBAction func chipmunkPlayback(sender: UIButton) {
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
timePitch.pitch = 1000
audioEngine.attachNode(pitchPlayer)
audioEngine.attachNode(timePitch)
audioEngine.connect(pitchPlayer, to: timePitch, format: myAudioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: myAudioFile.processingFormat)
pitchPlayer.scheduleFile(myAudioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&er)
pitchPlayer.play()
}
From what I understand, I used the AudioEngine to attach the AudioPlayerNode with the AudioEffect, which I in turn attached to the Output.
I am now curious about adding multiple sound effects to the audio. For instance, pitch change AND reverb. How would I go about adding multiple sound effects to the audio?
Also, would it make sense to attach and connect the nodes in viewDidLoad rather than how I have done it here in an IBAction ?