In short: How to do Time Stretching and Pitch Shifting in iOS?
In Detail:
What am i trying to do?
The app will show the video thumbnails to the users to select a certain range of frames from the video file.
I have to Apply the ramp slow Mo, only to the selected duration or frames of the video.
Ramp Slow Mo: (Time Stretching + Pitch shifting) dynamically. I have to vary the Time and pitch in a for loop say for some 10 iteration. Its nothing but the behaviour of English letter "U". So, i will increase the time and pitch and decrease them.
How am I trying:
- Finding the Start/End time for the user selected videos.
- Splitting the video and audio separately through AVFoundation Mutable Composition.
- For Video: Applying Time Stretching for only for the duration in between Start/End time. *No problem* in it,Its happening as i expect.
- For Audio: Doing Time Stretching and Pitch shifting dynamically, which is also for the selected time frame. I am using Dirac for this. Now, i would like to use IOS SDK itself.
- Merging audio and video post slow mo as a file and storing it.
Analysis:
Hitherto, I've found that OpenAl or Audio Units(Using AUVarispeed alone or AUTimePitch and Time Rate together) can help. After splitting the video and audio, i will have the audio URL to proceed further, either with OpenAL or Audio Units.
My questions:
- Can anybody help me to avoid Dirac(ref bullet 4)? So how to do Time Stretching and Pitch shifting dynamically?
- How to do *only for the selected frame*s (in between start and end time)?
- I am not sure how to combine *AVFoundataion's* URL with OpenAL or Audio Unit's buffer and after slow mo effect, how to merge finally with AVFoundation's video reference URL?
Any kind of reference or sample codes helps a lot.