Which framework should I use to play an audio file (WAV, MP3, AIFF) in iOS with low latency?
Asked Answered
F

7

4

iOS has various audio frameworks from the higher-level that lets you simply play a specified file, to the lower level that lets you get at the raw PCM data, and everything in between. For our app, we just need to play external files (WAV, AIFF, MP3) but we need to do so in response to pressing a button and we need that latency to be as small as possible. (It's for queueing in live productions.)

Now the AVAudioPlayer and such work to play simple file assets (via their URL), but its latency in actually starting the sound is too great. With larger files of more than five minutes in length, the delay to start the sound can be over a second long which renders it all but useless for timing in a live performance.

Now I know things like openAL can be used for a very-low-latency playback, but then you're waist-deep into audio buffers, audio sources, listeners, etc.

That said, does anyone know of any frameworks that work at a higher-level (i.e. play 'MyBeddingTrack.mp3') with a very low latency? Pre-buffering is fine. It's just the trigger has to be fast.

Bonus if we can do things like set the start and end points of the playback within the file, or to change the volume or even to perform ducking, etc.

Featly answered 25/1, 2013 at 0:0 Comment(0)
B
1

The following SO question contains working code that plays a file using Audio Units, and specifically the AudioFilePlayer. Even though the question states that it is not working, it worked out of the box for me - only add a AUGraphStart(_graph) at the end.

The 'ScheduledFilePrime' property of the AudioFilePlayer states how much of the file to load before starting to play. You may want to play around with that.

But as the others note, Audio Units have a steep learning curve.

Bhagavadgita answered 5/7, 2013 at 13:6 Comment(0)
T
4

Although Audio Queue framework is relatively easy to use.. it packs a lot of DSP heavy lifting behind the scenes (ie if you supply it with VBR/compressed audio.. it automatically converts it to PCM before playing it on the speaker.. it also handles a lot of the threading issues for the end user opaquely).. which is good news someone doing a light weight non-real time application.

You mentioned that you need it for queuing in live productions. I'm not sure if that means that your app is real-time.. because if it is.. then Audio Queue's will struggle to meed your needs. A good article to read about this is Ross Bencina's. The take away is that you can't afford to let third party frameworks or libraries do anything that can be potentially expensive behind the scenes like thread locking or mallocing or deallocing etc etc.. that's simply too expensive and risky for developing real time audio apps.

That's where the Audio Unit framework come in. Audio Queue's are actually built on top of the Audio Unit framework (it automates a lot of it's work).. but Audio Units bring you as close to the metal as it gets with iOS. It as responsive as you want it to be, and can do a real time app easy. Audio Unit has a huge learning curve though. There are some Open Source wrappers around it that simplifies it though (see novocaine).

If I were you.. I'd at least skim through Learning Core Audio.. it's the go to book for any iOS core-audio developer.. it talks in detail about Audio Queues, Audio Units etc and has excellent code examples..

From my own experience.. I worked on a real-time audio app that had some intensive audio requirements.. i found the Audio Queue framework and thought it was too good to be true.. my app worked when i prototyped it with light restrictions.. but it simply choked upon stress testing it.. that's when i had to dive deep into audio units and change the architecture etc etc (it wasn't pretty). my advice: work with audio queue at least as an introduction to Audio Units.. stick with it if it meets your needs, but then don't be afraid to use Audio Units if it becomes clear that Audio Queue no longer meets your app's demands.

Trinitrophenol answered 3/2, 2013 at 21:8 Comment(0)
A
2

The lowest latency you can get is with Audio Units, RemoteIO.

Remote I/O Unit

The Remote I/O unit (subtype kAudioUnitSubType_RemoteIO) connects to device hardware for input, output, or simultaneous input and output. Use it for playback, recording, or low-latency simultaneous input and output where echo cancelation is not needed.

Take a look at this tutorials:

http://atastypixel.com/blog/using-remoteio-audio-unit/

http://atastypixel.com/blog/playing-audio-in-time-using-remote-io/

Accusation answered 29/1, 2013 at 3:24 Comment(1)
RemoteIO seems a bit daunting at first sight, but apparently it is the best choice for low latency. It also allows for setting starting/ending points or similar functions.Boloney
F
1

You need the system sound framework. The system sound framework is made for things like use interface sounds or quick, responsive sounds. Take a look here.

Felicity answered 25/1, 2013 at 0:3 Comment(1)
No, that's for short sounds (and vibrations.) As mentioned in my question, the files can be very long, so the system sound framework isn't what should be used. Still good to know about so thanks! :)Featly
I
1

AVAudioPlayer has a prepareToPlay method to preload its audio buffers. This might speed up response time significantly.

Inflated answered 25/1, 2013 at 0:8 Comment(1)
That may work for a single sound, but we actually have several sounds we'd like to trigger. Plus, the documentation states that calling 'Stop' or letting it finish playing undoes this, which means we'd have to constantly be setting it up. Still, this is good for one-off things.Featly
A
1

I ran into the same problem as you are but after a while research I found a great framework. I am currently using kstenerud's ObjectAL sound framework. It is base on OpenAL and with well document. You are able to play background music and sound effect with multiple layers.

Here is project on github https://github.com/kstenerud/ObjectAL-for-iPhone Here is the web site http://kstenerud.github.com/ObjectAL-for-iPhone/index.html

Ashlaring answered 25/1, 2013 at 1:13 Comment(2)
Ok... from the main page, this looks like it may be damn cool! Thanks! Digging in now. The Audio Queue question was about to get the answer, but now I'm holding off because of this. Will let you know what I find.Featly
I only use it to play background music and sound effect, but it can do some complex sound.Ashlaring
G
1

I would use Audio Queue framework. https://developer.apple.com/library/mac/ipad/#documentation/MusicAudio/Conceptual/AudioQueueProgrammingGuide/Introduction/Introduction.html

Glide answered 25/1, 2013 at 5:3 Comment(4)
Didn't know that was on iOS! Digging through their SpeakHere example now. Not a fan of them doing full-app samples like that because you're now digging through a bunch of crap unrelated to the technology (i.e. they have view meters that they render in OpenGL ES. Really?! Do you need that in an audio sample?! That should be in a more advanced sample!) Plus, it's not ARC-enabled which means I have to deal with a lot of compatibility, etc. Still, this does look to be the most promising so far. May try to wrap this in my own static mini-library so I can just point it at files and go.Featly
(who really should consider changing their username! LOL!)... do you know of a limit to how many Audio Queues your app can have? We can have up to 100 buttons on the screen and all have to be ready at a moment's notice, but i think I read there's a limit of 30 cues. However, I can't find where I read that so I can't confirm.Featly
Not sure on this... As I only got to know Audio Queue a little while doing research for my last project. And I ended up using Audio Units instead. Perhaps soneone else can address this.Glide
i'm not sure the limit to how many audio queues your app can have.. but then again there is a lot of things that are opaque in the audio queue world.. and you can easily end up in a situation where something simply doesn't work and you can't find out why with audio queues.. see my answer for more detailsTrinitrophenol
B
1

The following SO question contains working code that plays a file using Audio Units, and specifically the AudioFilePlayer. Even though the question states that it is not working, it worked out of the box for me - only add a AUGraphStart(_graph) at the end.

The 'ScheduledFilePrime' property of the AudioFilePlayer states how much of the file to load before starting to play. You may want to play around with that.

But as the others note, Audio Units have a steep learning curve.

Bhagavadgita answered 5/7, 2013 at 13:6 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.