iOS: Audio Units vs OpenAL vs Core Audio
Asked Answered
D

2

34

Could someone explain to me how OpenAL fits in with the schema of sound on the iPhone?

There seem to be APIs at different levels for handling sound. The higher level ones are easy enough to understand.

But my understanding gets murky towards the bottom. There is Core Audio, Audio Units, OpenAL.

What is the connection between these? Is openAL the substratum, upon which rests Core Audio (which contains as one of its lower-level objects Audio Units) ?

OpenAL doesn't seem to be documented by Xcode, yet I can run code that uses its functions.

Darryl answered 25/10, 2010 at 12:41 Comment(1)
Great overview hereBranham
D
37

This is what I have figured out:

The substratum is Core Audio. Specifically, Audio Units.

So Audio Units form the base layer, and some low-level framework has been built on top of this. And the whole caboodle is termed Core Audio.

OpenAL is a multiplatform API -- the creators are trying to mirror the portability of OpenGL. A few companies are sponsoring OpenAL, including Creative Labs and Apple!

So Apple has provided this API, basically as a thin wrapper over Core Audio. I am guessing this is to allow developers to pull over code easily. Be warned, it is an incomplete implementation, so if you want OpenAL to do something that Core Audio can do, it will do it. But otherwise it won't.

Kind of counterintuitive -- just looking at the source, it looks as if OpenAL is lower level. Not so!

Darryl answered 26/10, 2010 at 17:29 Comment(3)
+1, Nice summary. I'd like to add that in my experience OpenAL is a bit flaky on iOS and to be avoided if possible. CoreAudio isn't easy, but that's because audio processing in general isn't easy, but at least it works very reliable.Shcherbakov
I've had good results with OpenAL and Cocos2D uses it as well. I was unaware that it wasn't the lowest level API though and must say I'm a bit miffed!Tether
One more potentially useful thing to know: "The 3D Mixer [audio] unit is the foundation upon which OpenAL is built" - from the Apple Docs: developer.apple.com/library/ios/#DOCUMENTATION/MusicAudio/…Tether
J
36

Core Audio covers a lot of things, such as reading and writing various file formats, converting between encodings, pulling frames out of streams, etc. Much of this functionality is collected as the "Audio Toolbox". Core Audio also offers multiple APIs for processing streams of audio, for playback, capture, or both. The lowest level one is Audio Units, which works with uncompressed (PCM) audio and has some nice stuff for applying effects, mixing, etc. Audio Queues, implemented atop Audio Units, are a lot easier because they work with compressed formats (not just PCM) and save you from some threading challenges. OpenAL is also implemented atop Audio Units; you still have to use PCM, but at least the threading isn't scary. Difference is that since it's not from Apple, its programming conventions are totally different from Core Audio and the rest of iOS (most obviously, it's a push API: if you want to stream with OpenAL, you poll your sources to see if they've exhausted their buffers and push in new ones; by contrast, Audio Queues and Audio Units are pull-based, in that you get a callback when new samples are needed for playback).

Higher level, as you've seen, is nice stuff like Media Player and AV Foundation. These are a lot easier if you're just playing a file, but probably aren't going to give you deep enough access if you want to do some kind of effects, signal processing, etc.

Jerkin answered 18/4, 2011 at 17:24 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.