How to set AVAudioEngine input and output devices (swift/macos)
Asked Answered
M

1

7

I've hunted high and low and cannot find a solution to this problem. I am looking for a method to change the input/output devices which an AVAudioEngine will use on macOS.

When simply playing back an audio file the following works as expected:

var outputDeviceID:AudioDeviceID = xxx 
let result:OSStatus = AudioUnitSetProperty(outputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &outputDeviceID, UInt32(MemoryLayout<AudioObjectPropertyAddress>.size))
if result != 0  {
   print("error setting output device \(result)")
   return
}

However if I initialize the audio input (with let input = engine.inputNode) then I get an error once I attempt to start the engine:

AVAEInternal.h:88 required condition is false: [AVAudioEngine.mm:1055:CheckCanPerformIO: (canPerformIO)]

I know that my playback code is OK since, if I avoid changing the output device then I can hear the microphone and the audio file, and if I change the output device but don't initialize the inputNode the file plays to the specified destination.

Additionally to this I have been trying to change the input device, I understood from various places that the following should do this:

let result1:OSStatus = AudioUnitSetProperty(inputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Output, 0, &inputDeviceID, UInt32(MemoryLayout<AudioObjectPropertyAddress>.size))
if result1 != 0  {
    print("failed with error \(result1)")
    return
}

However, this doesn't work - in most cases it throws an error (10853) although if I select a sound card that has both inputs and outputs it succeeds - it appears that when I am attempting to set the output or the input node it is actually setting the device for both.

I would think that this meant that an AVAudioEngine instance can only deal with one device, however it is quite happy working with the default devices (mic and speakers/headphones) so I am confident that isn't the issue. Looking at some solutions I have seen online people simply change the default input, but this isn't a massively nice solution.

Does anyone have any ideas as to whether this is possible?

It's worth noting that kAudioOutputUnitProperty_CurrentDevice is the only property available, there is not an equivalent kAudioInputUnitProperty_CurrentDevice key, due to the fact that as I understand it both the inputNode and outputNode are classed as "Output Units" (as they both emit sound somewhere).

Any ideas would be much appreciated as this is very very frustrating!!

Thanks

Miracidium answered 15/5, 2020 at 20:34 Comment(3)
I've also tried this with inputNode.auAudioUnit.setDeviceID(xxx) and seem to have the same issue. As far as I can see you are only able to move away from the default audio device if either a) you are only outputting audio, or b) you use one device for both input and output..Miracidium
Did you find a solution ? Im experiencing the exact same problem...Binkley
Had a reply from apple - will do an answer..Miracidium
M
6

So I filed a support request with apple on this and another issue and the response confirms that an AVAudioEngine can only be assigned to a single Aggregate device (that is, a device with both input and output channels) - the system default units create effectively an aggregate device internally which is why they work, although I've found an additional issue in that if the input device also has output capabilities (and you activate the inputNode) then that device has to be both the input and output device as otherwise the output appears not to work.

So answer is that I think there is no answer..

Miracidium answered 23/5, 2020 at 17:4 Comment(6)
I wonder if it would be possible to wrap an AUAudioUnit for an input device in an AVAudioSourceNode and use that in lieu of inputNode. I wrote some code that seemed promising but didn't work properly, but the approach could be workable.Plio
I tried that but it appears that you can't get a hardware input into an audioEngine that way. I think the solution would be to create a separate AudioUnit, write from that to a buffer and read in the buffer with a player node, but i haven't had a chance to try this yet. The CoreAudio "Play Through" example from apple does this but without an engine and appears to have pretty low latency. I tried doing a tap on the inputNode but the latency on that was at least 100msMiracidium
Thank you for documenting everything that you discovered. I too am struggling with this, trying to take input device A and play it through output device B and have tried many of the things that you described. Did you get any further with this? Do you know how applications like GarageBand can do it? I will try to find that CoreAudio "Play Through" example...Spacious
I beleive most do what the Play Through example does which is to write to a buffer and then read from that into the other device - that's what I ended up having to doMiracidium
My guess: An input unit requires an output unit using the same oscillator clock to pull from the input. Without the right clock to pull, using a copy buffer in between allow your custom software to manage any tiny differences in clock frequencies (thus sample rates) by fancy error concealment strategies (well outside the scope of AVAudioEngine). Otherwise audio buffer underflow or overflow is guaranteed by differences between the two crystal oscillators.Gharry
It was slightly awkward using AUGraphs with multiple devices, so Apple invented AVAudioEngine which made it completely impossible. AVAE being a secret aggregate device explains SO MUCH. You don't have input / output nodes, like the class suggests, you have a single input-output node! To be fair, you can put AVAE into offline mode, effectively cutting the stupid bit off & instead hook it up to an aggregate audio device you do control, regaining something resembling an AUGraph or manually connected series of AudioUnits. AVAE is pure tech debt. Change my mind.Stout

© 2022 - 2024 — McMap. All rights reserved.