AVAudioEngine Multichannel mapping
Asked Answered
P

1

1

I've converted the objc code to swift: https://forums.developer.apple.com/thread/15416 My audio file will play on my iPhone but when I plug in an audio interface I just get silence. My audio interface has 4 outputs and will play audio in other apps. Here is the converted code:

let audioSession = AVAudioSession.sharedInstance()

 // set the session category
 do
 {
     try audioSession.setCategory(.multiRoute, options: .mixWithOthers)
 }
 catch
 {
     print("unable to set category", error)
     return
 }

 // activate the audio session
 do
 {
     try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
 }
 catch
 {
     print("unable to set active", error)
     return
 }

 engine = AVAudioEngine()
 output = engine.outputNode
 mixer = engine.mainMixerNode

 player = AVAudioPlayerNode()

 engine.attach(player)

 guard let filePath: String = Bundle.main.path(forResource: "audio", ofType: "m4a") else { return }
 let fileURL: URL = URL(fileURLWithPath: filePath)
 let file = try! AVAudioFile(forReading: fileURL)

 let numberOfSourceChannels = file.processingFormat.channelCount
 print("numberOfSourceChannels: ", numberOfSourceChannels)

 let outputNumChannels = output.outputFormat(forBus: 0).channelCount
 print("outputNumChannels:" , outputNumChannels)
 var outputChannelMap = [
     0,
     1,
     -1,
     -1
 ]

 if let au = output.audioUnit
 {
     let propSize = UInt32(MemoryLayout.size(ofValue: outputChannelMap))
     print("propSize:", propSize)
     AudioUnitSetProperty(au, kAudioOutputUnitProperty_ChannelMap, kAudioUnitScope_Global, 0, &outputChannelMap, propSize)
 }

 let channelLayout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_DiscreteInOrder | UInt32(numberOfSourceChannels))

 let format = AVAudioFormat(streamDescription: file.processingFormat.streamDescription, channelLayout: channelLayout)


 engine.connect(player, to: mixer, format:format)
 engine.connect(mixer, to: output, format:format)

 player.scheduleFile(file, at: nil, completionHandler: nil)

 do
 {
    try engine.start()
 }
 catch
 {
     print("can't start", error)
     return
 }

 player.play()

Here is the output from running it:

/private/var/containers/Bundle/Application/FF7AA751-35A3-4B70-92C3-2D8356DE835A/AVAudioEngineTest.app/audio.m4a
numberOfSourceChannels:  2
outputNumChannels: 4
propSize: 8

My end goal is to be able to choose which output a node will play on. What am I doing wrong? Thanks in advance.

Psychosis answered 29/5, 2020 at 4:37 Comment(0)
P
2

It turns out the code is fine except for the outputChannelMap. I was confused at how it should work. Here is a few examples of how it should work:

//normal output
var outputChannelMap = [
    0,
    1
]

// left audio to left output only
outputChannelMap = [
    0,
    -1
]

// both audio channels to left output
outputChannelMap = [
    0,
    0
]

// both audio channels to right output
outputChannelMap = [
    1,
    1
]

//left audio channel to right output only
outputChannelMap = [
    1,
    -1
]

// left out right, right our left
outputChannelMap = [
    1,
    0
]

// right audio to left output only
outputChannelMap = [
    -1,
    0
]

// right audio  to right output only
outputChannelMap = [
    -1,
    1
]
Psychosis answered 3/6, 2020 at 1:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.