Swift 3: Using AVCaptureAudioDataOutput to analyze audio input
Asked Answered
R

1

3

I’m trying to use AVCaptureAudioDataOutput to analyze audio input, as described here . This is not stuff I could figure out on my own, so I’m copying the example, but I’m having difficulty.

Xcode in Swift 3 has prompted me to make a couple of changes. I’m getting a compile error with the line assigning samples. Xcode says, “Cannot invoke initializer for type ‘UnsafeMutablePointer<_> with an argument list of type ‘(UnsafeMutableRawPointer?)’”

Here’s the code as I’ve modified it:

func captureOutput(_ captureOutput: AVCaptureOutput!,
                    didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
                   from connection: AVCaptureConnection!){
    var buffer: CMBlockBuffer? = nil
    var audioBufferList = AudioBufferList(mNumberBuffers: 1,
                                          mBuffers: AudioBuffer(mNumberChannels: 1, mDataByteSize: 0, mData: nil))
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
        sampleBuffer,
        nil,
        &audioBufferList,
        MemoryLayout<AudioBufferList>.size,     // changed for Swift 3
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &buffer
    )
    let abl = UnsafeMutableAudioBufferListPointer(&audioBufferList)
    var sum:Int64 = 0
    var count:Int = 0
    var bufs:Int = 0
    for buf in abl {
        let samples = UnsafeMutableBufferPointer<Int16>(start: UnsafeMutablePointer(buf.mData),  // Error here
                                                        count: Int(buf.mDataByteSize)/sizeof(Int16))
        for sample in samples {
            let s = Int64(sample)
            sum = (sum + s*s)
            count += 1
        }
        bufs += 1
    }
    print( "found \(count) samples in \(bufs) buffers, sum is \(sum)" )
}

Can anyone tell me how to fix this code?

Radman answered 24/1, 2017 at 2:9 Comment(0)
R
1

The answer is that I need to wrap buf.mData in an OpaquePointer. i.e., in the call to UnsafeMutableBufferPointer<Int16>(OpaquePointer(buff.mData)), change

start: UnsafeMutablePointer(buff.mData)

to

start: UnsafeMutablePointer(OpaquePointer(buff.mData))

Here is the complete code, updated for Swift 3:

    func captureOutput(_ captureOutput: AVCaptureOutput!,
                   didOutputSampleBuffer sampleBuffer: CMSampleBuffer!,
                   from connection: AVCaptureConnection!){
    var buffer: CMBlockBuffer? = nil
    var audioBufferList = AudioBufferList(mNumberBuffers: 1,
                                          mBuffers: AudioBuffer(mNumberChannels: 1, mDataByteSize: 0, mData: nil))
    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
        sampleBuffer,
        nil,
        &audioBufferList,
        MemoryLayout<AudioBufferList>.size,
        nil,
        nil,
        UInt32(kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment),
        &buffer
    )
    let abl = UnsafeMutableAudioBufferListPointer(&audioBufferList)
    var sum:Int64 = 0
    var count:Int = 0
    var bufs:Int = 0
    for buff in abl {
        let samples = UnsafeMutableBufferPointer<Int16>(start: UnsafeMutablePointer(OpaquePointer(buff.mData)),
                                                        count: Int(buff.mDataByteSize)/MemoryLayout<Int16>.size)
        for sample in samples {
            let s = Int64(sample)
            sum = (sum + s*s)
            count += 1
        }
        bufs += 1
    }
    print( "found \(count) samples in \(bufs) buffers, RMS is \(sqrt(Float(sum)/Float(count)))" )
}

This satisfies the compiler, and it seems to generate reasonable numbers.

Radman answered 27/1, 2017 at 0:43 Comment(4)
I'm recording video by getting an image from sampleBuffer with CMSampleBufferGetImageBuffer, modifying the image, and then writing it to video. How do check if the sampleBuffer is audio then write it to the assetWriter?Adze
@ChewieTheChorkie To check if the sample buffer is audio you can look at the connection's property output. Like this, if connection.output is AVCaptureAudioDataOutput { ... } else { // isVideo }Billy
Maybe I'll try it, but I "fixed" it by running an AVAudioRecorder and then merging both the audio and video at the end. Is that bad?Adze
@ChewieTheChorkie Really helpful hint! Did you find any better solution? I've the same problem (no audio output when I'm trying to draw multiple context in real-time video recording)Shuler

© 2022 - 2024 — McMap. All rights reserved.