Play audio from AVAudioPCMBuffer with AVAudioEngine
Asked Answered
L

1

8

I have two classes, MicrophoneHandler, and AudioPlayer. I have managed to use AVCaptureSession to tap microphone data using the approved answer here, and and converted the CMSampleBuffer to NSData using this function:

func sendDataToDelegate(buffer: CMSampleBuffer!)
{
    let block = CMSampleBufferGetDataBuffer(buffer)
    var length = 0
    var data: UnsafeMutablePointer<Int8> = nil

    var status = CMBlockBufferGetDataPointer(block!, 0, nil, &length, &data)    // TODO: check for errors

    let result = NSData(bytesNoCopy: data, length: length, freeWhenDone: false)

    self.delegate.handleBuffer(result)
}

I would now like to play the audio over the speaker by converting the NSData produced above to AVAudioPCMBuffer and play it using AVAudioEngine. My AudioPlayerclass is as follows:

var engine: AVAudioEngine!
var playerNode: AVAudioPlayerNode!
var mixer: AVAudioMixerNode!

override init()
{
    super.init()

    self.setup()
    self.start()
}

func handleBuffer(data: NSData)
{
    let newBuffer = self.toPCMBuffer(data)
    print(newBuffer)

    self.playerNode.scheduleBuffer(newBuffer, completionHandler: nil)
}

func setup()
{
    self.engine = AVAudioEngine()
    self.playerNode = AVAudioPlayerNode()

    self.engine.attachNode(self.playerNode)
    self.mixer = engine.mainMixerNode

    engine.connect(self.playerNode, to: self.mixer, format: self.mixer.outputFormatForBus(0))
}

func start()
{
    do {
        try self.engine.start()
    }
    catch {
        print("error couldn't start engine")
    }

    self.playerNode.play()
}

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer
{
    let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 2, interleaved: false)  // given NSData audio format
    let PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: UInt32(data.length) / audioFormat.streamDescription.memory.mBytesPerFrame)

    PCMBuffer.frameLength = PCMBuffer.frameCapacity

    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

    data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)

    return PCMBuffer
}

The buffer reaches the handleBuffer:buffer function when self.delegate.handleBuffer(result) is called in the first snippet above.

I am able to print(newBuffer), and see the memory locations of the converted buffers, but nothing comes out of the speakers. I can only imagine something is not consistent between the conversions to and from NSData. Any ideas? Thanks in advance.

Lachrymal answered 25/11, 2015 at 0:47 Comment(0)
C
2

Skip the raw NSData format

Why not use AVAudioPlayer all the way? If you positively need NSData, you can always load such data from the soundURL below. In this example, the disk buffer is something like:

let soundURL = documentDirectory.URLByAppendingPathComponent("sound.m4a")

It makes sense to record directly to a file anyway for optimal memory and resource management. You get NSData from your recording this way:

let data = NSFileManager.defaultManager().contentsAtPath(soundURL.path())

The code below is all you need:

Record

if !audioRecorder.recording {
    let audioSession = AVAudioSession.sharedInstance()
    do {
        try audioSession.setActive(true)
        audioRecorder.record()
    } catch {}
}

Play

if (!audioRecorder.recording){
    do {
        try audioPlayer = AVAudioPlayer(contentsOfURL: audioRecorder.url)
        audioPlayer.play()
    } catch {}
}

Setup

let audioSession = AVAudioSession.sharedInstance()
do {
    try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
    try audioRecorder = AVAudioRecorder(URL: self.directoryURL()!,
        settings: recordSettings)
    audioRecorder.prepareToRecord()
} catch {}

Settings

let recordSettings = [AVSampleRateKey : NSNumber(float: Float(44100.0)),
    AVFormatIDKey : NSNumber(int: Int32(kAudioFormatMPEG4AAC)),
    AVNumberOfChannelsKey : NSNumber(int: 1),
    AVEncoderAudioQualityKey : NSNumber(int: Int32(AVAudioQuality.Medium.rawValue))]

Download Xcode Project:

You can find this very example here. Download the full project, which records and plays on both simulator and device, from Swift Recipes.

Clatter answered 5/12, 2015 at 2:58 Comment(6)
We need the NSData raw format to transmit it over a socket. We are streaming audio.Lachrymal
file to NSData comes out of the box. Can you think of the file as just a buffer? It happens to be a disk buffer, but it still is just a buffer?Clatter
How do we clear the buffer every time we send a packet of NSData over the socket? @ClatterLachrymal
You need a circular buffer for streaming: https://mcmap.net/q/1473121/-circular-buffer-audio-recording-ios-possibleClatter
@Clatter Can we Use Above Coe To Transmit Real time data from mic as i need real time nsdata to transfer over socket and play on another device And by saving it into file i think i would not be able to live stream your suggestions ? ThanxEindhoven
this question is quite old, I am in the same situation @ConnorHicks have you solved above issue?Cyclohexane

© 2022 - 2024 — McMap. All rights reserved.