Trying to stream audio from microphone to another phone via multipeer connectivity
Asked Answered
R

1

8

I am trying to stream audio from the microphone to another iPhone via Apples Multipeer Connectivity framework. To do the audio capturing and playback I am using AVAudioEngine (much thanks to Rhythmic Fistman's answer here).

I receive data from the microphone by installing a tap on the input, from this I am getting a AVAudioPCMBuffer which I then convert to an array of UInt8 which I then stream to the other phone.

But when I am converting the array back to an AVAudioPCMBuffer I get an EXC_BAD_ACCESS exception with the compiler pointing to the method where I am converting the byte array to AVAudioPCMBuffer again.

Here is the code for where I'm taking, converting and streaming the input:

input.installTap(onBus: 0, bufferSize: 2048, format: input.inputFormat(forBus: 0), block: {
                (buffer: AVAudioPCMBuffer!, time: AVAudioTime!) -> Void in

                let audioBuffer = self.typetobinary(buffer)
                stream.write(audioBuffer, maxLength: audioBuffer.count)
            })

My both functions for converting the data (taken from Martin.R's answer here):

func binarytotype <T> (_ value: [UInt8], _: T.Type) -> T {
    return value.withUnsafeBufferPointer {
        UnsafeRawPointer($0.baseAddress!).load(as: T.self)
    }

}

func typetobinary<T>(_ value: T) -> [UInt8] {
    var data = [UInt8](repeating: 0, count: MemoryLayout<T>.size)
    data.withUnsafeMutableBufferPointer {
        UnsafeMutableRawPointer($0.baseAddress!).storeBytes(of: value, as: T.self)
    }
    return data
}

And on the receiving end:

func session(_ session: MCSession, didReceive stream: InputStream, withName streamName: String, fromPeer peerID: MCPeerID) {
    if streamName == "voice" {

        stream.schedule(in: RunLoop.current, forMode: .defaultRunLoopMode)
        stream.open()

        var bytes = [UInt8](repeating: 0, count: 8)
        stream.read(&bytes, maxLength: bytes.count)

        let audioBuffer = self.binarytotype(bytes, AVAudioPCMBuffer.self) //Here is where the app crashes

        do {
            try engine.start()

            audioPlayer.scheduleBuffer(audioBuffer, completionHandler: nil)
            audioPlayer.play()
       }catch let error {
            print(error.localizedDescription)

        }
    }
}

The thing is that I can convert the byte array back and forth and play sound from it before I stream it (in the same phone) but not create the AVAudioPCMBuffer on the receiving end. Does anyone know why the conversion doesn't work on the receiving end? Is this the right way to go?

Any help, thoughts/input about this would be much appreciated.

Range answered 15/9, 2016 at 7:59 Comment(2)
can you give some reference how to use audio queue, or any sample project?Kirkman
No I can not I'm afraid.Range
C
4

Your AVAudioPCMBuffer serialisation/deserialisation is wrong.

Swift3's casting has changed a lot & seems to require more copying than Swift2.

Here's how you can convert between [UInt8] and AVAudioPCMBuffers:

N.B: this code assumes mono float data at 44.1kHz.
You might want to change that.

func copyAudioBufferBytes(_ audioBuffer: AVAudioPCMBuffer) -> [UInt8] {
    let srcLeft = audioBuffer.floatChannelData![0]
    let bytesPerFrame = audioBuffer.format.streamDescription.pointee.mBytesPerFrame
    let numBytes = Int(bytesPerFrame * audioBuffer.frameLength)

    // initialize bytes to 0 (how to avoid?)
    var audioByteArray = [UInt8](repeating: 0, count: numBytes)

    // copy data from buffer
    srcLeft.withMemoryRebound(to: UInt8.self, capacity: numBytes) { srcByteData in
        audioByteArray.withUnsafeMutableBufferPointer {
            $0.baseAddress!.initialize(from: srcByteData, count: numBytes)
        }
    }

    return audioByteArray
}

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    // format assumption! make this part of your protocol?
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]
    // for stereo
    // let dstRight = audioBuffer.floatChannelData![1]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }

    return audioBuffer
}
Carycaryatid answered 16/9, 2016 at 6:31 Comment(8)
Thank you again Rhythmic Fistman! This seems to work, I got some issues though. When I'm recording in the iPhone 5s the recording gets done in 8kHz and in the iPhone 6s it's 16kHz. This makes the program crash when I send a 16kHz stream and using the method (which wants a 8kHz stream). Do you have any tips on how to convert the stream before I send it? Also if I send the stream from the 5s to the 6s and sets the method for receiving a 8kHz stream the sound doesn't come through. It only plays white noice for a couple of milliseconds, pauses and repeats. Any thoughts?Range
I recommend choosing one sample rate and using that everywhere. You can convert to it by adding an AVAudioMixer to your engine or by using AVAudioConverter.Carycaryatid
Hi @RhythmicFistman , I had a question I was hoping you could answer. I'm using your algorithm above, and I'm wondering what the numBytes variable is. Based on this variable, when I try writing this data to the stream, it's sending over 17000 bytes per write, which seems incredibly high. Eventually the write fails because the other devices stream buffer is full...Hydracid
This answer shows you how to [de]serialise an AVAudioPCMBuffer, but the question assumes multipeer has enough bandwidth to send uncompressed across it. Now that I think about it, it probably is a bad assumption.Carycaryatid
@RhythmicFistman Well I suppose that is in bits, which is only about 2 kilobytes, which should be manageable. However, I have the issue that the audio I play back is static right now. I'm not sure why that's happening, any ideas?Hydracid
I'm not sure - does the problem happen if you do stream and playback on the same device? If so can you create a new question with that code?Carycaryatid
@RhythmicFistman Yes, that's what I meant. So if I stream the audio to another device, it's static, but if I play the audio before converting it, it sounds fine. Here is a link for a question I posted (on another account): #42817806Hydracid
@RhythmicFistman I am working on MIC audio streaming using multipeer connectivity, I haven't found any proper tutorial or any sample code yet. So please give me some reference or code how to use audio engine and streaming voice between two device using multipeer connectivity framework.Kirkman

© 2022 - 2024 — McMap. All rights reserved.