iOS8 AVAudioEngine how to send microphone data over Multipeer Connectivity?
Asked Answered
R

1

5

I want to send microphone audio data over Multipeer Connectivity (iOS 8) and play it through the speaker of the receiving peer. I've also setup the AVAudioEngine and I can hear the microphone data from the (upper) speaker output, but I don't know how to send AVAudioPCMBuffer over the network. Here's my code snippet:

AVAudioInputNode *inputNode =[self.engine inputNode];
AVAudioMixerNode *mainMixer = [self.engine mainMixerNode];
[self.engine connect:inputNode to:mainMixer format:[inputNode inputFormatForBus:0]];

[mainMixer installTapOnBus:0 bufferSize:4096 format:[mainMixer outputFormatForBus:0] 
    block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when) {

    //==== How to send the PCMBuffer ?? ======//
}];

NSError *error = nil;
[self.engine startAndReturnError:&error];

if(error)
{
    NSLog(@"error: %@", [error localizedDescription]);
}

Do I send it as NSData or NSStream ?

Appreciate for the help. Thx.

Recycle answered 9/10, 2014 at 4:2 Comment(0)
L
0

I haven't tried this solution:

- (NSOutputStream *)startStreamWithName:(NSString *)streamName
                                 toPeer:(MCPeerID *)peerID
                                  error:(NSError **)error

You can receive a float array by using the property buffer.floatChannelData. Now you can pack this float array into NSOutputStream and send it.

On client side you can try to receive a stream:

- (void)session:(MCSession *)session
didReceiveStream:(NSInputStream *)stream
       withName:(NSString *)streamName
       fromPeer:(MCPeerID *)peerID
{
    stream.delegate = self;
    [stream scheduleInRunLoop:[NSRunLoop mainRunLoop] 
                      forMode:NSDefaultRunLoopMode];
    [stream open];
}

But before you try this, you could try to send random values (therefore white noise) instead of the float array, so you can make sure, that the time slot for sending these buffers (we are talking about real-time) is wide enough.

Update 15.10.2014 I found exactly what you need: http://robots.thoughtbot.com/streaming-audio-to-multiple-listeners-via-ios-multipeer-connectivity

Lippmann answered 12/10, 2014 at 9:27 Comment(12)
Hi Michael, thx fo the answer. MCSessionDelegate has this method: session:didReceiveStream:withName:fromPeer: Is it possible to send the data as stream instead ? since it's continuous flow of data. And, what does 'time slot is wide enough' means ? Thx.Recycle
I thought you were asking just about the sender. I added the receiver stuff. For the second question I oversimplify: if the audio data comes faster to the receiver than the Bluetooth connection can handle; this I meant by 'time slot'.Lippmann
Hi Michael, thx for the prompt reply. Pardon me... I meant startStreamWithName:toPeer:error:, didReceiveStream is the receiving part. I've tried sending and receiving using NSData, but I've a problem at the receiving side. I send using [NSData dataWithBytes:buffer.floatChannelData length:buffer.frameLength], how do I convert the received bytes data back to AVAudioPCMBuffer ? Since I'll use AVAudioPlayerNode to playback the buffer. In the Apple docs, there's no 'setter' for buffer.floatChannelData.Recycle
Here is exactly what you tried as demo project: robots.thoughtbot.com/…Lippmann
The example link you provide is indeed very similar to what I want to achieve. However, I feel that the implementation is way too complex (especially on the receiving side). There should be a simpler solution using only AVAudioEngine. Btw, I'm able to stream music from one device to another using the code provided, but I've yet to be able to stream mic data. This: CMSampleBufferRef sampleBuffer = (__bridge CMSampleBufferRef)(self.micPCMBuffer); gives me an error when trying to write out stream.Recycle
Not really, real-time (audio signal) processing is always a hard topic. :-) I fear that it is not possible to handle it "easier". Do you use ARC? And what error?Lippmann
-_-' sigh... why don't Apple makes our lives easier ? and what's the point of having AVAudioEngine but we still need to deal with the low level buffer stuff ? Yup, I use ARC. In the example, there's a part to get CMSampleBufferRef from AVAssetReaderTrackOutput using [self.assetOutput copyNextSampleBuffer]. In my case, I tried to convert AVAudioPCMBuffer to CMSampleBufferRef directly. This is the error I got: The operation couldn’t be completed. (OSStatus error -12710.) and the Apple docs on OSStatus is so helpful that I dun even know what that numbers mean...Recycle
Btw, here's my source code: dl.dropboxusercontent.com/u/35434256/MultiPeerAudioItunes.zip. I'll be grateful if you can help me take a look. Many thanks!Recycle
@MichaelDorner It will be extremely helpful, if you can add examples of how to pack and unpack buffer.floatChannelData to NSData in Swift.Fieldpiece
@JackyCoolheart Did you find a solution to this?Isacco
@RamsundarShandilya, unfortunately, I've abandoned this project long long time ago...Recycle
What do you do with the floatChannelData on the other end, exactly?Hurlburt

© 2022 - 2024 — McMap. All rights reserved.