I have the following code in C that allocate an AudioBufferList with the appropriate length.
UInt32 bufferSizeBytes = bufferSizeFrames * sizeof(Float32);
propertySize = offsetof(AudioBufferList, mBuffers[0]) + (sizeof(AudioBuffer) * mRecordSBD.mChannelsPerFrame);
mBufferList = (AudioBufferList *) malloc(propertySize);
mBufferList->mNumberBuffers = mRecordSBD.mChannelsPerFrame;
for(UInt32 i = 0; i < mBufferList->mNumberBuffers; ++i)
{
mBufferList->mBuffers[i].mNumberChannels = 1;
mBufferList->mBuffers[i].mDataByteSize = bufferSizeBytes;
mBufferList->mBuffers[i].mData = malloc(bufferSizeBytes);
}
Most of the time, mChannelsPerFrame
is 2, so the above code creates two buffers, one for each channel. Each buffer has a reserved memory worths bufferSizeBytes
.
How can I replicate the same behaviour in Swift?