Read a WAV file and convert it to an array of amplitudes in Swift
Asked Answered
C

1

10

I have followed a very good tutorial on udacity to explore the basis of audio applications with Swift. I would like to extend its current functionalities, starting with displaying the waveform of the WAV file. For that purpose, I would need to retrieve the amplitude versus sample from the WAV file. How could I proceed in swift, given that I have a recorded file already?

Thank you!

Catalyze answered 16/2, 2015 at 18:26 Comment(0)
M
0

AudioToolBox meets you need.

You can use AudioFileService to get the audio samples from the audio file, such as the WAV file,

Then you can get the amplitude from every sample.

 // this is your desired amplitude data
 public internal(set) var packetsX = [Data]()

 public required init(src path: URL) throws {
        Utility.check(error:  AudioFileOpenURL(path as CFURL, .readPermission, 0,  &playbackFile) ,                // set on output to the AudioFileID
                      operation: "AudioFileOpenURL failed")
        
        guard let file = playbackFile else {
            return
        }
        
        var numPacketsToRead: UInt32 = 0
        
        
        GetPropertyValue(val: &numPacketsToRead, file: file, prop: kAudioFilePropertyAudioDataPacketCount)
        
        var asbdFormat = AudioStreamBasicDescription()
        GetPropertyValue(val: &asbdFormat, file: file, prop: kAudioFilePropertyDataFormat)
        
        dataFormatD = AVAudioFormat(streamDescription: &asbdFormat)
        /// At this point we should definitely have a data format
        var bytesRead: UInt32 = 0
        GetPropertyValue(val: &bytesRead, file: file, prop: kAudioFilePropertyAudioDataByteCount)
        
        
        
        
        guard let dataFormat = dataFormatD else {
            return
        }
        
    
        let format = dataFormat.streamDescription.pointee
        let bytesPerPacket = Int(format.mBytesPerPacket)
        
        for i in 0 ..< Int(numPacketsToRead) {
            
            var packetSize = UInt32(bytesPerPacket)
                
            let packetStart = Int64(i * bytesPerPacket)
            let dataPt: UnsafeMutableRawPointer = malloc(MemoryLayout<UInt8>.size * bytesPerPacket)
            AudioFileReadBytes(file, false, packetStart, &packetSize, dataPt)
            let startPt = dataPt.bindMemory(to: UInt8.self, capacity: bytesPerPacket)
            let buffer = UnsafeBufferPointer(start: startPt, count: bytesPerPacket)
            let array = Array(buffer)
            packetsX.append(Data(array))
        }
        
        
        
    }

For example , the WAV file has channel one 、bit depth of Int16 .

// buffer is of two Int8, to express an Int16
let buffer = UnsafeBufferPointer(start: startPt, count: bytesPerPacket)

more information , you can check my github repo

Mauro answered 11/6, 2021 at 7:46 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.