Generating waveform from any music file ios
Asked Answered
E

3

26

I'm looking for how to draw the sound waves according to music.

I want waves like this image

enter image description here

here is some discussion about displaying Waves from music

  1. WaveForm on IOS
  2. rendering a waveform on an iphone
  3. audio waveform visualisation with iPhone

Github Example Links

But not getting any idea about this type of wavefrom, is this possible to draw waves like this image?

East answered 31/10, 2013 at 6:28 Comment(4)
Your image doesn't appear to have any relationship to an actual waveform. Where are you seeing that?Casaubon
i want to display wavefrom like this. Please check this image, markhadleyuk.com/wp-content/uploads/2012/01/…East
There are no resources on how to generate a waveform like the ones in your images because they are fake. An audio waveform from a song doesn't look like that. The image in your OP looks like sine waves with a window function. The link in your comment might be real audio data with a low-pass filter but if you are here asking how to do this that is way beyond you. Sorry. There is a plethora of information in the links you've posted and on the web. I don't understand what you want for an answer.Dextroamphetamine
you can refer this #5033275 and can make changes in generating image codeSapling
I
4

Disclaimer: A lot of this has been discovered through trial and error, I may have some serious false assumptions in play here:

You would need to use the AudioUnits framework. When initialising the playback you can create an AURenderCallbackStruct. You can specify in this struct a playback callback function which provides you with a few arguments which will contain the information you need.

the callback function will have a signature like this:

static OSStatus recordingCallback (void *inRefCon,
                                   AudioUnitRenderActionFlags *ioActionFlags,
                                   const AudioTimeStamp *inTimeStamp,
                                   UInt32 inBusNumber,
                                   UInt32 inNumberFrames,
                                   AudioBufferList *ioData) 

In here there is an array of audio data which can be used for getting amplitude of the audio buffer for each frequency bin, or for calculating the DB value of the frequency bin.

I don't know what that graph is showing, but it looks to me like a smoothed display of the amplitudes of each of the sample bins.

Audio Units are not simple, but its worth playing with for a while until you get a grip.

Here is a skeleton of my callback function so you have more of a grasp as to what I mean:

EDIT: removed dead link, I've lost this code sorry

Ignominy answered 10/11, 2013 at 12:40 Comment(1)
If your playing back media in realtime this is the correct answer, you can get the audio data being outputted through this callback.Ferrara
D
3

I, too have been trying sincerely for the last three months but I didn't find a solution. For the time being I used static images based on the type of song (static data songs). I added the images to a UIScrollView and changed the contentOffset based on the current position of the audio.

Dehydrogenase answered 31/10, 2013 at 6:51 Comment(0)
H
2

A little bit refactoring from the above answers


import AVFoundation
import CoreGraphics
import Foundation
import UIKit

class WaveGenerator {
    private func readBuffer(_ audioUrl: URL) -> UnsafeBufferPointer<Float> {
        let file = try! AVAudioFile(forReading: audioUrl)

        let audioFormat = file.processingFormat
        let audioFrameCount = UInt32(file.length)
        guard let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: audioFrameCount)
        else { return UnsafeBufferPointer<Float>(_empty: ()) }
        do {
            try file.read(into: buffer)
        } catch {
            print(error)
        }

//        let floatArray = Array(UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength)))
        let floatArray = UnsafeBufferPointer(start: buffer.floatChannelData![0], count: Int(buffer.frameLength))

        return floatArray
    }

    private func generateWaveImage(
        _ samples: UnsafeBufferPointer<Float>,
        _ imageSize: CGSize,
        _ strokeColor: UIColor,
        _ backgroundColor: UIColor
    ) -> UIImage? {
        let drawingRect = CGRect(origin: .zero, size: imageSize)

        UIGraphicsBeginImageContextWithOptions(imageSize, false, 0)

        let middleY = imageSize.height / 2

        guard let context: CGContext = UIGraphicsGetCurrentContext() else { return nil }

        context.setFillColor(backgroundColor.cgColor)
        context.setAlpha(1.0)
        context.fill(drawingRect)
        context.setLineWidth(0.25)

        let max: CGFloat = CGFloat(samples.max() ?? 0)
        let heightNormalizationFactor = imageSize.height / max / 2
        let widthNormalizationFactor = imageSize.width / CGFloat(samples.count)
        for index in 0 ..< samples.count {
            let pixel = CGFloat(samples[index]) * heightNormalizationFactor

            let x = CGFloat(index) * widthNormalizationFactor

            context.move(to: CGPoint(x: x, y: middleY - pixel))
            context.addLine(to: CGPoint(x: x, y: middleY + pixel))

            context.setStrokeColor(strokeColor.cgColor)
            context.strokePath()
        }
        guard let soundWaveImage = UIGraphicsGetImageFromCurrentImageContext() else { return nil }

        UIGraphicsEndImageContext()
        return soundWaveImage
    }

    func generateWaveImage(from audioUrl: URL, in imageSize: CGSize) -> UIImage? {
        let samples = readBuffer(audioUrl)
        let img = generateWaveImage(samples, imageSize, UIColor.blue, UIColor.white)
        return img
    }
}

Usage

let url = Bundle.main.url(forResource: "TEST1.mp3", withExtension: "")!
let img = waveGenerator.generateWaveImage(from: url, in: CGSize(width: 600, height: 200))
Harwilll answered 1/7, 2021 at 4:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.