is there any way to detect shape contours in Swift without having to branch out to OpenCV?
Asked Answered
R

2

6

Obviously, we can use CoreImage to convert to gray, denoise and blur an image but then when it comes to Canny Edge Detection, contour detection and to draw a contour of a shape it seems that this is not possible just using swift features. Am I right or is there something that I am missing here?

Resa answered 12/8, 2018 at 1:56 Comment(0)
J
3

In June 2020 Apple added a new Contour Detection feature to their Vision Framework in iOS 14. (currently in beta), so you can now use Swift (w/o OpenCV) in order to detect contours.

You can find more details in this documentation from Apple: https://developer.apple.com/documentation/vision/vndetectcontoursrequest

Also relevant is this WWDC2020 video: https://developer.apple.com/videos/play/wwdc2020/10673/

Joacimah answered 30/6, 2020 at 11:23 Comment(0)
C
6

Now you can use VNDetectContoursRequest to make detection with SwiftUI for iOS 14:

func detectVisionContours(){
    
    let context = CIContext()
    if let sourceImage = UIImage.init(named: "coin")
    {
        var inputImage = CIImage.init(cgImage: sourceImage.cgImage!)
        
        let contourRequest = VNDetectContoursRequest.init()
        contourRequest.revision = VNDetectContourRequestRevision1
        contourRequest.contrastAdjustment = 1.0
        contourRequest.detectDarkOnLight = true            
        contourRequest.maximumImageDimension = 512


        let requestHandler = VNImageRequestHandler.init(ciImage: inputImage, options: [:])

        try! requestHandler.perform([contourRequest])
        let contoursObservation = contourRequest.results?.first as! VNContoursObservation
        
        self.points  = String(contoursObservation.contourCount)
        self.contouredImage = drawContours(contoursObservation: contoursObservation, sourceImage: sourceImage.cgImage!)

    } else {
        self.points = "Could not load image"
    }
}

and for drawing contours do this:

public func drawContours(contoursObservation: VNContoursObservation, sourceImage: CGImage) -> UIImage {
    let size = CGSize(width: sourceImage.width, height: sourceImage.height)
    let renderer = UIGraphicsImageRenderer(size: size)
    
    let renderedImage = renderer.image { (context) in
        let renderingContext = context.cgContext

        let flipVertical = CGAffineTransform(a: 1, b: 0, c: 0, d: -1, tx: 0, ty: size.height)
        renderingContext.concatenate(flipVertical)

        renderingContext.draw(sourceImage, in: CGRect(x: 0, y: 0, width: size.width, height: size.height))
        
        renderingContext.scaleBy(x: size.width, y: size.height)
        renderingContext.setLineWidth(5.0 / CGFloat(size.width))
        let redUIColor = UIColor.red
        renderingContext.setStrokeColor(redUIColor.cgColor)
        renderingContext.addPath(contoursObservation.normalizedPath)
        renderingContext.strokePath()
    }
    
    return renderedImage
}

Reference

Cruiser answered 10/9, 2021 at 11:36 Comment(0)
J
3

In June 2020 Apple added a new Contour Detection feature to their Vision Framework in iOS 14. (currently in beta), so you can now use Swift (w/o OpenCV) in order to detect contours.

You can find more details in this documentation from Apple: https://developer.apple.com/documentation/vision/vndetectcontoursrequest

Also relevant is this WWDC2020 video: https://developer.apple.com/videos/play/wwdc2020/10673/

Joacimah answered 30/6, 2020 at 11:23 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.