Area covered with the same pixel colour and change it
Asked Answered
C

1

8

how to erase the colour of particular pixel and pixel area covered with the same colour.I am able to get the particular pixel colour using below swift code but I am not able to get surrounding area which has same colour pixel and unable to erase it.

import UIKit

class ColorOfImage: UIImageView {
    
    var lastColor:UIColor? = nil


    
    /*
     // Only override draw() if you perform custom drawing.
     // An empty implementation adversely affects performance during animation.
     override func draw(_ rect: CGRect) {
     // Drawing code
     }
     */
    
    override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
        
        if self.isHidden == true {
            
            self.next?.touchesEnded(touches, with: event)
            return
        }
        
        let touch: UITouch = touches.first!
        
        var point:CGPoint = touch.location(in: self)
        self.lastColor = self.getPixelColorAtLocation(point:point)  
    }
    
    
    public func createARGBBitmapContext(inImage: CGImage) -> CGContext {

        var bitmapByteCount = 0
        var bitmapBytesPerRow = 0
        
        //Get image width, height
        let pixelsWide = inImage.width
        let pixelsHigh = inImage.height
        
        // Declare the number of bytes per row. Each pixel in the bitmap in this
        // example is represented by 4 bytes; 8 bits each of red, green, blue, and
        // alpha.
        bitmapBytesPerRow = Int(pixelsWide) * 4
        bitmapByteCount = bitmapBytesPerRow * Int(pixelsHigh)
        
        // Use the generic RGB color space.
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        
        
        
        // Allocate memory for image data. This is the destination in memory
        // where any drawing to the bitmap context will be rendered.
        let bitmapData = malloc(bitmapByteCount)
        
        // Create the bitmap context. We want pre-multiplied ARGB, 8-bits
        // per component. Regardless of what the source image format is
        // (CMYK, Grayscale, and so on) it will be converted over to the format
        // specified here by CGBitmapContextCreate.
        let context = CGContext(data: bitmapData, width: pixelsWide, height: pixelsHigh, bitsPerComponent: 8, bytesPerRow: bitmapBytesPerRow, space: colorSpace, bitmapInfo: CGImageAlphaInfo.premultipliedFirst.rawValue)
        
        // Make sure and release colorspace before returning
        return context!
    }
    
    
    
    
    public func getPixelColorAtLocation( point:CGPoint) -> UIColor {
        // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
        var point = point
        
        var context:CGContext? = nil
        
        context = self.createARGBBitmapContext(inImage: (self.image?.cgImage)!)
        
        
        if context == nil {
            return UIColor.white
        }
        
        var pixelsWide = (self.image?.cgImage)!.width
        var pixelsHigh = (self.image?.cgImage)!.height
        var rect = CGRect(x:0, y:0, width:Int(pixelsWide), height:Int(pixelsHigh))
        
        var xScale:CGFloat = CGFloat(pixelsWide)/self.frame.size.width
        var yScale:CGFloat = CGFloat(pixelsHigh)/self.frame.size.height
        
        
        point.x = point.x * xScale
        point.y = point.y * yScale
        
        
        
        var x:CGFloat = 1.0
        
        
        
        if (self.image?.responds(to: #selector(getter:  self.image?.scale)))! {
            x =  ( self.image!.scale)
            
        }
        
        
        
        //Clear the context
        context?.clear(rect)
        
        // Draw the image to the bitmap context. Once we draw, the memory
        // allocated for the context for rendering will then contain the
        // raw image data in the specified color space.
        
        context?.draw((self.image?.cgImage)!, in: rect)
        
        // Now we can get a pointer to the image data associated with the bitmap
        // context.
        
        
        let data = context?.data
        //    let dataType = UnsafePointer<UInt8>(data)
        
        var color:UIColor? = nil
        if data != nil {
            let dataType = data?.assumingMemoryBound(to: UInt8.self)
            
            
            let offset = 4*((Int(pixelsWide) * Int(point.y)) + Int(point.x))
            let alpha = dataType?[offset]
            let red = dataType?[offset+1]
            let green = dataType?[offset+2]
            let blue = dataType?[offset+3]
            color = UIColor(red: CGFloat(red!)/255.0, green: CGFloat(green!)/255.0, blue: CGFloat(blue!)/255.0, alpha: CGFloat(alpha!)/255.0)
        }
        else
        {
            
            
        }
        
        
        
        // Free image data memory for the context
        free(data)
        return color!;
    } 
}

For example if i touch on UIImage and at particular coordinates then it has to get the colour and erase that pixel and surrounding pixels which has same colour and all those have to erase.Can one please help me on this.

Crat answered 18/1, 2017 at 6:26 Comment(0)
S
1

Month ago I worked for an App where you get an image and you had to swap colors... I don't know if it could be useful for you. I made a Category of UIImage for it.

Anyway here is the code in Objective-C:

#import <UIKit/UIKit.h>

@interface UIImage (YPKInterface)
+ (UIImage *)swapImage:(UIImage *)image color:(UIColor *)originalColor withColor:(UIColor *)swappedColor andThreshold:(float)threshold;
@end

#import "UIImage+YPKInterface.h"

@implementation UIImage (YPKInterface)

+ (UIImage *)swapImage:(UIImage *)image color:(UIColor *)originalColor     withColor:(UIColor *)swappedColor andThreshold:(float)threshold {

int count = image.size.width*image.size.height;

// Convert image in raw data
CGImageRef imageRef = [image CGImage];
NSUInteger width = CGImageGetWidth(imageRef);
NSUInteger height = CGImageGetHeight(imageRef);
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * width;
NSUInteger bitsPerComponent = 8;
CGContextRef context = CGBitmapContextCreate(rawData, width, height, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);

CGColorSpaceRelease(colorSpace);

CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGContextRelease(context);

// Scan all the pixels
NSUInteger byteIndex = 0;
for (int i = 0 ; i < count ; ++i)
{
    CGFloat alpha = ((CGFloat) rawData[byteIndex + 3] ) / 255.0f;
            CGFloat red   = ((CGFloat) rawData[byteIndex]     ) / alpha;
            CGFloat green = ((CGFloat) rawData[byteIndex + 1] ) / alpha;
            CGFloat blue  = ((CGFloat) rawData[byteIndex + 2] ) / alpha;

    // Change the color

    // RGBA of the colors
    float origR = CGColorGetComponents(originalColor.CGColor)[0] * 255;
    float origG = CGColorGetComponents(originalColor.CGColor)[1] * 255;
    float origB = CGColorGetComponents(originalColor.CGColor)[2] * 255;
    //float origA = CGColorGetComponents(originalColor.CGColor)[3];

    float swapR = CGColorGetComponents(swappedColor.CGColor)[0] * 255;
    float swapG = CGColorGetComponents(swappedColor.CGColor)[1] * 255;
    float swapB = CGColorGetComponents(swappedColor.CGColor)[2] * 255;
    //float swapA = CGColorGetComponents(swappedColor.CGColor)[3];

    if (red >= origR - threshold && red <= origR + threshold &&
        green >= origG - threshold && green <= origG + threshold &&
        blue >= origB - threshold && blue <= origB + threshold) {
        rawData[byteIndex + 3] = alpha * 255;
        rawData[byteIndex    ] = swapR * alpha;
        rawData[byteIndex + 1] = swapG * alpha;
        rawData[byteIndex + 2] = swapB * alpha;
    }

    byteIndex += bytesPerPixel;
}

CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, rawData, width*height*4, NULL);

CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();

CGBitmapInfo bitmapInfo =
kCGBitmapByteOrderDefault | kCGImageAlphaLast;

CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef imageRef2 = CGImageCreate(width, height, 8, 32, 4*width,colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);

UIImage *newImage = [UIImage imageWithCGImage:imageRef2];

return newImage;
}

@end
Selfcontradiction answered 30/1, 2017 at 10:16 Comment(3)
This will change the colour right from 0,0 coordinates this is same like floodfill algorithm but instead of that I want to get the colour pixel at particular coordinates and area covered with the same coloured pixel from that coordinates and then erase that.Crat
Please check the below video when even he clicked on the image green pixels got selected, so in the same way i am looking after for that. when I click on the image touch location coordinates pixels colour and same related pixels have to be selected. youtube.com/watch?v=DG38mAWKE1cCrat
Can anyone please help me on this?Crat

© 2022 - 2024 — McMap. All rights reserved.