Detection of sharpness of a photo
Asked Answered
N

3

6

I'm looking for a framework helps detecting the sharpness of a photo. I have read this post which points to the methodology of doing so. But I'd rather work with a library than getting my hands dirty.

In the documentation of Core Image Apple says:

Core Image can analyze the quality of an image and provide a set of filters with optimal settings for adjusting such things as hue, contrast, and tone color, and for correcting for flash artifacts such as red eye. It does all this with one method call on your part.

How can I do the 'analyze image quality' part? I'd love to see some example code.

Neutron answered 5/8, 2015 at 11:5 Comment(2)
Take off the globes, and ready for the dirtiness!! Appliying a filter to an image is quite easy, go for it!Weeks
Questions asking us to recommend or find a book, tool, software library, tutorial or other off-site resource are off-topic for Stack Overflow.Gumbotil
G
1

We did it with GPUimage framework like this (calculate brightness and sharpness): (here are some snippets that might help you)

-(BOOL) calculateBrightness:(UIImage *) image {
float result  = 0;
int i = 0;
for (int y = 0; y < image.size.height; y++) {
    for (int x = 0; x < image.size.width; x++) {
        UIColor *color = [self colorAt:image
                                   atX:x
                                  andY:y];
        const CGFloat * colors = CGColorGetComponents(color.CGColor);
        float r = colors[0];
        float g = colors[1];
        float b = colors[2];
        result += .299 * r + 0.587 * g + 0.114 * b;
        i++;
    }
}
float brightness = result / (float)i;
NSLog(@"Image Brightness : %f",brightness);
if (brightness > 0.8 || brightness < 0.3) {
    return NO;
}
return YES;

}

-(BOOL) calculateSharpness:(UIImage *) image {
GPUImageCannyEdgeDetectionFilter *filter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
BinaryImageDistanceTransform *binImagTrans = [[BinaryImageDistanceTransform alloc] init ];
NSArray *resultArray = [binImagTrans twoDimDistanceTransform:[self getBinaryImageAsArray:[filter imageByFilteringImage:image]]];

if (resultArray == nil) {
    return NO;
}

int sum = 0;
for (int x = 0; x < resultArray.count; x++) {
    NSMutableArray *col = resultArray[x];
    sum += (int)[col valueForKeyPath:@"@max.intValue"];
}

// Values under analysis
NSLog(@"Image Sharp : %i",sum);
if (sum < 26250000) { // tested - bad sharpness is under ca. 26250000
    return NO;
}
return YES;

}

But it is very slow. It takes ca. 40 seconds for one image from iPad camera.

Grieve answered 20/8, 2015 at 8:12 Comment(1)
what is the BinaryImageDistanceTransform? I can't find it in the GPUImage libraryValera
U
2

Perhaps the best way to do this is the Polar Edge Coherence metric:

Baroncini, V., et al. "The polar edge coherence: a quasi blind metric for video quality assessment." EUSIPCO 2009, Glasgow (2009): 564-568.

It works just as well for images as for video. This directly measures the sharpness of edges. If you apply a sharpening filter you can compare the before and after values, and if you overdo the sharpening the values will start dropping again. It requires doing a couple of convolutions using complex-valued kernels as described in the paper.

Unplaced answered 17/8, 2015 at 8:25 Comment(0)
A
1

I don't think Core Image will help you. You could use the auto enhancement feature to get an array of proposed filters and values. There's no sharpness (edge contrast) filter however, just overall image contrast. See the full list here.

There's an Apple vDSP API which can do Fast Fourier Transform:

The vDSP API provides mathematical functions for applications such as speech, sound, audio, and video processing, diagnostic medical imaging, radar signal processing, seismic analysis, and scientific data processing.

You should be able to use it to analyze your image.

For a conceptual overview see: Using Fourier Transforms and search for tutorials on vDSP. There are also Q&A here on stack.

Aga answered 14/8, 2015 at 9:16 Comment(0)
G
1

We did it with GPUimage framework like this (calculate brightness and sharpness): (here are some snippets that might help you)

-(BOOL) calculateBrightness:(UIImage *) image {
float result  = 0;
int i = 0;
for (int y = 0; y < image.size.height; y++) {
    for (int x = 0; x < image.size.width; x++) {
        UIColor *color = [self colorAt:image
                                   atX:x
                                  andY:y];
        const CGFloat * colors = CGColorGetComponents(color.CGColor);
        float r = colors[0];
        float g = colors[1];
        float b = colors[2];
        result += .299 * r + 0.587 * g + 0.114 * b;
        i++;
    }
}
float brightness = result / (float)i;
NSLog(@"Image Brightness : %f",brightness);
if (brightness > 0.8 || brightness < 0.3) {
    return NO;
}
return YES;

}

-(BOOL) calculateSharpness:(UIImage *) image {
GPUImageCannyEdgeDetectionFilter *filter = [[GPUImageCannyEdgeDetectionFilter alloc] init];
BinaryImageDistanceTransform *binImagTrans = [[BinaryImageDistanceTransform alloc] init ];
NSArray *resultArray = [binImagTrans twoDimDistanceTransform:[self getBinaryImageAsArray:[filter imageByFilteringImage:image]]];

if (resultArray == nil) {
    return NO;
}

int sum = 0;
for (int x = 0; x < resultArray.count; x++) {
    NSMutableArray *col = resultArray[x];
    sum += (int)[col valueForKeyPath:@"@max.intValue"];
}

// Values under analysis
NSLog(@"Image Sharp : %i",sum);
if (sum < 26250000) { // tested - bad sharpness is under ca. 26250000
    return NO;
}
return YES;

}

But it is very slow. It takes ca. 40 seconds for one image from iPad camera.

Grieve answered 20/8, 2015 at 8:12 Comment(1)
what is the BinaryImageDistanceTransform? I can't find it in the GPUImage libraryValera

© 2022 - 2024 — McMap. All rights reserved.