Core Image: after using CICrop, applying a compositing filter doesn't line up
Asked Answered
I

2

8

I'm using CICrop to crop an image to a certain size by cutting off the top and bottom of the image.

Afterwards, I apply something like the CIMultiplyCompositing filter, to combine the cropped image with another image.

Both images are the same size, however the result shows that the two images don't line up... one is offset.

So, I checked the following:

NSLog(@"image after crop: %g, %g, %g, %g", imageToFilter.extent.origin.x,
                       imageToFilter.extent.origin.y,
                       imageToFilter.extent.size.width,
                       imageToFilter.extent.size.height);

NSLog(@"second image: %g, %g, %g, %g", secondImage.extent.origin.x,
                       secondImage.extent.origin.y,
                       secondImage.extent.size.width,
                       secondImage.extent.size.height);

Which shows that the origin.y of the cropped image has the offset I'm seeing (a result of using CICrop):

image after crop: 0, 136, 3264, 2176
second image: 0, 0, 3264, 2176

So, is there any way for me to reset the cropped images "extent" rect, so that origin.y is zero? Checking the docs for CIImage, "extent" is a read only property.

Or am I going to have to do some horribly inefficient conversion to another image type/raw data and then back to a CIImage?

Thanks for any advice.

Intercollegiate answered 17/11, 2011 at 16:5 Comment(0)
I
17

I figured out the answer to this... and it's an easy one. I just needed to apply a "Translation" transform on the CIImage after cropping it, like so:

imageToFilter = [imageToFilter imageByApplyingTransform:CGAffineTransformMakeTranslation(0, -imageToFilter.extent.origin.y)];

That effectively moves its y origin back to 0.

Intercollegiate answered 20/12, 2011 at 6:21 Comment(1)
Thanks for figuring this out, it really helped. There is a small mistake however. You need to normalize the X extent as well. imageToFilter = [imageToFilter imageByApplyingTransform:CGAffineTransformMakeTranslation(-imageToFilter.extent.origin.x, -imageToFilter.extent.origin.y)];Weird
V
2

You can also use this as an extension. For Swift:

import CoreImage

extension CIImage {
  var correctedExtent: CIImage {
    let toTransform = CGAffineTransform(translationX: -self.extent.origin.x, y: -self.extent.origin.y)
    return self.transformed(by: toTransform)
  }
}

And you can use as:

let corrected = ciImage.correctedExtent
Vary answered 16/7, 2020 at 12:21 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.