I'm using CICrop to crop an image to a certain size by cutting off the top and bottom of the image.
Afterwards, I apply something like the CIMultiplyCompositing filter, to combine the cropped image with another image.
Both images are the same size, however the result shows that the two images don't line up... one is offset.
So, I checked the following:
NSLog(@"image after crop: %g, %g, %g, %g", imageToFilter.extent.origin.x,
imageToFilter.extent.origin.y,
imageToFilter.extent.size.width,
imageToFilter.extent.size.height);
NSLog(@"second image: %g, %g, %g, %g", secondImage.extent.origin.x,
secondImage.extent.origin.y,
secondImage.extent.size.width,
secondImage.extent.size.height);
Which shows that the origin.y of the cropped image has the offset I'm seeing (a result of using CICrop):
image after crop: 0, 136, 3264, 2176
second image: 0, 0, 3264, 2176
So, is there any way for me to reset the cropped images "extent" rect, so that origin.y is zero? Checking the docs for CIImage, "extent" is a read only property.
Or am I going to have to do some horribly inefficient conversion to another image type/raw data and then back to a CIImage?
Thanks for any advice.