I am trying to create an image by averaging several other images. To achieve this, I first darken each image by a factor equivalent to the number of images I am averaging:
func darkenImage(by multiplier: CGFloat) -> CIImage? {
let divImage = CIImage(color: CIColor(red: multiplier, green: multiplier, blue: multiplier))
let divImageResized = divImage.cropped(to: self.extent) //Set multiplier image to same size as image to be darkened
if let divFilter = CIFilter(name: "CIMultiplyBlendMode", parameters: ["inputImage":self, "inputBackgroundImage":divImageResized]) {
return divFilter.outputImage
}
print("Failed to darken image")
return nil
}
After this I take each darkened image and add them together (add image 1 and 2 together, then add the result together with image 3 etc):
func blend(with image: CIImage, blendMode: BlendMode) -> CIImage? {
if let filter = CIFilter(name: blendMode.format) { //blendMode.format is CIAdditionCompositing
filter.setDefaults()
filter.setValue(self, forKey: "inputImage")
filter.setValue(image, forKey: "inputBackgroundImage")
let resultImage = filter.outputImage
return resultImage
}
return nil
}
This code executes and produces a new image, but the more images I average together, the darker the shadows gets. The highlights stay about the same brightness as each of the individual images, but the darker parts just gets darker and darker. Does anyone know what could be wrong?
To reduce the number of potential issues I have also tried to darken the images before hand in Lightroom and just apply the CIAdditionCompositing
filter. This gives the same result, which makes me think that CIAdditionCompositing
may not just be adding up pixels, but use some slightly different algorithm, but I haven't found any documentation on this. I have also tried changing the darkening multiplier to see if I did a calculation error, but if I darken the images less, the highlights becomes overexposed when adding the images together again.