HDR images through Core Image?
Asked Answered
P

3

12

Is it possible to process(filter) HDR images through Core Image? I couldn't find much documentation on this, so I was wondering if someone possibly had an answer to it. I do know that it is possible to do the working space computations with RGBAh when you initialize a CIContext, so I figured that if we could do computations with floating point image formats, that it should be possible..

What, if it is not possible, are alternatives if you want to produce HDR effects on iOS?

EDIT: I thought I'd try to be a bit more concise. It is to my understanding that HDR images can be clamped and saved as .jpg, .png, and other image formats by clamping the pixel values. However, I'm more interested in doing tone mapping through Core Image on a HDR image that has not been converted yet. The issue is encoding a CIImage with a HDR image, supposedly with the .hdr extention.

EDIT2: Maybe it would be useful to useful to use CGImageCreate , along with CGDataProviderCreateWithFilename ?

Peppercorn answered 16/8, 2016 at 6:23 Comment(0)
G
4

I hope you have basic understanding of how HDR works. An HDR file is generated by capturing 2 or more images at different exposures and combining it. So even if there's something like .HDR file, it would be a container format with more than one jpg in it. Technically you can not give two image files at once as an input to a generic CIFilter.

And in iOS, as I remember, it's not possible to access original set of photos of an HDR but the processed final output. Even if you could, you'd have to manually do the HDR process and generate a single HDR png/jpg anyway before feeding it to a CIFilter.

Generalissimo answered 19/8, 2016 at 4:6 Comment(3)
Once combined though, the resulting file can't be used in any way to supply pixel information? Perhaps we have such a file externally, hence the .hdr extension that I've seen used in sample MATLAB code for tone mapping, one cannot simply supply this file and encode the resulting image into a CIImage? Maybe it isn't the purpose of Core Image to do tone mapping on HDR images then? Even though, it technically is possible, with the use of their custom filters.Peppercorn
@DavidSacco first problem is ,in iOS there's no way to access originally captured array of images that has different exposures. And once final HDR is processed , it doesn't hold pixel information of source files because its technically impossible to store such data in 3 RGB channels. In iOS "HDR" labeled images are nothing but JPEGs , that has already been processed using multiple photos with different exposures.Generalissimo
Thanks, yeah, from my own google-fu this is also what I've learned. I'm wondering if any of the new iOS 10 stuff pertaining to .RAW images might help. I actually also learned that HDR images could be saved in a floating point .TIFF file, where high intensities could be preserved, it's just that the user would have to supply said image if they wanted to use tone mapping. It's further possible to make sure that your context is computed in floating point precision with the option kCIContextWorkingFormat.Peppercorn
C
2

Since there are people who ask for a CI HDR Algorithm, I decided to share my code on github. See:

https://github.com/schulz0r/CoreImage-HDR

It is the Robertson HDR algorithm, so you cannot use RAW images. Please see the unit tests if you want to know how to get the camera response and obtain the hdr image. CoreImage saturates pixel values outside [0.0 ... 1.0], so the HDR is scaled into said interval. Coding with metal always causes messy code for me, so I decided to use MetalKitPlus which you have to include in your project. You can find it here:

https://github.com/LRH539/MetalKitPlus

I think you have to check out the dev/v2.0.0 branch. I will merge this into master in the future.

edit: Just clone the master branch of MetalKitPlus. Also, I added a more detailed description to my CI-HDR project.

Cranium answered 27/2, 2018 at 14:3 Comment(0)
C
1

You can now(iOS 10+) capture Raw images(coded on 12 bits) and then filter them the way you like using CIFilter. You might not get a dynamic range as wide as the one you get by using bracketed captures; nevertheless, it is still wider than capturing 8-bits images.

Check Apple's documentation for capturing and processing RAW images.

I also recommend you watch wwdc2016 video by Apple(move to the raw processing part).

Crocket answered 15/11, 2018 at 11:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.