Loading 4-channel texture data in iOS
Asked Answered
E

6

5

I want to load 4-channel texture data from a file in iOS, so I consider the texture as a (continuous) map

[0,1]x[0,1] -> [0,1]x[0,1]x[0,1]x[0,1]

If I use the fileformat .png, XCode/iOS consider the file as an image, and so multiplies each component rgb with a (premultiplied alpha), corrupting my data. How should I solve this? Examples may be

  • use two textures with components rgb (3-channel)
  • postdivide alpha
  • use another file format

Of these, I consider the best solution to be to use another file format. The GL-compressed file format (PVRTC?) is not Apple-platform independent and seems to be of low resolution (4 bits) (reference).

EDIT: If my own answer below is true, it is not possible to get the 4 channel data of png's in iOS. Since OpenGL is about creating images rather than presenting images, it should be possible to load 4-channel data in some way. png is a fileformat for images (and compression depends on all 4 channels but compression of one channel is independent of the other channels), so one may argue that I should use another file format. So which other compressed file formats should I use, which is easy to read/integrated in iOS?

UPDATE: "combinatorial" mentioned a way to load 4-channel non-premultiplied textures, so I had to give him the correct answer. However, that solution had some restrictions I didn't like. My next question is then "Access raw 4-channel data from png files in iOS" :)

I think it is a bad library design not making it possible to read 4 channel png data. I don't like systems trying to be smarter than myself.

Endamage answered 11/11, 2012 at 22:40 Comment(5)
Out of curiosity, are the 4 channels independent, i.e. unlikely to be correlated? If so, texture compression techniques (be it PVRTC, ETC1, S3TC etc) are unlikely to be of much help because these assume correlation between the various colour channels. IIRC, for {8}888 data, PNG doesn't consider correlation between channels, and only looks for correllation within a channel, so having them independent won't really affect the rate of compression.(Of course, unlike texture compression, with PNG, you won't get any savings on the GPU-side since it has to be decompressed by the CPU first)Weever
These 4 channels are independent.Endamage
In that case, assuming you can't switch off the automatic premultiplication AND you decide to not use 2 PNGs (either splitting into 2+2 or 3+1 channels) then you will probably still need to use multiple compressed textures and that may depend on the texture format. PVRTC: You might be OK with just 2 textures. I recommend using the R&G as they have the most accuracy. The B channel is slightly less precise. Note that in areas of a texture that aren't fully opaque, some of the RGB bits may be re-assigned in order to represent alpha, so the precision is likely to decrease further....Weever
With ETC, the RGB channels are assumed to be correlated with luma, so you might need to store 1 channel per texture. If you do, replicate the channel into all of RGB, because I pretty sure the ETC compressor will do a better job with grey scale texture than one where, say, R&B are constant and only G varies.Weever
(I corrected an argument about png compression). 2 x (compressed texture) will probably also be a good choice. But I have to provide a backup solution (at run time) if that compression is not present on a device. If so, I could probably use png's...Endamage
L
4

As you considered PVRTC then using GLKit could be an option. This includes GLKTextureLoader which allows you to load textures without pre-multiplying alpha. Using for example:

+ (GLKTextureInfo *)textureWithContentsOfFile:(NSString *)fileName options:(NSDictionary *)textureOperations error:(NSError **)outError

and passing an options dictionary containing:

GLKTextureLoaderApplyPremultiplication = NO
Lily answered 28/11, 2012 at 6:38 Comment(5)
I must anyway provide another format (at run time) if PVRTC is not present on the running platform. But do you think GLKTextureLoaderApplyPremultiplication = NO could be applied to png's too?Endamage
Reading the documentation, this seems to be possible. I shall try this out.Endamage
This did actually work (independently of the build setting COMPRESS_PNG_FILES), and premultiplication=NO is actually default. However, with GLKTextureLoader it is not possible to load a file into a bound texture with that textures settings, it only creates a new texture with settings defined by target (TEXTURE_2D/TEXTURE_CUBE_MAP). (I remember now that this was the reason I didn't use GLKTextureLoader in the first place...). Anyway, I think this is a solution to the question.Endamage
I saw your update to the question, could you use something like this to get the raw data... render the texture to a framebuffer using glFramebufferTexture2D then use glReadPixels to get the raw pixel data in RGBA, its lots of hoops to jump through. There is a more detailed example of rendering the texture to a buffer here... troylawlor.com/tutorial-fbo.htmlLily
That is also a solution :) I was thinking if it was possible in GLES 2.0 to copy texture content. I think this is possible in GL4.0, since that library uses (general) buffers.Endamage
S
3

You can simply request that Xcode not 'compress' your PNG files. Click your project in the top left, select the 'Build Settings', find 'Compress PNG Files' and set the option to 'No'.

As to your other options, postdividing isn't a bad solution but obviously you'll lose overall precision and I believe both TIFF and BMP are also supported. PVRTC is PowerVR specific so it's not Apple-specific but it's also not entirely platform independent and is specifically designed to be a lossy compression that's trivial to uncompress with little input on the GPU. You'd generally increase your texture resolution to ameliorate for the low bit per pixel count.

enter image description here

Strafford answered 11/11, 2012 at 22:55 Comment(7)
So setting "Compress PNG Files" to No will not corrupt my png data? I currently load my texture data from a UIImage, using CGImageAlphaPremultipliedLast, like raywenderlich.com/4404/…. I don't think the option kCGImageAlphaLast is implemented on iOS.Endamage
That's certainly my understanding of what that setting does. And both Core Graphics and UIKit can load either real PNGs or Xcode's modified versions interchangeably, so that you can load PNGs from e.g. the Internet.Strafford
If premultiplication is done by XCode, and can be disabled, has CGImageAlphaPremultipliedLast in CGBitmapContextCreate no effect? If not, how should I read non-premultiplied .png files (I don't think libpng is available on iOS)?Endamage
You should use kCGImageAlphaLast rather than kCGImageAlphaPremultipliedLast so that premultiplication doesn't occur at runtime, leaving you with unmodified channels.Strafford
So kCGImageAlphaLast is actually valid on iOS? I have read that it was only valid on MacOSX... developer.apple.com/library/ios/documentation/GraphicsImaging/…Endamage
The link you post directly states "kCGImageAlphaLast The alpha component is stored in the least significant bits of each pixel. For example, non-premultiplied RGBA. Available in iOS 2.0 and later."Strafford
I am not sure if it is possible to access the png data in iOS. I have now updated my question and also tried to answer it.Endamage
G
3

You should use libpng to load PNG without premultiplied colors.

It is written in C and should compile for iOS.

I've had similar problems with Android and also had to use third-party library to load PNG files with non-premultiplied colors.

Goose answered 27/11, 2012 at 15:42 Comment(2)
Yes, I have considered libpng. But I could not link to libpng in iOS, so I think I have to compile this into my program. And libpng depends on zlib too... Do you know if it is easy to compile libpng manually, and will this increase my binary alot?Endamage
As I stated in my answer, I have experience with the similar case for Android apps. Unfortunately, I haven't compiled libpng for iOS and I'm not aware of its dependencies.Goose
E
2

This is an attempt to answer my own question.

It is not possible to load non-premultiplied .png files.

The option kCGImageAlphaLast is a valid option, but does not give a valid combination for CGBitmapContextCreate (reference). It is however a valid option for CGImageRef's.

What the build setting COMPRESS_PNG_FILES in XCode mentioned above does, is to convert .png files into some other file format and also multiply the channels rgb with a (reference). I was hoping that disabling this option would make it possible to reach the channel data in my actual .png files. But I am not sure if this is possible. The following example is an attempt to access the .png data at low level, as a CGImageRef:

void test_cgimage(const char* path)
{
    CGDataProviderRef dataProvider = CGDataProviderCreateWithFilename(path);
    CGImageRef cg_image = CGImageCreateWithPNGDataProvider(dataProvider, NULL, NO,
                                                           kCGRenderingIntentDefault);
    CGImageAlphaInfo info = CGImageGetAlphaInfo(cg_image);
    switch (info)
    {
        case kCGImageAlphaNone:               printf("kCGImageAlphaNone\n");                break;
        case kCGImageAlphaPremultipliedLast:  printf("kCGImageAlphaPremultipliedLast\n");   break;
        case kCGImageAlphaPremultipliedFirst: printf("kCGImageAlphaPremultipliedFirst\n");  break;
        case kCGImageAlphaLast:               printf("kCGImageAlphaLast\n");                break;
        case kCGImageAlphaFirst:              printf("kCGImageAlphaFirst\n");               break;
        case kCGImageAlphaNoneSkipLast:       printf("kCGImageAlphaNoneSkipLast\n");        break;
        case kCGImageAlphaNoneSkipFirst:      printf("kCGImageAlphaNoneSkipFirst\n");       break;
        default: break;
    }

}

which gives "kCGImageAlphaPremultipliedLast" with COMPRESS_PNG_FILES disabled. So I think iOS always convert .png files, even at run-time.

Endamage answered 18/11, 2012 at 21:14 Comment(0)
B
1

There is better solution, faster(about 3x~5x) and cross platform

    #define STB_IMAGE_IMPLEMENTATION
    #include "stb_image.h"


    // force 4 channel rgba, force flipy

    // image: first pixel is top left, OpenGL assume first is bottom left, so need flipy
    bool flipy = true;
    int width, height, nrChannels;
    stbi_set_flip_vertically_on_load(flipy);

    unsigned char *imageData = stbi_load(path.c_str(), &width, &height, &nrChannels, 4);

    // load data to your texture
    // glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, bytes);

    free(imageData);
Bazan answered 28/5, 2020 at 13:28 Comment(0)
S
0

Not 100% what you want, but I got around the problem using this approach: Put the alpha channel into a separate black & white png and save the original png without alpha. So space taken is about the same. Then in my texture loader load both images and combine into one texture.

I know this is a only a workaround, but at least it gives the correct result. And yes, it is very annoying, that iOS does not allow you to load textures from PNG without premultiplied alpha.

Schnook answered 18/12, 2012 at 13:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.