Applying CIFilter to OpenGL render-to-texture
Asked Answered
N

1

12

I'm trying to apply CoreImage filter on my fullscreen rendering output but looks like I'm missing something because I'm getting black screen as output.

First I draw whole scene to a texture. Then I create CoreImage out of that texture, which I finally draw and present. But all I get is black screen. I was following Apple guide lines on drawing to texture and integrating CoreImage with OpenGLES: WWDC2012 511 and https://developer.apple.com/library/ios/documentation/3ddrawing/conceptual/opengles_programmingguide/WorkingwithEAGLContexts/WorkingwithEAGLContexts.html

Here is relevant code:

Renderer:

@interface Renderer () {
  EAGLContext* _context;
  GLuint _defaultFramebuffer, _drawFramebuffer, _depthRenderbuffer, _colorRenderbuffer, _drawTexture;
  GLint _backingWidth, _backingHeight;
  CIImage *_coreImage;
  CIFilter *_coreFilter;
  CIContext *_coreContext;
}

Initialization method:

- (BOOL)initOpenGL
{
  _context = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
  if (!_context) return NO;

  [EAGLContext setCurrentContext:_context];

  glGenFramebuffers(1, &_defaultFramebuffer);
  glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);

  glGenRenderbuffers(1, &_colorRenderbuffer);
  glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
  glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, _colorRenderbuffer);

  glGenFramebuffers(1, &_drawFramebuffer);
  glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);

  glGenTextures(1, &_drawTexture);
  glBindTexture(GL_TEXTURE_2D, _drawTexture);
  glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, _drawTexture, 0);

  glGenRenderbuffers(1, &_depthRenderbuffer);
  glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer);
  glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderbuffer);

  _coreFilter = [CIFilter filterWithName:@"CIColorInvert"];
  [_coreFilter setDefaults];

  NSDictionary *opts = @{ kCIContextWorkingColorSpace : [NSNull null] };
  _coreContext = [CIContext contextWithEAGLContext:_context options:opts];

  return YES;
}

Alloc memory whenever layer size changes (on init and on orientation change):

- (void)resizeFromLayer:(CAEAGLLayer *)layer
{
  layer.contentsScale = 1;

  glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);

  glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);
  [_context renderbufferStorage:GL_RENDERBUFFER fromDrawable:layer];

  glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &_backingWidth);
  glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &_backingHeight);

  // glCheckFramebufferStatus ... SUCCESS

  glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);

  glBindTexture(GL_TEXTURE_2D, _drawTexture);
  glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, _backingWidth, _backingHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);

  glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderbuffer);
  glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, _backingWidth, _backingHeight);

  // glCheckFramebufferStatus ... SUCCESS
}

Draw method:

- (void)render:(Scene *)scene
{
  [EAGLContext setCurrentContext:_context];

  glBindFramebuffer(GL_FRAMEBUFFER, _drawFramebuffer);

  // Draw using GLKit, custom shaders, drawArrays, drawElements
  // Now rendered scene is in _drawTexture

  glBindFramebuffer(GL_FRAMEBUFFER, _defaultFramebuffer);
  glBindRenderbuffer(GL_RENDERBUFFER, _colorRenderbuffer);

  // Create CIImage with our render-to-texture texture
  _coreImage = [CIImage imageWithTexture:_drawTexture size:CGSizeMake(_backingWidth, _backingHeight) flipped:NO colorSpace:nil];

  // Ignore filtering for now; Draw CIImage to current render buffer 
  [_coreContext drawImage:_coreImage inRect:CGRectMake(0, 0, _backingWidth, _backingHeight) fromRect:CGRectMake(0, 0, _backingWidth, _backingHeight)];

  // Present
  [_context presentRenderbuffer:GL_RENDERBUFFER];
}

Note, after drawing scene, _drawTexture contains rendered scene. I check this using Xcode debug tools (Capture OpenGL ES frame).

EDIT: If I try creating CIImage out of some other texture then _drawTexture, it displays it correctly. My suspicion is that _drawTexture might not be ready or is somehow locked when CIContext tries to render it through CIImage.

EDIT2: I also tried replacing all drawing code with just viewport clearing:

  glViewport(0, 0, _backingWidth, _backingHeight);
  glClearColor(0, 0.8, 0, 1);
  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

Result is still black. It suggests problem might be something with draw texture or framebuffer.

Naif answered 7/1, 2014 at 15:25 Comment(4)
Why on earth would this be downvoted?Chorale
That's interesting. Usually, I'd say this is because the Core Image context wasn't created with the same OpenGL ES context as your rendering, but that looks to be set up properly here. If you use a passthrough shader and draw a quad to the screen using your rendered texture, can you verify that the scene is rendering to the texture properly? Finally, if you weren't married to Core Image, I have a little project here: github.com/BradLarson/GPUImage that can also do this style of GPU-side post processing. See the CubeExample sample application in there, which does what you want.Idem
Ok, I rendered a quad using _drawTexture and it's black. So looks like something is wrong either with that texture or with rendering on it. Maybe I'm missing something with render-to-texture. Diff is only that I'm attaching a texture as GL_COLOR_ATTACHEMNT0 instead of render buffer.Selfemployed
Oh, and thanks for bringing up GPUImage. I'm actually familiar with your project and I might fall back to it or even support it together with CoreImage.Selfemployed
N
6

I finally found what's wrong. Non-power of 2 textures on iOS must have linear filtering and clamp to edge wrapping:

glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

My texture had size same as screen, but I didn't set those four params.

For future generations: Code above is perfectly valid example of interconnection of OpenGL ES and CoreImage. Just make sure you initialise your texture properly!

Naif answered 7/1, 2014 at 21:37 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.