Monday, February 17, 2014

Core Animation CAOpenGLLayer with the 3.2 Core Profile

Translucent uses CAOpenGLLayer as one of its primary rendering facilities and the code for rendering needed to be updated to support the 3.2 OpenGL Core Profile and automatic switching between GPUs but there was no directly applicable, clear, concise information out there about how to specifically accomplish both of these things for CAOpenGLLayer.  The only information out there really focused on using the NSOpenGLView. I didn't, couldn't and wouldn't use NSOpenGLView. So a solution was needed for CAOpenGLLayer to use the new Core Profile. This became especially important since the legacy style was deprecated in OpenGL a while ago and all future version use the new Core Profile style, Apple was just a little behind in pushing this through.

I will mention that using the new Core Profile requires the use of shaders and that now the programmer being responsible for all of the math that the library use to handle for matrix manipulation. I won't go into depth on those topics, they are quite lengthy themselves and there are plenty of resources available online. If your targeting 10.8 or later you can use GLKit, if your using 10.7 you can roll your own by using GLKit as a reference. Apple also has many pieces of example code with shader linking, compiling and use if your in need of that code as well.


For those of you who are upgrading your code it is worth mentioning that all rendering for the 3.2 OpenGL context needs to be done through the use of vertex arrays to send the drawing calls. You simply need to use glGenVertexArrays and glBindVertexArray to create a vertex array when specifying your vertex buffers and bind the vertex array first when setting up rendering. Without the vertex array, nothing will render. This can be subtle and easily overlooked as you can draw with vertex buffers and without vertex arrays in legacy mode but the Core Profile requires the use of both.


Now to get the Core Profile running on the layer was actually rather easy after a plunge through the docs of CGL (Core OpenGL), which is the lowest-level programming interface for OpenGL. The main thing that has to be done is to build the pixel format that you want the renderer to support, which is done in copyCGLPixelFormatForDisplayMask method of CAOpenGLLayer. Usually in this method you'd just return the super class implementation but to get the new Core Profile you have to opt in and specify the pixel format yourself. It should look something like this:



- (CGLPixelFormatObj)copyCGLPixelFormatForDisplayMask:(uint32_t)mask
{
    CGLPixelFormatAttribute attribs[] =  {
        kCGLPFADisplayMask, 0,
        kCGLPFAColorSize, 24,
        kCGLPFAAlphaSize, 8,
        kCGLPFAAccelerated,
        kCGLPFADoubleBuffer,
        kCGLPFAAllowOfflineRenderers,
        NSOpenGLPFAOpenGLProfile, NSOpenGLProfileVersion3_2Core,
        0
    };
    attribs[1] = mask;

    
    CGLPixelFormatObj pixFormatObj = NULL;
    GLint numPixFormats = 0;
    CGLChoosePixelFormat(attribs, &pixFormatObj, &numPixFormats);
    
    return pixFormatObj;
}

The important things to notice is that the 3.2 profile is specified for the OpenGL profile to use and that allow offline renders is specified, offline renders means GPUs that are not directly/or currently connected to a display i.e. the integrated GPU.


Now one more important thing that has to be done to support the use of offline renders. In the applications Info.plist you need to add the key NSSupportsAutomaticGraphicsSwitching as a boolean and set it to YES. Both of these things are required to enable use of the switching between graphics card, otherwise it will only be allowed to run on the discrete card, which uses much more battery power and causes mobile users to not want to run the application with battery power.


Apple has a much more in depth Tech Note that addresses much more of the intricacies of using multiple GPUs with OpenGL. If your doing simple rendering you probably don't have to worry too much about switching GPUs but its always best to test it thoroughly.


Another hook into the context setup that is a good thing to override is copyCGLContextForPixelFormat, as this is where the context is created and it is a really good place to setup any of your OpenGL state required for rendering. It would look something like this:



-(CGLContextObj)copyCGLContextForPixelFormat:(CGLPixelFormatObj)pixelFormat
{
    CGLContextObj context = NULL;
    CGLCreateContext(pixelFormat, NULL, &context);
    if(context || (context = [super copyCGLContextForPixelFormat:pixelFormat])) {       
        //Setup any OpenGL state, make sure to set the context before invoking OpenGL
        CGLContextObj currContext = CGLGetCurrentContext();
        CGLSetCurrentContext(context);
        //Issue any calls that require the context here.
        CGLSetCurrentContext(currContext);
    }
    
    return context;
}


You can use CAOpenGLLayer as it is, but more than likely you did not go to all the work of setting up an OpenGL renderer instead of creating a CALayer with and image for the reason you want to animate the rendering of the content. So for a complete example of CAOpenGLLayer I'll go over how you would setup the layer for custom property animation and how you would use that to render the animation.


This is just like any other custom animatable property in that you specify your properties as dynamic, create no implementations for them and override a few methods. The first two methods here to override are needsDisplayForKey and actionForKey. These are used to tell the layer animation system the properties that need to be animated and the actual animation to run on property change. The third method here, initWithLayer, is used to copy your layers instance variables. This would be an example of animating a color:



+ (BOOL)needsDisplayForKey:(NSString *)key
{
    if ([key isEqualToString:@"fillColor"]) {
        return YES;
    }

    return [super needsDisplayForKey:key];
}

- (id<CAAction>)actionForKey:(NSString *)key
{
    if ([key isEqualToString:@"fillColor"]) {
        CABasicAnimation *propAnim = [CABasicAnimation animationWithKeyPath:key];
        
        propAnim.fromValue= [self.modelLayer valueForKey:key];
        propAnim.duration=0.25f;
        propAnim.removedOnCompletion=YES;
        propAnim.timingFunction=[CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionLinear];
        
        
        return propAnim;
    }
    
    return [super actionForKey:key];
}

- (id)initWithLayer:(id)layer
{
    if((self = [super initWithLayer:layer])) {
        //Copy over any variable needed, notice the custom property 
        //comes over for free.

        //You can even get fancy here and use direct access of private ivars
        _data = layer->_data;
    }
    
    return self;
}

Notice here is that the model layer is being accessed for the fromValue in actionForKey and we create and return an animation object. With initWithLayer we only copy variables that are not part of the dynamic properties but part of the layer state, i.e. shaders, vao, vbo, etc.  Now the actual rendering of this is done in drawInCGLContext. Here is an snippet of what it would look like.



- (void)drawInCGLContext:(CGLContextObj)ctx
             pixelFormat:(CGLPixelFormatObj)pf
            forLayerTime:(CFTimeInterval)t
             displayTime:(const CVTimeStamp *)ts
{
   NSColor *currentFill = [[self.presentationLayer fillColor]
                           colorUsingColorSpaceName:NSCalibratedRGBColorSpace];

   CGFloat red = 0.0f, blue = 0.0f, green = 0.0f, alpha = 0.0f;
   [currentFill getRed:&red green:&green blue:&blue alpha:&alpha];
   //In the real world we would first have to enable the shader with glUseProgram
   //before we could make any calls to set the shader variables
   glUniform4f(colorPos, (GLfloat)red, (GLfloat)green, (GLfloat)blue, (GLfloat)alpha);

   //when we are all finished we need to call the super implementation to flush

  [super drawInCGLContext:ctx pixelFormat:pf forLayerTime:t displayTime:ts];
}

Notice here that you need to make sure to access the presentation layer for current animated display value. Also to note with colors, they need to be in NSCalibratedRGBColorSpace to get the red, green and blue values with: getRed:green:blue:alpha:, otherwise they will create a wonderful exception and crash if the color space of the NSColor object is anything else, as it will be with something like [NSColor blackColor]. With the red,green,blue and alpha in hand you would then simply pass it down to the shader with something like glUniform4f or put it in your vertex buffer.


Hopefully this has been a helpful reference to get the Core Profile up and running in CAOpenGLLayer. I wanted this information out there because when I went through this process there really was not really any information on exactly how to support using the Core Profile on CAOpenGLLayer.


Also I hope that someone else will benefit from the consolidating of information here to enable switching of graphics chipsets, since it was not overly clear at the time that it is a "two step process" because it is split between seemingly unrelated tech notes. For full reference here is the Tech Note for setting the Info.plist variable.


No comments:

Post a Comment