Render to a rectangle that is the same size as the source image before applying a contents scale factor.

Consider using simpler filters that can produce results similar to algorithmic filters.

For example, CIColorCube can produce output similar to CISepiaTone, and do so more efficiently.

Take advantage of the support for YUV image in iOS 6.0 and later.

Camera pixel buffers are natively YUV but most image processing algorithms expect RBGA data. There is a cost to converting between the two. Core Image supports reading YUB from CVPixelBuffer objects and applying the appropriate color transform.

options = @{ (id)kCVPixelBufferPixelFormatTypeKey :

@(kCVPixelFormatType_420YpCvCr88iPlanarFullRange) };

Does Your App Need Color Management?

By default, Core Image applies all filters in light-linear color space. This provides the most accurate and consistent results.

The conversion to and from sRGB adds to filter complexity, and requires Core Image to apply these equations: