Core Image is a pixel-accurate, near-realtime, non-destructive image processing technology in Mac OS X. Implemented as part of the QuartzCore framework of Mac OS X 10.4 and later, Core Image provides a plugin-based architecture for applying filters and effects within the Quartz graphics rendering layer.[1] The framework was later added to iOS in iOS 5.[2]


Core Image abstracts the pixel-level manipulation process required when applying a filter to an image, making it simple for applications to implement image transformation capabilities without extensive coding. In a simple implementation, Core Image applies a single Image Filter to the pixel data of a given source to produce the transformed image. Each Image Filter specifies a single transform or effect, either built into Core Image or loaded from a plugin called an Image Unit. Combined with preset or user-defined input parameters, the filter can be applied to the original pixel data without modifying it, thereby providing non-destructive image editing.[3][4]

Like Photoshop, Core Image can apply multiple filters to the same image source. Instead of applying a series of filters individually, Core Image assembles a dynamic instruction pipeline so that only one calculation needs to be applied to the pixel data to achieve a cumulative effect. Applying the pixel operations associated with multiple filters can be achieved simultaneously and without a significant increase in processing time. Regardless of the number of filters, Core Image assembles the code for this instruction pipeline with a just-in-time compiler, which is executed by either the CPU or graphics card's GPU, whichever can perform the calculation faster.[5]

Filters are written in the Core Image Kernel Language, which shares a subset of commands with OpenGL Shading Language (GLSL).[6] When a compatible GPU is available, the Core Image compiler writes the instruction pipeline using GLSL, handling buffers and states transparently. Although GPU rendering is preferred[citation needed], the compiler can operate in a CPU fallback mode, generating commands suitable for the current CPU architecture instead.[7] CPU fallback uses the vector processing capabilities of the current CPU or CPUs, and it is multi-processor aware. Thus, Core Image performance depends on the GLSL capabilities of the GPU or the processing power of the CPU. With a supported GPU, most effects can be rendered in realtime or near-realtime.[8]

History and implementation

Core Image was introduced with Mac OS X 10.4.[9] Early examples of its use can be found in the ripple effect in Dashboard, and Apple's professional digital photography application, Aperture.[10] Starting with Mac OS X 10.5, any application that implements Image Kit can utilize Core Image.[11] Preview and iPhoto are common examples.

In 2011, Apple added Core Image to iOS in iOS 5.0.[2]

The Xcode Tools include Core Image Fun House and Quartz Composer; both utilize Core Image.

The Core Image plugin architecture was inspired by that of Core Audio.[12]

Pixel accuracy

All pixel processing provided by an Image Unit is performed in a pre-multiplied alpha (RGBA) color space, storing four color channels: red, green, blue, and transparency (alpha). Each color channel is represented by a 32-bit, floating point number. This provides exceptional color depth, far greater than can be perceived by the human eye, as each pixel is represented by a 128-bit vector (four 32-bit color channels). For color spaces of lower bit-depth, the floating-point calculation model employed by Core Image provides exceptional performance, which is useful when processing multiple images or video frames.[3][13]

Supported graphics processors

Any programmable GPU that supports the required OpenGL Shader (GLSL) commands is Core Image capable. Apple has used the following graphics cards to support Core Image GPU processing in Mac OS X 10.4 and Aperture, so the following list could be considered an example of minimum requirements:[10][14]

Note that any GPU capable of handling Core Image instructions is also Quartz Extreme capable. The requirements for Core Image are greater than those of Quartz Extreme.[14]

Built-in filters

Wikipedia Logo with "Color Monochrome", "Parallelogram Tile", and "Pinch Distortion" Image Units applied

macOS includes many built-in filters, including the following ones. Mac OS X 10.4 introduced over 100 of them, and Mac OS X 10.5 has added to the list.[3][15]

An open source documentation website for built-in Core Image filters is maintained at

See also


  1. ^ "Mac Dev Center - Introduction to Core Image Programming Guide". Retrieved September 20, 2009.
  2. ^ a b "iOS 5.0 API Diffs". Retrieved September 14, 2012.
  3. ^ a b c "Apple - Developer - Developing with Core Image". Retrieved September 20, 2009.
  4. ^ "Mac Dev Center - Introduction to Core Image Programming Guide - Filter Clients and Filter Creators". Retrieved September 20, 2009.
  5. ^ "ArsTechnica - Mac OS X 10.4 Tiger - Page 15". Retrieved September 20, 2009.
  6. ^ "Mac Dev Center - Core Image Kernel Language Reference - Introduction". Retrieved September 20, 2009.
  7. ^ "Mac Dev Center - Core Image Programming Guide - Core Image Concepts - Core Image and the GPU". Retrieved September 20, 2009.
  8. ^ "ArsTechnica - Mac OS X 10.4 Tiger - Page 15". Retrieved April 17, 2007.
  9. ^ "Mac Dev Center - Core Image Programming Guide - Core Image Concepts". Retrieved September 20, 2009.
  10. ^ a b "Apple - Aperture - Tech Specs". Retrieved September 20, 2009.
  11. ^ "Mac Dev Center - Image Kit Programming Guide - Introduction to Image Kit Programming Guide". Retrieved September 20, 2009.
  12. ^ Singh 2006, p. 97.
  13. ^ "Mac Dev Center - Core Image Programming Guide - Core Image Concepts - Color Components and Premultiplied Alpha". Retrieved September 20, 2009.
  14. ^ a b "Mac OS X 10.4 - Requirements for Quartz Extreme and Core Image Graphics". Retrieved September 20, 2009.
  15. ^ "Mac Dev Center - Core Image Filter Reference". Retrieved September 20, 2009.