CoreImage framework provides main three classes which handles image processing on iOS and OS-X application development: 1. CIFilter - it is a mutable object that represents an effect, using it we can modify existing filters by coding them with preset values or by changing them. A filter object has at least one input parameter and produces an output image. 2. CIImage - it is an immutable object that represents an image. This class hold the image data and can be creating from a UIImage, from an image file, or from pixel data. We can synthesize image data or provide it from a file or the output of another CIFilter object. 3. CIContext - it is an object through which Core Image draws the results produced by a filter. All of the processing of a core image is done in a CIContext. This is somewhat similar to a Core Graphics or OpenGL context.