Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

What's a Core Image? An Image-Processing Framework on iOS and OS X

943 views

Published on

Flatiron students Steven Zhou and Heidi Hansen explain how core images work on iOS and OS X to help developers process images efficiently without dealing with low level interactions with GPU or CPU.


Published in: Technology
  • Be the first to comment

What's a Core Image? An Image-Processing Framework on iOS and OS X

  1. 1. Core Image Steve Zhou & Heidi Hansen
  2. 2. What is a Digital Image? • It’s a collection of pixels and each pixel represents a specific color. • Combining hundreds and thousands of pixels together forms a digital image. • It’s a 2-dimensional arrays of pixels: X and Y or Rows and Columns. • Usually digital image refers to raster image or bitmap image.
  3. 3. What is Core Image? •It’s an image processing framework on both iOS and OS X. It was first introduced in iOS 5 and previously was only available on OS X. •Its designed to do “real-time” image processing at pixel level for both still image and video. •It operates on image data types from multiple sources. ex: Core Graphics,Core Video, Image I/O •It allows developers to process images easily and efficiently without dealing with low level interactions with GPU or CPU. •It provides more than 90 built-in image processing filters in iOS, Feature detection capability, and support for automatic image enhancement. 
 

  4. 4. Key Core Image Classes • CIImage: 
 1. An object that represents an image.
 2. You can create CIImage object with different inputs.
 
 • CIFilter:
 1. An object that represents an effect. 
 2. Has at least one input parameter.
 3. Produces a CIImage object.
 
 • CIContext: 
 1.An object that draws the result produced by CIFilter
 using either CPU or GPU rendering path.
  5. 5. Apply Filter To An Image 1. CIContext *context = [CIContext contextWithOptions:nil];
 2. CIImage *beginImage =[[CIImage alloc] initWithImage:[UIImage imageNamed:@“waterfall.jpg”]];
 3. CIFilter *filter = [CIFilter filterWithName:@“CIGammaAdjust"];" " [filter setValue:beginImage forKey:kCIInputImageKey];" " [filter setValue:@2.0 forKey:@"inputPower"];" " 4. CIImage *resultImage = [filter outputImage];" ! 5. CGImageRef cgimage = [context createCGImage:resultImage fromRect: [resultImage extent]];
 6. UIImage *newImage = [UIImage imageWithCGImage:cgimage];"
  6. 6. Chaining Multiple Filters 1.CIImage *beginImage =[[CIImage alloc] initWithImage:[UIImage imageNamed:@“waterfall.jpg”]];! 2.CIImage *firstOutPutImage=[CIFilter filterWithName:@"CISepiaTone"
 keysAndValues: kCIInputImageKey beginImage, @"inputIntensity", @0.8, nil].outputImage;! 3.CIImage *secondOutPutImage=[CIFilter filterWithName:@"CIHueAdjust"
 keysAndValues: kCIInputImageKey firstOutPutImage, @"inputIntensity", @0.8, nil].outputImage;! 4.CIImage *thirdOutPutImage=[CIFilter filterWithName:@"CIVibrance"
 keysAndValues: kCIInputImageKey secondOutPutImage, @"inputIntensity", @0.8, nil].outputImage;! ! ! ! ! ! begin Image CISepiaTone CIHueAdjust CIVibrance
  7. 7. Core Image Built-In Filters • CIBoxBlur
 • CIDiscBlur • CIGaussianBlur • CIMedianFilter • CIMinimumComponent • CIPhotoEffectChrome • CIPhotoEffectFade • CIPhotoEffectInstant • CICircularScreen • CICMYKHalftone • CIDotScreen • CIHatchedScreen • CIAffineClamp • CIAffineTile • CIEightfoldReflectedTile • CIFourfoldReflectedTile • CIFourfoldRotatedTile • CIOverlayBlendMode • CISaturationBlendMode • CIScreenBlendMode • CISoftLightBlendMode • CIColorMap • CIColorMonochrome • CIColorPosterize • CIFalseColor
  8. 8. Steve Zhou & Heidi Hansen ! Thanks! Demo/Questions?

×