• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
915
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
13
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • \n
  • I geek out about Core Image on a regular basis\nExcited!\nPlease interrupt me with questions\n
  • download links for the source and presentation\n(on next page)\n
  • Hi!\nI’m an iOS dev nerd - love this stuff\n\nCEO\n\nMobile Dev @ CX\n\nHit me on Twitter\n
  • Experience working with photos and effects...\n\nPictwo (in market) - blends two photos together\nFeatured by Apple for 2 weeks in “New and Noteworthy”\n\nPicQuick is shipping in a few weeks\nCurated effects to your photos\nWe’ve created unique effects not seen anywhere else\n\nWhile creating these two apps I’ve tried almost every way to create effects on image in iOS \nexcept raw data processing\nApple gives a lot of powerful tools for processing images\n\n\n
  • Underlying Technologies\n\nPictwo uses Quartz 2D\n\nPicQuick uses Core Image\n\nPictwo will be migrated to Core Image because its superior for photo blending and effects\n\nThere are many reasons why its better for these applications\n\nThough at the time of Pictwo launch, Core Image wasn’t mature enough on iOS\n
  • Full disclosure: much of this information is from sessions 510 & 511 of WWDC 2012\n\n510 - Getting Started With Core Image\nVery good introduction to the basics\n\n511 - Core Image Techniques\nHow to get more power and performance out of Core Image\nUse of Core Image in games\n\nI’m going to take it a bit further with some detailed source code I’m excited to share\n
  • Hipstaroid is a real time camera effects app\nGood example - shows applying effects to still images and live video\nAlso gives me to chance to show how to encapsulate multiple filters into recipes\n
  • (quick demo)\nshows a live preview of the effect being applied\nshows a list of effects, applying the effect to a sample image\nhad fun designing some UI based on a 35mm camera\n
  • Basically, Core Image performs some seriously complicated math on your image data\n\nConsist of filters that perform per pixel operations on image data\n\nCan be used for\n+ Images\n+ Video\n+ Effects in games\n+ other applications (e.g. analysis on NASA image data)\n(because it has flexible inputs and outputs)\n
  • Games!!!\nOpenGL Textures with CoreImage\nFrom WWDC Session 511\n
  • With Core Image you can chain filters <-- most useful part\nChaining allows you to create unique effects\nCore Image optimizes those chains for you by doing complex matrix operations\n(math I don’t even want to think about)\n
  • 93 filters on iOS 6 - greatly expanded from iOS 5\nThe reason I didn’t use Core Image for Pictwo\nSomething to keep in mind for previous OS support\nCore Image provides run time introspection methods to find out what is supported on device\n
  • At runtime you can query CoreImage for a ton of information\n\nNeed to know which filters are available? use “filterNamesInCategory”\n\nNeed to know the inputs for a filter you are using? use “attributes”\n\nAttributes show\n+ key for filter\n+ data type of inputs\n+ default values, value ranges, and some specific information\n\nAllows you to write one code path that adapts to each specific filter type because the interface is standard\n
  • API is interchangeable between OS X and iOS - usually just need to use NS equivalent classes\nOS X has 130 built in filters + extendable kernels so you can build your own\nI’ve found that by combining the 93 filters in iOS, I’ve never needed to create my own\n
  • Plenty of source options for your image data\n+ photo library\n+ video feed\n+ image data in memory - stream straight from the web\n
  • Can do more then still image and video processing\nSend to Open GL textures for effects in games\nRaw byte output for custom applications\n
  • Combines filters when possible, you don’t have to worry about operations on matrices\nPipelines the filter operations through the GPU == speedy\n
  • Core Image groups filters into categories\nSearch Core Image Filter Reference in Google or get link\n
  • Example of: Color Effects filter\n
  • Example of: Compositing Operations\nWill take 2 inputs - one of which is a generator here\n
  • Example of: Distortion Effects\n\n
  • Example of: Blur and Sharpen Effects\nThese are new in iOS 6 and are certainly the most important filter additions\nCore Image uses them as the basis for many filters, things like Bloom and Gloom\nAlmost every effect in PicQuick uses some sort of blur or sharpen effect\nNOTE: they are also the most computationally intense\n
  • I like to think of Core Images filters as block diagrams\n\nAll effects provide an output\nMost will take an input (not generators)\nCompositing filters take two inputs\nMost will have one or more inputs that affect their operation - parameters\n\nThink of filters as a block you can flow your image data through\n
  • Source image data will always be a CIImage\nCreating a filter is easy, just pass in the filter name as a string\nFilters use key-values for all inputs and outputs\nWith the Sepia tone filter, we set the input image (our source image) and an intensity value\nThe effect can be varied by changing that intensity value\nA CI Context is prepared in order to be used for rendering\nThe results are rendered to a CGImage and then converted to a UIImage\nUIImage can be presented easily on screen\n
  • How do we chain filters?\nVery simple!\nJust need to hook the output of one filter to the input of another filter\n
  • This is an extension of the Sepia tone example\nWe add a hue Filter to the filter chain\nSet the input image of the hue filter to the output image of the sepia filter\nUse the hue filter as the output image when rendering\n
  • The CIContext is what does the rendering for CoreImage\nIts flexible in where it can render\nStill images - CGImage\nEAGLContext - OpenGL, the fastest way to get your effects on screen\nCVPixelBufferRef - back into a video stream\ndata - your own custom implementation\n\n\n
  • CIContext can run on the GPU or the CPU\nIt defaults to the GPU\nGPU is fast, but not as accurate as the CPU\nCPU is slower, but is more accurate and can handle larger image sizes\nCPU can also be run in the background after a users closes your app\nUse CPU when you don’t have to present the output to users right away\n
  • Thats the basics, enough to build the sample app I showed\nI think Core Image is pretty neat\nStill not convinced CoreImage is the way to go for image processing on iOS?\nLets see it in action...\n
  • (switch to XCode)\n
  • Start with: Film Selection view\nApplies the filter stack (recipe) to a still image (puppy)\nComposed of: horizontal slide-able UIScrollView with a background to make it look like film\nEach preview is a UIButton with the image set as the rendered output of the recipe\n
  • FilmSelectionViewController.m\nUsing NSOperationQueues to keep rendering off the main UI\nSending a film and source image to the DarkRoom so it can apply the effect\n(simple here)\n
  • DarkRoom.m\nTakes a Film object and source image \nGenerates a UIImage output from them\nThe Film image is simply a CIFilter object (like the sample before)\nWe use the source size (extent) to set the output size\nDon’t forget to use CFRelease or you will leak even with ARC\n
  • Film.h\nWhat is the Film object?\nSimple subclass of CIFilter that only defines one important piece (line 2)\nAll the effects (films) derive from this class\n
  • Simplest Film:\nSimply inverts each pixel\n
  • To make your own CIFilter you simply override the outputImage method\nThis is how Apple builds there own filters\nThere are other optional methods you can override, but this one has to be done in order to provide output\nIn this case: creates a CIColorInvert filter and ties the inputImage into it, then returns the output\n
  • Sepia Tone + Vignette\nGives it the look of a photograph that has faded, especially at the edges\n
  • This time we have to CIFilter objects, so they need to be connected together (as shown in previous example)\nThe CISepiaTone filter takes the inputImage as its inputImage\nThe CIVignette filter takes the CISepiaTone’s output as its inputImage.\nThe vignette filters output is the Film’s output\npiece of cake!\nBoth these filters also have inputs (explain)\nRadius is a Distance attribute so it needs to be scaled. \nI usually scale along the width.\n
  • B&W\n
  • \n
  • \n
  • Low Rez\n
  • \n
  • (how is the film applied in real time?)\n
  • Need an AVCaptureSession to bring in the camera feed\n(was messing with grabbing still images from the feed)\nCache the CoreImage context so it only needs to be created once\nStore the currently selected film\n
  • Setup an AVCapture session\nThis is my first time, not an expert\nThis is the important piece, we are adding a data output in 32BGRA format\nand we are setting up a callback to our code for each captured image frame\n
  • AVCaptureSession (once its setup) will callback your code for each frame captured\ncaptureOutput:didOutputSampleBuffer:fromConnection\nWhat it gives you is a CVPixelBuffer\nYou always need a CIImage for input to your CoreImage filters\nHere we are converting it and rotating it to display correctly\n
  • Now for the magic...\nPass the core image frame into the selected film, and simply render its output\nThe CIContext is used to do this\nSurprisingly making a UIImage is fast enough to get near 30fps\nTo speed this up, you can replace the imageWithCGImage with some OpenGL code that renders directly to an EAGLcontext\nI wanted to keep this code sample simple\nOther then setting up a camera capture session, this is all there is too doing realtime effects on your camera preview\nEnjoy the source, its yours!\n
  • Hi!\nI’m an iOS dev nerd - love this stuff\n\nCEO\n\nMobile Dev @ CX\n\nHit me on Twitter\n

Transcript

  • 1. Core Image Making Instagram Like Effects
  • 2. The Goods github.com/ kylestew/ Hipstaroid slidesha.re/ WstkjL
  • 3. github.com/ kylestew/ Hipstaroidslidesha.re/ WstkjL CEO - inVnt LLC Mobile Developer @ cx.com @kyleRstewart
  • 4. Pictwo PicQuick
  • 5. Quartz 2D Core Image
  • 6. WWDC 2012Sessions 510 / 511
  • 7. Hipstaroid
  • 8. What is Core Image?Image processing framework for Images Video
  • 9. Why is it useful?Filter chainsOptimized rendering
  • 10. 93 filters on iOS 6!
  • 11. Runtime Introspection List available CoreImage filters List attributes on a filter Input data types Default input values
  • 12. iOS <---> OS XVery similar API between OS X and iOSOS X has more filtersExtendable kernels in OSX
  • 13. Flexible Inputs
  • 14. Flexible Outputs
  • 15. PowerfulCombine filtersRenders on the GPU
  • 16. The Filters... Color Effects Stylize Filters Compositing Operations Halftone Effects Geometry Adjustment Transition Effects Tile Effects Generators Distortion Effects Blur and Sharpen Effectshttps://developer.apple.com/library/mac/#documentation/graphicsimaging/ reference/CoreImageFilterReference/Reference/reference.html
  • 17. CISepiaTone
  • 18. CIScreenBlendMode
  • 19. CITwirlDistortion
  • 20. CIGaussianBlur
  • 21. Filter Block Diagram
  • 22. Simple Example - Sepia Tone
  • 23. Filter Chaining
  • 24. Filter Chaining
  • 25. CIContextCoreImage rendererFlexible rendering destination CGImageRef EAGLContext CVPixelBufferRef data
  • 26. CPU vs GPU ContextGPU - fast, less accurateCPU - slow, more accurate, larger image sizesCPU can also be run in the background
  • 27. Hipstaroidgithub.com/kylestew/Hipstaroid
  • 28. FilmSelectionViewController.m
  • 29. DarkRoom.m
  • 30. Film.h
  • 31. Inversion.m
  • 32. OldTimey.m
  • 33. BlackWhite.m ...
  • 34. BlackWhite.m
  • 35. LowRez.m
  • 36. CameraViewController.m
  • 37. CameraViewController.m
  • 38. CameraViewController.m ...
  • 39. CameraViewController.m
  • 40. github.com/ kylestew/ Hipstaroidslidesha.re/ WstkjL CEO - inVnt LLC Mobile Developer @ cx.com @kyleRstewart