gcdtmp
Upcoming SlideShare
Loading in...5
×
 

gcdtmp

on

  • 180 views

 

Statistics

Views

Total Views
180
Views on SlideShare
180
Embed Views
0

Actions

Likes
0
Downloads
0
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

gcdtmp gcdtmp Presentation Transcript

  • Blocks & GCD
  • • History • GCD • Queues • Blocks • Syntax and Examples • Demo
  • Grand Central Dispatch
  • History •  Historically, microprocessors gained speed by running at faster and faster clock speeds; and software become automatically faster. •  Processor clock speeds began to reach a limit because power consumption and heat became problematic, particularly for mobile systems. •  CPU vendors shifted their focus from increasing clock speed to putting multiple processor cores into a single CPU •  so#ware  no  longer  automa/cally  becomes   faster.  
  • History •  We required threads to make use of multithreading, NSThread is not that simple to use for each and every small data processing task, need locks. •  NSOprationQueue was an alternative, but not so lightweight and required some boilerplate code •  Or use o performSelectorInBackground:withObject: o  performSelectorOnMainThread:withObject:waitUn tilDone:
  • Solu/on   •  The  dominant  model  for  concurrent  programming —threads  and  locks—is  too  difficult  to  be  worth   the  effort  for  most  applica/ons.  To  write  a  an   efficient  applica/on  for  mul/-­‐  core  using  threads,   you  need  to:     – Break  each  logical  task  down  to  a  single  thread   Lock  data  that  is  in  danger  of  being  changed  by  two   threads  at  once   Build  a  thread  manager  to  run  only  as  many  threads  as   there  are  available  cores  Hope  that  no  other   applica/ons  running  on  the  system  are  using  the   processor  cores    
  • How?   •  GCD  shi#s  the  responsibility  for  managing   threads  and  their  execu/on  from  applica/ons  to   the  opera/ng  system.     •  Units  of  work  are  described  as  blocks  in  your   code,  while  queues  are  used  to  organize  blocks   based  on  how  you  believe  they  need  to  be   executed.     •  GCD  has  a  mul/core  execu/on  engine  that  reads   the  queues  created  by  each  applica/on  and   assigns  work  from  the  queues  to  the  threads  it  is   managing.    
  • Hooray! •  Apple introduced Grand Central Dispatch and Blocks for SnowLeopard; and decided to remove support for PPC. •  Apple came up with multicore HandHelds, soon after blocks and GCD were announced for iOS
  • Programming  Model   •  Blocks  are  used  as  a  Unit  of  Work   •  Dispatch  Objects  -­‐  reference  counted,  uses   dispatch_retain()  and  dispatch_release()   •  Queues  (four  system  defined)  are  used  to   execute  Blocks   – Serial  /  Concurrent     •  Event  Sources  –  associate  blocks/queues  to   asynchronous  event  source  e.g.  /mer,  socket   •  A  thread-­‐pool  of  max  512  threads  can  be   maintained,  old  threads  are  reused  
  • Queues
  • Queues   •  Most  of  the  intelligence  behind  Grand  Central   Dispatch  is  provided  by  queues.     – Global  Queues   – Private  Queues   – Main  Queue   •  A  queue  can  execute  opera/on  in  Sync  or   Async  
  • Queues   dispatch_queue_t  dispatch_get_global_queue(        long  priority,        unsigned  long  flags);   DISPATCH_QUEUE_PRIORITY_HIGH   DISPATCH_QUEUE_PRIORITY_DEFAULT   DISPATCH_QUEUE_PRIORITY_LOW   dispatch_queue_t  dispatch_queue_create(        const  char  *label        dispatch_queue_afr_t  afr);   dispatch_queue_t  dispatch_get_main_queue(void);  
  • Using  Queues  
  • Using  Queues  
  • Using  Queues   •  the  block  that's  submifed  with  the  barrier  func/on  doesn't  run   concurrently  with  other  work  on  that  queue   •  pointless  on  a  serial  queue   •  non-­‐func/onal  when  used  on  the  global  queues  
  • Blocks
  • Blocks:What is it? •  Blocks are a nonstandard extension added by Apple Inc. to the C, C++, and Objective-C programming languages that uses a lambda expression-like syntax to create closures within these languages. Blocks are supported for programs developed for Mac OS X 10.6+ and iOS 4.0+. – wikipedia
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Has  a  typed  argument  list  just  like  a  func/on  
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Has  an  inferred  or  declared  return  type  
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Can  capture  state  from  the  lexical  scope  within   which  it  is  defined
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Can  op/onally  modify  the  state  of  the  lexical  scope  
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Can  share  the  poten/al  for  modifica/on  with  other   blocks  defined  within  the  same  lexical  scope   • Multiple blocks in the same lexical score shares the same instance. • If a variable is __block is is passed as reference and hence can be accessed by multiple blocks
  • What is it? •  A  block  is  an  anonymous  inline  collec/on  of   code  that:   – Can  con/nue  to  share  and  modify  state  defined   within  the  lexical  scope  (the  stack  frame)  a#er  the   lexical  scope  (the  stack  frame)  has  been  destroyed   • Each  __block  has  a  __strong  reference  hence  even  if   current  stack  frame  is  destroyed  it  can  work  on  that   variable.   • The  compiler  and  run/me  arrange  that  all  variables   referenced  from  the  block  are  preserved  for  the  life  of   all  copies  of  the  block  
  • What is it? •  You  can  copy  a  block  and  even  pass  it  to  other   threads  for  deferred  execu/on  (or,  within  its   own  thread,  to  a  runloop).  (use  Block_copy   instead)  
  • Blocks  Usage   •  Used  for  lightweight  task,  Blocks  represent   typically  small,  self-­‐contained  pieces  of  code.     •  They’re  par/cularly  useful  as  a  means  of   encapsula/ng  units  of  work  that  may  be   executed  concurrently,  or  over  items  in  a   collec/on,  or  as  a  callback  when  another   opera/on  has  finished.  
  • Small,  Self  contained  
  • Work  over  over  items  in  a  collec/on  
  • Callbacks   •  Asynchronus  Network  tasks   •  Snippet  from  MKNetworkEngine.m  by  @mugunthkumar  
  • Usage  in  cocoa   •  Asynchronus  UI  tasks  
  • Under  the  Hood   •  Extracted  from  "Pro  Mul.threading  and  Memory  Management  for  iOS  and  OS  X  with  ARC,   Grand  Central  Dispatch,  and  Blocks"  by  Kazuki  Sakamoto  and  Tomohiko  Furumoto   void  (^blk)(void)  =  ^{prinq("Blockn");};     is  translated  to     sta/c  void  __main_block_func_0(struct  __main_block_impl_0   *__cself)  {prinq("Blockn");  }   by  compiler,  there  is  lot  of  other  boilerplate   code  created  to  help  invoca/on  of  this  method   and  other  blocks  opera/on  (sharing  a  variable,   copy,  pass  by  reference)  
  • Dispatch  Source
  • Dispatch  Source   •  dispatch  source  is  an  object  which  monitors  for   one  of  following  event  :   – Mach  port  send  right  state  changes.   – Mach  port  receive  right  state  changes.   – External  process  state  change.   – File  descriptor  ready  for  read.   – File  descriptor  ready  for  write.   – Filesystem  node  event.  (kqueue)   – POSIX  signal.   – Custom  /mer.   – Custom  event.  
  • Dispatch  Source  example  
  • Thank  You!   •  Sources:   – Apple  Documenta/on   – libdispatch  Wiki   – CGD  Technology  Brief   Prepared  for  PuneCocoa   By  Prashant  Rane    (@the69geeks)