Patterns of parallel programming
Upcoming SlideShare
Loading in...5
×
 

Patterns of parallel programming

on

  • 3,102 views

Шаблоны параллельного программирования, by

Шаблоны параллельного программирования, by
Ян Другаля, Software Developer, CloudIt

Statistics

Views

Total Views
3,102
Views on SlideShare
748
Embed Views
2,354

Actions

Likes
0
Downloads
8
Comments
0

4 Embeds 2,354

http://www.usergroup.od.ua 2346
http://131.253.14.98 4
http://131.253.14.66 3
http://translate.googleusercontent.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Patterns of parallel programming Patterns of parallel programming Presentation Transcript

  • Patterns of Parallel ProgrammingPrepared by Yan Drugalya ydrugalya@gmail.com @ydrugalya
  • Agenda• Why parallel?• Terms and measures• Building Blocks• Patterns overview – Pipeline and data flow – Producer-Consumer – Map-Reduce – Other
  • Why Moores law is not working anymore• Power consumption• Wire delays• DRAM access latency• Diminishing returns of more instruction-level parallelism
  • Power consumption Sun’s Surface 10,000 1,000 Rocket NozzlePower Density (W/cm2) 100 Nuclear Reactor 10 Pentium® processors Hot Plate 1 8080 ‘70 ‘80 ’90 ’00 ‘10
  • Wire delays
  • Diminishing returns• 80’s – 10 CPI  1 CPI• 90 – 1 CPI  0.5CPI• 00’s: multicore
  • No matter how fast processors get, softwareconsistently finds new ways to eat up the extraspeed. Herb Sutter
  • Survival To scale performance, put many processing cores on the microprocessor chip New Moore’s law edition is about doubling of cores.
  • Terms & Measures• Work = T1• Span = T∞• Work Law: Tp>=T1/P• Span Law: Tp>=T∞• Speedup: Tp/T1 – Linear: θ(P) – Perfect: P• Parallelism: T1/T∞• Tp<=(T1-T∞)/P + T∞
  • Definitions• Concurrent - Several things happenings at the same time• Multithreaded – Multiple execution contexts• Parallel – Multiple simultaneous computations• Asynchronous – Not having to wait
  • Dangers• Race Conditions• Starvations• Deadlocks• Livelock• Optimizing compilers• …
  • Data parallelismParallel.ForEach(letters, ch => Capitalize(ch));
  • Task parallelismParallel.Invoke(() => Average(), () => Minimum() …);
  • Fork-Join • Additional work may be started only when specific subsets of the original elements have completed processing • All elements should be given the chance to run even if one invocation fails (Ping) Parallel.Invoke( () => ComputeMean(), Fork () => ComputeMedian(), () => ComputeMode());Compute Compute Compute static void MyParallelInvoke(params Action[] actions) Median Mean Mode { var tasks = new Task[actions.Length]; for (int i = 0; i < actions.Length; i++) tasks[i] = Task.Factory.StartNew(actions[i]); Join Task.WaitAll(tasks); }
  • Pipeline pattern Task<int> T1 = Task.Factory.StartNew(() =>Task 1 { return result1(); }); Task<double> T2 = T1.ContinueWith((antecedent) =>Task 2 { return result2(antecedent.Result); }); Task<double> T3 = T2.ContinueWith((antecedent) =>Task 3 { return result3(antecedent.Result); });
  • Producer/Consumer Disk/Net Read 1 Read 2 Read 3BlockingCollection<T> Process Process Process
  • Other patterns• Speculative Execution• APM (IAsyncResult, Begin/end pairs)• EAP(Operation/Callback pairs)
  • References• Patterns for Parallel Programming: Understanding and Applying Parallel Patterns with the .NET Framework 4• Pluralsight: – Introduction to Async and Parallel Programming in .NET 4 – Async and Parallel Programming: Application Design• The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software• Chapter 27 Multithreaded Algorithms from Introduction to algorithms 3rd edition