Patterns of parallel programming
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Patterns of parallel programming

  • 3,185 views
Uploaded on

Шаблоны параллельного программирования, by ...

Шаблоны параллельного программирования, by
Ян Другаля, Software Developer, CloudIt

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
3,185
On Slideshare
823
From Embeds
2,362
Number of Embeds
4

Actions

Shares
Downloads
10
Comments
0
Likes
0

Embeds 2,362

http://www.usergroup.od.ua 2,354
http://131.253.14.98 4
http://131.253.14.66 3
http://translate.googleusercontent.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Patterns of Parallel ProgrammingPrepared by Yan Drugalya ydrugalya@gmail.com @ydrugalya
  • 2. Agenda• Why parallel?• Terms and measures• Building Blocks• Patterns overview – Pipeline and data flow – Producer-Consumer – Map-Reduce – Other
  • 3. Why Moores law is not working anymore• Power consumption• Wire delays• DRAM access latency• Diminishing returns of more instruction-level parallelism
  • 4. Power consumption Sun’s Surface 10,000 1,000 Rocket NozzlePower Density (W/cm2) 100 Nuclear Reactor 10 Pentium® processors Hot Plate 1 8080 ‘70 ‘80 ’90 ’00 ‘10
  • 5. Wire delays
  • 6. Diminishing returns• 80’s – 10 CPI  1 CPI• 90 – 1 CPI  0.5CPI• 00’s: multicore
  • 7. No matter how fast processors get, softwareconsistently finds new ways to eat up the extraspeed. Herb Sutter
  • 8. Survival To scale performance, put many processing cores on the microprocessor chip New Moore’s law edition is about doubling of cores.
  • 9. Terms & Measures• Work = T1• Span = T∞• Work Law: Tp>=T1/P• Span Law: Tp>=T∞• Speedup: Tp/T1 – Linear: θ(P) – Perfect: P• Parallelism: T1/T∞• Tp<=(T1-T∞)/P + T∞
  • 10. Definitions• Concurrent - Several things happenings at the same time• Multithreaded – Multiple execution contexts• Parallel – Multiple simultaneous computations• Asynchronous – Not having to wait
  • 11. Dangers• Race Conditions• Starvations• Deadlocks• Livelock• Optimizing compilers• …
  • 12. Data parallelismParallel.ForEach(letters, ch => Capitalize(ch));
  • 13. Task parallelismParallel.Invoke(() => Average(), () => Minimum() …);
  • 14. Fork-Join • Additional work may be started only when specific subsets of the original elements have completed processing • All elements should be given the chance to run even if one invocation fails (Ping) Parallel.Invoke( () => ComputeMean(), Fork () => ComputeMedian(), () => ComputeMode());Compute Compute Compute static void MyParallelInvoke(params Action[] actions) Median Mean Mode { var tasks = new Task[actions.Length]; for (int i = 0; i < actions.Length; i++) tasks[i] = Task.Factory.StartNew(actions[i]); Join Task.WaitAll(tasks); }
  • 15. Pipeline pattern Task<int> T1 = Task.Factory.StartNew(() =>Task 1 { return result1(); }); Task<double> T2 = T1.ContinueWith((antecedent) =>Task 2 { return result2(antecedent.Result); }); Task<double> T3 = T2.ContinueWith((antecedent) =>Task 3 { return result3(antecedent.Result); });
  • 16. Producer/Consumer Disk/Net Read 1 Read 2 Read 3BlockingCollection<T> Process Process Process
  • 17. Other patterns• Speculative Execution• APM (IAsyncResult, Begin/end pairs)• EAP(Operation/Callback pairs)
  • 18. References• Patterns for Parallel Programming: Understanding and Applying Parallel Patterns with the .NET Framework 4• Pluralsight: – Introduction to Async and Parallel Programming in .NET 4 – Async and Parallel Programming: Application Design• The Free Lunch Is Over: A Fundamental Turn Toward Concurrency in Software• Chapter 27 Multithreaded Algorithms from Introduction to algorithms 3rd edition