Parallel Computing

3,361 views

Published on

Published in: Education
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,361
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
334
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide

Parallel Computing

  1. 1. Parallel Computing by Vikram Singh Slathia Dept. Computer Science Central University of Rajasthan
  2. 2. Parallel Processing is a term used to denote alarge class of techniques that are used toprovide simultaneous data processing tasks forthe purpose of • Save time and/or money • Solve larger problemsParallel computing is the simultaneoususe of multiple compute resources to solve acomputational problem
  3. 3. The Universe is Parallel• Galaxy formation• Planetary movement• Weather and ocean patterns• Tectonic plate drift• Rush hour traffic• Automobile assembly line• Building a jet• Ordering a hamburger at the drive through.
  4. 4. Areas of Parallel Computing• Physics – applied, nuclear, particle, condensed matter, high pressure, fusion, photonics• Bioscience, Biotechnology, Genetics• Chemistry, Molecular Sciences• Geology, Seismology• Mechanical Engineering - from prosthetics to spacecraft• Electrical Engineering, Circuit Design, Microelectronics• Computer Science, Mathematics
  5. 5. Why Use Parallel Computing?• Save time and/or money: In theory, throwing more resources at a task will shorten its time to completion, with potential cost savings. Parallel computers can be built from cheap, commodity components.• Solve larger problems: Many problems are so large and/or complex that it is impractical or impossible to solve them on a single computer, especially given limited computer memory.• Better response times: As the computing tasks are engaged by a group of processors, the tasks are completed in a smaller amount of time
  6. 6. ways to classify parallel computers.• One of the more widely used classifications, in use since 1966, is called Flynns Taxonomy The 4 possible classifications according to Flynn’s are :• Single Instruction, Single Data (SISD)• Single Instruction, Multiple Data (SIMD)• Multiple Instruction, Single Data (MISD)• Multiple Instruction, Multiple Data (MIMD):
  7. 7. Some basic requirements for achieving parallel execution• Operating system capable of managing the multiple processors.• Computer system/servers with built in multiple processors and better message facilitation among processors.• Clustered nodes with application software, such as Oracle RAC
  8. 8. Conclusion• Parallel computing is fast.• Parallel computing is the future of computing.
  9. 9. ReferencesBooks• The New Turing Omnibus, A. K. Dewdney, Henry Holt and Company, 1993• Parallel Programming in C with MPI and OpenMP, Michael J. Quinn, McGraw Hill Higher Education, 2003• Introduction to Parallel Computing 2nd Edition , Ananth Grama , PearsonLinks• Parallel Processing, http://www.dba-oracle.com/real_application_clusters_rac_grid/parallel.html• Internet Parallel Computing Archive,• wotug.ukc.ac.uk/parallel• Introduction to Parallel Computing, www.llnl.gov/computing/tutorials/parallel_comp/#Whatis
  10. 10. Thank you

×