BASIC CONCEPTS OF
PARALLELIZATION
(CD)
by,
K.B.Snega,M.sc(CS).,
NADAR SARASWATHI COLLEGE
OF ARTS AND SCIENCE,THENI
Parallel programming
model
 A programming model is a collection of
program abstraction providing a programmer
a simplified and transparent view of computer
H/W and S/W.
 A parallel programming is an abstraction of
parallel computer architecture with which it is
convenient to express algorithms and their
composition in programs.
Five model are designed that exploits
parallelism :
• Shared-variable model.
 Message-passing model.
 Data parallel model.
 Object oriented model.
 Functional and logic model.
SHARED VARIABLE
MODEL
 Variables may be shared
or restricted.
 These model can
automatically generate the
appropriate
communication statements
based on shared variable
for SPMD(Simple Program
Multiple data).
MESSAGE PASSING MODEL
 Synchronous Message Passing –
It is must synchronize the sender process and the
receiver process in time and space.
 Asynchronous Message Passing –
It does not require message sending and
receiving be synchronized in time and space.
Non blocking can be achieved.
 Distributing the computations-
Subprogram level is handled rather than at the
instructional or fine grain process level in a tightly
coupled multiprocessor.
DATA PARALLEL MODEL
 It is easier to write and to debug because
parallelism is explicitly handled by hardware
synchronization and flow control.
 It requires the use of pre-distributed data
sets.
 Synchronization is done at compile time
rather than run time.
 The following are some issued handled
1. Data Parallelism
2. Array Language Extensions
3. Compiler support
OBJECT ORIENTED MODEL
 Object are created and manipulated dynamically.
 Processing is performed using object.
 Concurrent programming model are built up from
low level object such as processes, queue and
semaphore.
 C- oop achieve parallelism using three methods,
1.pipeline concurrency
2.Divide and conquer concurrency
3.Co-operating problem solving
FUNCTIONAL AND LOGICAL
MODEL
 Two language-oriented programming for
parallel processing are purposed.
 Functional programming model such as
LISP, SISAL, Strand 88.
 Logic programming model as prolog.
 Based on predicate logic, logic
programming is suitable for solving large
database queries.
PARALLEL CONTROL LANGUAGE
 Special language construct and data array
expression for exploiting parallelism in
program.
 First is FORTRAN 90 array notation.
 Parallel flow control is achieve using do across
and do all type of keyword which is use in the
FORTRAN 90.
 Same we also use FORK and JOIN method.
OPTIMIZING COMPILER
 The role of compiler to remove the burden
of program optimization and code
generation.
 A parallelizing compiler consist of the
three major phases.
 Flow analysis.
 Optimization.
 Code generation.
“Compilation phases in parallel code generation”
ISSUES IN PARALLELIZATION
 Amount of parallelizable CPU bound work.
 Task Granularity.
 Load Balancing.
 Memory allocation and Garbage collection.
 Locality issues.
Basic concepts of parallelization

Basic concepts of parallelization

  • 1.
  • 2.
    Parallel programming model  Aprogramming model is a collection of program abstraction providing a programmer a simplified and transparent view of computer H/W and S/W.  A parallel programming is an abstraction of parallel computer architecture with which it is convenient to express algorithms and their composition in programs.
  • 3.
    Five model aredesigned that exploits parallelism : • Shared-variable model.  Message-passing model.  Data parallel model.  Object oriented model.  Functional and logic model.
  • 4.
    SHARED VARIABLE MODEL  Variablesmay be shared or restricted.  These model can automatically generate the appropriate communication statements based on shared variable for SPMD(Simple Program Multiple data).
  • 5.
    MESSAGE PASSING MODEL Synchronous Message Passing – It is must synchronize the sender process and the receiver process in time and space.  Asynchronous Message Passing – It does not require message sending and receiving be synchronized in time and space. Non blocking can be achieved.  Distributing the computations- Subprogram level is handled rather than at the instructional or fine grain process level in a tightly coupled multiprocessor.
  • 6.
    DATA PARALLEL MODEL It is easier to write and to debug because parallelism is explicitly handled by hardware synchronization and flow control.  It requires the use of pre-distributed data sets.  Synchronization is done at compile time rather than run time.  The following are some issued handled 1. Data Parallelism 2. Array Language Extensions 3. Compiler support
  • 7.
    OBJECT ORIENTED MODEL Object are created and manipulated dynamically.  Processing is performed using object.  Concurrent programming model are built up from low level object such as processes, queue and semaphore.  C- oop achieve parallelism using three methods, 1.pipeline concurrency 2.Divide and conquer concurrency 3.Co-operating problem solving
  • 8.
    FUNCTIONAL AND LOGICAL MODEL Two language-oriented programming for parallel processing are purposed.  Functional programming model such as LISP, SISAL, Strand 88.  Logic programming model as prolog.  Based on predicate logic, logic programming is suitable for solving large database queries.
  • 9.
    PARALLEL CONTROL LANGUAGE Special language construct and data array expression for exploiting parallelism in program.  First is FORTRAN 90 array notation.  Parallel flow control is achieve using do across and do all type of keyword which is use in the FORTRAN 90.  Same we also use FORK and JOIN method.
  • 10.
    OPTIMIZING COMPILER  Therole of compiler to remove the burden of program optimization and code generation.  A parallelizing compiler consist of the three major phases.  Flow analysis.  Optimization.  Code generation.
  • 11.
    “Compilation phases inparallel code generation”
  • 12.
    ISSUES IN PARALLELIZATION Amount of parallelizable CPU bound work.  Task Granularity.  Load Balancing.  Memory allocation and Garbage collection.  Locality issues.