Marathwada Mitramandal’s
COLLEGE OF ENGINEERING
Karvenagar, Pune
Accredited with ‘A’ Grade by NAAC
Presentation
On
Subject: High Performance Computing
by
Ms. A.S. Mane & Ms. Hema Karande
Department of Computer Engineering
Unit 3: Parallel Communication
Session-16/36
Communication using MPI: Scatter, Gather, Broadcast, Blocking and
non-blocking MPI
Session Plan
Outline:
A. Attendance
B. Review of the previous session
C. Learning Outcomes of the session
D. Content
E. Student’s evaluation
F. Preparation for next session
G. Wrap up
B. Review of previous session:
● All-Reduce Operations
● Prefix-Sum Operations
● Collective technique
C. Learning Outcomes of the session:
● To understand the Communication using MPI
● To know about different Communication using MPI
D. Content:
Contents Learning /
Methodology
Faculty
Approach
Typical
Student
Activity
Skill/
Competenc
y
Developed
Communication using
MPI: Scatter, Gather PPT
Explain Understand
Critical
Thinking -
Observation
Communication using
MPI: Broadcast PPT
Explains,
Guides
Understand
Critical
Thinking –
Observation
Communication using
MPI: Blocking and non-
blocking MPI
PPT
Explains,
Guides
Understand
Critical
Thinking –
Observation
The message passing interface (MPI) is a standardized
means of exchanging messages between multiple
computers running a parallel program across distributed
memory.
In parallel computing, multiple computers – or even multiple
processor cores within the same computer – are called
nodes.
Message Passing Interface (MPI)
There are two types of communication in MPI:
point-to-point communication and
collective communication
Communication using MPI
Communication using MPI
Collective communication is defined as communication that
involves a group of processes.
The functions of this type provided by MPI are the following:
Barrier synchronization across all group members.
Broadcast from one member to all members of a group
Communication using MPI: Broadcast
During a broadcast, one process sends the same data to all processes
in a communicator.
One of the main uses of broadcasting is to send out user input to a
parallel program, or send out configuration parameters to all processes.
Blocking: return only when the buffer is ready to be reused.
Non-blocking: return immediately. Buffering: data is kept until it is received.
Synchronization: when a send is completed. Collective communications in MPI
are always blocking.
Examples:
MPI Send (): Blocking send. Will not return until you can use the send buffer.
MPI Isend (): Nonblocking send. But not necessarily asynchronous.
Communication using MPI: Blocking and non-blocking MPI
A non-blocking call returns after minimal delay due to local operations, so that
the caller is not blocked.
A blocking receive returns when a message is placed in the calling process'
buffer, blocking if there is no message to be received from the specified
source.
Contd…
E. Student’s evaluation
How does Communication using MPI work?
What are other communications using MPI?
● All-to-All Personalized Communication
● Circular Shift
F. Preparation for Next Session:-
G. Wrap Up
In this lecture we have learned
● Communication using MPI: Scatter, Gather
● Communication using MPI: Broadcast,Blocking and non-blocking
MPI

parallel computing of HPC...............

  • 1.
    Marathwada Mitramandal’s COLLEGE OFENGINEERING Karvenagar, Pune Accredited with ‘A’ Grade by NAAC Presentation On Subject: High Performance Computing by Ms. A.S. Mane & Ms. Hema Karande Department of Computer Engineering
  • 2.
    Unit 3: ParallelCommunication Session-16/36 Communication using MPI: Scatter, Gather, Broadcast, Blocking and non-blocking MPI Session Plan
  • 3.
    Outline: A. Attendance B. Reviewof the previous session C. Learning Outcomes of the session D. Content E. Student’s evaluation F. Preparation for next session G. Wrap up
  • 4.
    B. Review ofprevious session: ● All-Reduce Operations ● Prefix-Sum Operations ● Collective technique
  • 5.
    C. Learning Outcomesof the session: ● To understand the Communication using MPI ● To know about different Communication using MPI
  • 6.
    D. Content: Contents Learning/ Methodology Faculty Approach Typical Student Activity Skill/ Competenc y Developed Communication using MPI: Scatter, Gather PPT Explain Understand Critical Thinking - Observation Communication using MPI: Broadcast PPT Explains, Guides Understand Critical Thinking – Observation Communication using MPI: Blocking and non- blocking MPI PPT Explains, Guides Understand Critical Thinking – Observation
  • 7.
    The message passinginterface (MPI) is a standardized means of exchanging messages between multiple computers running a parallel program across distributed memory. In parallel computing, multiple computers – or even multiple processor cores within the same computer – are called nodes. Message Passing Interface (MPI)
  • 8.
    There are twotypes of communication in MPI: point-to-point communication and collective communication Communication using MPI
  • 9.
    Communication using MPI Collectivecommunication is defined as communication that involves a group of processes. The functions of this type provided by MPI are the following: Barrier synchronization across all group members. Broadcast from one member to all members of a group
  • 14.
    Communication using MPI:Broadcast During a broadcast, one process sends the same data to all processes in a communicator. One of the main uses of broadcasting is to send out user input to a parallel program, or send out configuration parameters to all processes.
  • 16.
    Blocking: return onlywhen the buffer is ready to be reused. Non-blocking: return immediately. Buffering: data is kept until it is received. Synchronization: when a send is completed. Collective communications in MPI are always blocking. Examples: MPI Send (): Blocking send. Will not return until you can use the send buffer. MPI Isend (): Nonblocking send. But not necessarily asynchronous. Communication using MPI: Blocking and non-blocking MPI
  • 17.
    A non-blocking callreturns after minimal delay due to local operations, so that the caller is not blocked. A blocking receive returns when a message is placed in the calling process' buffer, blocking if there is no message to be received from the specified source. Contd…
  • 18.
    E. Student’s evaluation Howdoes Communication using MPI work? What are other communications using MPI?
  • 19.
    ● All-to-All PersonalizedCommunication ● Circular Shift F. Preparation for Next Session:-
  • 20.
    G. Wrap Up Inthis lecture we have learned ● Communication using MPI: Scatter, Gather ● Communication using MPI: Broadcast,Blocking and non-blocking MPI