Concurrency refers to running multiple tasks simultaneously using a single processing unit through fast context switching. It creates the illusion of parallelism. Parallelism refers to running tasks simultaneously using multiple processing units to improve computational speed and throughput. Concurrency is achieved through interleaving processes on a CPU, while parallelism uses multiple CPUs. Concurrency can be done on a single CPU but parallelism requires multiple CPUs. Concurrency overlaps tasks to increase work done over time, while parallelism improves speed.
2. What is concurrency in parallel and distributed
computing?
1. Concurrency is the task of running and managing the multiple
computations at the same time. While parallelism is the task of
running multiple computations simultaneously.
3. Concurrency
• Concurrency relates to an application that is processing more than
one task at the same time. Concurrency is an approach that is used
for decreasing the response time of the system by using the single
processing unit. Concurrency is creates the illusion of parallelism,
however actually the chunks of a task aren’t parallelly processed, but
inside the application, there are more than one task is being
processed at a time. It doesn’t fully end one task before it begins
ensuing.
Concurrency is achieved through the interleaving operation of
processes on the central processing unit(CPU) or in other words by
the context switching. that’s rationale it’s like parallel processing. It
increases the amount of work finished at a time.
4. Parallelism:
• Parallelism is related to an application where tasks are divided into
smaller sub-tasks that are processed seemingly simultaneously or
parallel. It is used to increase the throughput and computational
speed of the system by using multiple processors. It enables single
sequential CPUs to do lot of things “seemingly” simultaneously.
• Parallelism leads to overlapping of central processing units and input-
output tasks in one process with the central processing unit and
input-output tasks of another process. Whereas in concurrency the
speed is increased by overlapping the input-output activities of one
process with CPU process of another process.
5. Concurrency Parallelism
Concurrency is the task of running and managing the
multiple computations at the same time.
While parallelism is the task of running multiple
computations simultaneously.
Concurrency is achieved through the interleaving
operation of processes on the central processing
unit(CPU) or in other words by the context switching.
While it is achieved by through multiple central
processing units(CPUs)
Concurrency can be done by using a single processing
unit.
While this can’t be done by using a single processing
unit. it needs multiple processing units.
Concurrency increases the amount of work finished at
a time.
While it improves the throughput and computational
speed of the system.
Concurrency deals lot of things simultaneously. While it do lot of things simultaneously.
Concurrency is the non-deterministic control flow
approach
While it is deterministic control flow approach.
6. • Apart from recent hardware trends towards multi-core and multiprocessor
systems, the use of concurrency in applications is generally motivated by
performance gains.
• The presence of concurrency is an intrinsic property for any kind of
distributed system. Processes running on different machines form a
common system that executes code on multiple machines at the same
time.
• Conceptually, all web applications can be used by various users at the same
time. Thus, a web application is also inherently concurrent. This is not
limited to the web server that must handle multiple client connections in
parallel. Also the application that executes the associated business logic of
a request and the backend data storage are confronted with concurrency.
7. Describe three fundamental ways how the concurrent execution of an
application can improve its performance:
Reduce latency
• A unit of work is executed in shorter time by subdivision into parts that can be executed
concurrently.
Hide latency
• Multiple long-running tasks are executed together by the underlying system. This is particularly
effective when the tasks are blocked because of external resources they must wait upon, such as
disk or network I/O operations.
Increase throughput
• By executing multiple tasks concurrently, the general system throughput can be increased. It is
important to notice that this also speeds up independent sequential tasks that have not been
specifically designed for concurrency yet.
8. Concurrent Programming vs. Parallel
Programming
• Differentiating concurrent and parallel programming is more tedious,
as both are targeting different goals on different conceptual levels
• The internals of a web server are the typical outcome of concurrent
programming, while the parallel abstractions such as Googleʹs
MapReduce