SlideShare a Scribd company logo
1 of 92
MODULE 3
GREEDY METHOD
CONTENTS
 General method
 Coin Change Problem
 Knapsack Problem
 Job sequencing with deadlines
 Minimum cost spanning trees
 Prim’s Algorithm
 Kruskal’s Algorithm
 Single source shortest paths
 Dijkstra's Algorithm
 Optimal Tree problem
 Huffman Trees and Codes
 Transform and Conquer Approach
 Heaps and Heap Sort
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 2
INTRODUCTION
The General Method
 There will be a problem given. A set of conditions to be
satisfied for the problem will also be provided.
 Suppose the problem has n inputs, we select a subset of n
inputs that satisfy the conditions.
 Several subsets may satisfy the conditions.
 Each subset satisfying the problem conditions is referred to as
a Feasible Solution.
 Out of all the Feasible Solutions, the one satisfying the
conditions to the fullest is referred to as the Optimal Solution
and is the solution to the problem.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 3
INTRODUCTION
 Finding a solution to a problem in this manner is referred to as
Subset Paradigm.
Algorithm Greedy(a, n)
//a[1:n] contains the n inputs
{
solution := Ø;
for i := 1 to n do {
x := Select(a);
if Feasible(solution, x) then
solution := Union(solution,
x);
}
return solution;
}
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 4
INTRODUCTION
 In simple words, Greedy Method is the concept of getting the
best possible solution for a problem with minimal efforts.
 It is like getting a high quality product with minimal cost.
 A classic scenario that represents greedy technique is the
Coin Change Problem.
 Another simple example that represents greedy method is
Machine Scheduling, which is presented next.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 5
INTRODUCTION
There are n tasks that must be completed. Each task has a start
and end time. Infinite number of machines are provided to
complete the task. The condition is that, the tasks must not be
assigned to machines such that their execution overlaps.
Task A B C D E F G
Start 0 3 4 9 7 1 6
End 2 7 7 11 10 5 8
Solution
 One possible solution for this problem is assigning individual
machines to each of the tasks.
 There are 8 tasks. Hence we assign 8 machines, one for each
task. Although this is a feasible solution, this is definitely not
optimal.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 6
INTRODUCTION
 We now see how to get the optimal solution.
 We start by arranging the tasks in the order of their start times,
i.e.,
A, F, B, C, G, E, D
(0,2),(1,5),(3,7),(4,7),(6,8),(7,10),(9,11)
 Next we allocate machines to the tasks as follows,
0
1 2 3 4 5 6 7 8 9 10 11 12 13
mc
m1
A
m2 F
B
m3 C
G
E
D
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 7
KNAPSACK PROBLEM
 There are n objects given to us.
 Each object has a weight and profit associated with it.
 The weights and profits are represented as 𝒘𝒊 and 𝒑𝒊
respectively, for 1 ≤ i ≤ n.
 We are also given a Knapsack/bag whose capacity in m.
 The problem here is to fill the knapsack by using the objects
such that we get maximum profit and we don’t exceed the
knapsack capacity.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 8
KNAPSACK PROBLEM
 Let the objects be 𝒙𝟏, 𝒙𝟐, ……., 𝒙𝒏.
 An object 𝒙𝒊 can either be chosen as a whole or a fraction of it
can be chosen.
 This means, 0 ≤ 𝒙𝒊 ≤ 1.
 𝒙𝒊 = 1, means the object 𝒙𝒊 has been chosen as a whole.
 𝒙𝒊 = 1/2, means half of 𝒙𝒊 has been chosen
 𝒙𝒊 = 0, means the object 𝒙𝒊 hasn't been chosen.
 The profit of an object varies based on the fraction of the object
chosen.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 9
KNAPSACK PROBLEM
 Consider an object 𝒙𝒊, whose profit is 100. Then
 𝒙𝒊 = 1, means the object 𝒙𝒊 has been chosen as a whole.
Hence profit of 𝒙𝒊 is 100.
 𝒙𝒊 = 1/2, means half of 𝒙𝒊 has been chosen. Hence profit of
𝒙𝒊 is 50(100/2).
 𝒙𝒊 = 0, means the object 𝒙𝒊 hasn't been chosen. Hence
profit of 𝒙𝒊 is 0.
 In general, the profit of an object is obtained by 𝒙𝒊*𝒑𝒊.
Similarly, the weight of an object is 𝒘𝒊*𝒙𝒊.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 10
KNAPSACK PROBLEM
Formal Definition of Knapsack Problem
“Given n objects with weights 𝒘𝒊 and profits 𝒑𝒊
such that 1 ≤ i ≤ n and given a knapsack with capacity
m, the Knapsack problem can be stated as
maximize 1 ≤ i ≤ n 𝒑𝒊𝒙𝒊
subject to 1 ≤ i ≤ n 𝒘𝒊𝒙𝒊 ≤ m
and 0 ≤ 𝒙𝒊≤ 1 , 1 ≤ i ≤ n ”
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 11
KNAPSACK PROBLEM
We now discuss some observations made about the
Knapsack problem. These observations are referred to
as Lemmas.
Lemma 1
If total weight, i.e., 𝒘𝟏+ 𝒘𝟐+…+ 𝒘𝒏 ≤ M, then, 𝒙𝒊=1 for all i such
that 1 ≤ i ≤ n.
Lemma 2
All optimal solutions will fill the knapsack exactly.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 12
KNAPSACK PROBLEM
Strategies to solve Knapsack problem
Consider there are 3 objects, each with weights (18, 15, 10) and
profits (25, 24, 15) respectively. The knapsack capacity is 20.
Find the optimal solution.
Solution
The data given in this problem is
 n = 3
 Objects are 𝒙𝟏, 𝒙𝟐, 𝒙𝟑
 𝒘𝟏 = 18, 𝒘𝟐 = 15, 𝒘𝟑 = 10
 𝒑𝟏 = 25, 𝒑𝟐 = 24, 𝒑𝟑 = 15
 m = 20
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 13
KNAPSACK PROBLEM
Following are some solutions available for the given
problem.
Solutio
n
𝒙𝟏 𝒙𝟐 𝒙𝟑
Total
Weight
Total
Profit
A 1/2 1/3 1/4 16.5 24.25
B 1 2/15 0 20 28.20
C 0 2/3 1 20 31
D 0 1 1/2 20 31.5
 Here solution D is the optimal solution.
 We now look at some greedy strategies to solve the
knapsack problem.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 14
KNAPSACK PROBLEM
Strategy 1 – Choose highest profit/value next
 In this approach, we choose objects in the decreasing order of
their profits.
 In the example, we first select 𝒙𝟏 followed by 2/15th of 𝒙𝟐 as
selecting the whole of 𝒙𝟐 will exceed the knapsack capacity.
 This strategy is represented in solution B.
Strategy 2 – Choose smallest weight next
 In this approach, we choose objects in the increasing order of
their weights.
 This strategy is represented in solution C.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 15
KNAPSACK PROBLEM
Strategy 3 – Choose highest value to weight ratio
next
 This is a combination of strategies 1 and 2.
 In this approach, we first obtain the profit to weight ratio of all
objects. For our example,
𝒙𝟏 = 25/18 = 1.388
𝒙𝟐 = 24/15 = 1.6
𝒙𝟑 = 15/10 = 1.5
 We then select the objects in the decreasing order of their
profit to weight ratio.
 This approach is represented by solution D, which is the
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 16
KNAPSACK PROBLEM
Theorem
If 𝒑𝟏
𝒘𝟏 ≥ 𝒑𝟐
𝒘𝟐 ≥ …… ≥ 𝒑𝒏
𝒘𝒏, then Greedy Knapsack
generates an optimal solution to the given instance of the
problem.
Proof
Phase 1
Example
 We need to prove that Greedy method gives the optimal
solution for the Knapsack problem.
 From the discussion that we have had so far, to get the optimal
solution, we have used the value to weight ratio as the
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 17
KNAPSACK PROBLEM
 In this approach, we first find the value/weight ratio for each
object and arrange it in descending order, i.e.,
𝒑𝟏
𝒘𝟏 ≥ 𝒑𝟐
𝒘𝟐 ≥ …… ≥ 𝒑𝒏
𝒘𝒏
 Let 𝒑𝟏
𝒘𝟏 correspond to object 𝒙𝟏, 𝒑𝟐
𝒘𝟐 correspond to object 𝒙𝟐
and so on.
 This means, the order in which we pick the objects is,
𝒙𝟏 , 𝒙𝟐 ,……., 𝒙𝒏
 We first pick up 𝒙𝟏 and we pick it as a whole. We then pick up
𝒙𝟐 as a whole followed by 𝒙𝟑 and so on.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 18
KNAPSACK PROBLEM
 We keep picking objects as a whole as long as the
knapsack capacity is not exceeded.
 This can be expressed as
𝒙𝟏 = 1, 𝒙𝟐 = 1, 𝒙𝟑 = 1……..
 This continues until we reach a point, say position ‘j’, such
that picking 𝒙𝒋 as a whole will exceed knapsack capacity.
 Hence, we have no other choice but to pick a fraction of
𝒙𝒋, i.e.,
0 < 𝒙𝒋 < 1
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 19
KNAPSACK PROBLEM
 We take a fraction of 𝒙𝒋 such that it is just enough to fill the
knapsack capacity.
 This means, after picking a fraction of 𝒙𝒋, we will not be
able to pick the objects 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……, 𝒙𝒏.
 This is represented as,
𝒙𝟏, 𝒙𝟐,……, 𝒙𝒋−𝟏 𝒙𝒋 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……,
𝒙𝒏
1 1 1 0< 𝒙𝒋<1 0 0 0
 This is the greedy solution to the problem. And we need to
prove that this is the optimal solution.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 20
KNAPSACK PROBLEM
Phase 2
 Let the greedy solution for the Knapsack problem be
X = (𝒙𝟏, 𝒙𝟐,….., 𝒙𝒏), such that,
𝒙𝟏, 𝒙𝟐,……, 𝒙𝒋−𝟏 𝒙𝒋 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……, 𝒙𝒏
1 1 1 0< 𝒙𝒋<1 0 0 0
 Let us assume the optimal solution for the same problem
to be
Y = (𝒚𝟏, 𝒚𝟐,….., 𝒚𝒏)
 We are not aware of what fractions of 𝒚𝒊’s have been
considered.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 21
KNAPSACK PROBLEM
 If we say y = x, it means that the optimal solution is our
greedy solution. There is nothing to prove.
 For the sake of proof, we consider y ≠ x.
 For an index position ‘k’, if 𝒚𝒌 ≠ 𝒙𝒌, then 𝒚𝒌 must be
less than 𝒙𝒌(𝒚𝒌 < 𝒙𝒌).
 We now prove the above statement w.r.t index ‘j’. We
consider three cases:
 Case 1: if k < j
 Case 2: if k = j
 Case 3: if k > j
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 22
KNAPSACK PROBLEM
Example
Case 1: if k < j
 For all index positions less than j, the value of 𝒙𝒊’s are 1.
 Hence, for this case, if 𝒚𝒌 ≠ 𝒙𝒌,i.e., if 𝒚𝒌 ≠ 1, then 𝒚𝒌 has to be
less than 𝒙𝒌.
 This is because, 𝒚𝒌 cannot be greater than 1, as 1 is the
maximum value of an object.
 Therefore, for Case 1, we have proved that if 𝒚𝒌 ≠ 𝒙𝒌, then,
𝒚𝒌 < 𝒙𝒌.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 23
KNAPSACK PROBLEM
Case 2: if k = j
 This means 𝒚𝒊 values are same as 𝒙𝒊 values till index j-1. At jth
index their values don’t match.
 At position j, we consider just a fraction of object x. This is because,
taking x as a whole will exceed knapsack capacity.
 This means at position j, the value of object y also cannot be 1.
 Also 𝒚𝒋 cannot be greater than 𝒙𝒋, as the fraction of 𝒙𝒋 considered
fills knapsack to its capacity.
 So, if 𝒚𝒋 exceeds this fraction, then it’ll definitely exceed the
knapsacks capacity.
 Hence, at index k=j, if 𝒚𝒌 ≠ 𝒙𝒌 then 𝒚𝒌 < 𝒙𝒌.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 24
KNAPSACK PROBLEM
Case 3: if k > j
 This means 𝒚𝒊 values are same as 𝒙𝒊 values till index j. After
jth index their values don’t match.
 The values of 𝒙𝒊 till index j have filled the knapsack to its
capacity. This means even the 𝒚𝒊 values have filled the
knapsack.
 After index j, all 𝒙𝒊’s are 0s.
 At these positions 𝒚𝒊 cannot be greater than 𝒙𝒊 as the
knapsack capacity will be breached.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 25
KNAPSACK PROBLEM
 Hence we have proved the statement that, if
𝒚𝒌 ≠ 𝒙𝒌 then 𝒚𝒌 < 𝒙𝒌.
 This means the assumed optimal solution Y is giving
a performance which is less than our greedy solution
X.
 We now transform the assumed optimal solution Y
into the greedy solution X and prove the theorem.
 By transformation, we mean, since 𝒚𝒌 < 𝒙𝒌, we bring
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 26
KNAPSACK PROBLEM
Phase 3
Example
 We increase 𝒚𝒌 to 𝒙𝒌. We also reduce 𝒚𝒌+𝟏, 𝒚𝒌+𝟐, …….,
𝒚𝒏 accordingly, so that the balance holds.
 Let this transformed solution be
Z = (𝒛𝟏, 𝒛𝟐,……., 𝒛𝒏)
 We can make the following observations on Z,
 For 1 ≤ i ≤ k, 𝒛𝒊 = 𝒙𝒊
 𝒘𝒌( 𝒙𝒌 - 𝒚𝒌 ) = 𝒊=𝒌+𝟏
𝒏
𝒘𝒊 ( 𝒚𝒊 − 𝒛𝒊 )
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 27
KNAPSACK PROBLEM
 We now go on to prove that Y = Z.
 We already know that Z = X.
 Hence if we prove Y = Z, then, X = Y, which is the proof of the
theorem.
We have,
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + 𝑝𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1
𝑛
𝑝𝑖(𝑦𝑖 − 𝑧𝑖)
Let us rewrite 𝑝𝑘 as 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘) and 𝑝𝑖 as 𝑤𝑖 ∗ (𝑝𝑖/ 𝑤𝑖). We get,
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) -
𝑖=𝑘+1
𝑛
𝑤𝑖 ∗ (𝑝𝑖/ 𝑤𝑖)
(𝑦𝑖 − 𝑧𝑖)
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) -
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 28
KNAPSACK PROBLEM
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) -
[ 𝑖=𝑘+1
𝑛
𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)](𝑝𝑘/ 𝑤𝑘)
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + (𝑝𝑘/ 𝑤𝑘)[𝑤𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1
𝑛
𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)]
Since we already have, 𝒘𝒌(𝒙𝒌- 𝒚𝒌 ) = 𝒊=𝒌+𝟏
𝒏
𝒘𝒊 ( 𝒚𝒊 − 𝒛𝒊)
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖 + (𝑝𝑘/ 𝑤𝑘)[𝑤𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1
𝑛
𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)]
𝑖=1
𝑛
𝑝𝑖 𝑧𝑖 = 𝑖=1
𝑛
𝑝𝑖 𝑦𝑖
𝒁 = 𝒀, Hence Proved
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 29
KNAPSACK PROBLEM
Example
Algorithm GreedyKnapsack(m,
n){
for i := 1 to n do
x[i] := 0.0;
U := m;
for i := 1 to n do{
if(w[i] > U) then
break;
x[i] := 1.0;
U := U – w[i];
}
if( i ≤ n) then
x[i] = U / w[i];
}
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 30
KNAPSACK PROBLEM
Example
Find the optimal solution for the given instance of Knapsack
using greedy technique.
n = 7, m = 15,
profits = (10, 5, 15, 7, 6, 18, 3)
Weights = (2, 3, 5, 7, 1, 4, 1)
Solution
 We already know that the optimal solution is obtained by
selecting the objects in the decreasing order of their value to
weight ratio.
 We first find this ration for all the objects..
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 31
KNAPSACK PROBLEM
i 1 2 3 4 5 6 7
𝒑𝒊 10 5 15 7 6 18 3
𝒘𝒊 2 3 5 7 1 4 1
𝒑𝒊/𝒘𝒊 5 1.6 3 1 6 4.5 3
m = 15
𝒙𝒊 1
m = 14
1
m = 12
1
m = 8
1
m = 3
1
m = 2
2/3
m = 0
0
Total Profit = 1*10+2/3*5+1*15+1*6+1*18+1*3
Total Profit = 55.33
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 32
JOB SEQUENCING WITH DEADLINES
 Consider there are n jobs/tasks.
 Each job has a deadline 𝒅𝒊 such that 𝒅𝒊 ≥ 0.
 Every job also has a profit/value 𝒑𝒊 such that 𝒑𝒊 > 0.
 For a job, we get its profit only when the job is completed
before its deadline.
 A machine is provided to execute the jobs.
 Only one machine is provided.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 33
JOB SEQUENCING WITH DEADLINES
 There are some conditions defined for the problem:
 Every job takes one time unit to complete execution.
 A job can be taken for execution only when the system clock is within
the deadline of the considered job.
 The objective of the problem is to execute as many jobs as
possible and get maximum profit.
Example
Solve the Job Sequencing problem for the following instance.
n=4
(𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒)=(100, 10, 15, 27)
(𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 1, 2, 1)
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 34
JOB SEQUENCING WITH DEADLINES
 The data given is,
(𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒)=(100, 10, 15, 27)
(𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 1, 2, 1)
 We find all possible solutions and then select the optimal one.
0 1 2 3 4
(J1, J2) = 110
(J1, J3) = 115
(J1, J4) = 127
(J2, J3) = 25
(J3, J4) = 42
J1
J2
0 1 2 3 4
J1
J3
0 1 2 3 4
J1
J4
0 1 2 3 4
J2 J3
0 1 2 3 4
J3
J4
(J1) = 100
(J2) = 10
(J3) = 15
(J4) = 27
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 35
JOB SEQUENCING WITH DEADLINES
 This way of getting the optimal solution is time consuming.
 An efficient approach is to arrange the jobs in the decreasing
order of their profits and then select the jobs from the order for
execution.(Greatest Profit Next strategy)
 We have,
Jobs - J1 J4 J3 J2
Profits - 100 27 15 10
Deadline - 2 1 2 1
Hence the optimal solution is,
0 1 2 3 4
J4 J1
(J4, J1) is the optimal
solution with the profit
127
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 36
Algorithm GreedyJob(d, J, n)
//J is a set of jobs that can be completed by their deadlines
{
J := {1};
for i := 2 to n do
{
if( all jobs in J υ {i} can be completed by their
deadlines)
then J := J υ {i};
}
}
JOB SEQUENCING WITH DEADLINES
High Level description of the Job Sequencing
algorithm
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 37
JOB SEQUENCING WITH DEADLINES
Algorithm JS(d, J, n){
//d is an array of deadlines of n jobs. s is an array of slots.
//J is a set of jobs that can be completed by their deadlines
k := 0; //indicates the number of jobs executed
for i := 1 to n do
j[i] := s[i] :=0;
for i := 1 to n do{
if(s[d[i]] == 0){
s[d[i]] := i;
j[i] := 1;
k++;
}else{
for x := d[i]-1 to 1 step -1{
if(s[x]==0){
s[x] := i; j[i] := 1; k++;
break;
}
}
}
return k;
} Example
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 38
JOB SEQUENCING WITH DEADLINES
Example
n=5, (𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒, 𝒑𝟓)=(20, 15, 10, 5, 1)
(𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 2, 1, 3, 3).
Solution
We solve this problem using feasibility representation.
J Assigned slots Job
considered
Action profit
Ø none 1 Assign to slot [1,2] 0
{1} [1,2] 2 Assign to slot [0,1] 20
{1,2} [0,1], [1,2] 3 Cannot fit. Reject 35
{1,2} [0,1], [1,2] 4 Assign to slot [2,3] 35
{1,2,4} [0,1], [1,2], [2,3] 5 Cannot fit. Reject 40
Hence, the optimal solution is {1, 2, 4} with a profit of 40.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 39
JOB SEQUENCING WITH DEADLINES
Exercise Problems
1. n =4, D=(4, 1, 1, 1), P=(20, 10, 40, 30)
2. n =5, D=(2, 1, 2, 1, 3), P=(100, 19, 27, 25, 15)
3. n =7, D=(1, 3, 4, 3, 2, 1, 2), P=(3, 5, 20, 18, 1, 6, 30)
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 40
MINIMUM COST SPANNING TREE
 Let G=(V, E) be an undirected graph with V vertices and E
edges. Then,
“A Spanning Tree of an undirected connected graph
is its connected acyclic sub graph (i.e., a tree) that contains
all the vertices of the graph”
 A spanning tree satisfies the property that for a given
graph G, it’s spanning tree is a minimal sub graph 𝑮| such
that
 V(G) = V(𝑮|)
 𝑮|
is connected
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 41
MINIMUM COST SPANNING TREE
 A Minimal sub graph is one which has the fewest
number of edges.
 Sometimes, edges of a graph are assigned with
some numerical values.
 These values are referred to as Cost of the edges.
 The Cost of a Tree is the sum of cost of all the edges
in the tree.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 42
MINIMUM COST SPANNING TREE
Hence,
“The Minimum Cost Spanning Tree(MST) for
a graph G is the spanning tree of the given graph
such that its cost is minimal.”
a b
c d
1
2
3
5
a b
c d
1
2
3
a b
c d
1
3
5
a b
c d
1
2
5
Graph w(T1) =6 w(T2) =9 w(T3) =8
Graph and its spanning trees, with T1 being the minimum
spanning tree.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 43
MINIMUM COST SPANNING TREE
Prim’s Algorithm
 This algorithm constructs the MST edge by edge.
 All the edges that have been chosen to be part of the MST are
stored in an edge set A.
 Selection of an edge to be made part of a MST must satisfy
the following conditions:
 This edge results in a minimal cost sub graph.
 The inclusion of this edge in the sub graph ensures that the sub
graph remains a tree.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 44
MINIMUM COST SPANNING TREE
“If A is a set of edges selected so far, then A
forms a tree. The next edge (u, v) to be included in A is
a minimum cost edge not in A with the property that A
U {(u, v)} is also a tree”
Prim’s Algorithm
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 45
MINIMUM COST SPANNING TREE
1
6
5
4
2
3
7
10
25
22
24
18
12
16
14
28
1
6
5
4
2
3
7
10
25
22
12
16
14
𝑉𝑇 = 1
𝐸𝑇 =
|V|= 7
𝑉𝑇 = 1, 6
𝐸𝑇 = (1, 6)
𝑉𝑇 = 1, 6, 5
𝐸𝑇 = (1, 6),(6,5)
𝑉𝑇 = 1, 6, 5, 4
𝐸𝑇 = (1, 6),(6,5),(5,4)
𝑉𝑇 = 1, 6, 5, 4, 3
𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3)
𝑉𝑇 = 1, 6, 5, 4, 3, 2
𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3),(3,2)
𝑉𝑇 = 1, 6, 5, 4, 3, 2,
7
𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3),(3,2),(2,7)
Minimal Cost
Spanning tree
with cost = 99
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 46
MINIMUM COST SPANNING TREE
Example
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
In the solution for this problem, we use the following
notation to represent a node,
node_name(predecessor_tree vertex,
edge_cost)
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 47
MINIMUM COST SPANNING TREE
Tree
Vertices
Remaining Vertices Illustration
a(-,-
)
b(a,3), c(-,∞), d(-,
∞)
e(a,6), f(a,5)
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
3
b(a,3) c(b,1), d(-, ∞),
e(a,6), f(b,4)
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
3
1
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 48
MINIMUM COST SPANNING TREE
Tree
Vertices
Remaining Vertices Illustration
c(b,1) d(c, 6), e(a,6),
f(b,4)
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
3
1
f(b,4) d(f, 5), e(f,2)
4
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
3
1
4
2
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 49
MINIMUM COST SPANNING TREE
Tree
Vertices
Remaining Vertices Illustration
e(f,2) d(f, 5) b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
3
1
4
2
5
d(f,5) MST with cost =
15
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 50
MINIMUM COST SPANNING TREE
Kruskal’s Algorithm
 This is another approach to obtain a MST for a given graph.
 The algorithm starts by arranging the edges of the given graph
in ascending order.
 Edges are then selected one at a time from this order, to be
included in the MST.
 An edge is included, if and only if it does not result in a cycle in
the MST.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 51
MINIMUM COST SPANNING TREE
ALGORITHM Kruskal(G)
//Input: A weighted connected graph G = <V, E>
//Output: 𝑬𝑻, the set of edges composing a minimum spanning tree of G
sort E in non decreasing order of the edge weights w(𝒆𝒊𝟏) ≤ . . . ≤
w(𝒆𝒊|𝑬|)
𝑬𝑻 ← ∅; ecounter ← 0 //initialize the set of tree edges and its size
k ← 0 //initialize the number of processed edges
while ecounter < |V| − 1 do
k← k + 1
if 𝑬𝑻 ∪ {𝑒𝑖𝑘} is acyclic
𝑬𝑻 ← 𝑬𝑻 ∪ {𝑒𝑖𝑘};
ecounter ←ecounter + 1
return 𝑬𝑻
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 52
MINIMUM COST SPANNING TREE
Example
Find the MST for the graph using Kruskal’s algorithm.
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
We first arrange the edges of this graph in ascending order.
The resulting order is,
bc ef ab bf cf af df ae cd de
1 2 3 4 4 5 5 6 6 8
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 53
MINIMUM COST SPANNING TREE
Tree
Edges
Sorted List of Edges Illustration
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
1
b
c
1
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
1
2
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 54
MINIMUM COST SPANNING TREE
Tree
Edges
Sorted List of Edges Illustration
ef
2
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
1
2
3
ab
3
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
1
2
3
4
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 55
MINIMUM COST SPANNING TREE
Tree
Edges
Sorted List of Edges Illustration
bf
4
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
1
2
3
4
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b
c
ef a
b
bf cf af df ae c
d
d
e
1 2 3 4 4 5 5 6 6 8
b c
f d
a
e
3
1
6
5
8
2
4
4
5
6
1
2
3
4
5
df
5 MST with cost =
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 56
MINIMUM COST SPANNING TREE
Practise Examples
Find the MST for the following graphs using Prims and Kruskals algorithms.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 57
MINIMUM COST SPANNING TREE
Disjoint Subsets and Union-Find Algorithms
 Union-Find algorithm is a strategy which is used to efficiently
implement the Kruskal’s algorithm.
 This strategy is based on the concept of Disjoint Subsets.
 The strategy divides the given edge set 𝑬 into disjoint
subsets, with each subset having just one edge.
 It then combines these subsets one by one until a MST is
obtained.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 58
MINIMUM COST SPANNING TREE
 The Union-Find algorithm performs the following operations:
 makeset(x) - creates a one-element set {x}.
 find(x) - returns a subset containing x.
 union(x, y) - constructs the union of the disjoint subsets 𝑺𝒙 and 𝑺𝒚
containing x and y.
 For example, let S = {1, 2, 3, 4, 5, 6}. Then makeset(i) creates
the set { i } for all elements in the set, i.e.,
{1}, {2}, {3}, {4}, {5}, {6}
 Performing union(1, 4) and union(5, 2) yields
{1, 4}, {5, 2}, {3}, {6}
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 59
MINIMUM COST SPANNING TREE
 Now if we want to combine the subsets {1,4} and {5,2},
how do we call the Union() function?
 There is a problem here because, Union() function
accepts two values as parameters.
 But here we have two subsets to be combined, each with
two elements.
 We cannot pass the whole subset as parameter to the
Union() function as the function accepts single elements
only.
 Hence, we need to have a single element that represents
a subset. We call this element, the Subset
Representative.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 60
MINIMUM COST SPANNING TREE
 There are several logics followed to assign a representative
for a subset.
 The approach that we adopt is, the smallest element of a
subset is its representative.
 With this, we can comfortably merge the subsets {1,4} and
{5,2} by making the call
Union(1,2)
 Here element 1 represents the subset {1,4} and element 2
represents subset {5,2}.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 61
MINIMUM COST SPANNING TREE
 We now have a look at various data structures that
are used to implement the Union-Find algorithm.
 We have 2 strategies for Union-Find based on the
type of data structures used.
 The strategies are
 Quick Find
 Quick Union
 Quick Find uses arrays and linked list while Quick
Union uses trees.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 62
MINIMUM COST SPANNING TREE
 Quick Find
 This strategy maintains an array that contains
information about the representative of each subsets.
 The array indices are the elements of the set S.
 The values at the respective indices are the representatives
of the subset containing the element.
 The strategy also maintains each subset as a linked list
with header node.
 The header node contains pointers to the first and last
element of the list as well as information about the total
number of nodes in the list.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 63
MINIMUM COST SPANNING TREE
 Consider the set S={1, 2, 3, 4, 5, 6}
 Initially, this set gets divided into the subsets
{1}, {2}, {3}, {4}, {5}, {6}
 The linked list for the subsets are as follows.
1
Size Last First
List 1 1 0
1
Size Last First
List 2 2 0
1
Size Last First
List 3 3 0
1
Size Last First
List 4 4 0
1
Size Last First
List 5 5 0
1
Size Last First
List 6 6 0
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 64
MINIMUM COST SPANNING TREE
 The array of subset representatives is,
Subset Representatives
Element Index
Representative
1
2
3
4
5
6
1
2
3
4
5
6
 We will now see what happens to the array and the
linked lists after the calls for union(1,4) and
union(2,5) are made.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 65
MINIMUM COST SPANNING TREE
2
Size Last First
List 1 1
2
Size Last First
List 2
1
Size Last First
List 3 3 0
1
Size Last First
List 6 6 0
4 0
2 5 0
Subset Representatives
Element Index Representative
1
2
3
4
5
6
1
2
3
1
2
6
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 66
MINIMUM COST SPANNING TREE
 Next the call for union(1,2) and union(3,6) will result in
the following scenario,
4
Size Last First
List 1 1
2
Size Last First
List 3 3 6 0
4 2 5 0
Subset Representatives
Element Index Representative
1
2
3
4
5
6
1
1
3
1
1
3
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 67
MINIMUM COST SPANNING TREE
 This way the process continues in Quick find
strategy.
 To summarize, in Quick Find approach,
 makeset(x) involves creating a representative array in
which the representative for each element is itself
initially.
 This operation also involves creating a linked list of
one node for each subset.
 find(x) involves retrieving x’s representative from the
array.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 68
MINIMUM COST SPANNING TREE
 Quick Union
 This approach uses Trees to implement the operations.
 makeset(x) involves creating a tree of one node for all
x in the set S.
 The root of a tree is the representative of that subset.
 Edges in the tree are directed from children to parents.
 union(x,y) involves attaching the root of one tree to the
root of the other.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 69
MINIMUM COST SPANNING TREE
 For a given set S={1, 2, 3, 4, 5, 6}, makeset(x) results in the
following scenario.
1
2 3
4
5 6
 union(1,4) and union(2,5) will result in,
1 4
2
5
 union(1,2) and union(3,6) will result in,
1
4
2
5
3
6
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 70
DIJKSTRA’S ALGORITHM
 This algorithm is used to solve the Single Source
Shortest Path problem.
 The problem is defined as,
“For a given vertex called the source in a
weighted connected graph, find shortest paths to all its
other vertices.”
 The Dijkstra’s algorithm works only on graphs with
edges having non negative weights.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 71
DIJKSTRA’S ALGORITHM
 In this algorithm, we will have two categories of vertices:
 Tree vertices
 Fringe vertices
 Tree vertices are those vertices of the given graph that
are part of the shortest path tree.
 Fringe vertices are the remaining vertices that are yet to
be added to the shortest path tree.
 We now get introduced to 2 notations that will be regularly
used in the algorithm:
 𝒅𝒊 - shortest distance from the source vertex to vertex i.
 w(a, b) – weight of the edge between vertices a and b.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 72
DIJKSTRA’S ALGORITHM
The steps involved in finding the Single Source Shortest path for
a given graph are:
Step 1 – Identify a vertex 𝒖∗
that is closest to the source. Move
𝒖∗ from the fringe to the set of tree vertices.
Step 2 - For each remaining fringe vertex u that is connected to
𝒖∗ by an edge of weight w(𝒖∗, u) such that 𝒅𝒖∗ + w(𝒖∗, u) < 𝒅𝒖,
update the labels of u by 𝒖∗
and 𝒅𝒖∗ + w(𝒖∗
, u) respectively.
a b
c d
1
1
4
1
2
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 73
DIJKSTRA’S ALGORITHM
 The vertex representation used in this algorithm is,
node_name(predecessor, cost of shortest path from
source)
 We now find the single source shortest path for the
following graph using Dijkstra’s strategy.
b c
d
a e
4
2 5
4
6
7
3
Source
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 74
DIJKSTRA’S ALGORITHM
Tree
Vertices
Fringe Vertices Illustration
a(- , 0) b(a , 3), c(- , ∞), d(a ,
7), e(- , ∞)
b c
d
a e
4
2 5
4
6
7
3
3
b(a , 3) c(b , 7), d(b , 5), e(- ,
∞)
b c
d
a e
4
2 5
4
6
7
3
3
2
d(b , 5) c(b , 7), e(d , 9)
b c
d
a e
4
2 5
4
6
7
3
3
2
4
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 75
DIJKSTRA’S ALGORITHM
Tree
Vertices
Fringe Vertices Illustration
c(b , 7) e(d , 9)
b c
d
a e
4
2 5
4
6
7
3
3
2
4
4
e(d , 9)
Single Source Shortest Paths
are:
a – b of length 3
a – b – c of length 7
a – b – d of length 5
a – b – d – e of length 9
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 76
DIJKSTRA’S ALGORITHM
Algorithm b c
d
a e
4
2 5
4
6
7
3
Source
0 1 2 3 4
Q
Ø Ø Ø Ø Ø
𝒑𝒗
0 1 2 3 4
(a,∞
)
(b,∞
)
(c,∞) (d,∞
)
(e,∞
)
0 1 2 3 4
(a,0) (b,∞
)
(c,∞) (d,∞
)
(e,∞
)
Ø
𝑽𝑻
0
i
a
𝒖∗
a
0 1 2 3 4
(a,0) (b,∞
)
(c,∞) (d,∞
)
(e,∞
)
∞ ∞ ∞ ∞ ∞
𝒅𝒗 0 ∞ ∞ ∞ ∞
b
u
0 3 ∞ ∞ ∞
Ø a Ø Ø Ø
0 1 2 3 4
(a,0) (b,3) (c,∞) (d,∞
)
(e,∞
)
d
0 3 ∞ 7 ∞
Ø a Ø a Ø
0 1 2 3 4
(a,0) (b,3) (c,∞) (d,7) (e,∞
)
1
0 1 2 3 4
(a,0) (b,3) (c,∞) (d,7) (e,∞
)
b
a b
c
0 3 7 7 ∞
Ø a b a Ø
0 1 2 3 4
(a,0) (b,3) (c,7) (d,7) (e,∞
)
d
0 3 7 5 ∞
Ø a b b Ø
0 1 2 3 4
(a,0) (b,3) (c,7) (d,5) (e,∞
)
2
d
a b d
e
0 1 2 3 4
(a,0) (b,3) (c,7) (d,5) (e,∞
)
0 3 7 5 9
Ø a b b d
0 1 2 3 4
(a,0) (b,3) (c,7) (d,5) (e,9)
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 77
DIJKSTRA’S ALGORITHM
Exercise
Problems b c
d
a e
4
2 5
4
6
7
3
Source
a
d
h
k
b
e
i
l
f
j
c
g
3
3
1
6
8
4
5
7
6
3
4
2
5
6
5
2
3
9
5
4
Source
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 78
HUFFMAN TREES
 Huffman trees are used for Encoding.
 Encoding is the process of converting a given
text/message into some other form.
 This is accomplished by converting each character of the
text into a sequence of bits.
 The resulting string is called Codeword.
 There are 2 encoding strategies:
 Fixed length encoding
 Variable length encoding
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 79
HUFFMAN TREES
Fixed Length Encoding
 As the name says, each character in the text is replaced by a bit
sequence of the same length.
 ASCII code is an example(A=65, B=66, …., Z=90).
Variable Length Encoding
 Different characters are assigned with bit sequences of different
length.
 Frequently occurring characters are assigned with smaller length bit
string while rarely occurring characters are assigned with a longer bit
string.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 80
HUFFMAN TREES
Problem with Variable Length Encoding
 Consider the encoding scheme where a=01, e =00, h=010,
l=011, o=10, i=11.
 With this, we encode the text hello as 0100001101110.
 The problem lies in decoding this bit string.
 Since it is a variable length encoding, every character will have
bit strings of different length.
 Hence finding the beginning or end of bit string for a character
becomes difficult.
 The above bit string could also be decoded as aeeilo.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 81
HUFFMAN TREES
Solution
 We now need a solution such that, even in variable length
encoding, we can clearly identify the beginning and end of a
particular character.
 For this, we adopt a strategy called Prefix Code or Prefix free
code.
 As the name says, the code for a character will not have any
prefixes.
 This means, each character’s code in the code word will not
have anything preceding it.
 Now the question is, how can such a strategy be implemented.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 82
HUFFMAN TREES
 This is accomplished using Trees, specifically Binary
Trees.
 We construct a binary tree such that,
 Leaf nodes represent characters of the text.
 The edges from the root to a leaf node represent the bit
code for the character in that leaf node.
 All the left edges are labeled 0 and all the right edges are
labeled 1.
This strategy is presented next.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 83
HUFFMAN TREES
 With this scheme, decoding the bit string
011010101011
will only result in HELLO.
a
e h
l 0
0
0
0
0
1
1
1
1
The code for the characters
are
A = 00
E = 010
H = 011
L = 10
O = 11
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 84
HUFFMAN TREES
 A formal approach of constructing such a binary tree was
proposed by Huffman and is presented in the following
algorithm.
Huffman’s Algorithm
Step 1:
 Initialize n one-node trees and label them with the symbols of the
alphabet given.
 Record the frequency of each symbol in its tree’s root to indicate
the tree’s weight. (More generally, the weight of a tree will be
equal to the sum of the frequencies in the tree’s leaves.)
Step 2: Repeat the following steps, until a single tree is obtained.
 Find two trees with the smallest weight.
 Make them the left and right sub tree of a new tree and record the
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 85
HUFFMAN TREES
 A tree constructed in this way is known as a Huffman
tree.
 The code generated by such a tree is known as Huffman
Code.
Example
Consider the five-symbol alphabet {A, B, C, D, _} with the
following occurrence frequencies in a text made up of these
symbols:
symbol A B C D _
frequency 0.35 0.1 0.2 0.2 0.15
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 86
HUFFMAN TREES
Step 1
 We start by creating n one node trees.
 Since there are 5 characters, we will have 5 trees,
A B C D _
 We now assign the frequencies of these characters as their
node weights.
A B C D _
0.35 0.1 0.2 0.2 0.15
Step 2
 We need to select 2 trees with least weights. For this we first
arrange these trees in the increasing order of weights.
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 87
HUFFMAN TREES
 We now select nodes B and _ to be merged as one tree.
 The total weight of the resulting tree is recorded in the
root.
B _ C D A
0.1 0.15 0.2 0.2 0.35
B _ C D A
0.1 0.15 0.2 0.2 0.35
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 88
HUFFMAN TREES
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
0.4
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
0.4
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
0.4 0.6
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
0.4 0.6
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 89
HUFFMAN TREES
B
0.1
_
0.15
C
0.2
D
0.2
A
0.35
0.25
0.4 0.6
1.0
0
0 0
0 1
1
1
1
Huffman Tree and Huffman
Code
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 90
HUFFMAN TREES
 The expected number of bits per character using
Huffman approach is given by,
𝒊=𝟏
𝒏
𝒏𝒐. 𝒐𝒇 𝒃𝒊𝒕𝒔 𝒇𝒐𝒓 𝒊𝒕𝒉 𝒄𝒉𝒂𝒓𝒂𝒄𝒕𝒆𝒓 + 𝒇𝒓𝒆𝒒𝒖𝒆𝒏𝒄𝒚 𝒐𝒇 𝒊𝒕𝒉 𝒄𝒉𝒂𝒓𝒂𝒄𝒕𝒆𝒓
 For our example,
Expected no. of bits = 2*0.35 + 3*0.1+ 2*0.2+
2*0.2 + 3*0.15
= 2.25
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 91
THANK YOU
9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 92

More Related Content

What's hot

Data fitting in Scilab - Tutorial
Data fitting in Scilab - TutorialData fitting in Scilab - Tutorial
Data fitting in Scilab - TutorialScilab
 
Greedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack ProblemGreedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack ProblemMadhu Bala
 
Greedy Algorithm - Huffman coding
Greedy Algorithm - Huffman codingGreedy Algorithm - Huffman coding
Greedy Algorithm - Huffman codingMd Monirul Alom
 
Greedy method class 11
Greedy method class 11Greedy method class 11
Greedy method class 11Kumar
 
Fractional knapsack class 13
Fractional knapsack class 13Fractional knapsack class 13
Fractional knapsack class 13Kumar
 
Greedy algorithm activity selection fractional
Greedy algorithm activity selection fractionalGreedy algorithm activity selection fractional
Greedy algorithm activity selection fractionalAmit Kumar Rathi
 
Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)   Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp) Archana432045
 
Deep Dive on Amazon EC2 instances
Deep Dive on Amazon EC2 instancesDeep Dive on Amazon EC2 instances
Deep Dive on Amazon EC2 instancesAmazon Web Services
 
【材料力学】はり のせん断力と曲げモーメント
【材料力学】はり のせん断力と曲げモーメント【材料力学】はり のせん断力と曲げモーメント
【材料力学】はり のせん断力と曲げモーメントKazuhiro Suga
 
The Traveling Salesman Problem: A Neural Network Perspective
The Traveling Salesman Problem: A Neural Network PerspectiveThe Traveling Salesman Problem: A Neural Network Perspective
The Traveling Salesman Problem: A Neural Network Perspectivemustafa sarac
 
Pose Graph based SLAM
Pose Graph based SLAMPose Graph based SLAM
Pose Graph based SLAMEdwardIm1
 

What's hot (16)

Data fitting in Scilab - Tutorial
Data fitting in Scilab - TutorialData fitting in Scilab - Tutorial
Data fitting in Scilab - Tutorial
 
Greedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack ProblemGreedy Algorithm - Knapsack Problem
Greedy Algorithm - Knapsack Problem
 
Tsp branch and-bound
Tsp branch and-boundTsp branch and-bound
Tsp branch and-bound
 
Greedy Algorithm - Huffman coding
Greedy Algorithm - Huffman codingGreedy Algorithm - Huffman coding
Greedy Algorithm - Huffman coding
 
Greedy method class 11
Greedy method class 11Greedy method class 11
Greedy method class 11
 
Minimax
MinimaxMinimax
Minimax
 
Alpha beta pruning
Alpha beta pruningAlpha beta pruning
Alpha beta pruning
 
Fractional knapsack class 13
Fractional knapsack class 13Fractional knapsack class 13
Fractional knapsack class 13
 
Greedy algorithm activity selection fractional
Greedy algorithm activity selection fractionalGreedy algorithm activity selection fractional
Greedy algorithm activity selection fractional
 
Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)   Constraint satisfaction problems (csp)
Constraint satisfaction problems (csp)
 
Dynamic programming
Dynamic programmingDynamic programming
Dynamic programming
 
Deep Dive on Amazon EC2 instances
Deep Dive on Amazon EC2 instancesDeep Dive on Amazon EC2 instances
Deep Dive on Amazon EC2 instances
 
【材料力学】はり のせん断力と曲げモーメント
【材料力学】はり のせん断力と曲げモーメント【材料力学】はり のせん断力と曲げモーメント
【材料力学】はり のせん断力と曲げモーメント
 
The Traveling Salesman Problem: A Neural Network Perspective
The Traveling Salesman Problem: A Neural Network PerspectiveThe Traveling Salesman Problem: A Neural Network Perspective
The Traveling Salesman Problem: A Neural Network Perspective
 
Pose Graph based SLAM
Pose Graph based SLAMPose Graph based SLAM
Pose Graph based SLAM
 
N queen problem
N queen problemN queen problem
N queen problem
 

Similar to Module 3_Greedy Technique_2021 Scheme.pptx

module3_Greedymethod_2022.pdf
module3_Greedymethod_2022.pdfmodule3_Greedymethod_2022.pdf
module3_Greedymethod_2022.pdfShiwani Gupta
 
Mb0048 operations research (1)
Mb0048 operations research (1)Mb0048 operations research (1)
Mb0048 operations research (1)smumbahelp
 
Mb0048 operations research (1)
Mb0048 operations research (1)Mb0048 operations research (1)
Mb0048 operations research (1)smumbahelp
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptdakccse
 
Optimization problems
Optimization problemsOptimization problems
Optimization problemsRuchika Sinha
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptBinayakMukherjee4
 
daa-unit-3-greedy method
daa-unit-3-greedy methoddaa-unit-3-greedy method
daa-unit-3-greedy methodhodcsencet
 
Module 2_Decrease and Conquer_2021 Scheme.pptx
Module 2_Decrease and Conquer_2021 Scheme.pptxModule 2_Decrease and Conquer_2021 Scheme.pptx
Module 2_Decrease and Conquer_2021 Scheme.pptxRITIKKUMAR168218
 
A Survey- Knapsack Problem Using Dynamic Programming
A Survey- Knapsack Problem Using Dynamic ProgrammingA Survey- Knapsack Problem Using Dynamic Programming
A Survey- Knapsack Problem Using Dynamic ProgrammingEditor IJCTER
 
Satisfaction And Its Application To Ai Planning
Satisfaction And Its Application To Ai PlanningSatisfaction And Its Application To Ai Planning
Satisfaction And Its Application To Ai Planningahmad bassiouny
 
AMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptxAMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptxbhavypatel2228
 
Mba205 operations research
Mba205 operations researchMba205 operations research
Mba205 operations researchsmumbahelp
 
Introduction to Optimization revised.ppt
Introduction to Optimization revised.pptIntroduction to Optimization revised.ppt
Introduction to Optimization revised.pptJahnaviGautam
 
data structures and algorithms Unit 4
data structures and algorithms Unit 4data structures and algorithms Unit 4
data structures and algorithms Unit 4infanciaj
 
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...Mengxi Jiang
 

Similar to Module 3_Greedy Technique_2021 Scheme.pptx (20)

module3_Greedymethod_2022.pdf
module3_Greedymethod_2022.pdfmodule3_Greedymethod_2022.pdf
module3_Greedymethod_2022.pdf
 
Mb0048 operations research (1)
Mb0048 operations research (1)Mb0048 operations research (1)
Mb0048 operations research (1)
 
Mb0048 operations research (1)
Mb0048 operations research (1)Mb0048 operations research (1)
Mb0048 operations research (1)
 
Module 3_DAA (2).pptx
Module 3_DAA (2).pptxModule 3_DAA (2).pptx
Module 3_DAA (2).pptx
 
Stochastic Optimization
Stochastic OptimizationStochastic Optimization
Stochastic Optimization
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
 
Optimization problems
Optimization problemsOptimization problems
Optimization problems
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
 
daa-unit-3-greedy method
daa-unit-3-greedy methoddaa-unit-3-greedy method
daa-unit-3-greedy method
 
Sudoku Solver
Sudoku SolverSudoku Solver
Sudoku Solver
 
Module 2_Decrease and Conquer_2021 Scheme.pptx
Module 2_Decrease and Conquer_2021 Scheme.pptxModule 2_Decrease and Conquer_2021 Scheme.pptx
Module 2_Decrease and Conquer_2021 Scheme.pptx
 
A Survey- Knapsack Problem Using Dynamic Programming
A Survey- Knapsack Problem Using Dynamic ProgrammingA Survey- Knapsack Problem Using Dynamic Programming
A Survey- Knapsack Problem Using Dynamic Programming
 
Ai unit-3
Ai unit-3Ai unit-3
Ai unit-3
 
Satisfaction And Its Application To Ai Planning
Satisfaction And Its Application To Ai PlanningSatisfaction And Its Application To Ai Planning
Satisfaction And Its Application To Ai Planning
 
AMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptxAMS_502_13, 14,15,16 (1).pptx
AMS_502_13, 14,15,16 (1).pptx
 
Mba205 operations research
Mba205 operations researchMba205 operations research
Mba205 operations research
 
Introduction to Optimization revised.ppt
Introduction to Optimization revised.pptIntroduction to Optimization revised.ppt
Introduction to Optimization revised.ppt
 
data structures and algorithms Unit 4
data structures and algorithms Unit 4data structures and algorithms Unit 4
data structures and algorithms Unit 4
 
Assignment
AssignmentAssignment
Assignment
 
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...
Application of Graphic LASSO in Portfolio Optimization_Yixuan Chen & Mengxi J...
 

Recently uploaded

High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...ranjana rawat
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...ranjana rawat
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingrakeshbaidya232001
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingrknatarajan
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Christo Ananth
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Dr.Costas Sachpazis
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...RajaP95
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).pptssuser5c9d4b1
 

Recently uploaded (20)

High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...
(TARA) Talegaon Dabhade Call Girls Just Call 7001035870 [ Cash on Delivery ] ...
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
(SHREYA) Chakan Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Esc...
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
Porous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writingPorous Ceramics seminar and technical writing
Porous Ceramics seminar and technical writing
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and workingUNIT-V FMM.HYDRAULIC TURBINE - Construction and working
UNIT-V FMM.HYDRAULIC TURBINE - Construction and working
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
Structural Analysis and Design of Foundations: A Comprehensive Handbook for S...
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
IMPLICATIONS OF THE ABOVE HOLISTIC UNDERSTANDING OF HARMONY ON PROFESSIONAL E...
 
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
247267395-1-Symmetric-and-distributed-shared-memory-architectures-ppt (1).ppt
 

Module 3_Greedy Technique_2021 Scheme.pptx

  • 2. CONTENTS  General method  Coin Change Problem  Knapsack Problem  Job sequencing with deadlines  Minimum cost spanning trees  Prim’s Algorithm  Kruskal’s Algorithm  Single source shortest paths  Dijkstra's Algorithm  Optimal Tree problem  Huffman Trees and Codes  Transform and Conquer Approach  Heaps and Heap Sort 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 2
  • 3. INTRODUCTION The General Method  There will be a problem given. A set of conditions to be satisfied for the problem will also be provided.  Suppose the problem has n inputs, we select a subset of n inputs that satisfy the conditions.  Several subsets may satisfy the conditions.  Each subset satisfying the problem conditions is referred to as a Feasible Solution.  Out of all the Feasible Solutions, the one satisfying the conditions to the fullest is referred to as the Optimal Solution and is the solution to the problem. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 3
  • 4. INTRODUCTION  Finding a solution to a problem in this manner is referred to as Subset Paradigm. Algorithm Greedy(a, n) //a[1:n] contains the n inputs { solution := Ø; for i := 1 to n do { x := Select(a); if Feasible(solution, x) then solution := Union(solution, x); } return solution; } 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 4
  • 5. INTRODUCTION  In simple words, Greedy Method is the concept of getting the best possible solution for a problem with minimal efforts.  It is like getting a high quality product with minimal cost.  A classic scenario that represents greedy technique is the Coin Change Problem.  Another simple example that represents greedy method is Machine Scheduling, which is presented next. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 5
  • 6. INTRODUCTION There are n tasks that must be completed. Each task has a start and end time. Infinite number of machines are provided to complete the task. The condition is that, the tasks must not be assigned to machines such that their execution overlaps. Task A B C D E F G Start 0 3 4 9 7 1 6 End 2 7 7 11 10 5 8 Solution  One possible solution for this problem is assigning individual machines to each of the tasks.  There are 8 tasks. Hence we assign 8 machines, one for each task. Although this is a feasible solution, this is definitely not optimal. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 6
  • 7. INTRODUCTION  We now see how to get the optimal solution.  We start by arranging the tasks in the order of their start times, i.e., A, F, B, C, G, E, D (0,2),(1,5),(3,7),(4,7),(6,8),(7,10),(9,11)  Next we allocate machines to the tasks as follows, 0 1 2 3 4 5 6 7 8 9 10 11 12 13 mc m1 A m2 F B m3 C G E D 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 7
  • 8. KNAPSACK PROBLEM  There are n objects given to us.  Each object has a weight and profit associated with it.  The weights and profits are represented as 𝒘𝒊 and 𝒑𝒊 respectively, for 1 ≤ i ≤ n.  We are also given a Knapsack/bag whose capacity in m.  The problem here is to fill the knapsack by using the objects such that we get maximum profit and we don’t exceed the knapsack capacity. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 8
  • 9. KNAPSACK PROBLEM  Let the objects be 𝒙𝟏, 𝒙𝟐, ……., 𝒙𝒏.  An object 𝒙𝒊 can either be chosen as a whole or a fraction of it can be chosen.  This means, 0 ≤ 𝒙𝒊 ≤ 1.  𝒙𝒊 = 1, means the object 𝒙𝒊 has been chosen as a whole.  𝒙𝒊 = 1/2, means half of 𝒙𝒊 has been chosen  𝒙𝒊 = 0, means the object 𝒙𝒊 hasn't been chosen.  The profit of an object varies based on the fraction of the object chosen. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 9
  • 10. KNAPSACK PROBLEM  Consider an object 𝒙𝒊, whose profit is 100. Then  𝒙𝒊 = 1, means the object 𝒙𝒊 has been chosen as a whole. Hence profit of 𝒙𝒊 is 100.  𝒙𝒊 = 1/2, means half of 𝒙𝒊 has been chosen. Hence profit of 𝒙𝒊 is 50(100/2).  𝒙𝒊 = 0, means the object 𝒙𝒊 hasn't been chosen. Hence profit of 𝒙𝒊 is 0.  In general, the profit of an object is obtained by 𝒙𝒊*𝒑𝒊. Similarly, the weight of an object is 𝒘𝒊*𝒙𝒊. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 10
  • 11. KNAPSACK PROBLEM Formal Definition of Knapsack Problem “Given n objects with weights 𝒘𝒊 and profits 𝒑𝒊 such that 1 ≤ i ≤ n and given a knapsack with capacity m, the Knapsack problem can be stated as maximize 1 ≤ i ≤ n 𝒑𝒊𝒙𝒊 subject to 1 ≤ i ≤ n 𝒘𝒊𝒙𝒊 ≤ m and 0 ≤ 𝒙𝒊≤ 1 , 1 ≤ i ≤ n ” 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 11
  • 12. KNAPSACK PROBLEM We now discuss some observations made about the Knapsack problem. These observations are referred to as Lemmas. Lemma 1 If total weight, i.e., 𝒘𝟏+ 𝒘𝟐+…+ 𝒘𝒏 ≤ M, then, 𝒙𝒊=1 for all i such that 1 ≤ i ≤ n. Lemma 2 All optimal solutions will fill the knapsack exactly. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 12
  • 13. KNAPSACK PROBLEM Strategies to solve Knapsack problem Consider there are 3 objects, each with weights (18, 15, 10) and profits (25, 24, 15) respectively. The knapsack capacity is 20. Find the optimal solution. Solution The data given in this problem is  n = 3  Objects are 𝒙𝟏, 𝒙𝟐, 𝒙𝟑  𝒘𝟏 = 18, 𝒘𝟐 = 15, 𝒘𝟑 = 10  𝒑𝟏 = 25, 𝒑𝟐 = 24, 𝒑𝟑 = 15  m = 20 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 13
  • 14. KNAPSACK PROBLEM Following are some solutions available for the given problem. Solutio n 𝒙𝟏 𝒙𝟐 𝒙𝟑 Total Weight Total Profit A 1/2 1/3 1/4 16.5 24.25 B 1 2/15 0 20 28.20 C 0 2/3 1 20 31 D 0 1 1/2 20 31.5  Here solution D is the optimal solution.  We now look at some greedy strategies to solve the knapsack problem. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 14
  • 15. KNAPSACK PROBLEM Strategy 1 – Choose highest profit/value next  In this approach, we choose objects in the decreasing order of their profits.  In the example, we first select 𝒙𝟏 followed by 2/15th of 𝒙𝟐 as selecting the whole of 𝒙𝟐 will exceed the knapsack capacity.  This strategy is represented in solution B. Strategy 2 – Choose smallest weight next  In this approach, we choose objects in the increasing order of their weights.  This strategy is represented in solution C. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 15
  • 16. KNAPSACK PROBLEM Strategy 3 – Choose highest value to weight ratio next  This is a combination of strategies 1 and 2.  In this approach, we first obtain the profit to weight ratio of all objects. For our example, 𝒙𝟏 = 25/18 = 1.388 𝒙𝟐 = 24/15 = 1.6 𝒙𝟑 = 15/10 = 1.5  We then select the objects in the decreasing order of their profit to weight ratio.  This approach is represented by solution D, which is the 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 16
  • 17. KNAPSACK PROBLEM Theorem If 𝒑𝟏 𝒘𝟏 ≥ 𝒑𝟐 𝒘𝟐 ≥ …… ≥ 𝒑𝒏 𝒘𝒏, then Greedy Knapsack generates an optimal solution to the given instance of the problem. Proof Phase 1 Example  We need to prove that Greedy method gives the optimal solution for the Knapsack problem.  From the discussion that we have had so far, to get the optimal solution, we have used the value to weight ratio as the 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 17
  • 18. KNAPSACK PROBLEM  In this approach, we first find the value/weight ratio for each object and arrange it in descending order, i.e., 𝒑𝟏 𝒘𝟏 ≥ 𝒑𝟐 𝒘𝟐 ≥ …… ≥ 𝒑𝒏 𝒘𝒏  Let 𝒑𝟏 𝒘𝟏 correspond to object 𝒙𝟏, 𝒑𝟐 𝒘𝟐 correspond to object 𝒙𝟐 and so on.  This means, the order in which we pick the objects is, 𝒙𝟏 , 𝒙𝟐 ,……., 𝒙𝒏  We first pick up 𝒙𝟏 and we pick it as a whole. We then pick up 𝒙𝟐 as a whole followed by 𝒙𝟑 and so on. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 18
  • 19. KNAPSACK PROBLEM  We keep picking objects as a whole as long as the knapsack capacity is not exceeded.  This can be expressed as 𝒙𝟏 = 1, 𝒙𝟐 = 1, 𝒙𝟑 = 1……..  This continues until we reach a point, say position ‘j’, such that picking 𝒙𝒋 as a whole will exceed knapsack capacity.  Hence, we have no other choice but to pick a fraction of 𝒙𝒋, i.e., 0 < 𝒙𝒋 < 1 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 19
  • 20. KNAPSACK PROBLEM  We take a fraction of 𝒙𝒋 such that it is just enough to fill the knapsack capacity.  This means, after picking a fraction of 𝒙𝒋, we will not be able to pick the objects 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……, 𝒙𝒏.  This is represented as, 𝒙𝟏, 𝒙𝟐,……, 𝒙𝒋−𝟏 𝒙𝒋 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……, 𝒙𝒏 1 1 1 0< 𝒙𝒋<1 0 0 0  This is the greedy solution to the problem. And we need to prove that this is the optimal solution. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 20
  • 21. KNAPSACK PROBLEM Phase 2  Let the greedy solution for the Knapsack problem be X = (𝒙𝟏, 𝒙𝟐,….., 𝒙𝒏), such that, 𝒙𝟏, 𝒙𝟐,……, 𝒙𝒋−𝟏 𝒙𝒋 𝒙𝒋+𝟏, 𝒙𝒋+𝟐,……, 𝒙𝒏 1 1 1 0< 𝒙𝒋<1 0 0 0  Let us assume the optimal solution for the same problem to be Y = (𝒚𝟏, 𝒚𝟐,….., 𝒚𝒏)  We are not aware of what fractions of 𝒚𝒊’s have been considered. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 21
  • 22. KNAPSACK PROBLEM  If we say y = x, it means that the optimal solution is our greedy solution. There is nothing to prove.  For the sake of proof, we consider y ≠ x.  For an index position ‘k’, if 𝒚𝒌 ≠ 𝒙𝒌, then 𝒚𝒌 must be less than 𝒙𝒌(𝒚𝒌 < 𝒙𝒌).  We now prove the above statement w.r.t index ‘j’. We consider three cases:  Case 1: if k < j  Case 2: if k = j  Case 3: if k > j 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 22
  • 23. KNAPSACK PROBLEM Example Case 1: if k < j  For all index positions less than j, the value of 𝒙𝒊’s are 1.  Hence, for this case, if 𝒚𝒌 ≠ 𝒙𝒌,i.e., if 𝒚𝒌 ≠ 1, then 𝒚𝒌 has to be less than 𝒙𝒌.  This is because, 𝒚𝒌 cannot be greater than 1, as 1 is the maximum value of an object.  Therefore, for Case 1, we have proved that if 𝒚𝒌 ≠ 𝒙𝒌, then, 𝒚𝒌 < 𝒙𝒌. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 23
  • 24. KNAPSACK PROBLEM Case 2: if k = j  This means 𝒚𝒊 values are same as 𝒙𝒊 values till index j-1. At jth index their values don’t match.  At position j, we consider just a fraction of object x. This is because, taking x as a whole will exceed knapsack capacity.  This means at position j, the value of object y also cannot be 1.  Also 𝒚𝒋 cannot be greater than 𝒙𝒋, as the fraction of 𝒙𝒋 considered fills knapsack to its capacity.  So, if 𝒚𝒋 exceeds this fraction, then it’ll definitely exceed the knapsacks capacity.  Hence, at index k=j, if 𝒚𝒌 ≠ 𝒙𝒌 then 𝒚𝒌 < 𝒙𝒌. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 24
  • 25. KNAPSACK PROBLEM Case 3: if k > j  This means 𝒚𝒊 values are same as 𝒙𝒊 values till index j. After jth index their values don’t match.  The values of 𝒙𝒊 till index j have filled the knapsack to its capacity. This means even the 𝒚𝒊 values have filled the knapsack.  After index j, all 𝒙𝒊’s are 0s.  At these positions 𝒚𝒊 cannot be greater than 𝒙𝒊 as the knapsack capacity will be breached. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 25
  • 26. KNAPSACK PROBLEM  Hence we have proved the statement that, if 𝒚𝒌 ≠ 𝒙𝒌 then 𝒚𝒌 < 𝒙𝒌.  This means the assumed optimal solution Y is giving a performance which is less than our greedy solution X.  We now transform the assumed optimal solution Y into the greedy solution X and prove the theorem.  By transformation, we mean, since 𝒚𝒌 < 𝒙𝒌, we bring 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 26
  • 27. KNAPSACK PROBLEM Phase 3 Example  We increase 𝒚𝒌 to 𝒙𝒌. We also reduce 𝒚𝒌+𝟏, 𝒚𝒌+𝟐, ……., 𝒚𝒏 accordingly, so that the balance holds.  Let this transformed solution be Z = (𝒛𝟏, 𝒛𝟐,……., 𝒛𝒏)  We can make the following observations on Z,  For 1 ≤ i ≤ k, 𝒛𝒊 = 𝒙𝒊  𝒘𝒌( 𝒙𝒌 - 𝒚𝒌 ) = 𝒊=𝒌+𝟏 𝒏 𝒘𝒊 ( 𝒚𝒊 − 𝒛𝒊 ) 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 27
  • 28. KNAPSACK PROBLEM  We now go on to prove that Y = Z.  We already know that Z = X.  Hence if we prove Y = Z, then, X = Y, which is the proof of the theorem. We have, 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + 𝑝𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1 𝑛 𝑝𝑖(𝑦𝑖 − 𝑧𝑖) Let us rewrite 𝑝𝑘 as 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘) and 𝑝𝑖 as 𝑤𝑖 ∗ (𝑝𝑖/ 𝑤𝑖). We get, 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1 𝑛 𝑤𝑖 ∗ (𝑝𝑖/ 𝑤𝑖) (𝑦𝑖 − 𝑧𝑖) 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) - 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 28
  • 29. KNAPSACK PROBLEM 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + 𝑤𝑘 ∗ (𝑝𝑘/ 𝑤𝑘)(𝑥𝑘- 𝑦𝑘) - [ 𝑖=𝑘+1 𝑛 𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)](𝑝𝑘/ 𝑤𝑘) 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + (𝑝𝑘/ 𝑤𝑘)[𝑤𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1 𝑛 𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)] Since we already have, 𝒘𝒌(𝒙𝒌- 𝒚𝒌 ) = 𝒊=𝒌+𝟏 𝒏 𝒘𝒊 ( 𝒚𝒊 − 𝒛𝒊) 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 + (𝑝𝑘/ 𝑤𝑘)[𝑤𝑘(𝑥𝑘- 𝑦𝑘) - 𝑖=𝑘+1 𝑛 𝑤𝑖 ∗ (𝑦𝑖 − 𝑧𝑖)] 𝑖=1 𝑛 𝑝𝑖 𝑧𝑖 = 𝑖=1 𝑛 𝑝𝑖 𝑦𝑖 𝒁 = 𝒀, Hence Proved 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 29
  • 30. KNAPSACK PROBLEM Example Algorithm GreedyKnapsack(m, n){ for i := 1 to n do x[i] := 0.0; U := m; for i := 1 to n do{ if(w[i] > U) then break; x[i] := 1.0; U := U – w[i]; } if( i ≤ n) then x[i] = U / w[i]; } 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 30
  • 31. KNAPSACK PROBLEM Example Find the optimal solution for the given instance of Knapsack using greedy technique. n = 7, m = 15, profits = (10, 5, 15, 7, 6, 18, 3) Weights = (2, 3, 5, 7, 1, 4, 1) Solution  We already know that the optimal solution is obtained by selecting the objects in the decreasing order of their value to weight ratio.  We first find this ration for all the objects.. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 31
  • 32. KNAPSACK PROBLEM i 1 2 3 4 5 6 7 𝒑𝒊 10 5 15 7 6 18 3 𝒘𝒊 2 3 5 7 1 4 1 𝒑𝒊/𝒘𝒊 5 1.6 3 1 6 4.5 3 m = 15 𝒙𝒊 1 m = 14 1 m = 12 1 m = 8 1 m = 3 1 m = 2 2/3 m = 0 0 Total Profit = 1*10+2/3*5+1*15+1*6+1*18+1*3 Total Profit = 55.33 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 32
  • 33. JOB SEQUENCING WITH DEADLINES  Consider there are n jobs/tasks.  Each job has a deadline 𝒅𝒊 such that 𝒅𝒊 ≥ 0.  Every job also has a profit/value 𝒑𝒊 such that 𝒑𝒊 > 0.  For a job, we get its profit only when the job is completed before its deadline.  A machine is provided to execute the jobs.  Only one machine is provided. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 33
  • 34. JOB SEQUENCING WITH DEADLINES  There are some conditions defined for the problem:  Every job takes one time unit to complete execution.  A job can be taken for execution only when the system clock is within the deadline of the considered job.  The objective of the problem is to execute as many jobs as possible and get maximum profit. Example Solve the Job Sequencing problem for the following instance. n=4 (𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒)=(100, 10, 15, 27) (𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 1, 2, 1) 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 34
  • 35. JOB SEQUENCING WITH DEADLINES  The data given is, (𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒)=(100, 10, 15, 27) (𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 1, 2, 1)  We find all possible solutions and then select the optimal one. 0 1 2 3 4 (J1, J2) = 110 (J1, J3) = 115 (J1, J4) = 127 (J2, J3) = 25 (J3, J4) = 42 J1 J2 0 1 2 3 4 J1 J3 0 1 2 3 4 J1 J4 0 1 2 3 4 J2 J3 0 1 2 3 4 J3 J4 (J1) = 100 (J2) = 10 (J3) = 15 (J4) = 27 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 35
  • 36. JOB SEQUENCING WITH DEADLINES  This way of getting the optimal solution is time consuming.  An efficient approach is to arrange the jobs in the decreasing order of their profits and then select the jobs from the order for execution.(Greatest Profit Next strategy)  We have, Jobs - J1 J4 J3 J2 Profits - 100 27 15 10 Deadline - 2 1 2 1 Hence the optimal solution is, 0 1 2 3 4 J4 J1 (J4, J1) is the optimal solution with the profit 127 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 36
  • 37. Algorithm GreedyJob(d, J, n) //J is a set of jobs that can be completed by their deadlines { J := {1}; for i := 2 to n do { if( all jobs in J υ {i} can be completed by their deadlines) then J := J υ {i}; } } JOB SEQUENCING WITH DEADLINES High Level description of the Job Sequencing algorithm 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 37
  • 38. JOB SEQUENCING WITH DEADLINES Algorithm JS(d, J, n){ //d is an array of deadlines of n jobs. s is an array of slots. //J is a set of jobs that can be completed by their deadlines k := 0; //indicates the number of jobs executed for i := 1 to n do j[i] := s[i] :=0; for i := 1 to n do{ if(s[d[i]] == 0){ s[d[i]] := i; j[i] := 1; k++; }else{ for x := d[i]-1 to 1 step -1{ if(s[x]==0){ s[x] := i; j[i] := 1; k++; break; } } } return k; } Example 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 38
  • 39. JOB SEQUENCING WITH DEADLINES Example n=5, (𝒑𝟏, 𝒑𝟐, 𝒑𝟑, 𝒑𝟒, 𝒑𝟓)=(20, 15, 10, 5, 1) (𝒅𝟏, 𝒅𝟐, 𝒅𝟑, 𝒅𝟒) = ( 2, 2, 1, 3, 3). Solution We solve this problem using feasibility representation. J Assigned slots Job considered Action profit Ø none 1 Assign to slot [1,2] 0 {1} [1,2] 2 Assign to slot [0,1] 20 {1,2} [0,1], [1,2] 3 Cannot fit. Reject 35 {1,2} [0,1], [1,2] 4 Assign to slot [2,3] 35 {1,2,4} [0,1], [1,2], [2,3] 5 Cannot fit. Reject 40 Hence, the optimal solution is {1, 2, 4} with a profit of 40. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 39
  • 40. JOB SEQUENCING WITH DEADLINES Exercise Problems 1. n =4, D=(4, 1, 1, 1), P=(20, 10, 40, 30) 2. n =5, D=(2, 1, 2, 1, 3), P=(100, 19, 27, 25, 15) 3. n =7, D=(1, 3, 4, 3, 2, 1, 2), P=(3, 5, 20, 18, 1, 6, 30) 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 40
  • 41. MINIMUM COST SPANNING TREE  Let G=(V, E) be an undirected graph with V vertices and E edges. Then, “A Spanning Tree of an undirected connected graph is its connected acyclic sub graph (i.e., a tree) that contains all the vertices of the graph”  A spanning tree satisfies the property that for a given graph G, it’s spanning tree is a minimal sub graph 𝑮| such that  V(G) = V(𝑮|)  𝑮| is connected 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 41
  • 42. MINIMUM COST SPANNING TREE  A Minimal sub graph is one which has the fewest number of edges.  Sometimes, edges of a graph are assigned with some numerical values.  These values are referred to as Cost of the edges.  The Cost of a Tree is the sum of cost of all the edges in the tree. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 42
  • 43. MINIMUM COST SPANNING TREE Hence, “The Minimum Cost Spanning Tree(MST) for a graph G is the spanning tree of the given graph such that its cost is minimal.” a b c d 1 2 3 5 a b c d 1 2 3 a b c d 1 3 5 a b c d 1 2 5 Graph w(T1) =6 w(T2) =9 w(T3) =8 Graph and its spanning trees, with T1 being the minimum spanning tree. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 43
  • 44. MINIMUM COST SPANNING TREE Prim’s Algorithm  This algorithm constructs the MST edge by edge.  All the edges that have been chosen to be part of the MST are stored in an edge set A.  Selection of an edge to be made part of a MST must satisfy the following conditions:  This edge results in a minimal cost sub graph.  The inclusion of this edge in the sub graph ensures that the sub graph remains a tree. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 44
  • 45. MINIMUM COST SPANNING TREE “If A is a set of edges selected so far, then A forms a tree. The next edge (u, v) to be included in A is a minimum cost edge not in A with the property that A U {(u, v)} is also a tree” Prim’s Algorithm 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 45
  • 46. MINIMUM COST SPANNING TREE 1 6 5 4 2 3 7 10 25 22 24 18 12 16 14 28 1 6 5 4 2 3 7 10 25 22 12 16 14 𝑉𝑇 = 1 𝐸𝑇 = |V|= 7 𝑉𝑇 = 1, 6 𝐸𝑇 = (1, 6) 𝑉𝑇 = 1, 6, 5 𝐸𝑇 = (1, 6),(6,5) 𝑉𝑇 = 1, 6, 5, 4 𝐸𝑇 = (1, 6),(6,5),(5,4) 𝑉𝑇 = 1, 6, 5, 4, 3 𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3) 𝑉𝑇 = 1, 6, 5, 4, 3, 2 𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3),(3,2) 𝑉𝑇 = 1, 6, 5, 4, 3, 2, 7 𝐸𝑇 = (1, 6),(6,5),(5,4),(4,3),(3,2),(2,7) Minimal Cost Spanning tree with cost = 99 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 46
  • 47. MINIMUM COST SPANNING TREE Example b c f d a e 3 1 6 5 8 2 4 4 5 6 In the solution for this problem, we use the following notation to represent a node, node_name(predecessor_tree vertex, edge_cost) 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 47
  • 48. MINIMUM COST SPANNING TREE Tree Vertices Remaining Vertices Illustration a(-,- ) b(a,3), c(-,∞), d(-, ∞) e(a,6), f(a,5) b c f d a e 3 1 6 5 8 2 4 4 5 6 3 b(a,3) c(b,1), d(-, ∞), e(a,6), f(b,4) b c f d a e 3 1 6 5 8 2 4 4 5 6 3 1 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 48
  • 49. MINIMUM COST SPANNING TREE Tree Vertices Remaining Vertices Illustration c(b,1) d(c, 6), e(a,6), f(b,4) b c f d a e 3 1 6 5 8 2 4 4 5 6 3 1 f(b,4) d(f, 5), e(f,2) 4 b c f d a e 3 1 6 5 8 2 4 4 5 6 3 1 4 2 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 49
  • 50. MINIMUM COST SPANNING TREE Tree Vertices Remaining Vertices Illustration e(f,2) d(f, 5) b c f d a e 3 1 6 5 8 2 4 4 5 6 3 1 4 2 5 d(f,5) MST with cost = 15 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 50
  • 51. MINIMUM COST SPANNING TREE Kruskal’s Algorithm  This is another approach to obtain a MST for a given graph.  The algorithm starts by arranging the edges of the given graph in ascending order.  Edges are then selected one at a time from this order, to be included in the MST.  An edge is included, if and only if it does not result in a cycle in the MST. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 51
  • 52. MINIMUM COST SPANNING TREE ALGORITHM Kruskal(G) //Input: A weighted connected graph G = <V, E> //Output: 𝑬𝑻, the set of edges composing a minimum spanning tree of G sort E in non decreasing order of the edge weights w(𝒆𝒊𝟏) ≤ . . . ≤ w(𝒆𝒊|𝑬|) 𝑬𝑻 ← ∅; ecounter ← 0 //initialize the set of tree edges and its size k ← 0 //initialize the number of processed edges while ecounter < |V| − 1 do k← k + 1 if 𝑬𝑻 ∪ {𝑒𝑖𝑘} is acyclic 𝑬𝑻 ← 𝑬𝑻 ∪ {𝑒𝑖𝑘}; ecounter ←ecounter + 1 return 𝑬𝑻 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 52
  • 53. MINIMUM COST SPANNING TREE Example Find the MST for the graph using Kruskal’s algorithm. b c f d a e 3 1 6 5 8 2 4 4 5 6 We first arrange the edges of this graph in ascending order. The resulting order is, bc ef ab bf cf af df ae cd de 1 2 3 4 4 5 5 6 6 8 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 53
  • 54. MINIMUM COST SPANNING TREE Tree Edges Sorted List of Edges Illustration b c f d a e 3 1 6 5 8 2 4 4 5 6 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 1 b c 1 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c f d a e 3 1 6 5 8 2 4 4 5 6 1 2 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 54
  • 55. MINIMUM COST SPANNING TREE Tree Edges Sorted List of Edges Illustration ef 2 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c f d a e 3 1 6 5 8 2 4 4 5 6 1 2 3 ab 3 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c f d a e 3 1 6 5 8 2 4 4 5 6 1 2 3 4 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 55
  • 56. MINIMUM COST SPANNING TREE Tree Edges Sorted List of Edges Illustration bf 4 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c f d a e 3 1 6 5 8 2 4 4 5 6 1 2 3 4 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c ef a b bf cf af df ae c d d e 1 2 3 4 4 5 5 6 6 8 b c f d a e 3 1 6 5 8 2 4 4 5 6 1 2 3 4 5 df 5 MST with cost = 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 56
  • 57. MINIMUM COST SPANNING TREE Practise Examples Find the MST for the following graphs using Prims and Kruskals algorithms. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 57
  • 58. MINIMUM COST SPANNING TREE Disjoint Subsets and Union-Find Algorithms  Union-Find algorithm is a strategy which is used to efficiently implement the Kruskal’s algorithm.  This strategy is based on the concept of Disjoint Subsets.  The strategy divides the given edge set 𝑬 into disjoint subsets, with each subset having just one edge.  It then combines these subsets one by one until a MST is obtained. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 58
  • 59. MINIMUM COST SPANNING TREE  The Union-Find algorithm performs the following operations:  makeset(x) - creates a one-element set {x}.  find(x) - returns a subset containing x.  union(x, y) - constructs the union of the disjoint subsets 𝑺𝒙 and 𝑺𝒚 containing x and y.  For example, let S = {1, 2, 3, 4, 5, 6}. Then makeset(i) creates the set { i } for all elements in the set, i.e., {1}, {2}, {3}, {4}, {5}, {6}  Performing union(1, 4) and union(5, 2) yields {1, 4}, {5, 2}, {3}, {6} 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 59
  • 60. MINIMUM COST SPANNING TREE  Now if we want to combine the subsets {1,4} and {5,2}, how do we call the Union() function?  There is a problem here because, Union() function accepts two values as parameters.  But here we have two subsets to be combined, each with two elements.  We cannot pass the whole subset as parameter to the Union() function as the function accepts single elements only.  Hence, we need to have a single element that represents a subset. We call this element, the Subset Representative. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 60
  • 61. MINIMUM COST SPANNING TREE  There are several logics followed to assign a representative for a subset.  The approach that we adopt is, the smallest element of a subset is its representative.  With this, we can comfortably merge the subsets {1,4} and {5,2} by making the call Union(1,2)  Here element 1 represents the subset {1,4} and element 2 represents subset {5,2}. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 61
  • 62. MINIMUM COST SPANNING TREE  We now have a look at various data structures that are used to implement the Union-Find algorithm.  We have 2 strategies for Union-Find based on the type of data structures used.  The strategies are  Quick Find  Quick Union  Quick Find uses arrays and linked list while Quick Union uses trees. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 62
  • 63. MINIMUM COST SPANNING TREE  Quick Find  This strategy maintains an array that contains information about the representative of each subsets.  The array indices are the elements of the set S.  The values at the respective indices are the representatives of the subset containing the element.  The strategy also maintains each subset as a linked list with header node.  The header node contains pointers to the first and last element of the list as well as information about the total number of nodes in the list. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 63
  • 64. MINIMUM COST SPANNING TREE  Consider the set S={1, 2, 3, 4, 5, 6}  Initially, this set gets divided into the subsets {1}, {2}, {3}, {4}, {5}, {6}  The linked list for the subsets are as follows. 1 Size Last First List 1 1 0 1 Size Last First List 2 2 0 1 Size Last First List 3 3 0 1 Size Last First List 4 4 0 1 Size Last First List 5 5 0 1 Size Last First List 6 6 0 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 64
  • 65. MINIMUM COST SPANNING TREE  The array of subset representatives is, Subset Representatives Element Index Representative 1 2 3 4 5 6 1 2 3 4 5 6  We will now see what happens to the array and the linked lists after the calls for union(1,4) and union(2,5) are made. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 65
  • 66. MINIMUM COST SPANNING TREE 2 Size Last First List 1 1 2 Size Last First List 2 1 Size Last First List 3 3 0 1 Size Last First List 6 6 0 4 0 2 5 0 Subset Representatives Element Index Representative 1 2 3 4 5 6 1 2 3 1 2 6 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 66
  • 67. MINIMUM COST SPANNING TREE  Next the call for union(1,2) and union(3,6) will result in the following scenario, 4 Size Last First List 1 1 2 Size Last First List 3 3 6 0 4 2 5 0 Subset Representatives Element Index Representative 1 2 3 4 5 6 1 1 3 1 1 3 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 67
  • 68. MINIMUM COST SPANNING TREE  This way the process continues in Quick find strategy.  To summarize, in Quick Find approach,  makeset(x) involves creating a representative array in which the representative for each element is itself initially.  This operation also involves creating a linked list of one node for each subset.  find(x) involves retrieving x’s representative from the array. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 68
  • 69. MINIMUM COST SPANNING TREE  Quick Union  This approach uses Trees to implement the operations.  makeset(x) involves creating a tree of one node for all x in the set S.  The root of a tree is the representative of that subset.  Edges in the tree are directed from children to parents.  union(x,y) involves attaching the root of one tree to the root of the other. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 69
  • 70. MINIMUM COST SPANNING TREE  For a given set S={1, 2, 3, 4, 5, 6}, makeset(x) results in the following scenario. 1 2 3 4 5 6  union(1,4) and union(2,5) will result in, 1 4 2 5  union(1,2) and union(3,6) will result in, 1 4 2 5 3 6 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 70
  • 71. DIJKSTRA’S ALGORITHM  This algorithm is used to solve the Single Source Shortest Path problem.  The problem is defined as, “For a given vertex called the source in a weighted connected graph, find shortest paths to all its other vertices.”  The Dijkstra’s algorithm works only on graphs with edges having non negative weights. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 71
  • 72. DIJKSTRA’S ALGORITHM  In this algorithm, we will have two categories of vertices:  Tree vertices  Fringe vertices  Tree vertices are those vertices of the given graph that are part of the shortest path tree.  Fringe vertices are the remaining vertices that are yet to be added to the shortest path tree.  We now get introduced to 2 notations that will be regularly used in the algorithm:  𝒅𝒊 - shortest distance from the source vertex to vertex i.  w(a, b) – weight of the edge between vertices a and b. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 72
  • 73. DIJKSTRA’S ALGORITHM The steps involved in finding the Single Source Shortest path for a given graph are: Step 1 – Identify a vertex 𝒖∗ that is closest to the source. Move 𝒖∗ from the fringe to the set of tree vertices. Step 2 - For each remaining fringe vertex u that is connected to 𝒖∗ by an edge of weight w(𝒖∗, u) such that 𝒅𝒖∗ + w(𝒖∗, u) < 𝒅𝒖, update the labels of u by 𝒖∗ and 𝒅𝒖∗ + w(𝒖∗ , u) respectively. a b c d 1 1 4 1 2 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 73
  • 74. DIJKSTRA’S ALGORITHM  The vertex representation used in this algorithm is, node_name(predecessor, cost of shortest path from source)  We now find the single source shortest path for the following graph using Dijkstra’s strategy. b c d a e 4 2 5 4 6 7 3 Source 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 74
  • 75. DIJKSTRA’S ALGORITHM Tree Vertices Fringe Vertices Illustration a(- , 0) b(a , 3), c(- , ∞), d(a , 7), e(- , ∞) b c d a e 4 2 5 4 6 7 3 3 b(a , 3) c(b , 7), d(b , 5), e(- , ∞) b c d a e 4 2 5 4 6 7 3 3 2 d(b , 5) c(b , 7), e(d , 9) b c d a e 4 2 5 4 6 7 3 3 2 4 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 75
  • 76. DIJKSTRA’S ALGORITHM Tree Vertices Fringe Vertices Illustration c(b , 7) e(d , 9) b c d a e 4 2 5 4 6 7 3 3 2 4 4 e(d , 9) Single Source Shortest Paths are: a – b of length 3 a – b – c of length 7 a – b – d of length 5 a – b – d – e of length 9 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 76
  • 77. DIJKSTRA’S ALGORITHM Algorithm b c d a e 4 2 5 4 6 7 3 Source 0 1 2 3 4 Q Ø Ø Ø Ø Ø 𝒑𝒗 0 1 2 3 4 (a,∞ ) (b,∞ ) (c,∞) (d,∞ ) (e,∞ ) 0 1 2 3 4 (a,0) (b,∞ ) (c,∞) (d,∞ ) (e,∞ ) Ø 𝑽𝑻 0 i a 𝒖∗ a 0 1 2 3 4 (a,0) (b,∞ ) (c,∞) (d,∞ ) (e,∞ ) ∞ ∞ ∞ ∞ ∞ 𝒅𝒗 0 ∞ ∞ ∞ ∞ b u 0 3 ∞ ∞ ∞ Ø a Ø Ø Ø 0 1 2 3 4 (a,0) (b,3) (c,∞) (d,∞ ) (e,∞ ) d 0 3 ∞ 7 ∞ Ø a Ø a Ø 0 1 2 3 4 (a,0) (b,3) (c,∞) (d,7) (e,∞ ) 1 0 1 2 3 4 (a,0) (b,3) (c,∞) (d,7) (e,∞ ) b a b c 0 3 7 7 ∞ Ø a b a Ø 0 1 2 3 4 (a,0) (b,3) (c,7) (d,7) (e,∞ ) d 0 3 7 5 ∞ Ø a b b Ø 0 1 2 3 4 (a,0) (b,3) (c,7) (d,5) (e,∞ ) 2 d a b d e 0 1 2 3 4 (a,0) (b,3) (c,7) (d,5) (e,∞ ) 0 3 7 5 9 Ø a b b d 0 1 2 3 4 (a,0) (b,3) (c,7) (d,5) (e,9) 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 77
  • 78. DIJKSTRA’S ALGORITHM Exercise Problems b c d a e 4 2 5 4 6 7 3 Source a d h k b e i l f j c g 3 3 1 6 8 4 5 7 6 3 4 2 5 6 5 2 3 9 5 4 Source 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 78
  • 79. HUFFMAN TREES  Huffman trees are used for Encoding.  Encoding is the process of converting a given text/message into some other form.  This is accomplished by converting each character of the text into a sequence of bits.  The resulting string is called Codeword.  There are 2 encoding strategies:  Fixed length encoding  Variable length encoding 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 79
  • 80. HUFFMAN TREES Fixed Length Encoding  As the name says, each character in the text is replaced by a bit sequence of the same length.  ASCII code is an example(A=65, B=66, …., Z=90). Variable Length Encoding  Different characters are assigned with bit sequences of different length.  Frequently occurring characters are assigned with smaller length bit string while rarely occurring characters are assigned with a longer bit string. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 80
  • 81. HUFFMAN TREES Problem with Variable Length Encoding  Consider the encoding scheme where a=01, e =00, h=010, l=011, o=10, i=11.  With this, we encode the text hello as 0100001101110.  The problem lies in decoding this bit string.  Since it is a variable length encoding, every character will have bit strings of different length.  Hence finding the beginning or end of bit string for a character becomes difficult.  The above bit string could also be decoded as aeeilo. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 81
  • 82. HUFFMAN TREES Solution  We now need a solution such that, even in variable length encoding, we can clearly identify the beginning and end of a particular character.  For this, we adopt a strategy called Prefix Code or Prefix free code.  As the name says, the code for a character will not have any prefixes.  This means, each character’s code in the code word will not have anything preceding it.  Now the question is, how can such a strategy be implemented. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 82
  • 83. HUFFMAN TREES  This is accomplished using Trees, specifically Binary Trees.  We construct a binary tree such that,  Leaf nodes represent characters of the text.  The edges from the root to a leaf node represent the bit code for the character in that leaf node.  All the left edges are labeled 0 and all the right edges are labeled 1. This strategy is presented next. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 83
  • 84. HUFFMAN TREES  With this scheme, decoding the bit string 011010101011 will only result in HELLO. a e h l 0 0 0 0 0 1 1 1 1 The code for the characters are A = 00 E = 010 H = 011 L = 10 O = 11 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 84
  • 85. HUFFMAN TREES  A formal approach of constructing such a binary tree was proposed by Huffman and is presented in the following algorithm. Huffman’s Algorithm Step 1:  Initialize n one-node trees and label them with the symbols of the alphabet given.  Record the frequency of each symbol in its tree’s root to indicate the tree’s weight. (More generally, the weight of a tree will be equal to the sum of the frequencies in the tree’s leaves.) Step 2: Repeat the following steps, until a single tree is obtained.  Find two trees with the smallest weight.  Make them the left and right sub tree of a new tree and record the 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 85
  • 86. HUFFMAN TREES  A tree constructed in this way is known as a Huffman tree.  The code generated by such a tree is known as Huffman Code. Example Consider the five-symbol alphabet {A, B, C, D, _} with the following occurrence frequencies in a text made up of these symbols: symbol A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 86
  • 87. HUFFMAN TREES Step 1  We start by creating n one node trees.  Since there are 5 characters, we will have 5 trees, A B C D _  We now assign the frequencies of these characters as their node weights. A B C D _ 0.35 0.1 0.2 0.2 0.15 Step 2  We need to select 2 trees with least weights. For this we first arrange these trees in the increasing order of weights. 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 87
  • 88. HUFFMAN TREES  We now select nodes B and _ to be merged as one tree.  The total weight of the resulting tree is recorded in the root. B _ C D A 0.1 0.15 0.2 0.2 0.35 B _ C D A 0.1 0.15 0.2 0.2 0.35 B 0.1 _ 0.15 C 0.2 D 0.2 A 0.35 0.25 B 0.1 _ 0.15 C 0.2 D 0.2 A 0.35 0.25 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 88
  • 90. HUFFMAN TREES B 0.1 _ 0.15 C 0.2 D 0.2 A 0.35 0.25 0.4 0.6 1.0 0 0 0 0 1 1 1 1 Huffman Tree and Huffman Code 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 90
  • 91. HUFFMAN TREES  The expected number of bits per character using Huffman approach is given by, 𝒊=𝟏 𝒏 𝒏𝒐. 𝒐𝒇 𝒃𝒊𝒕𝒔 𝒇𝒐𝒓 𝒊𝒕𝒉 𝒄𝒉𝒂𝒓𝒂𝒄𝒕𝒆𝒓 + 𝒇𝒓𝒆𝒒𝒖𝒆𝒏𝒄𝒚 𝒐𝒇 𝒊𝒕𝒉 𝒄𝒉𝒂𝒓𝒂𝒄𝒕𝒆𝒓  For our example, Expected no. of bits = 2*0.35 + 3*0.1+ 2*0.2+ 2*0.2 + 3*0.15 = 2.25 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 91
  • 92. THANK YOU 9/24/2023 Dr. K. Balakrishnan, Dept. of CSE, SaIT, 92

Editor's Notes

  1. In the above algorithm, Select() – this function selects a subset from a[1:n] and assign this subset to x. Feasible() – this function determines whether the subset x satisfies the problem conditions. In other words, it checks if x is a feasible solution. If so it returns true, otherwise returns false. Union() – this function combines x with the existing set of solutions.
  2. Lemma 1 This observation is self explanatory. It clearly says that the weights of all the objects put together will not exceed the capacity of knapsack. If this is the case, then we are free to select all the objects as a whole without the fear of exceeding the knapsack capacity. Hence 𝒙 𝒊 =1 such that 1 ≤ i ≤ n is definitely true for this Lemma. Lemma 2 This lemma means that, an optimal solution to the knapsack problem is that solution which fills the knapsack exactly to its capacity.
  3. NOTE: k is the index position which is the point of difference between X and Y. This means both X and Y objects have the same values until index position k-1. They differ only at the kth position.