3. Last Week…
Divide: breaking down a problem into two or more sub-problems of the same (or related) type,
until
Conquer: these sub-problems become simple enough to be solved directly.
The solutions to the sub-problems are then combined to give a solution to the original problem.
• The base case for the recursion is subproblems of constant size.
• Classic examples: Searching algorithms (Binary Search), Sorting algorithms (Merge Sort, Quick Sort)
Divide and conquer is an algorithm design paradigm.
4. Divide: breaking down the problem into related type,
until
Conquer: these sub-problems become simple enough to
be solved directly.
Last Week…
Binary Search
At each step of the algorithm, we
divide the array in two halves.
5. Merge Sort
Last Week…
• Breaking down a list into several
sub-lists until each sublist
consists of a single element
• Merging those sublists in a
manner that results into a sorted
list.
6. Quick Sort
Last Week…
Selecting a “pivot” element from the array and partitioning
the other elements into two sub-arrays, according to
whether they are less than or greater than the pivot.
7. Generic form of Divide and Conquer Recurrences
• Divide and Conquer algorithms often conform to the following generic form of recurrence
• With respect to divide and conquers the terms signify the following:
ü n : size of input
ü a : number of sub-problems
ü
!
"
: size of each sub-problem
ü f(n) : cost of work outside recursion
• For the Binary Search algorithm, the recurrance relation is : 𝑇 𝑛 = 1. 𝑇
!
#
+ 𝜃 1
𝑇 𝑛 = 𝑎𝑇
𝑛
𝑏
+ 𝑓 𝑛 𝑎 ≥ 1 𝑎𝑛𝑑 𝑏 > 1 𝑎𝑟𝑒 𝑐𝑜𝑛𝑠𝑡𝑎𝑛𝑡𝑠
We searched in only
one of the halves
We divided the
array in two halves
Last Week…
Analysis can be done using recurrence equations
10. Basic Algorithm Design Techniques
• Divide and Conquer
• Greedy
• Dynamic Programming
1. Greedy Method
It is useful for problems that can be divided into
smaller, independent subproblems.
It is suitable for problems where making the locally optimal
choice at each stage leads to a globally optimal solution.
It is effective for problems with overlapping subproblems
and optimal substructure, allowing the reuse of previously
computed solutions.
The choice of algorithm depends on the specific characteristics of the problem at hand.
11. • A greedy algorithm is a simple, intuitive, and efficient approach to solving optimization problems.
• In a greedy algorithm, we make a series of choices, each choice being the one that appears to be the best at
that moment, without considering the potential consequences of that choice on future decisions.
• The key characteristic of a greedy algorithm is that it makes locally optimal choices, hoping that these
choices will lead to a globally optimal solution.
• However, it is important to note that not all problems can be solved optimally using a greedy approach.
• Greedy algorithms may not always guarantee the best possible solution, and they can sometimes lead to
suboptimal outcomes.
• It is essential to carefully analyze the problem and the chosen criteria to ensure that a greedy algorithm is an
appropriate and effective solution.
1. Greedy Method
12. The general steps involved in designing and implementing a greedy algorithm: Initialization, Selection,
Feasibility, Termination.
1. Initialization: Start with an empty solution or an initial solution.
2. Selection: At each step, choose the best available option based on a specific criterion. This choice
should be locally optimal, meaning it looks good at the current step.
3. Feasibility: Ensure that the selected option satisfies the problem’s constraints.
4. Termination: Continue making choices until a stopping condition is met, such as reaching a desired
state or having no more options to consider.
1. Greedy Method
13. This method is used for solving optimization problems.
Optimization problems: problems that require either
minimum result or maximum result.
Problem: Travel from one location A to location B.
Constraints (conditions given in the problem): to cover
travel within 12 hours
Objective: To cover this journey in minimum cost.
(minimization problem)
Feasible Solutions: solutions that satisfy constraints
Optimal Solution: solution that is already feasible and
satisfy objective of the problem either maximum result or
minimum result.
There are more solution to travel from A to B.
Feasible Solutions: s4, s5
Optimal Solution: s4
Note: There is definitely only one optimal solution.
1. Greedy Method
14. Suppose you want to hire a person in a company;
• Technical test, group discussion, technical interview
• You will filter the people.
• Depending on the selection procedure, it is the best
person.
• You got the optimal solution.
• The approach is greedy.
Suppose you want to buy the best (optimal) car;
v In terms of features, car with maximum cost may be optimal
One solution:
• How to choose best car?
- Looking all the brands, all models of car available in the city.
- Check all cars in city, this one is the best.
- Time consuming and costly process.
Other solution:
• Checking the features
• Selecting based on brands (TOGG), then selecting top models,
well known car à best car
• Is it best car in the world?, NO.
• Own method of selection. It is the best car for me.
• We used greedy approach (without checking all cars in the city)
1. Greedy Method
15. 1. Coin Changing Problem: Given a set of coin denominations and an amount
to make change for, find the minimum number of coins needed to make that
amount.
2. Fractional Knapsack Problem: Given a set of items, each with a weight
and a value, and a knapsack with a maximum weight capacity, select a
combination of items to maximize the total value while staying within the
weight limit.
3. Huffman Coding: Given a set of characters and their frequencies in a text,
construct a binary tree to encode the characters in a way that minimizes the
total length of the encoded message. (Using shorter binary representations
for characters with high frequencies)
2. Sample Greedy Algorithms
16. 4. Minimum Spanning Tree: Spans all the vertices in a connected, undirected graph with the minimum
possible sum of edge weights.
Prim’s Algorithm
- Find the minimum spanning tree of a connected, undirected graph with weighted edges.
- This algorithm is used to design efficient network layouts.
Kruskal’s Algorithm
- Another approach to finding a minimum spanning tree in a graph.
- It focuses on selecting the smallest weighted edges while avoiding cycles.
5. Shortest Path: Find the shortest path from a source node to all other nodes in a weighted graph. It's
commonly used in routing and network protocols.
Dijkstra's Algorithm
6. Interval Scheduling: Given a set of intervals, find the maximum number of non-overlapping intervals that
can be selected. (Choosing activities that has smallest number of conflicts)
2. Sample Greedy Algorithms
17. Suppose we have following coin denominations: 25 cents, 10 cents, 5 cents, and 1 cent).
Problem: Find the minimum number of coins needed to make change for n cents.
A greedy algorithm
• Construct the solution coin by coin, reducing the amount at each step.
• Greedy choice: at each step, choose the coin of the largest denomination that does not exceed the remaining
amount.
Example: n = 89 cents. What is the optimal solution? Answer: 8 coins(3 25 cents, 1 10 cents, and 4 1cents).
1. Coin Changing Problem
2. Sample Greedy Algorithms
18. 2. Fractional Knapsack Problem
• There is a knapsack that can hold items of total weight at most W.
• There are now n items with weights w1,w2,…, wn.
• Each item also has a value (profit) v1,v2,…,vn.
• The items are infinitely divisible: can put ½ (or any fraction) of an item into the knapsack.
Goal: Select fractions p1, p2,…,pn such that
Capacity constraint: p1w1+p2w2+…+pnwn <= W
Maximum value: p1v1+p2v2+…+pnvn maximized
2. Sample Greedy Algorithms
19. 2. Fractional Knapsack Problem
Heightest
profit
by
weight
Constraint
Maximum profit
2. Sample Greedy Algorithms
ü Capacity constraint: p1w1+p2w2+…+pnwn <= W
ü Maximum value: p1v1+p2v2+…+pnvn maximized
20. 2. Sample Greedy Algorithms
3. Huffman Coding
ü Huffman Codes is a data compression algorithm. (Lossless data compression)
ü A binary tree structure is used to perform compression.
ü The primary goal of Huffman coding is to efficiently represent a set of characters or symbols with
variable-length codes.
ü Huffman coding is a technique where
- frequently used characters are encoded with shorter codes,
- less frequently used characters are encoded with longer codes.
• Each character is encoded as 8 bits in ASCII encoding.
• Instead of giving a fixed length to each character, we can give shorter codes to frequently used
characters and longer codes to rarely used characters.
• The overall size of the message can be shorten.
21. • Let’s consider that we want to code this message:
"susie says it is easy.”
• A total of 9 different characters are used.
• Their frequencies of use are as follows:
• When this problem is solved with Huffman coding, the codes
assigned to the characters are determined as follows:
The entire message is encoded as follows:
2. Sample Greedy Algorithms
3. Huffman Coding
Fixed sized vs. Huffman
o 9 different characters in the message.
o Fixed size codes.
o It would be necessary to make a 4-digit coding for
each character.
Fixed sized coding : 22*4 = 88bit
Huffman Coding : 65bit
Gain : 23bit
Less memory usage : 23/88 = %26
23. • A tree is a connected acyclic graph.
• A spanning tree of an undirected graph G is a subgraph of G that is a tree containing all the vertices of G.
• In a weighted graph, the weight of a subgraph is the sum of the weights of the edges in the subgraph.
• A minimum spanning tree (MST) for a weighted undirected graph is a spanning tree with minimum
weight.
4. Minimum Spanning Tree
2. Sample Greedy Algorithms All vertices, connected, no cycles.
24. 4. Minimum Spanning Tree
G = (V,E)
V = { 1, 2, 3, 4, 5, 6 }
E = {(1,2), (2,3), (3,4), (4, 5), (5,6), (6,1)}
• Spanning tree is a subgraph of a graph.
S ⊆ G
S = (V’, E’)
V’ = V
∣E’∣ = ∣V∣ - 1
How many different spanning trees are possible? 6
Sample minimum spanning trees
2. Sample Greedy Algorithms
25. 4. Minimum Spanning Tree
2. Sample Greedy Algorithms
• The solution has to be a minimally connected graph (i.e. a graph without any cycles) or a tree,
• It must cover all the vertices of the graph (so that every house can be reached from every other house) - a spanning tree.
• Further the sum of weights of the edges (the cost of paving) should be the lowest - a Minimum Spanning Tree.
26. • This problem can be represented by a graph.
4. Minimum Spanning Tree
2. Sample Greedy Algorithms
1. Enough streets must be paved so that it is possible for everyone to travel from their house to anyone
else’s house along paved roads.
2. The paving should cost as little as possible.
3. The number of paving stones between each house represents the cost of paving that route.
Find the best route that connects all the houses, but uses as few paving stones as possible.
Two well-known algorithms for finding a minimum spanning tree are
ü Kruskal’s algorithm
ü Prim’s algorithm
28. 4. Minimum Spanning Tree
2. Sample Greedy Algorithms
Prim’s Algorithm:
• Start with the minimum cost edge.
• Always select the minimum cost edge from the graph but make sure that it should be connected all these
selected vertices.
Consider a graph G = (V, E);
Let T be a tree consisting of only the starting vertex x;
while (T has fewer than ∣V∣ vertices)
{
find a smallest edge connecting T to G-T;
add it to T;
}
30. • The minimum cost spanning tree is unique for a connected graph if all edge weights are distinct.
• If there are multiple edges with the same weight, the minimum spanning tree may not be unique.
• Both Kruskal's and Prim's algorithms guarantee the discovery of the minimum spanning tree.
4. Minimum Spanning Tree
2. Sample Greedy Algorithms