SlideShare a Scribd company logo
1 of 80
Module 3
Greedy technique
Define greedy technique
• The greedy method is one of the strategies like Divide and
conquer used to solve the problems. This method is used for
solving optimization problems. An optimization problem is a
problem that demands either maximum or minimum results.
• The Greedy method is the simplest and straightforward
approach. It is not an algorithm, but it is a technique. The main
function of this approach is that the decision is taken on the
basis of the currently available information. Whatever the
current information is present, the decision is made without
worrying about the effect of the current decision in future.
continued
• This technique is basically used to determine the feasible
solution that may or may not be optimal. The feasible solution is
a subset that satisfies the given criteria. The optimal solution is
the solution which is the best and the most favorable solution in
the subset. In the case of feasible, if more than one solution
satisfies the given criteria then those solutions will be
considered as the feasible, whereas the optimal solution is the
best solution among all the solutions.
Applications of Greedy Algorithm
• It is used in finding the shortest path.
• It is used to find the minimum spanning tree using the prim's
algorithm or the Kruskal's algorithm.
• It is used in a job sequencing with a deadline.
• This algorithm is also used to solve the fractional knapsack
problem.
Example
The Knapsack problem
• Given a Knapsack of a
maximum capacity of W and
N items each with its own
value and weight, throw in
items inside the Knapsack
such that the final contents
has the maximum value.
The selection of some things, each with profit and weight
values, to be packed into one or more knapsacks with capacity
is the fundamental idea behind all families of knapsack
problems. The knapsack problem had two versions that are as
follows:
 Fractional Knapsack Problem
 0 /1 Knapsack Problem
What is Knapsack Problem Using Greedy Method?
In this method, the Knapsack's filling is done so that the maximum capacity of the knapsack is utilized so that
maximum profit can be earned from it. The knapsack problem using the Greedy Method is referred to as:
Given a list of n objects, say {I1, I2,……, In) and a knapsack (or bag).
The capacity of the knapsack is M.
Each object Ij has a weight wj and a profit of pj
If a fraction xj (where x ∈ {0...., 1)) of an object Ij is placed into a knapsack, then a profit of pjxj is earned.
The problem (or Objective) is to fill the knapsack (up to its maximum capacity M), maximizing the total profit
earned.
Mathematically:
Knapsack Problem Algorithm Using Greedy Method
Knapsack Problem Using Greedy Method Pseudocode
A pseudo-code for solving knapsack problems using the greedy method is;
greedy fractional-knapsack (P[1...n], W[1...n], X[1..n]. M)
/*P[1...n] and W[1...n] contain the profit and weight of the n-objects ordered
such that X[1...n] is a solution set and M is the capacity of knapsack*/
{
For j ← 1 to n do
X[j]← 0
profit ← 0 // Total profit of item filled in the knapsack
weight ← 0 // Total weight of items packed in knapsacks
j ← 1
While (Weight < M) // M is the knapsack capacity
Fractional Knapsack Problem Using Greedy Method-
Fractional knapsack problem is solved using greedy method in the following steps-
Step-01:
For each item, compute its value / weight ratio.
Step-02:
Arrange all the items in decreasing order of their value / weight ratio
Step-03:
Start putting the items into the knapsack beginning from the item with the highest ratio.
Put as many items as you can into the knapsack.
EXAMPLE???
Knapsack Problem - 1
Obtain the optimal solution for the knapsack
problem using greedy method given the
following:
M = 15
n = 7
p1,p2,p3,p4,p5,p6,p7 = 10,5,15,7,6,18,3
w1,w2,w3,w4,w5,w6,w7= 2,3,5,7,1,4,1
Solution Vector = (1, 0,1, 4/7,0,1,0)= (1, 0,1, 0.57,0,1,0)
Optimal solution using this method is (x1, x2, x3,x4,x5,x6,x7) =
(1, 0,1, 0.57,0,1,0)
with profit = 47
Solution Vector = (1, 1,4/5,0,1,1,1)= (1, 1,0.8,0,1,1,1)
Optimal solution using this method is (x1, x2, x3,x4,x5,x6,x7) = (1, 1,0.8,0,1,1,1) with profit = 54
Optimal solution is not guaranteed using method 1 and 2
Solution Vector = (1, 2/3,1,0, 1,1,1)= (1, 0.67,1,0, 1,1,1)
Optimal solution is (x1, x2, x3,x4,x5,x6,x7) = (1, 0.67,1,0, 1,1,1) with profit [1*10+0.67*5+1*15+0*7+1*6+1*18+1*3]=
55.34 Weight=[1*2+0.67*3+1*5+0*7+1*1+1*4+1*1]=15
This greedy approach always results optimal solution
Problem Statement
In job sequencing problem, the objective is to find a sequence of jobs,
which is completed within their deadlines and gives maximum profit.
Solution
Let us consider, a set of n given jobs which are associated with deadlines and profit is
earned, if a job is completed by its deadline. These jobs need to be ordered in such a way
that there is maximum profit.
It may happen that all of the given jobs may not be completed within their deadlines.
Assume, deadline of ith job Ji is di and the profit received from this job is pi. Hence, the
optimal solution of this algorithm is a feasible solution with maximum profit.
Thus, D(i)>0
 for 1⊽i⊽n
Initially, these jobs are ordered according to profit, i.e. p1⊞p2⊞p3⊞...⊞pn
Algorithm: Job-Sequencing-With-Deadline (D,
J, n, k)
D(0) := J(0) := 0
k := 1
J(1) := 1 // means first job is selected
for i = 2 … n do
r := k
while D(J(r)) > D(i) and D(J(r)) ≠ r do
r := r – 1
if D(J(r)) ≤ D(i) and D(i) > r then
for l = k … r + 1 by -1 do
J(l + 1) := J(l)
J(r + 1) := i
k := k + 1
Analysis
In this algorithm, we are using two loops, one is within another. Hence, the
complexity of this algorithm is O(n2)
.
Examples:
Input:
Total profit – 20 + 25 + 35 + 30 = 110
We are given the jobs, their deadlines and associated profits as shown-
Jobs J1 J2 J3 J4 J5 J6
Deadlines 5 3 3 2 4 2
Profits 201 181 191 301 121 101
Step-01:
Firstly, we need to sort all the given jobs in decreasing order of their profit as follows.
Jobs J4 J1 J3 J2 J5 J6
Deadlines 2 5 3 3 4 2
Profits 300 200 190 180 120 100
Step-02:
For each step, we calculate the value of the maximum deadline.
Here, the value of the maximum deadline is 5.
So, we draw a Gantt chart as follows and assign it with a maximum time on the Gantt chart with 5 units as
shown below.
Now,
We will be considering each job one by one in the same order as they appear in the Step-01.
We are then supposed to place the jobs on the Gantt chart as far as possible from 0.
Step-03:
We now consider job4.
Since the deadline for job4 is 2, we will be placing it in the first empty cell before deadline 2
as follows.
Step-04:
Now, we go with job1.
Since the deadline for job1 is 5, we will be placing it in the first empty cell before deadline 5 as
shown below.
Step-05:
We now consider job3.
Since the deadline for job3 is 3, we will be placing it in the first empty cell before
deadline 3 as shown in the following figure.
Step-07:
Now, we consider job5.
Since the deadline for job5 is 4, we will be placing it in the first empty cell before
deadline 4 as shown in the following figure.
Now,
We can observe that the only job left is job6 whose deadline is 2.
Since all the slots before deadline 2 are already occupied, job6 cannot be completed.
Now, the questions given above can be answered as follows:
Part-01:
The optimal schedule is-
Job2, Job4, Job3, Job5, Job1
In order to obtain the maximum profit this is the required order in which the jobs must be
completed.
Part-02:
As we can observe, all jobs are not completed on the optimal schedule.
This is because job6 was not completed within the given deadline.
Part-03:
Maximum earned profit = Sum of the profit of all the jobs from the optimal
schedule
= Profit of job2 + Profit of job4 + Profit of job3 + Profit of job5 + Profit of job1
= 181 + 301 + 191 + 121 + 201
= 995 units
Minimum Spanning Tree
What is a Spanning Tree?
• Given an undirected and connected graph G=(V,E), a spanning tree of
the graph G is a tree that spans G (that is, it includes every vertex of G
) and is a subgraph of G (every edge in the tree belongs to G )
What is a Minimum Spanning Tree?
• The cost of the spanning tree is the sum of the weights of all the edges in the
tree. There can be many spanning trees. Minimum spanning tree is the spanning
tree where the cost is minimum among all the spanning trees. There also can be
many minimum spanning trees.
• Minimum spanning tree has direct application in the design of networks. It is
used in algorithms approximating the travelling salesman problem, multi-terminal
minimum cut problem and minimum-cost weighted perfect matching. Other
practical applications are:
Cluster Analysis
Handwriting recognition
Image segmentation
Contains all the original graph’s vertices.
Reaches out to (spans) all vertices.
Is acyclic. In other words, the graph doesn’t have any nodes which loop back to itself.
Adjacency Matrix
In graph theory, an adjacency matrix is nothing but a square matrix
utilised to describe a finite graph. The components of the matrix
express whether the pairs of a finite set of vertices (also called nodes)
are adjacent in the graph or not. In graph representation, the networks
are expressed with the help of nodes and edges, where nodes are the
vertices and edges are the finite set of ordered pairs.
Adjacency Matrix Definition
The adjacency matrix, also called the connection matrix, is a matrix
containing rows and columns which is used to represent a simple labelled
graph, with 0 or 1 in the position of (Vi , Vj) according to the condition
whether Vi and Vj are adjacent or not. It is a compact way to represent the
finite graph containing n vertices of a m x m matrix M. Sometimes
adjacency matrix is also called as vertex matrix and it is defined in the
general form as
Prim’s algorithm
• It is a greedy algorithm that is used to find the minimum
spanning tree from a graph. Prim's algorithm finds the subset of
edges that includes every vertex of the graph such that the sum
of the weights of the edges can be minimized.
• Prim's algorithm starts with the single node and explores all the
adjacent nodes with all the connecting edges at every step. The
edges with the minimal weights causing no cycles in the graph
got selected.
How does the prim's algorithm work?
Prim's algorithm is a greedy algorithm that starts from one vertex and
continue to add the edges with the smallest weight until the goal is
reached. The steps to implement the prim's algorithm are given as
follows -
 First, we have to initialize an MST with the randomly chosen
vertex.
 Now, we have to find all the edges that connect the tree in the
above step with the new vertices. From the edges found, select the
minimum edge and add it to the tree.
 Repeat step 2 until the minimum spanning tree is formed.
The applications of prim's algorithm are -
Prim's algorithm can be used in network designing.
It can be used to make network cycles.
It can also be used to lay down electrical wiring
cables.
Example of prim's algorithm
Now, let's see the working of prim's algorithm using an example. It will be
easier to understand the prim's algorithm using an example.
Suppose, a weighted graph is -
Step 1 - First, we have to choose a vertex from the above graph. Let's choose B.
Step 2 - Now, we have to choose and add the shortest edge from vertex B. There are two edges from
vertex B that are B to C with weight 10 and edge B to D with weight 4. Among the edges, the edge BD
has the minimum weight. So, add it to the MST.
Step 3 - Now, again, choose the edge with the minimum weight among all the other edges. In this case, the
edges DE and CD are such edges. Add them to MST and explore the adjacent of C, i.e., E and A. So, select the
edge DE and add it to the MST.
Step 4 - Now, select the edge CD, and add it to the MST.
Step 5 - Now, choose the edge CA. Here, we cannot select the edge CE as it
would create a cycle to the graph. So, choose the edge CA and add it to the
MST.
So, the graph produced in step 5 is the minimum spanning tree of the
given graph. The cost of the MST is given below -
Cost of MST = 4 + 2 + 1 + 3 = 10 units.
Algorithm
Step 1: Select a starting vertex
Step 2: Repeat Steps 3 and 4 until there are fringe vertices
Step 3: Select an edge 'e' connecting the tree vertex and fringe vertex
that has minimum weight
Step 4: Add the selected edge and the vertex to the minimum
spanning tree T
[END OF LOOP]
Step 5: EXIT
Data structure used for the minimum edge weight Time Complexity
Adjacency matrix, linear searching O(|V|2)
Adjacency list and binary heap O(|E| log |V|)
Adjacency list and Fibonacci heap O(|E|+ |V| log |V|)
Kruskal's Algorithm
Kruskal's Algorithm is used to find the minimum spanning tree
for a connected weighted graph. The main target of the algorithm
is to find the subset of edges by using which we can traverse
every vertex of the graph. It follows the greedy approach that
finds an optimum solution at every stage instead of focusing on a
global optimum.
How does Kruskal's algorithm work?
In Kruskal's algorithm, we start from edges with the lowest weight
and keep adding the edges until the goal is reached. The steps to
implement Kruskal's algorithm are listed as follows -
 First, sort all the edges from low weight to high.
 Now, take the edge with the lowest weight and add it to the
spanning tree. If the edge to be added creates a cycle, then reject
the edge.
 Continue to add the edges until we reach all vertices, and a
minimum spanning tree is created.
The applications of Kruskal's algorithm are -
Kruskal's algorithm can be used to layout electrical wiring among cities.
It can be used to lay down LAN connections.
Example of Kruskal's algorithm
Now, let's see the working of Kruskal's algorithm using an example. It will be easier to
understand Kruskal's algorithm using an example.
Suppose a weighted graph is -
The weight of the edges of the above graph is given in the below table -
Edge AB AC AD AE BC CD DE
Weight 1 7 10 5 3 4 2
Now, sort the edges given above in the ascending order of their weights.
Edge AB DE BC CD AE AC AD
Weight 1 2 3 4 5 7 10
Step 1 - First, add the edge AB with weight 1 to the MST.
Step 2 - Add the edge DE with weight 2 to the MST as it is not creating
the cycle.
Step 3 - Add the edge BC with weight 3 to the MST, as it is not creating any cycle or loop.
Step 4 - Now, pick the edge CD with weight 4 to the MST, as it is not
forming the cycle.
Step 5 - After that, pick the edge AE with weight 5. Including this edge will
create the cycle, so discard it.
Step 6 - Pick the edge AC with weight 7. Including this edge will create the
cycle, so discard it.
Step 7 - Pick the edge AD with weight 10. Including this edge will also create
the cycle, so discard it.
So, the final minimum spanning tree obtained from the given weighted graph
by using Kruskal's algorithm is -
The cost of the MST is = AB + DE + BC + CD = 1 + 2 + 3 + 4 = 10.
Now, the number of edges in the above tree equals the number of vertices minus
1. So, the algorithm stops here.
Algorithm
Step 1: Create a forest F in such a way that every vertex of the
graph is a separate tree.
Step 2: Create a set E that contains all the edges of the graph.
Step 3: Repeat Steps 4 and 5 while E is NOT EMPTY and F is not
spanning
Step 4: Remove an edge from E with minimum weight
Step 5: IF the edge obtained in Step 4 connects two different trees,
then add it to the forest F
(for combining two trees into one tree).
ELSE
Discard the edge
Step 6: END
Complexity of Kruskal's algorithm
Now, let's see the time complexity of Kruskal's algorithm.
Time Complexity
The time complexity of Kruskal's algorithm is O(E logE) or O(V
logV), where E is the no. of edges, and V is the no. of vertices.
Shortest paths – Dijkstra’salgorithm
• The Dijkstra’s algorithm finds the shortest path from a given vertex to
all the remaining vertices in a diagraph.
• We have to find out the shortest path from a given source vertex ‘S’
to each of the destinations (other vertices ) in the graph.
Dijkstra’s( s)
// Finds shortest path from source vertex to
all other vertices
//Input: Weighted connected graph G=<V,E>
with nonnegative weights and its vertices s
//Output: The length of distance of a shortest
path from s to v
{
1. for i = 1 to n do // Intialize
S[i] = 0;
d[i] = a[s][i];
2. S[s] = 1; //Assume 1 as the source
vertex
d[s] = 1;
3. for i = 1 to n do
{
Choose a vertex u in v-s such that
d[u] is minimum
S = s ꓴ u
for each vertex v in v-s do
d[v] = min{ d[u], d[u]+c[u,v]}
}
}
Lets see an example to understand Dijkstra’s Algorithm
Below is a directed weighted graph. We will find shortest path between all the vertices
using Dijkstra’a Algorithm.
Dijkstra's Algorithm
Dijkstra’s Algorithm Applications
• To find the shortest path between source and destination
• In social networking applications to map the connections and
information
• In networking to route the data
• To find the locations on the map
Disadvantage of Dijkstra’s Algorithm
• It follows the blind search, so wastes a lot of time to give the
desired output.
• It Negative edges value cannot be handled by it.
• We need to keep track of vertices that have been visited.
Optimal Tree : Huffman coding
• Huffman coding provides codes to characters such that the length of
the code depends on the relative frequency or weight of the
corresponding character. Huffman codes are of variable-length, and
without any prefix (that means no code is a prefix of any other). Any
prefix-free binary code can be displayed or visualized as a binary tree
with the encoded characters stored at the leaves.
• Huffman tree or Huffman coding tree defines as a full binary tree in
which each leaf of the tree corresponds to a letter in the given
alphabet.
• The Huffman tree is treated as the binary tree associated with
minimum external path weight that means, the one associated with
the minimum sum of weighted path lengths for the given set of
leaves. So the goal is to construct a tree with the minimum external
path weight
Letter frequency table
Letter z k m c u d l e
Frequency 2 7 24 32 37 42 42 120
Huffman code
Letter Freq Code Bits
e 120 0 1
d 42 101 3
Letter Freq Code Bits
e 120 0 1
d 42 101 3
l 42 110 3
u 37 100 3
c 32 1110 4
m 24 11111 5
k 7 111101 6
z 2 111100 6
Huffman code
The Huffman tree (for the above example) is given below -
END OF MODULE 3

More Related Content

Similar to Module 3_DAA (2).pptx

Daa unit 1
Daa unit 1Daa unit 1
Daa unit 1jinalgoti
 
Unit 3- Greedy Method.pptx
Unit 3- Greedy Method.pptxUnit 3- Greedy Method.pptx
Unit 3- Greedy Method.pptxMaryJacob24
 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhdanielgetachew0922
 
UNIT-II.pptx
UNIT-II.pptxUNIT-II.pptx
UNIT-II.pptxJyoReddy9
 
Data structure notes
Data structure notesData structure notes
Data structure notesanujab5
 
Greedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptGreedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptRuchika Sinha
 
Greedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptGreedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptRuchika Sinha
 
Unit 3 greedy method
Unit 3  greedy methodUnit 3  greedy method
Unit 3 greedy methodMaryJacob24
 
Unit 3 - Greedy Method
Unit 3  - Greedy MethodUnit 3  - Greedy Method
Unit 3 - Greedy MethodMaryJacob24
 
Optimization problems
Optimization problemsOptimization problems
Optimization problemsRuchika Sinha
 
Knapsack problem using greedy approach
Knapsack problem using greedy approachKnapsack problem using greedy approach
Knapsack problem using greedy approachpadmeshagrekar
 
Knapsack Dynamic
Knapsack DynamicKnapsack Dynamic
Knapsack DynamicParas Patel
 
Bt0080 fundamentals of algorithms2
Bt0080 fundamentals of algorithms2Bt0080 fundamentals of algorithms2
Bt0080 fundamentals of algorithms2Techglyphs
 
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...kongara
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptdakccse
 

Similar to Module 3_DAA (2).pptx (20)

Unit 2 in daa
Unit 2 in daaUnit 2 in daa
Unit 2 in daa
 
algorithm Unit 2
algorithm Unit 2 algorithm Unit 2
algorithm Unit 2
 
Daa unit 1
Daa unit 1Daa unit 1
Daa unit 1
 
Unit 3- Greedy Method.pptx
Unit 3- Greedy Method.pptxUnit 3- Greedy Method.pptx
Unit 3- Greedy Method.pptx
 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
 
UNIT-II.pptx
UNIT-II.pptxUNIT-II.pptx
UNIT-II.pptx
 
Data structure notes
Data structure notesData structure notes
Data structure notes
 
Dynamic programming
Dynamic programmingDynamic programming
Dynamic programming
 
Greedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptGreedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.ppt
 
Greedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.pptGreedy with Task Scheduling Algorithm.ppt
Greedy with Task Scheduling Algorithm.ppt
 
Unit 3 greedy method
Unit 3  greedy methodUnit 3  greedy method
Unit 3 greedy method
 
Unit 3 - Greedy Method
Unit 3  - Greedy MethodUnit 3  - Greedy Method
Unit 3 - Greedy Method
 
Optimization problems
Optimization problemsOptimization problems
Optimization problems
 
Knapsack problem using greedy approach
Knapsack problem using greedy approachKnapsack problem using greedy approach
Knapsack problem using greedy approach
 
Knapsack Dynamic
Knapsack DynamicKnapsack Dynamic
Knapsack Dynamic
 
Bt0080 fundamentals of algorithms2
Bt0080 fundamentals of algorithms2Bt0080 fundamentals of algorithms2
Bt0080 fundamentals of algorithms2
 
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...
Quantitativetechniqueformanagerialdecisionlinearprogramming 090725035417-phpa...
 
Greedy method by Dr. B. J. Mohite
Greedy method by Dr. B. J. MohiteGreedy method by Dr. B. J. Mohite
Greedy method by Dr. B. J. Mohite
 
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.pptParallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
Parallel_Algorithms_In_Combinatorial_Optimization_Problems.ppt
 
Greedy method
Greedy methodGreedy method
Greedy method
 

More from AnkitaVerma776806

the reference of book detail Hospital Booking.pptx
the reference of book detail Hospital Booking.pptxthe reference of book detail Hospital Booking.pptx
the reference of book detail Hospital Booking.pptxAnkitaVerma776806
 
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptxARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptxAnkitaVerma776806
 
voicecontrolhomeautomation-130627021245-phpapp01 (1).ppt
voicecontrolhomeautomation-130627021245-phpapp01 (1).pptvoicecontrolhomeautomation-130627021245-phpapp01 (1).ppt
voicecontrolhomeautomation-130627021245-phpapp01 (1).pptAnkitaVerma776806
 
the refernce of programming C notes ppt.pptx
the refernce of programming C notes ppt.pptxthe refernce of programming C notes ppt.pptx
the refernce of programming C notes ppt.pptxAnkitaVerma776806
 
detail the reference of Hospital Booking.pptx
detail the reference of Hospital Booking.pptxdetail the reference of Hospital Booking.pptx
detail the reference of Hospital Booking.pptxAnkitaVerma776806
 
Advances in ML learning process require. ppt.pptx
Advances in ML learning process require. ppt.pptxAdvances in ML learning process require. ppt.pptx
Advances in ML learning process require. ppt.pptxAnkitaVerma776806
 
Reasesrty djhjan S - explanation required.pptx
Reasesrty djhjan S - explanation required.pptxReasesrty djhjan S - explanation required.pptx
Reasesrty djhjan S - explanation required.pptxAnkitaVerma776806
 
SEMINAR_BIOMETRIC of hand fingerprint,voice bsed biometric ,eye based biometric
SEMINAR_BIOMETRIC  of hand fingerprint,voice bsed biometric ,eye based biometricSEMINAR_BIOMETRIC  of hand fingerprint,voice bsed biometric ,eye based biometric
SEMINAR_BIOMETRIC of hand fingerprint,voice bsed biometric ,eye based biometricAnkitaVerma776806
 
continuity of module 2.pptx
continuity of module 2.pptxcontinuity of module 2.pptx
continuity of module 2.pptxAnkitaVerma776806
 
Advances in ML. ppt.pptx
Advances in ML. ppt.pptxAdvances in ML. ppt.pptx
Advances in ML. ppt.pptxAnkitaVerma776806
 
SE Complete notes mod 4 &5.pdf
SE Complete notes mod 4 &5.pdfSE Complete notes mod 4 &5.pdf
SE Complete notes mod 4 &5.pdfAnkitaVerma776806
 

More from AnkitaVerma776806 (15)

the reference of book detail Hospital Booking.pptx
the reference of book detail Hospital Booking.pptxthe reference of book detail Hospital Booking.pptx
the reference of book detail Hospital Booking.pptx
 
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptxARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
 
voicecontrolhomeautomation-130627021245-phpapp01 (1).ppt
voicecontrolhomeautomation-130627021245-phpapp01 (1).pptvoicecontrolhomeautomation-130627021245-phpapp01 (1).ppt
voicecontrolhomeautomation-130627021245-phpapp01 (1).ppt
 
the refernce of programming C notes ppt.pptx
the refernce of programming C notes ppt.pptxthe refernce of programming C notes ppt.pptx
the refernce of programming C notes ppt.pptx
 
detail the reference of Hospital Booking.pptx
detail the reference of Hospital Booking.pptxdetail the reference of Hospital Booking.pptx
detail the reference of Hospital Booking.pptx
 
Advances in ML learning process require. ppt.pptx
Advances in ML learning process require. ppt.pptxAdvances in ML learning process require. ppt.pptx
Advances in ML learning process require. ppt.pptx
 
Reasesrty djhjan S - explanation required.pptx
Reasesrty djhjan S - explanation required.pptxReasesrty djhjan S - explanation required.pptx
Reasesrty djhjan S - explanation required.pptx
 
SEMINAR_BIOMETRIC of hand fingerprint,voice bsed biometric ,eye based biometric
SEMINAR_BIOMETRIC  of hand fingerprint,voice bsed biometric ,eye based biometricSEMINAR_BIOMETRIC  of hand fingerprint,voice bsed biometric ,eye based biometric
SEMINAR_BIOMETRIC of hand fingerprint,voice bsed biometric ,eye based biometric
 
ch8.pptx
ch8.pptxch8.pptx
ch8.pptx
 
AI(Module1).pptx
AI(Module1).pptxAI(Module1).pptx
AI(Module1).pptx
 
continuity of module 2.pptx
continuity of module 2.pptxcontinuity of module 2.pptx
continuity of module 2.pptx
 
Advances in ML. ppt.pptx
Advances in ML. ppt.pptxAdvances in ML. ppt.pptx
Advances in ML. ppt.pptx
 
SE Complete notes mod 4 &5.pdf
SE Complete notes mod 4 &5.pdfSE Complete notes mod 4 &5.pdf
SE Complete notes mod 4 &5.pdf
 
CH1 ARRAY (1).pptx
CH1 ARRAY (1).pptxCH1 ARRAY (1).pptx
CH1 ARRAY (1).pptx
 
IOT VIVA QUESTION.pdf
IOT VIVA QUESTION.pdfIOT VIVA QUESTION.pdf
IOT VIVA QUESTION.pdf
 

Recently uploaded

SBFT Tool Competition 2024 -- Python Test Case Generation Track
SBFT Tool Competition 2024 -- Python Test Case Generation TrackSBFT Tool Competition 2024 -- Python Test Case Generation Track
SBFT Tool Competition 2024 -- Python Test Case Generation TrackSebastiano Panichella
 
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
LANDMARKS  AND MONUMENTS IN NIGERIA.pptxLANDMARKS  AND MONUMENTS IN NIGERIA.pptx
LANDMARKS AND MONUMENTS IN NIGERIA.pptxBasil Achie
 
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Krijn Poppe
 
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...NETWAYS
 
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...NETWAYS
 
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Salam Al-Karadaghi
 
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...NETWAYS
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AITatiana Gurgel
 
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...NETWAYS
 
The 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringThe 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringSebastiano Panichella
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkataanamikaraghav4
 
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...henrik385807
 
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfCTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfhenrik385807
 
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfOpen Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfhenrik385807
 
Philippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptPhilippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptssuser319dad
 
Genesis part 2 Isaiah Scudder 04-24-2024.pptx
Genesis part 2 Isaiah Scudder 04-24-2024.pptxGenesis part 2 Isaiah Scudder 04-24-2024.pptx
Genesis part 2 Isaiah Scudder 04-24-2024.pptxFamilyWorshipCenterD
 
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...NETWAYS
 
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)Basil Achie
 
Work Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxWork Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxmavinoikein
 

Recently uploaded (20)

SBFT Tool Competition 2024 -- Python Test Case Generation Track
SBFT Tool Competition 2024 -- Python Test Case Generation TrackSBFT Tool Competition 2024 -- Python Test Case Generation Track
SBFT Tool Competition 2024 -- Python Test Case Generation Track
 
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
LANDMARKS  AND MONUMENTS IN NIGERIA.pptxLANDMARKS  AND MONUMENTS IN NIGERIA.pptx
LANDMARKS AND MONUMENTS IN NIGERIA.pptx
 
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
Presentation for the Strategic Dialogue on the Future of Agriculture, Brussel...
 
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
Open Source Camp Kubernetes 2024 | Running WebAssembly on Kubernetes by Alex ...
 
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
OSCamp Kubernetes 2024 | A Tester's Guide to CI_CD as an Automated Quality Co...
 
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
Exploring protein-protein interactions by Weak Affinity Chromatography (WAC) ...
 
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
OSCamp Kubernetes 2024 | SRE Challenges in Monolith to Microservices Shift at...
 
Microsoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AIMicrosoft Copilot AI for Everyone - created by AI
Microsoft Copilot AI for Everyone - created by AI
 
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...
Open Source Camp Kubernetes 2024 | Monitoring Kubernetes With Icinga by Eric ...
 
The 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software EngineeringThe 3rd Intl. Workshop on NL-based Software Engineering
The 3rd Intl. Workshop on NL-based Software Engineering
 
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls KolkataRussian Call Girls in Kolkata Vaishnavi 🤌  8250192130 🚀 Vip Call Girls Kolkata
Russian Call Girls in Kolkata Vaishnavi 🤌 8250192130 🚀 Vip Call Girls Kolkata
 
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Rohini Delhi 💯Call Us 🔝8264348440🔝
 
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
CTAC 2024 Valencia - Sven Zoelle - Most Crucial Invest to Digitalisation_slid...
 
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdfCTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
CTAC 2024 Valencia - Henrik Hanke - Reduce to the max - slideshare.pdf
 
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdfOpen Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
Open Source Strategy in Logistics 2015_Henrik Hankedvz-d-nl-log-conference.pdf
 
Philippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.pptPhilippine History cavite Mutiny Report.ppt
Philippine History cavite Mutiny Report.ppt
 
Genesis part 2 Isaiah Scudder 04-24-2024.pptx
Genesis part 2 Isaiah Scudder 04-24-2024.pptxGenesis part 2 Isaiah Scudder 04-24-2024.pptx
Genesis part 2 Isaiah Scudder 04-24-2024.pptx
 
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...
OSCamp Kubernetes 2024 | Zero-Touch OS-Infrastruktur fĂźr Container und Kubern...
 
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
NATIONAL ANTHEMS OF AFRICA (National Anthems of Africa)
 
Work Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptxWork Remotely with Confluence ACE 2.pptx
Work Remotely with Confluence ACE 2.pptx
 

Module 3_DAA (2).pptx

  • 2. Define greedy technique • The greedy method is one of the strategies like Divide and conquer used to solve the problems. This method is used for solving optimization problems. An optimization problem is a problem that demands either maximum or minimum results. • The Greedy method is the simplest and straightforward approach. It is not an algorithm, but it is a technique. The main function of this approach is that the decision is taken on the basis of the currently available information. Whatever the current information is present, the decision is made without worrying about the effect of the current decision in future.
  • 3. continued • This technique is basically used to determine the feasible solution that may or may not be optimal. The feasible solution is a subset that satisfies the given criteria. The optimal solution is the solution which is the best and the most favorable solution in the subset. In the case of feasible, if more than one solution satisfies the given criteria then those solutions will be considered as the feasible, whereas the optimal solution is the best solution among all the solutions.
  • 4. Applications of Greedy Algorithm • It is used in finding the shortest path. • It is used to find the minimum spanning tree using the prim's algorithm or the Kruskal's algorithm. • It is used in a job sequencing with a deadline. • This algorithm is also used to solve the fractional knapsack problem.
  • 6. The Knapsack problem • Given a Knapsack of a maximum capacity of W and N items each with its own value and weight, throw in items inside the Knapsack such that the final contents has the maximum value.
  • 7. The selection of some things, each with profit and weight values, to be packed into one or more knapsacks with capacity is the fundamental idea behind all families of knapsack problems. The knapsack problem had two versions that are as follows:  Fractional Knapsack Problem  0 /1 Knapsack Problem
  • 8. What is Knapsack Problem Using Greedy Method? In this method, the Knapsack's filling is done so that the maximum capacity of the knapsack is utilized so that maximum profit can be earned from it. The knapsack problem using the Greedy Method is referred to as: Given a list of n objects, say {I1, I2,……, In) and a knapsack (or bag). The capacity of the knapsack is M. Each object Ij has a weight wj and a profit of pj If a fraction xj (where x ∈ {0...., 1)) of an object Ij is placed into a knapsack, then a profit of pjxj is earned. The problem (or Objective) is to fill the knapsack (up to its maximum capacity M), maximizing the total profit earned. Mathematically:
  • 9. Knapsack Problem Algorithm Using Greedy Method Knapsack Problem Using Greedy Method Pseudocode A pseudo-code for solving knapsack problems using the greedy method is; greedy fractional-knapsack (P[1...n], W[1...n], X[1..n]. M) /*P[1...n] and W[1...n] contain the profit and weight of the n-objects ordered such that X[1...n] is a solution set and M is the capacity of knapsack*/ { For j ← 1 to n do X[j]← 0 profit ← 0 // Total profit of item filled in the knapsack weight ← 0 // Total weight of items packed in knapsacks j ← 1 While (Weight < M) // M is the knapsack capacity
  • 10. Fractional Knapsack Problem Using Greedy Method- Fractional knapsack problem is solved using greedy method in the following steps- Step-01: For each item, compute its value / weight ratio. Step-02: Arrange all the items in decreasing order of their value / weight ratio Step-03: Start putting the items into the knapsack beginning from the item with the highest ratio. Put as many items as you can into the knapsack.
  • 12. Knapsack Problem - 1 Obtain the optimal solution for the knapsack problem using greedy method given the following: M = 15 n = 7 p1,p2,p3,p4,p5,p6,p7 = 10,5,15,7,6,18,3 w1,w2,w3,w4,w5,w6,w7= 2,3,5,7,1,4,1
  • 13.
  • 14. Solution Vector = (1, 0,1, 4/7,0,1,0)= (1, 0,1, 0.57,0,1,0) Optimal solution using this method is (x1, x2, x3,x4,x5,x6,x7) = (1, 0,1, 0.57,0,1,0) with profit = 47
  • 15. Solution Vector = (1, 1,4/5,0,1,1,1)= (1, 1,0.8,0,1,1,1) Optimal solution using this method is (x1, x2, x3,x4,x5,x6,x7) = (1, 1,0.8,0,1,1,1) with profit = 54 Optimal solution is not guaranteed using method 1 and 2
  • 16. Solution Vector = (1, 2/3,1,0, 1,1,1)= (1, 0.67,1,0, 1,1,1) Optimal solution is (x1, x2, x3,x4,x5,x6,x7) = (1, 0.67,1,0, 1,1,1) with profit [1*10+0.67*5+1*15+0*7+1*6+1*18+1*3]= 55.34 Weight=[1*2+0.67*3+1*5+0*7+1*1+1*4+1*1]=15 This greedy approach always results optimal solution
  • 17. Problem Statement In job sequencing problem, the objective is to find a sequence of jobs, which is completed within their deadlines and gives maximum profit. Solution Let us consider, a set of n given jobs which are associated with deadlines and profit is earned, if a job is completed by its deadline. These jobs need to be ordered in such a way that there is maximum profit. It may happen that all of the given jobs may not be completed within their deadlines. Assume, deadline of ith job Ji is di and the profit received from this job is pi. Hence, the optimal solution of this algorithm is a feasible solution with maximum profit. Thus, D(i)>0  for 1⊽i⊽n Initially, these jobs are ordered according to profit, i.e. p1⊞p2⊞p3⊞...⊞pn
  • 18. Algorithm: Job-Sequencing-With-Deadline (D, J, n, k) D(0) := J(0) := 0 k := 1 J(1) := 1 // means first job is selected for i = 2 … n do r := k while D(J(r)) > D(i) and D(J(r)) ≠ r do r := r – 1 if D(J(r)) ≤ D(i) and D(i) > r then for l = k … r + 1 by -1 do J(l + 1) := J(l) J(r + 1) := i k := k + 1
  • 19. Analysis In this algorithm, we are using two loops, one is within another. Hence, the complexity of this algorithm is O(n2) .
  • 21.
  • 22. Total profit – 20 + 25 + 35 + 30 = 110
  • 23. We are given the jobs, their deadlines and associated profits as shown- Jobs J1 J2 J3 J4 J5 J6 Deadlines 5 3 3 2 4 2 Profits 201 181 191 301 121 101
  • 24. Step-01: Firstly, we need to sort all the given jobs in decreasing order of their profit as follows. Jobs J4 J1 J3 J2 J5 J6 Deadlines 2 5 3 3 4 2 Profits 300 200 190 180 120 100 Step-02: For each step, we calculate the value of the maximum deadline. Here, the value of the maximum deadline is 5. So, we draw a Gantt chart as follows and assign it with a maximum time on the Gantt chart with 5 units as shown below.
  • 25. Now, We will be considering each job one by one in the same order as they appear in the Step-01. We are then supposed to place the jobs on the Gantt chart as far as possible from 0. Step-03: We now consider job4. Since the deadline for job4 is 2, we will be placing it in the first empty cell before deadline 2 as follows.
  • 26. Step-04: Now, we go with job1. Since the deadline for job1 is 5, we will be placing it in the first empty cell before deadline 5 as shown below. Step-05: We now consider job3. Since the deadline for job3 is 3, we will be placing it in the first empty cell before deadline 3 as shown in the following figure.
  • 27. Step-07: Now, we consider job5. Since the deadline for job5 is 4, we will be placing it in the first empty cell before deadline 4 as shown in the following figure.
  • 28. Now, We can observe that the only job left is job6 whose deadline is 2. Since all the slots before deadline 2 are already occupied, job6 cannot be completed. Now, the questions given above can be answered as follows: Part-01: The optimal schedule is- Job2, Job4, Job3, Job5, Job1 In order to obtain the maximum profit this is the required order in which the jobs must be completed.
  • 29. Part-02: As we can observe, all jobs are not completed on the optimal schedule. This is because job6 was not completed within the given deadline. Part-03: Maximum earned profit = Sum of the profit of all the jobs from the optimal schedule = Profit of job2 + Profit of job4 + Profit of job3 + Profit of job5 + Profit of job1 = 181 + 301 + 191 + 121 + 201 = 995 units
  • 31. What is a Spanning Tree? • Given an undirected and connected graph G=(V,E), a spanning tree of the graph G is a tree that spans G (that is, it includes every vertex of G ) and is a subgraph of G (every edge in the tree belongs to G )
  • 32. What is a Minimum Spanning Tree? • The cost of the spanning tree is the sum of the weights of all the edges in the tree. There can be many spanning trees. Minimum spanning tree is the spanning tree where the cost is minimum among all the spanning trees. There also can be many minimum spanning trees. • Minimum spanning tree has direct application in the design of networks. It is used in algorithms approximating the travelling salesman problem, multi-terminal minimum cut problem and minimum-cost weighted perfect matching. Other practical applications are: Cluster Analysis Handwriting recognition Image segmentation
  • 33.
  • 34. Contains all the original graph’s vertices. Reaches out to (spans) all vertices. Is acyclic. In other words, the graph doesn’t have any nodes which loop back to itself.
  • 35.
  • 36. Adjacency Matrix In graph theory, an adjacency matrix is nothing but a square matrix utilised to describe a finite graph. The components of the matrix express whether the pairs of a finite set of vertices (also called nodes) are adjacent in the graph or not. In graph representation, the networks are expressed with the help of nodes and edges, where nodes are the vertices and edges are the finite set of ordered pairs.
  • 37. Adjacency Matrix Definition The adjacency matrix, also called the connection matrix, is a matrix containing rows and columns which is used to represent a simple labelled graph, with 0 or 1 in the position of (Vi , Vj) according to the condition whether Vi and Vj are adjacent or not. It is a compact way to represent the finite graph containing n vertices of a m x m matrix M. Sometimes adjacency matrix is also called as vertex matrix and it is defined in the general form as
  • 38.
  • 39.
  • 40.
  • 41. Prim’s algorithm • It is a greedy algorithm that is used to find the minimum spanning tree from a graph. Prim's algorithm finds the subset of edges that includes every vertex of the graph such that the sum of the weights of the edges can be minimized. • Prim's algorithm starts with the single node and explores all the adjacent nodes with all the connecting edges at every step. The edges with the minimal weights causing no cycles in the graph got selected.
  • 42. How does the prim's algorithm work? Prim's algorithm is a greedy algorithm that starts from one vertex and continue to add the edges with the smallest weight until the goal is reached. The steps to implement the prim's algorithm are given as follows -  First, we have to initialize an MST with the randomly chosen vertex.  Now, we have to find all the edges that connect the tree in the above step with the new vertices. From the edges found, select the minimum edge and add it to the tree.  Repeat step 2 until the minimum spanning tree is formed.
  • 43. The applications of prim's algorithm are - Prim's algorithm can be used in network designing. It can be used to make network cycles. It can also be used to lay down electrical wiring cables.
  • 44. Example of prim's algorithm Now, let's see the working of prim's algorithm using an example. It will be easier to understand the prim's algorithm using an example. Suppose, a weighted graph is -
  • 45. Step 1 - First, we have to choose a vertex from the above graph. Let's choose B. Step 2 - Now, we have to choose and add the shortest edge from vertex B. There are two edges from vertex B that are B to C with weight 10 and edge B to D with weight 4. Among the edges, the edge BD has the minimum weight. So, add it to the MST.
  • 46. Step 3 - Now, again, choose the edge with the minimum weight among all the other edges. In this case, the edges DE and CD are such edges. Add them to MST and explore the adjacent of C, i.e., E and A. So, select the edge DE and add it to the MST.
  • 47. Step 4 - Now, select the edge CD, and add it to the MST.
  • 48. Step 5 - Now, choose the edge CA. Here, we cannot select the edge CE as it would create a cycle to the graph. So, choose the edge CA and add it to the MST.
  • 49. So, the graph produced in step 5 is the minimum spanning tree of the given graph. The cost of the MST is given below - Cost of MST = 4 + 2 + 1 + 3 = 10 units.
  • 50. Algorithm Step 1: Select a starting vertex Step 2: Repeat Steps 3 and 4 until there are fringe vertices Step 3: Select an edge 'e' connecting the tree vertex and fringe vertex that has minimum weight Step 4: Add the selected edge and the vertex to the minimum spanning tree T [END OF LOOP] Step 5: EXIT
  • 51. Data structure used for the minimum edge weight Time Complexity Adjacency matrix, linear searching O(|V|2) Adjacency list and binary heap O(|E| log |V|) Adjacency list and Fibonacci heap O(|E|+ |V| log |V|)
  • 52. Kruskal's Algorithm Kruskal's Algorithm is used to find the minimum spanning tree for a connected weighted graph. The main target of the algorithm is to find the subset of edges by using which we can traverse every vertex of the graph. It follows the greedy approach that finds an optimum solution at every stage instead of focusing on a global optimum.
  • 53. How does Kruskal's algorithm work? In Kruskal's algorithm, we start from edges with the lowest weight and keep adding the edges until the goal is reached. The steps to implement Kruskal's algorithm are listed as follows -  First, sort all the edges from low weight to high.  Now, take the edge with the lowest weight and add it to the spanning tree. If the edge to be added creates a cycle, then reject the edge.  Continue to add the edges until we reach all vertices, and a minimum spanning tree is created.
  • 54. The applications of Kruskal's algorithm are - Kruskal's algorithm can be used to layout electrical wiring among cities. It can be used to lay down LAN connections.
  • 55. Example of Kruskal's algorithm Now, let's see the working of Kruskal's algorithm using an example. It will be easier to understand Kruskal's algorithm using an example. Suppose a weighted graph is -
  • 56. The weight of the edges of the above graph is given in the below table - Edge AB AC AD AE BC CD DE Weight 1 7 10 5 3 4 2 Now, sort the edges given above in the ascending order of their weights. Edge AB DE BC CD AE AC AD Weight 1 2 3 4 5 7 10
  • 57. Step 1 - First, add the edge AB with weight 1 to the MST. Step 2 - Add the edge DE with weight 2 to the MST as it is not creating the cycle.
  • 58. Step 3 - Add the edge BC with weight 3 to the MST, as it is not creating any cycle or loop.
  • 59. Step 4 - Now, pick the edge CD with weight 4 to the MST, as it is not forming the cycle.
  • 60. Step 5 - After that, pick the edge AE with weight 5. Including this edge will create the cycle, so discard it. Step 6 - Pick the edge AC with weight 7. Including this edge will create the cycle, so discard it. Step 7 - Pick the edge AD with weight 10. Including this edge will also create the cycle, so discard it.
  • 61. So, the final minimum spanning tree obtained from the given weighted graph by using Kruskal's algorithm is - The cost of the MST is = AB + DE + BC + CD = 1 + 2 + 3 + 4 = 10. Now, the number of edges in the above tree equals the number of vertices minus 1. So, the algorithm stops here.
  • 62. Algorithm Step 1: Create a forest F in such a way that every vertex of the graph is a separate tree. Step 2: Create a set E that contains all the edges of the graph. Step 3: Repeat Steps 4 and 5 while E is NOT EMPTY and F is not spanning Step 4: Remove an edge from E with minimum weight Step 5: IF the edge obtained in Step 4 connects two different trees, then add it to the forest F (for combining two trees into one tree). ELSE Discard the edge Step 6: END
  • 63. Complexity of Kruskal's algorithm Now, let's see the time complexity of Kruskal's algorithm. Time Complexity The time complexity of Kruskal's algorithm is O(E logE) or O(V logV), where E is the no. of edges, and V is the no. of vertices.
  • 64.
  • 65.
  • 66. Shortest paths – Dijkstra’salgorithm • The Dijkstra’s algorithm finds the shortest path from a given vertex to all the remaining vertices in a diagraph. • We have to find out the shortest path from a given source vertex ‘S’ to each of the destinations (other vertices ) in the graph.
  • 67. Dijkstra’s( s) // Finds shortest path from source vertex to all other vertices //Input: Weighted connected graph G=<V,E> with nonnegative weights and its vertices s //Output: The length of distance of a shortest path from s to v { 1. for i = 1 to n do // Intialize S[i] = 0; d[i] = a[s][i]; 2. S[s] = 1; //Assume 1 as the source vertex d[s] = 1;
  • 68. 3. for i = 1 to n do { Choose a vertex u in v-s such that d[u] is minimum S = s ꓴ u for each vertex v in v-s do d[v] = min{ d[u], d[u]+c[u,v]} } }
  • 69. Lets see an example to understand Dijkstra’s Algorithm Below is a directed weighted graph. We will find shortest path between all the vertices using Dijkstra’a Algorithm. Dijkstra's Algorithm
  • 70.
  • 71.
  • 72.
  • 73.
  • 74. Dijkstra’s Algorithm Applications • To find the shortest path between source and destination • In social networking applications to map the connections and information • In networking to route the data • To find the locations on the map
  • 75. Disadvantage of Dijkstra’s Algorithm • It follows the blind search, so wastes a lot of time to give the desired output. • It Negative edges value cannot be handled by it. • We need to keep track of vertices that have been visited.
  • 76. Optimal Tree : Huffman coding • Huffman coding provides codes to characters such that the length of the code depends on the relative frequency or weight of the corresponding character. Huffman codes are of variable-length, and without any prefix (that means no code is a prefix of any other). Any prefix-free binary code can be displayed or visualized as a binary tree with the encoded characters stored at the leaves. • Huffman tree or Huffman coding tree defines as a full binary tree in which each leaf of the tree corresponds to a letter in the given alphabet. • The Huffman tree is treated as the binary tree associated with minimum external path weight that means, the one associated with the minimum sum of weighted path lengths for the given set of leaves. So the goal is to construct a tree with the minimum external path weight
  • 77. Letter frequency table Letter z k m c u d l e Frequency 2 7 24 32 37 42 42 120 Huffman code Letter Freq Code Bits e 120 0 1 d 42 101 3
  • 78. Letter Freq Code Bits e 120 0 1 d 42 101 3 l 42 110 3 u 37 100 3 c 32 1110 4 m 24 11111 5 k 7 111101 6 z 2 111100 6 Huffman code
  • 79. The Huffman tree (for the above example) is given below -