The document discusses counting trees in graphs and Huffman coding. It provides the following key points:
1. Cayley's theorem states that the number of distinct labelled trees with n vertices is nn-2.
2. Huffman coding is a lossless data compression algorithm that uses variable-length codes to encode characters based on their frequency of occurrence.
3. It constructs a Huffman tree from character frequencies and assigns codes to characters by traversing the tree, with more frequent characters getting shorter codes.
Water Industry Process Automation & Control Monthly - April 2024
Counting trees.pptx
1. Counting Trees in Graph
Sanghita Bhattacharjee
Department of Computer Science and Engineering
NIT Durgapur
2. Counting No of Graphs
• Number of simple, labelled graphs of n vertices is 2𝑛(𝑛−1)/2
Let, n =3, number of simple graphs is 8
Since, n =3, maximum number of edges possible in the graph is 3
Minimum number of edges in the graph 0
m=0,1,2,3. total number of graphs= 3
0
+ 3
1
+ 3
2
+ 3
3
=8
a
b c
a
b c
a
b c
a
b c
3. Counting No of Graphs
Number of distinct graphs is 8
Number of non isomorphic graphs= 4
4. Counting No of Graphs
• Find number of non-isomorphic simple graphs with 4 vertices
Maximum number of edges = 4(4-1)/2=6
Total number of simple labelled graphs= 26
=64
0 edges: 1 unique graph, 6 edges: 1unique graph, 1 edge: 1 unique graph, 2 edges: 2
unique graph. One where two edges are incident and other where they are not incident
5. Counting No of Graphs
• 3 edges: 3 unique graphs. One is a 3 cycle with one isolated vertex, other two are
trees: one has a vertex with 3 degree and other has 2 vertices of degree 2
• 4 edges: 2 unique graphs: 4 cycle graph and one containing 3 cycle
• 5 edges: 1 unique graph
Number of non isomorphic graphs: 11
• Maximum number of simple graphs with n vertices and m edges is 𝑛(𝑛−1)/2
𝑚
Let, n=5, m=3. Maximum no of simple graphs is 10
3
=120
6. Counting No of Trees
How many trees does a graph have?
• n=1, trees=0
• n=2, an edge=1, T=1
• n=3, T=3
• n=4, T=16, four stars, 12 paths
Caylay’s Theorem: Number of distinct labelled trees with
n vertices is 𝑛𝑛−2
27. Kirchhoff algorithm
• Step 1: Create an adjacency matrix for the given graph
• Step 2: Replace all the diagonal elements with the degree of nodes
• Step 3: Replace all non diagonal 1’s with -1
• Step 4: Delete a row and a column and take the determinant
• Step 5: This determinant is the number of spanning trees
29. Edge disjoint spanning tree
• We have seen that every connected graph has a spanning tree
• Edge disjoint spanning trees provide alternate routes if the primary tree fails
• Number of edge disjoint spanning trees in 𝐾𝑛 =floor(n/2)
30. Huffman Code
• Huffman Coding is a famous Greedy Algorithm.
• It is used for the lossless compression of data.
• It uses variable length encoding.
• It assigns variable length code to all the characters.
• The code length of a character depends on how frequently it occurs in the given
text.
• The character which occurs most frequently gets the smallest code.
• The character which occurs least frequently gets the largest code.
• It is also known as Huffman Encoding.
31. Major steps in Huffman Coding
There are two major steps in Huffman Coding-
1.Building a Huffman Tree from the input characters.
2.Assigning code to the characters by traversing the Huffman Tree.
32. How Huffman Code Works
• Huffman coding first creates a tree using the frequencies of the character and then
generates code for each character.
• Once the data is encoded, it has to be decoded. Decoding is done using the same
tree.
• Huffman Coding prevents any ambiguity in the decoding process using the
concept of prefix code i.e. a code associated with a character should not be
present in the prefix of any other code. The tree created above helps in
maintaining the property.
33. How Huffman Code Works
1. Calculate the frequency of each character in the string.
2. Sort the characters in increasing order of the frequency. These are stored in a
priority queue Q.
3. Make each unique character as a leaf node.
4. Create an empty node z. Assign the minimum frequency to the left child of z
and assign the second minimum frequency to the right child of z. Set the value
of the z as the sum of the above two minimum frequencies.
34. How Huffman Code Works
5. Remove these two minimum frequencies from Q and add the sum into the list of
frequencies
6. Insert node z into the tree.
7. Repeat steps 3 to 5 for all the characters.
8. For each non-leaf node, assign 0 to the left edge and 1 to the right edge.
The time complexity analysis of Huffman Coding is as follows-
extractMin( ) is called 2 x (n-1) times if there are n nodes.
As extractMin( ) calls minHeapify( ), it takes O(logn) time.
Thus, Overall time complexity of Huffman Coding becomes O(nlogn). Here, n is the
number of unique characters in the given text.
35.
36.
37.
38.
39. Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
Huffman’s algorithm
f:5 b:13
c:12 d:16
e:9 a:45
Actual size= (45+12+13+5+9+16)*8= 800 bits
40. Huffman’s algorithm
f:5 b:13
c:12 d:16
e:9 a:45
14
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
43. Huffman’s algorithm
f:5 b:13
c:12
d:16
e:9
a:45
14 25
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
44. Huffman’s algorithm
f:5 b:13
c:12
d:16
e:9
a:45
14 25
30
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
45. Huffman’s algorithm
f:5
b:13
c:12 d:16
e:9
a:45
14
25 30
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
46. Huffman’s algorithm
f:5
b:13
c:12 d:16
e:9
a:45
14
25 30
55
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
47. Huffman’s algorithm
f:5
b:13
c:12 d:16
e:9
a:45
14
25 30
55
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
48. Huffman’s algorithm
f:5
b:13
c:12 d:16
e:9
a:45
14
25 30
55
100
0 1
0
0 0
0
1
1 1
1
0
100 101
1100 1101
111
Huffman(C)
1 n ← |C|
2 Q ← C
3 for i = 1..n – 1
4 do allocate a new node z
5 left[z] ← x ← Delete-Min(Q)
6 right[z] ← y ← Delete-Min(Q)
7 f [z] ← f [x] + f [y]
8 Insert(Q, z)
9 return Delete-Min(Q)
49. Size calculation in Huffman’s algorithm
Character frequency code size
a 45 0 45*1
c 12 100 12*3
b 13 101 13*3
f 5 1100 5*4
e 9 1101 9*4
d 16 111 16*3
Total size= 100* (45+36+39+20+36+48)/100= 2.24*100=224 bits
Actual size= (45+12+13+5+9+16)*8= 800 bits