SlideShare a Scribd company logo
1 of 169
UNIT IV : ADVANCED TREES
By
Mr.S.Selvaraj
Asst. Professor (SRG) / CSE
Kongu Engineering College
Perundurai, Erode, Tamilnadu, India
Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C
Programming”, 1st Edition, McGraw Hill, 2018.
20CST32 – Data Structures
Syllabus – Unit Wise
4/6/2022 4.1 _ Splay Trees 2
List of Exercises
4/6/2022 4.1 _ Splay Trees 3
Text Book and Reference Book
4/6/2022 4.1 _ Splay Trees 4
Unit IV : Contents
1. Splay Trees
2. B tree
3. Red-Black Trees:
– Rotation
– Insertion
– Deletion
4. Priority Queues(Heaps)
– Binary heap
– d-heaps
– Leftist heaps
– Skew heaps
4/6/2022 5
4.1 _ Splay Trees
Splay Tree
• Splay trees are the self-balancing or self-
adjusted binary search trees.
• In other words, we can say that the splay trees
are the variants of the binary search trees.
• The prerequisite for the splay trees that we
should know about the binary search trees.
4/6/2022 4.1 _ Splay Trees 6
Splay Tree
• As we already know, the time complexity of a binary search tree in
every case. The time complexity of a binary search tree in the
average case is O(logn) and the time complexity in the worst case
is O(n).
• In a binary search tree, the value of the left subtree is smaller than
the root node, and the value of the right subtree is greater than the
root node; in such case, the time complexity would be O(logn).
• If the binary tree is left-skewed or right-skewed, then the time
complexity would be O(n).
• To limit the skewness, the AVL and Red-Black tree came into the
picture, having O(logn) time complexity for all the operations in all
the cases.
• We can also improve this time complexity by doing more practical
implementations, so the new Tree data structure was designed,
known as a Splay tree.
4/6/2022 4.1 _ Splay Trees 7
Spalying
• A splay tree is a self-balancing tree, but AVL and Red-Black
trees are also self-balancing trees then.
• What makes the splay tree unique two trees. It has one
extra property that makes it unique is splaying.
• A splay tree contains the same operations as a Binary
search tree.
• i.e., Insertion, deletion and searching, but it also contains
one more operation, i.e., splaying.
• So all the operations in the splay tree are followed by
splaying.
• Splay trees are not strictly balanced trees, but they are
roughly balanced trees. Let's understand the search
operation in the splay-tree.
4/6/2022 4.1 _ Splay Trees 8
Example
• Suppose we want to search 7 element in the
tree, which is shown below:
4/6/2022 4.1 _ Splay Trees 9
Rearrangements
• To search any element in the splay tree, first, we will
perform the standard binary search tree operation.
• As 7 is less than 10 so we will come to the left of the
root node.
• After performing the search operation, we need to
perform splaying.
• Here splaying means that the operation that we are
performing on any element should become the root
node after performing some rearrangements.
• The rearrangement of the tree will be done through
the rotations.
4/6/2022 4.1 _ Splay Trees 10
Rotations
• There are six types of rotations used for
splaying:
1. Zig rotation (Right rotation)
2. Zag rotation (Left rotation)
3. Zig zag (Zig followed by zag)
4. Zag zig (Zag followed by zig)
5. Zig zig (two right rotations)
6. Zag zag (two left rotations)
4/6/2022 4.1 _ Splay Trees 11
Cases for the Rotations
• Case 1: If the node does not have a grand-parent, and if it is
the right child of the parent, then we carry out the left
rotation; otherwise, the right rotation is performed.
• Case 2: If the node has a grandparent, then based on the
following scenarios; the rotation would be performed:
– Scenario 1: If the node is the right of the parent and the parent
is also right of its parent, then zig zig right right rotation is
performed.
– Scenario 2: If the node is left of a parent, but the parent is right
of its parent, then zig zag right left rotation is performed.
– Scenario 3: If the node is right of the parent and the parent is
right of its parent, then zig zig left left rotation is performed.
– Scenario 4: If the node is right of a parent, but the parent is left
of its parent, then zig zag right-left rotation is performed.
4/6/2022 4.1 _ Splay Trees 12
Zig rotations
• we have to search 7 element in the tree
4/6/2022 4.1 _ Splay Trees 13
Zag Rotation
• we have to search 20 element in the tree
4/6/2022 4.1 _ Splay Trees 14
Zig Zig Rotation
• Suppose we have to search 1 element in the
tree
4/6/2022 4.1 _ Splay Trees 15
Zig Zag Rotation
• Suppose we want to search 13 element in the
tree
4/6/2022 4.1 _ Splay Trees 16
Zag Zig Rotation
• Suppose we want to search 9 element in the
tree
4/6/2022 4.1 _ Splay Trees 17
Zag Zag Rotation
• Suppose we want to search 20 in the below
tree.
4/6/2022 4.1 _ Splay Trees 18
Advantages of Splay tree
• In the splay tree, we do not need to store the extra information.
– In contrast, in AVL trees, we need to store the balance factor of each
node that requires extra space, and
– Red-Black trees also require to store one extra bit of information that
denotes the color of the node, either Red or Black.
• It is the fastest type of Binary Search tree for various practical
applications. It is used in Windows NT and GCC compilers.
• It provides better performance as the frequently accessed nodes
will move nearer to the root node, due to which the elements can
be accessed quickly in splay trees.
– It is used in the cache implementation as the recently accessed data is
stored in the cache so that we do not need to go to the memory for
accessing the data, and it takes less time.
4/6/2022 4.1 _ Splay Trees 19
Drawback of Splay tree
• The major drawback of the splay tree would
be that trees are not strictly balanced, i.e.,
they are roughly balanced.
• Sometimes the splay trees are linear, so it will
take O(n) time complexity.
4/6/2022 4.1 _ Splay Trees 20
Thank you
4/6/2022 4.1 _ Splay Trees 21
UNIT IV : ADVANCED TREES
By
Mr.S.Selvaraj
Asst. Professor (SRG) / CSE
Kongu Engineering College
Perundurai, Erode, Tamilnadu, India
Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C
Programming”, 1st Edition, McGraw Hill, 2018.
20CST32 – Data Structures
Unit IV : Contents
1. Splay Trees
2. B tree
3. Red-Black Trees:
– Rotation
– Insertion
– Deletion
4. Priority Queues(Heaps)
– Binary heap
– d-heaps
– Leftist heaps
– Skew heaps
4/6/2022 23
4.2 _ B Tree
B Tree
• B tree is a self-balancing tree, and it is a m-way tree where m
defines the order of the tree.
• B tree is a generalization of the Binary Search tree in which
– a node can have more than one key and
– more than two children depending upon the value of m.
• In the B tree, the data is specified in a sorted order having
– lower values on the left subtree and
– higher values in the right subtree.
• In the B tree, all the leaf nodes must be at the same level,
whereas, in the case of a binary tree, the leaf nodes can be at
different levels.
4/6/2022 4.2 _ B Tree 24
B Tree Properties
• If the B tree has an order of m, then
1. Children Case:
– each node can have a maximum of m children
– In the case of minimum children,
• the leaf nodes have zero children,
• the root node has two children, and
• the internal nodes have a ceiling of m/2.
2. Key Case:
– Each node can have maximum (m-1) keys.
• For example, if the value of m is 5 then the maximum value of keys is 4.
– In the case of minimum keys,
• The root node has minimum one key,
• other nodes has (ceiling of m/2 minus - 1) minimum keys.
• If we perform insertion in the B tree, then the node is always
inserted in the leaf node.
4/6/2022 4.2 _ B Tree 25
Example
• Suppose we want to create a B tree of order 3
by inserting values from 1 to 10.
4/6/2022 4.2 _ B Tree 26
• Step 1: First, we create a node with 1 value
4/6/2022 4.2 _ B Tree 27
• Step 2: The next element is 2, which comes
after 1
4/6/2022 4.2 _ B Tree 28
• Step 3: The next element is 3, and it is inserted after 2.
• As we know that each node can have 2 maximum keys,
so we will split this node through the middle element.
• The middle element is 2, so it moves to its parent.
• The node 2 does not have any parent, so it will become
the root node.
4/6/2022 4.2 _ B Tree 29
• Step 4: The next element is 4. Since 4 is
greater than 2 and 3, so it will be added after
the 3
4/6/2022 4.2 _ B Tree 30
• Step 5: The next element is 5. Since 5 is greater than 2, 3
and 4 so it will be added after 4
• As we know that each node can have 2 maximum keys, so
we will split this node through the middle element.
• The middle element is 4, so it moves to its parent. The
parent is node 2; therefore, 4 will be added after 2
4/6/2022 4.2 _ B Tree 31
• Step 6: The next element is 6. Since 6 is
greater than 2, 4 and 5, so 6 will come after 5
4/6/2022 4.2 _ B Tree 32
• Step 7: The next element is 7. Since 7 is greater than 2, 4, 5 and 6, so 7 will come
after 6
• As we know that each node can have 2 maximum keys, so we will split this node
through the middle element. The middle element is 6, so it moves to its parent
• But, 6 cannot be added after 4 because the node can have 2 maximum keys, so we
will split this node through the middle element. The middle element is 4, so it
moves to its parent. As node 4 does not have any parent, node 4 will become a
root node
4/6/2022 4.2 _ B Tree 33
• Step 8: ----------
• Step 9: ----------
• Step 10: ----------
• Can You Try........
4/6/2022 4.2 _ B Tree 34
Create a B tree of order 5 by inserting
values from 1 to 20
4/6/2022 4.2 _ B Tree 35
B + tree
• A B+ tree is used to store the records very
efficiently by storing the records in an indexed
manner using the B+ tree indexed structure.
• Due to the multi-level indexing, the data
accessing becomes faster and easier.
4/6/2022 4.2 _ B Tree 36
B+ tree Node Structure
• The node structure of the B+ tree contains pointers
and key values shown in the below figure:
• As we can observe in the above B+ tree node structure
that it contains n-1 key values (k1 to kn-1) and n
pointers (p1 to pn).
• The search key values which are placed in the node are
kept in sorted order. Thus, if i<j then ki<kj.
4/6/2022 4.2 _ B Tree 37
B+ Tree
• Although all of the search trees we have seen so far are binary, there is a
popular search tree that is not binary. This tree is known as a B-tree.
• B Tree is a specialized m-way tree that can be widely used for disk access.
• A B-Tree of order m can have at most m-1 keys and m children.
• One of the main reason of using B tree is its capability to store large
number of keys in a single node and large key values by keeping the
height of the tree relatively small.
• A B tree of order m contains all the properties of an M way tree. In
addition, it contains the following properties.
– Every node in a B-Tree contains at most m children.
– Every node in a B-Tree except the root node and the leaf node contain at least
m/2 children.
– The root nodes must have at least 2 nodes.
– All leaf nodes must be at the same level.
– All data is stored at the leaves.
• It is not necessary that, all the nodes contain the same number of
children but, each node must have m/2 number of nodes.
4/6/2022 4.2 _ B Tree 38
B+ Tree Properties
• Contained in each interior node are pointers p1, p2, . . . , pm to the
children, and values k1, k2, . . . , km – 1, representing the smallest key
found in the subtrees p2, p3, . . . , pm respectively.
• Of course, some of these pointers might be NULL, and the corresponding
ki would then be undefined.
• For every node, all the keys in subtree p1 are smaller than the keys in
subtree p2, and so on.
• The leaves contain all the actual data, which is either the keys themselves
or pointers to records containing the keys.
• We will assume the former to keep our examples simple.
• There are various definitions of B-trees that change this structure in
mostly minor ways, but this definition is one of the popular forms.
• We will also insist (for now) that the number of keys in a leaf is also
between m/2 and m.
• A B-tree of order 4 is more popularly known as a 2-3-4 tree, and a B-tree
of order 3 is known as a 2-3 tree.
• We will describe the operation of B-trees by using the special case of 2-3
trees.
• Our starting point is the 2-3 tree that follows.
4/6/2022 4.2 _ B Tree 39
Example of a B-tree of order 4.
4/6/2022 4.2 _ B Tree 40
Example of a B-tree of order 3.
4/6/2022 4.2 _ B Tree 41
• We have drawn interior nodes (nonleaves) in ellipses,
which contain the two pieces of data for each node.
• A dash line as a second piece of information in an interior
node indicates that the node has only two children.
• Leaves are drawn in boxes, which contain the keys.
• The keys in the leaves are ordered.
Find
• To perform a find, we start at the root and
branch in one of (at most) three directions,
depending on the relation of the key we are
looking for to the two (possibly one) values
stored at the node.
4/6/2022 4.2 _ B Tree 42
Insert Case 1
• To perform an insert on a previously unseen key, x, we
follow the path as though we were performing a find.
• When we get to a leaf node, we have found the correct
place to put x.
• Thus, to insert a node with key 18, we can just add it to a
leaf without causing any violations of the 2-3 tree
properties.
• The result is shown in the following figure.
4/6/2022 4.2 _ B Tree 43
Insert Case 2
• Unfortunately, since a leaf can hold only two or three keys, this
might not always be possible.
• If we now try to insert 1 into the tree, we find that the node where
it belongs is already full.
• Placing our new key into this node would give it a fourth element
which is not allowed.
• This can be solved by making two nodes of two keys each and
adjusting the information in the parent.
4/6/2022 4.2 _ B Tree 44
Insert Case 3
• Unfortunately, this idea does not always work,
as can be seen by an attempt to insert 19 into
the current tree.
• If we make two nodes of two keys each, we
obtain the following tree.
4/6/2022 4.2 _ B Tree 45
Case 3 Solution
• This tree has an internal node with four children, but we only allow three
per node.
• The solution is simple. We merely split this node into two nodes with two
children. Of course, this node might be one of three children itself, and
thus splitting it would create a problem for its parent (which would now
have four children), but we can keep on splitting nodes on the way up to
the root until we either get to the root or find a node with only two
children.
• In our case, we can get by with splitting only the first internal node we
see, obtaining the following tree.
4/6/2022 4.2 _ B Tree 46
Insert Case 4
• If we now insert an element with key 28, we create a leaf with four
children, which is split into two leaves of two children:
4/6/2022 4.2 _ B Tree 47
Case 4 Solution
• This creates an internal node with four children, which is then split
into two children.
• What we have done here is split the root into two nodes.
• When we do this, we have a special case, which we finish by
creating a new root.
• This is how (the only way) a 2-3 tree gains height.
4/6/2022 4.2 _ B Tree 48
General Rule – B Tree
• With general B-trees of order m, when a key is inserted,
the only difficulty arises when the node that is to accept
the key already has m keys.
• This key gives the node m + 1 keys, which we can split into
two nodes with (m + 1) / 2 and (m + 1) / 2 keys
respectively.
• As this gives the parent an extra node, we have to check
whether this node can be accepted by the parent and split
the parent if it already has m children.
• We repeat this until we find a parent with less than m
children.
• If we split the root, we create a new root with two
children.
4/6/2022 4.2 _ B Tree 49
B Tree – Time Complexity
• The depth of a B-tree is at most log m/2 n .
• At each node on the path, we perform O(log m) work to determine
which branch to take (using a binary search), but an insert or delete
could require O(m) work to fix up all the information at the node.
• The worst-case running time for each of the insert and delete
operations is thus O(m logm n) = O( (m / log m ) log n), but a find
takes only O(log n ).
• The best (legal) choice of m for running time considerations has
been shown empirically to be either m = 3 or m = 4; this agrees
with the bounds above, which show that as m gets larger, the
insertion and deletion times increase.
• If we are only concerned with main memory speed, higher order
B-trees, such as 5-9 trees, are not an advantage.
4/6/2022 4.2 _ B Tree 50
Real Use of B-Tree
• The real use of B-trees lies in database systems, where the tree is kept on a
physical disk instead of main memory.
• Accessing a disk is typically several orders of magnitude slower than any main
memory operation.
• If we use a B-tree of order m, then the number of disk accesses is O(logm n).
• Although each disk access carries the overhead of O(log m) to determine the
direction to branch, the time to perform this computation is typically much smaller
than the time to read a block of memory and can thus be considered
inconsequential (as long as m is chosen reasonably).
• Even if updates are performed and O(m) computing time is required at each node,
this too is generally not significant.
• The value of m is then chosen to be the largest value that still allows an interior
node to fit into one disk block, and is typically in the range 32 m 256.
• The maximum number of elements that are stored in a leaf is chosen so that if the
leaf is full, it fits in one block.
• This means that a record can always be found in very few disk accesses, since a
typical B-tree will have a depth of only 2 or 3, and the root (and possibly the first
level) can be kept in main memory.
4/6/2022 4.2 _ B Tree 51
B tree Vs B+ Tree
4/6/2022 4.2 _ B Tree 52
B tree Vs B+ Tree
4/6/2022 4.2 _ B Tree 53
UNIT IV : ADVANCED TREES
By
Mr.S.Selvaraj
Asst. Professor (SRG) / CSE
Kongu Engineering College
Perundurai, Erode, Tamilnadu, India
Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C
Programming”, 1st Edition, McGraw Hill, 2018.
20CST32 – Data Structures
Unit IV : Contents
1. Splay Trees
2. B tree
3. Red-Black Trees:
– Rotation
– Insertion
– Deletion
4. Priority Queues(Heaps)
– Binary heap
– d-heaps
– Leftist heaps
– Skew heaps
4/6/2022 56
4.3 _ Red Black Tree
Red Black Tree
• The red-Black tree is a binary search tree.
• The prerequisite of the red-black tree is that we should
know about the binary search tree.
• In a binary search tree, the values of the nodes in the
– left subtree should be less than the value of the root node, and
– right subtree should be greater than the value of the root
node.
• Each node in the Red-black tree contains an extra bit that
represents a color to ensure that the tree is balanced
during any operations performed on the tree like insertion,
deletion, etc.
• In a binary search tree, the searching, insertion and
deletion take
– O(log2n) time in the average case,
– O(1) in the best case and
– O(n) in the worst case.
4/6/2022 4.3 _ Red Black Tree 57
BST
• In the above tree, if we want to search the 80.
• We will first compare 80 with the root node.
• 80 is greater than the root node, i.e., 10, so searching will be
performed on the right subtree.
• Again, 80 is compared with 15; 80 is greater than 15, so we move
to the right of the 15, i.e., 20.
• Now, we reach the leaf node 20, and 20 is not equal to 80.
• Therefore, it will show that the element is not found in the tree.
• After each operation, the search is divided into half. The above
BST will take O(logn) time to search the element.
4/6/2022 4.3 _ Red Black Tree 58
Right Skewed BST Tree
• The above tree shows the right-skewed BST.
• If we want to search the 80 in the tree, we will compare 80 with all the
nodes until we find the element or reach the leaf node.
• So, the above right-skewed BST will take O(N) time to search the element.
• In the above BST, the first one is the balanced BST, whereas the second
one is the unbalanced BST.
• We conclude from the above two binary search trees that a balanced tree
takes less time than an unbalanced tree for performing any operation on
the tree.
4/6/2022 4.3 _ Red Black Tree 59
Why Red Black Tree?
• why do we require a Red-Black tree if AVL is also a height-balanced
tree.
• The Red-Black tree is used because the AVL tree requires many
rotations when the tree is large, whereas the Red-Black tree
requires a maximum of two rotations to balance the tree.
• The main difference between the AVL tree and the Red-Black tree is
that the AVL tree is strictly balanced, while the Red-Black tree is
not completely height-balanced.
• So, the AVL tree is more balanced than the Red-Black tree, but the
Red-Black tree guarantees O(log2n) time for all operations like
insertion, deletion, and searching.
• Insertion is easier in the AVL tree as the AVL tree is strictly
balanced, whereas deletion and searching are easier in the Red-
Black tree as the Red-Black tree requires fewer rotations.
4/6/2022 4.3 _ Red Black Tree 60
Red Black Tree Properties
1. Self Balancing BST.
2. Every node is either Black / Red (extra 1 bit)
1. In case AVL Tree, Balancing Factor(BF): -1,0,1
2. 0: Black, 1: Red
3. Root is always Black.
4. Every leaf contains NIL node which in Black. (the nodes
that have no child are considered the internal nodes and
these nodes are connected to the NIL nodes that are
always black in color. )
5. If node is Red then its children are Black. (no red-red
parent-child relationship)
6. Every path from a node to any of its descendant's NIL
node should have same number of black nodes.
4/6/2022 4.3 _ Red Black Tree 61
If node is Red then its children are Black.
(no red-red parent-child relationship)
4/6/2022 4.3 _ Red Black Tree 62
Is every AVL tree can be a Red-Black tree?
• Yes, every AVL tree can be a Red-Black tree if we color
each node either by Red or Black color.
• But every Red-Black tree is not an AVL because the
AVL tree is strictly height-balanced while the Red-Black
tree is not completely height-balanced.
4/6/2022 4.3 _ Red Black Tree 63
Example – RB Tree
4/6/2022 4.3 _ Red Black Tree 64
Check is it RB tree or not?
4/6/2022 4.3 _ Red Black Tree 65
Now , Check is it RB tree or not?
4/6/2022 4.3 _ Red Black Tree 66
Now , Check is it RB tree or not?
4/6/2022 4.3 _ Red Black Tree 67
Check RB Tree or Not?
4/6/2022 4.3 _ Red Black Tree 68
Clue for RB Tree
• Every Perfect Binary Tree, that contains only
Black Nodes is also a Red Black Tree.
4/6/2022 4.3 _ Red Black Tree 69
Once More another example....
4/6/2022 4.3 _ Red Black Tree 70
Check Last one more ...
• If it is not then tell me which property is not
satisfying by this tree.
4/6/2022 4.3 _ Red Black Tree 71
Rules to insert values in Red Black Tree
• The following are some rules used to create the Red-
Black tree:
1. If the tree is empty, then we create a new node as a root
node with the color black.
2. If the tree is not empty, then we create a new node as a
leaf node with a color red.
3. If the parent of a new node is black, then exit.
4. If the parent of a new node is Red, then we have to check
the color of the parent's sibling of a new node.
4(a) If the color is Black, then we perform rotations and recoloring.
4(b) If the color is Red then we recolor the node. We will also check
whether the parents' parent of a new node is the root node or
not; if it is not a root node, we will recolor and recheck the node.
4/6/2022 4.3 _ Red Black Tree 72
Example - Insertion
4/6/2022 4.3 _ Red Black Tree 73
Let's understand the insertion in the
Red-Black tree.
10, 18, 7, 15, 16, 30, 25, 40
• Step 1: Initially, the tree is empty, so we create
a new node having value 10. This is the first
node of the tree, so it would be the root node
of the tree. As we already discussed, that root
node must be black in color, which is shown
below:
4/6/2022 4.3 _ Red Black Tree 74
• Step 2: The next node is 18. As 18 is greater
than 10 so it will come at the right of 10 as
shown below.
4/6/2022 4.3 _ Red Black Tree 75
• Step 3: Now, we create the new node having
value 7 with Red color. As 7 is less than 10, so
it will come at the left of 10 as shown below.
4/6/2022 4.3 _ Red Black Tree 76
• Step 4: The next element is 15, and 15 is greater than 10, but less than 18, so the
new node will be created at the left of node 18. The node 15 would be Red in color
as the tree is not empty.
• The above tree violates the property of the Red-Black tree as it has Red-red parent-
child relationship.
• Now we have to apply some rule to make a Red-Black tree. The rule 4 says that if
the new node's parent is Red, then we have to check the color of the parent's
sibling of a new node.
• The new node is node 15; the parent of the new node is node 18 and the sibling of
the parent node is node 7.
• As the color of the parent's sibling is Red in color, so we apply the rule 4a.
• The rule 4a says that we have to recolor both the parent and parent's sibling node.
So, both the nodes, i.e., 7 and 18, would be recolored as shown in the below figure.
• We also have to check whether the parent's parent of the new node is the root
node or not. As we can observe in the above figure, the parent's parent of a new
node is the root node, so we do not need to recolor it.
4/6/2022 4.3 _ Red Black Tree 77
• Step 5: The next element is 16. As 16 is greater
than 10 but less than 18 and greater than 15, so
node 16 will come at the right of node 15. The
tree is not empty; node 16 would be Red in color,
as shown in the below figure:
4/6/2022 4.3 _ Red Black Tree 78
• Step 6: The next element is 30. Node 30 is inserted at the
right of node 18. As the tree is not empty, so the color of
node 30 would be red.
• We also have to check the parent's parent of the new node,
whether it is a root node or not. The parent's parent of the
new node, i.e., node 30 is node 16 and node 16 is not a
root node, so we will recolor the node 16 and changes to
the Red color. The parent of node 16 is node 10, and it is
not in Red color, so there is no Red-red conflict.
4/6/2022 4.3 _ Red Black Tree 79
• Step 7: The next element is 25, which we have to insert in a tree. Since 25
is greater than 10, 16, 18 but less than 30; so, it will come at the left of
node 30. As the tree is not empty, node 25 would be in Red color. Here
Red-red conflict occurs as the parent of the newly created is Red color.
4/6/2022 4.3 _ Red Black Tree 80
• Step 8: The next element is 40. Since 40 is greater than
10, 16, 18, 25, and 30, so node 40 will come at the right
of node 30. As the tree is not empty, node 40 would be
Red in color. There is a Red-red conflict between nodes
40 and 30, so rule 4b will be applied.
4/6/2022 4.3 _ Red Black Tree 81
• You Continue for last 4 numbers :
• 10, 18, 7, 15, 16, 30, 25, 40, 60, 2, 1, 70
• .................... Take it as practice purpose.
4/6/2022 4.3 _ Red Black Tree 82
Final Answer is ... Check it out.
4/6/2022 4.3 _ Red Black Tree 83
Deletion in Red Back tree
• Let's understand how we can delete the
particular node from the Red-Black tree.
• The following are the rules used to delete the
particular node from the tree:
• Step 1: First, we perform BST rules for the
deletion.
• Step 2:
• Case 1: if the node is Red, which is to be
deleted, we simply delete it.
4/6/2022 4.3 _ Red Black Tree 84
Case 1 – Example 1
• Suppose we want to delete node 30 from the tree, which is given
below.
• Initially, we are having the address of the root node. First, we will
apply BST to search the node. Since 30 is greater than 10 and 20,
which means that 30 is the right child of node 20. Node 30 is a leaf
node and Red in color, so it is simply deleted from the tree.
4/6/2022 4.3 _ Red Black Tree 85
Case 1 – Example 2
• If we want to delete the internal node that has one child. First, replace the value
of the internal node with the value of the child node and then simply delete the
child node.
• Let's take another example in which we want to delete the internal node, i.e.,
node 20.
• We cannot delete the internal node; we can only replace the value of that node
with another value. Node 20 is at the right of the root node, and it is having only
one child, node 30. So, node 20 is replaced with a value 30, but the color of the
node would remain the same, i.e., Black. In the end, node 20 (leaf node) is
deleted from the tree.
4/6/2022 4.3 _ Red Black Tree 86
Case 1 – Example 3
• If we want to delete the internal node that has two child nodes.
• In this case, we have to decide from which we have to replace the value of the
internal node (either left subtree or right subtree).
• We have two ways:
– Inorder predecessor: We will replace with the largest value that exists in the left subtree.
– Inorder successor: We will replace with the smallest value that exists in the right subtree.
• Suppose we want to delete node 30 from the tree, which is shown below:
• Node 30 is at the right of the root node. In this case, we will use the inorder
successor. The value 38 is the smallest value in the right subtree, so we will
replace the value 30 with 38, but the node would remain the same, i.e., Red.
After replacement, the leaf node, i.e., 30, would be deleted from the tree.
• Since node 30 is a leaf node and Red in color, we need to delete it (we do not have
to perform any rotations or any recoloring).
4/6/2022 4.3 _ Red Black Tree 87
Case 2
• Case 2: If the root node is also double black,
then simply remove the double black and
make it a single black.
4/6/2022 4.3 _ Red Black Tree 88
Case 3
• Case 3: If the double black's sibling is black and
both its children are black.
1. Remove the double black node.
2. Add the color of the node to the parent (P) node.
• If the color of P is red then it becomes black.
• If the color of P is black, then it becomes double black.
3. The color of double black's sibling changes to red.
4. If still double black situation arises, then we will
apply other cases
4/6/2022 4.3 _ Red Black Tree 89
Case 3 Example
• Let's understand this case through an
example.
• Suppose we want to delete node 15 in the
below tree.
4/6/2022 4.3 _ Red Black Tree 90
Case 3 Example (Cntd..)
• We cannot simply delete node 15 from the tree as node 15
is Black in color. Node 15 has two children, which are nil.
So, we replace the 15 value with a nil value. As node 15 and
nil node are black in color, the node becomes double black
after replacement, as shown in the below figure.
4/6/2022 4.3 _ Red Black Tree 91
Case 3 Example (Cntd..)
• In the above tree, we can observe that the double black's sibling is black in color
and its children are nil, which are also black. As the double black's sibling and its
children have black so it cannot give its black color to neither of these. Now, the
double black's parent node is Red so double black's node add its black color to its
parent node. The color of the node 20 changes to black while the color of the nil
node changes to a single black as shown in the below figure.
• After adding the color to its parent node, the color of the double black's sibling,
i.e., node 30 changes to red as shown in the below figure.
• In the above tree, we can observe that there is no longer double black's problem
exists, and it is also a Red-Black tree.
4/6/2022 4.3 _ Red Black Tree 92
Case 4
• Case 4: If double black's sibling is Red.
1. Swap the color of its parent and its sibling.
2. Rotate the parent node in the double black's
direction.
3. Reapply cases.
4/6/2022 4.3 _ Red Black Tree 93
Case 4 – Example
• Let's understand this case through an example.
• Suppose we want to delete node 15.
4/6/2022 4.3 _ Red Black Tree 94
Case 4 – Example (Cntd.,)
• Initially, the 15 is replaced with a nil value. After replacement, the
node becomes double black. Since double black's sibling is Red so
color of the node 20 changes to Red and the color of the node 30
changes to Black.
• Once the swapping of the color is completed, the rotation towards
the double black would be performed. The node 30 will move
upwards and the node 20 will move downwards as shown in the
below figure.
4/6/2022 4.3 _ Red Black Tree 95
Case 4 – Example (Cntd.,)
• In the above tree, we can observe that double black situation still exists in
the tree. It satisfies the case 3 in which double black's sibling is black as
well as both its children are black. First, we remove the double black from
the node and add the black color to its parent node. At the end, the color
of the double black's sibling, i.e., node 25 changes to Red as shown in the
below figure.
• In the above tree, we can observe that the double black situation has
been resolved. It also satisfies the properties of the Red Black tree.
4/6/2022 4.3 _ Red Black Tree 96
Case 5
• Case 5: If double black's sibling is black,
sibling's child who is far from the double black
is black, but near child to double black is red.
1. Swap the color of double black's sibling and the
sibling child which is nearer to the double black
node.
2. Rotate the sibling in the opposite direction of the
double black.
3. Apply case 6
4/6/2022 4.3 _ Red Black Tree 97
Case 5 Example
• Suppose we want to delete the node 1 in the
below tree.
4/6/2022 4.3 _ Red Black Tree 98
Case 5 Example (Cntd..)
• First, we replace the value 1 with the nil value. The node becomes
double black as both the nodes, i.e., 1 and nil are black. It satisfies
the case 3 that implies if DB's sibling is black and both its children
are black. First, we remove the double black of the nil node. Since
the parent of DB is Black, so when the black color is added to the
parent node then it becomes double black. After adding the color,
the double black's sibling color changes to Red as shown below.
4/6/2022 4.3 _ Red Black Tree 99
Case 5 Example (Cntd..)
• We can observe in the above screenshot that the double black problem
still exists in the tree. So, we will reapply the cases. We will apply case 5
because the sibling of node 5 is node 30, which is black in color, the child
of node 30, which is far from node 5 is black, and the child of the node 30
which is near to node 5 is Red. In this case, first we will swap the color of
node 30 and node 25 so the color of node 30 changes to Red and the color
of node 25 changes to Black as shown below.
• As we can observe in the above tree that double black situation still exists.
So, we need to case 6. Let's first see what is case 6.
4/6/2022 4.3 _ Red Black Tree 100
Case 5 Example (Cntd..)
• Once the swapping of the color between the nodes is
completed, we need to rotate the sibling in the
opposite direction of the double black node. In this
rotation, the node 30 moves downwards while the
node 25 moves upwards as shown below.
4/6/2022 4.3 _ Red Black Tree 101
Case 6
• Case 6: If double black's sibling is black, far
child is Red
1. Swap the color of Parent and its sibling node.
2. Rotate the parent towards the Double black's
direction
3. Remove Double black
4. Change the Red color to black.
4/6/2022 4.3 _ Red Black Tree 102
Case 6 Example
• Now we will apply case 6 in the above example to solve the double
black's situation.
• In the above example, the double black is node 5, and the sibling of
node 5 is node 25, which is black in color. The far child of the
double black node is node 30, which is Red in color as shown in the
below figure:
4/6/2022 4.3 _ Red Black Tree 103
Case 6 Example (Cntd..)
• First, we will swap the colors of Parent and its sibling. The parent of
node 5 is node 10, and the sibling node is node 25. The colors of
both the nodes are black, so there is no swapping would occur.
• In the second step, we need to rotate the parent in the double
black's direction. After rotation, node 25 will move upwards,
whereas node 10 will move downwards. Once the rotation is
performed, the tree would like, as shown in the below figure:
4/6/2022 4.3 _ Red Black Tree 104
Case 6 Example (Cntd..)
• In the next step, we will remove double black from
node 5 and node 5 will give its black color to the far
child, i.e., node 30. Therefore, the color of node 30
changes to black as shown in the below figure.
4/6/2022 4.3 _ Red Black Tree 105
UNIT IV : ADVANCED TREES
By
Mr.S.Selvaraj
Asst. Professor (SRG) / CSE
Kongu Engineering College
Perundurai, Erode, Tamilnadu, India
Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C
Programming”, 1st Edition, McGraw Hill, 2018.
20CST32 – Data Structures
Unit IV : Contents
1. Splay Trees
2. B tree
3. Red-Black Trees:
– Rotation
– Insertion
– Deletion
4. Priority Queues(Heaps)
– Binary heap
– d-heaps
– Leftist heaps
– Skew heaps
4/6/2022 108
4.4 _ Priority Queues (Heaps)
Why we need Priority Queue?
Case 1 : Printer Job
• Although jobs sent to a line printer are generally
placed on a queue, this might not always be the best
thing to do.
• For instance, one job might be particularly important,
so that it might be desirable to allow that job to be run
as soon as the printer is available.
• Conversely, if, when the printer becomes available,
there are several one-page jobs and one hundred-
page job, it might be reasonable to make the long job
go last, even if it is not the last job submitted.
• (Unfortunately, most systems do not do this, which
can be particularly annoying at times.)
4/6/2022 4.4 _ Priority Queues (Heaps) 109
Why we need Priority Queue?
Case 2 : Multiuser Operating System Job Scheduler
• Similarly, in a multiuser environment, the operating system scheduler
must decide which of several processes to run.
• Generally a process is only allowed to run for a fixed period of time. One
algorithm uses a queue.
• Jobs are initially placed at the end of the queue.
• The scheduler will repeatedly take the first job on the queue, run it until
either it finishes or its time limit is up, and place it at the end of the queue
if it does not finish.
• This strategy is generally not appropriate, because very short jobs will
seem to take a long time because of the wait involved to run.
• Generally, it is important that short jobs finish as fast as possible, so these
jobs should have preference over jobs that have already been running.
• Furthermore, some jobs that are not short are still very important and
should also have preference.
• This particular application seems to require a special kind of queue,
known as a priority queue.
4/6/2022 4.4 _ Priority Queues (Heaps) 110
Priority Queue - Model
• A priority queue is a data structure that allows at least the following
two operations:
– insert, which does the obvious thing, and
– delete_min, which finds, returns and removes the minimum element
in the heap.
• The insert operation is the equivalent of enqueue, and delete_min
is the priority queue equivalent of the queue's dequeue operation.
• The delete_min function also alters its input.
• Current thinking in the software engineering community suggests
that this is no longer a good idea.
• However, we will continue to use this function because of historical
reasons--many programmers expect delete_min to operate this
way.
4/6/2022 4.4 _ Priority Queues (Heaps) 111
Priority Queue - Positives
• Priority queues have many applications
besides operating systems.
• priority queues are used for external sorting.
• Priority queues are also important in the
implementation of greedy algorithms, which
operate by repeatedly finding a minimum.
4/6/2022 4.4 _ Priority Queues (Heaps) 112
Priority Queues Implementation using Linked List
• There are several obvious ways to implement a priority
queue.
• We could use a simple linked list,
– performing insertions at the front in O(1) and
– traversing the list, which requires O(n) time,
– to delete the minimum which requires O(n) time.
• Alternatively, we could insist that the list be always kept
sorted;
– this makes insertions expensive (O (n)) and
– delete_mins cheap (O(1)).
• The former is probably the better idea of the two, based
on the fact that there are never more delete_mins than
insertions.
4/6/2022 4.4 _ Priority Queues (Heaps) 113
Priority Queues Implementation Using BST
• Another way of implementing priority queues would be to use a binary
search tree.
• This gives an O(log n) average running time for both operations.
• This is true in spite of the fact that although the insertions are random,
but the deletions are not.
• Recall that the only element we ever delete is the minimum.
• Repeatedly removing a node that is in the left subtree would seem to
hurt the balance of the tree by making the right subtree heavy.
• However, the right subtree is random. In the worst case, where the
delete_mins have depleted the left subtree, the right subtree would have
at most twice as many elements as it should.
• This adds only a small constant to its expected depth. Notice that the
bound can be made into a worst-case bound by using a balanced tree;
this protects one against bad insertion sequences.
• We will then discuss how to implement heaps to support efficient
merging.
4/6/2022 4.4 _ Priority Queues (Heaps) 114
Binary Heap
• The implementation we will use is known as a binary heap.
• Its use is so common for priority queue implementations
that when the word heap is used without a qualifier, it is
generally assumed to be referring to this implementation
of the data structure.
• In this section, we will refer to binary heaps as merely
heaps.
• Like binary search trees, heaps have two properties,
– namely, a structure property and
– a heap order property.
• As with AVL trees, an operation on a heap can destroy one
of the properties, so a heap operation must not terminate
until all heap properties are in order. This turns out to be
simple to do.
4/6/2022 4.4 _ Priority Queues (Heaps) 115
Structure Property
• A heap is a binary tree that is completely filled, with the possible
exception of the bottom level, which is filled from left to right.
• Such a tree is known as a complete binary tree.
• a complete binary tree of height h has between 2h and 2h+1 - 1 nodes.
• This implies that the height of a complete binary tree is log n , which is
clearly O(log n).
• An important observation is that because a complete binary tree is so
regular, it can be represented in an array and no pointers are necessary.
4/6/2022 4.4 _ Priority Queues (Heaps) 116
Structure Property
• For any element in array position i,
– the left child is in position 2i,
– the right child is in the cell after the left child (2i + 1), and
– the parent is in position (i/2) .
4/6/2022 4.4 _ Priority Queues (Heaps) 117
Structure Property
• Thus not only are pointers not required, but the operations
required to traverse the tree are extremely simple and likely to be
very fast on most computers.
• The only problem with this implementation is that an estimate of
the maximum heap size is required in advance, but typically this is
not a problem.
• In the figure above, the limit on the heap size is 13 elements.
• The array has a position 0; more on this later.
• A heap data structure will, then, consist of an array (of whatever
type the key is) and integers representing the maximum 2nd
current heap size.
• we shall draw the heaps as trees, with the implication that an
actual implementation will use simple arrays.
4/6/2022 4.4 _ Priority Queues (Heaps) 118
Declaration of Priority Queue
struct heap_struct
{
/* Maximum # that can fit in the heap */
unsigned int max_heap_size;
/* Current # of elements in the heap */
unsigned int size;
element_type *elements;
};
typedef struct heap_struct *PRIORITY_QUEUE;
4/6/2022 4.4 _ Priority Queues (Heaps) 119
Empty Heap Creation
PRIORITY_QUEUE create_pq( unsigned int max_elements )
{
PRIORITY_QUEUE H;
if( max_elements < MIN_PQ_SIZE )
error("Priority queue size is too small");
H = (PRIORITY_QUEUE) malloc ( sizeof (struct heap_struct) );
if( H == NULL )
fatal_error("Out of space!!!");
/* Allocate the array + one extra for sentinel */
H->elements = (element_type *) malloc ( ( max_elements+1) * sizeof (element_type) );
if( H->elements == NULL )
fatal_error("Out of space!!!");
H->max_heap_size = max_elements;
H->size = 0;
H->elements[0] = MIN_DATA;
return H;
}
4/6/2022 4.4 _ Priority Queues (Heaps) 120
Heap Order Property
• The property that allows operations to be performed
quickly is the heap order property.
• Since we want to be able to find the minimum quickly,
it makes sense that the smallest element should be at
the root.
• If we consider that any subtree should also be a heap,
then any node should be smaller than all of its
descendants.
• Applying this logic, we arrive at the heap order
property.
4/6/2022 4.4 _ Priority Queues (Heaps) 121
Heap Order Property
• In a heap, for every node X, the key in the parent of X is smaller
than (or equal to) the key in X, with the obvious exception of the
root (which has no parent).*
• In below Figure the tree on the left is a heap, but the tree on the
right is not (the dashed line shows the violation of heap order).
• By the heap order property, the minimum element can always be
found at the root. Thus, we get the extra operation, find_min, in
constant time.
4/6/2022 4.4 _ Priority Queues (Heaps) 122
Basic Heap Operations
• It is easy (both conceptually and practically) to
perform the two required operations.
• All the work involves ensuring that the heap
order property is maintained.
– Insert
– Delete_min
4/6/2022 4.4 _ Priority Queues (Heaps) 123
Insert
• To insert an element x into the heap, we create a
hole in the next available location, since
otherwise the tree will not be complete.
• If x can be placed in the hole without violating
heap order, then we do so and are done.
• Otherwise we slide the element that is in the
hole's parent node into the hole, thus bubbling
the hole up toward the root.
• We continue this process until x can be placed in
the hole.
4/6/2022 4.4 _ Priority Queues (Heaps) 124
Insert
• Figure shows that to insert 14, we create a hole in the next
available heap location.
• Inserting 14 in the hole would violate the heap order property, so
31 is slid down into the hole. This strategy is continued in next
Figure until the correct location for 14 is found.
• This general strategy is known as a percolate up; the new element
is percolated up the heap until the correct location is found.
4/6/2022 4.4 _ Priority Queues (Heaps) 125
Insert
4/6/2022 4.4 _ Priority Queues (Heaps) 126
Insert
• Insertion is easily implemented with the code shown.
• We could have implemented the percolation in the
insert routine by performing repeated swaps until the
correct order was established, but a swap requires
three assignment statements.
• If an element is percolated up d levels, the number of
assignments performed by the swaps would be 3d.
• Our method uses d + 1 assignments.
4/6/2022 4.4 _ Priority Queues (Heaps) 127
Insertion - Code
/* H->element[0] is a sentinel */
Void insert( element_type x, PRIORITY_QUEUE H )
{
unsigned int i;
if( is_full( H ) )
error("Priority queue is full");
else
{
i = ++H->size;
while( H->elements[i/2] > x )
{
H->elements[i] = H->elements[i/2];
i /= 2;
}
H->elements[i] = x;
}
}
4/6/2022 4.4 _ Priority Queues (Heaps) 128
Sentinel
• If the element to be inserted is the new minimum, it will be pushed all the
way to the top.
• At some point, i will be 1 and we will want to break out of the while loop.
We could do this with an explicit test, but we have chosen to put a very
small value in position 0 in order to make the while loop terminate.
• This value must be guaranteed to be smaller than (or equal to) any
element in the heap; it is known as a sentinel.
• This idea is similar to the use of header nodes in linked lists.
• By adding a dummy piece of information, we avoid a test that is executed
once per loop iteration, thus saving some time.
• The time to do the insertion could be as much as O (log n), if the element
to be inserted is the new minimum and is percolated all the way to the
root.
• On average, the percolation terminates early; it has been shown that
2.607 comparisons are required on average to perform an insert, so the
average insert moves an element up 1.607 levels.
4/6/2022 4.4 _ Priority Queues (Heaps) 129
Delete _ min
• Delete_mins are handled in a similar manner as insertions.
• Finding the minimum is easy; the hard part is removing it.
• When the minimum is removed, a hole is created at the
root.
• Since the heap now becomes one smaller, it follows that
the last element x in the heap must move somewhere in
the heap.
• If x can be placed in the hole, then we are done.
• This is unlikely, so we slide the smaller of the hole's children
into the hole, thus pushing the hole down one level.
• We repeat this step until x can be placed in the hole.
• Thus, our action is to place x in its correct spot along a path
from the root containing minimum children.
4/6/2022 4.4 _ Priority Queues (Heaps) 130
Delete _ min
• The Figure the left figure shows a heap prior to the
delete_min.
• After 13 is removed, we must now try to place 31 in the
heap.
• 31 cannot be placed in the hole, because this would
violate heap order.
4/6/2022 4.4 _ Priority Queues (Heaps) 131
Delete _ min
• Thus, we place the smaller child (14) in the hole,
sliding the hole down one level.
• We repeat this again, placing 19 into the hole
and creating a new hole one level deeper.
4/6/2022 4.4 _ Priority Queues (Heaps) 132
Delete _ min
• We then place 26 in the hole and create a new hole
on the bottom level.
• Finally, we are able to place 31 in the hole.
• This general strategy is known as a percolate down.
• We use the same technique as in the insert routine to
avoid the use of swaps in this routine.
4/6/2022 4.4 _ Priority Queues (Heaps) 133
Delete – min : Function
• A frequent implementation error in heaps occurs
– when there are an even number of elements in the heap, and
– the one node that has only one child is encountered.
• You must make sure not to assume that there are always two
children, so this usually involves an extra test.
• In the code, depicted in next slide we've done this test at line 8.
• One extremely tricky solution is always to ensure that your
algorithm thinks every node has two children.
• Do this by placing a sentinel, of value higher than any in the heap,
at the spot after the heap ends, at the start of each percolate down
when the heap size is even.
• You should think very carefully before attempting this, and you
must put in a prominent comment if you do use this technique.
4/6/2022 4.4 _ Priority Queues (Heaps) 134
Delete _ min : function
• element_type Delete_min( PRIORITY_QUEUE H )
• {
• unsigned int i, child;
• element_type min_element, last_element;
• if( is_empty( H ) )
• {
• error("Priority queue is empty");
• return H->elements[0];
• }
• min_element = H->elements[1];
• last_element = H->elements[H->size--];
• for( i=1; i*2 <= H->size; i=child )
• {
• /* find smaller child */
• child = i*2;
• if( ( child != H->size ) && ( H->elements[child+1] < H->elements [child] ) ) // LINE 8
• child++;
• /* percolate one level */
• if( last_element > H->elements[child] )
• H->elements[i] = H->elements[child];
• Else
• break;
• }
• H->elements[i] = last_element;
• return min_element;
• }
4/6/2022 4.4 _ Priority Queues (Heaps) 135
Heap – Time Complexity
• Although this eliminates the need to test for the
presence of a right child, you cannot eliminate
the requirement that you test when you reach
the bottom because this would require a sentinel
for every leaf.
• The worst-case running time for this operation is
O(log n).
• On average, the element that is placed at the root
is percolated almost to the bottom of the heap
(which is the level it came from), so the average
running time is O (log n).
4/6/2022 4.4 _ Priority Queues (Heaps) 136
Other Heap Operations
• Decrease_key
• Increase_key
• Delete
• Build_heap
4/6/2022 4.4 _ Priority Queues (Heaps) 137
Applications of Priority Queues
• Operating systems design.
• Priority queues are used to implement several
graph algorithms efficiently.
• The Selection Problem.
• Event Simulation.
4/6/2022 4.4 _ Priority Queues (Heaps) 138
d-Heaps
• Binary heaps are so simple that they are almost always
used when priority queues are needed.
• A simple generalization is a d-heap, which is exactly like a
binary heap except that all nodes have d children (thus, a
binary heap is a 2-heap).
• Below Figure shows a 3-heap.
4/6/2022 4.4 _ Priority Queues (Heaps) 139
d - Heaps
• Notice that a d-heap is much more shallow than a binary heap, improving the running time of
inserts to O(logdn).
• However, the delete_min operation is more expensive, because even though the tree is shallower,
the minimum of d children must be found, which takes d - 1 comparisons using a standard
algorithm.
• This raises the time for this operation to O(d logdn). If d is a constant, both running times are, of
course, O(log n).
• Furthermore, although an array can still be used, the multiplications and divisions to find children
and parents are now by d, which seriously increases the running time, because we can no longer
implement division by a bit shift.
• d-heaps are interesting in theory, because there are many algorithms where the number of
insertions is much greater than the number of delete_mins (and thus a theoretical speedup is
possible).
• They are also of interest when the priority queue is too large to fit entirely in main memory.
• In this case, a d-heap can be advantageous in much the same way as B-trees.
• The most glaring weakness of the heap implementation, aside from the inability to perform finds is
that combining two heaps into one is a hard operation. This extra operation is known as a merge.
• There are quite a few ways of implementing heaps so that the running time of a merge is O(log n).
• We will now discuss three data structures, of various complexity, that support the merge operation
efficiently. We will defer any complicated analysis until
4/6/2022 4.4 _ Priority Queues (Heaps) 140
Leftist Heaps
• It is a priority queue implemented with a variant
of a binary heap.
• Every node has an s-value (or rank or
distance) which is the distance to the nearest
leaf.
• In contrast to a binary heap (Which is always
a complete binary tree), a leftist tree may be
very unbalanced.
• It is a binary tree with the following properties:
– Normal Min Heap Property: Key(i)>= Key(Parent(i))
– Heavier on Left side: dist(left(i)>=dist(right(i))
4/6/2022 4.4 _ Priority Queues (Heaps) 141
Time Complexities of Leftist Tree / Heap
4/6/2022 4.4 _ Priority Queues (Heaps) 142
Leftist Heap – NPL or Distance or Rank
• NPL : NULL Path Length
– NPL(NULL) = -1
– NPL(Leaf/Single Child) = 0
– NPL (other nodes) = 1 + min(NPL(Left/Right))
– NPL(Left)>=NPL(Right)
4/6/2022 4.4 _ Priority Queues (Heaps) 143
Leftist Heap – Example and Node
Structure
4/6/2022 4.4 _ Priority Queues (Heaps) 144
Leftist Heap - Example
4/6/2022 4.4 _ Priority Queues (Heaps) 145
Not a Leftist Heap - Example
4/6/2022 4.4 _ Priority Queues (Heaps) 146
Leftist Heap - Operation
• The main operation is merge().
• deleteMin() or extractMin() can be done by removing
root and calling merge() for left and right subtrees.
• insert() can be done be create a leftist tree with single
key (key to be inserted) and calling merge() for given
tree and tree with single node.
4/6/2022 4.4 _ Priority Queues (Heaps) 147
Idea behind Merging
• Since right subtree is smaller, the idea is to merge
right subtree of a tree with other tree. Below are
abstract steps.
– Put the root with smaller value as the new root.
– Hang its left subtree on the left.
– Recursively merge its right subtree and the other
tree.
– Before returning from recursion:
• Update dist() of merged root.
• Swap left and right subtrees just below root, if needed, to
keep leftist property of merged
result.
4/6/2022 4.4 _ Priority Queues (Heaps) 148
Detailed Steps for Merge:
1. Compare the roots of two heaps.
2. Push the smaller key into an empty stack, and move to
the right child of smaller key.
3. Recursively compare two keys and go on pushing the
smaller key onto the stack and move to its right child.
4. Repeat until a null node is reached.
5. Take the last node processed and make it the right child
of the node at top of the stack, and convert it to leftist
heap if the properties of leftist heap are violated.
6. Recursively go on popping the elements from the stack
and making them the right child of new stack top.
4/6/2022 4.4 _ Priority Queues (Heaps) 149
Merge - Example
• Consider two leftist heaps given below:
4/6/2022 4.4 _ Priority Queues (Heaps) 150
Merge - Example
4/6/2022 4.4 _ Priority Queues (Heaps) 151
Merge - Example
• The subtree at node 7 violates the property of
leftist heap so we swap it with the left child
and retain the property of leftist heap.
4/6/2022 4.4 _ Priority Queues (Heaps) 152
• Convert to leftist heap. Repeat the process.
• The worst case time complexity of this algorithm is
O(log n) in the worst case, where n is the number of
nodes in the leftist heap.
Merge - Example
4/6/2022 4.4 _ Priority Queues (Heaps) 153
Merge - Example 2
4/6/2022 4.4 _ Priority Queues (Heaps) 154
Skew Heap
• Problem with Leftist Heap
– Extra Space for NPL.
– Two PASS merge (with Stack).
– Extra Complexity / Logic to maintain and check NPL.
• Solutions:
– Skew Heap
– Blind adjusting version of leftist heap.
– Amortized time for merge, insert, delete_min (i.e, O(logn))
– Worst case time for all the three operations is O(n)
– Merge always switches children when fixing right path.
– In iterative method, skew heap has only one PASS.
4/6/2022 4.4 _ Priority Queues (Heaps) 155
Skew Heap
• A skew heap is a heap data structure implemented as a
binary tree.
• Skew heaps are advantageous because of their ability
to merge more quickly than binary heaps.
• In contrast with binary heaps, there are no structural
constraints, so there is no guarantee that the height of
the tree is logarithmic.
• Only two conditions must be satisfied:
– The general heap order must be enforced
– Every operation (insert, delete_min, merge) on two skew
heaps must be done using a special skew heap merge.
4/6/2022 4.4 _ Priority Queues (Heaps) 156
Skew Heap
• A skew heap is a self-adjusting form of a leftist heap which
attempts to maintain balance by unconditionally swapping all
nodes in the merge path when merging two heaps.
• (The merge operation is also used when inserting and deleting
values.)
• With no structural constraints, it may seem that a skew heap would
be horribly inefficient.
• However, amortized complexity analysis can be used to
demonstrate that all operations on a skew heap can be done in
O(log n).
• Skew heaps may be described with the following recursive
definition:
– A heap with only one element is a skew heap.
– The result of skew merging two skew heaps Sh1 and Sh2 is also a skew
heap.
4/6/2022 4.4 _ Priority Queues (Heaps) 157
Skew Heap – Merge()
• When two skew heaps are to be merged
together, we can use the same process as the
merge of two leftist heaps:
1. Compare roots of two heaps
2. Recursively merge the heap that has the larger
root with the right subtree of the other heap.
3. Make the resulting merge the right subtree of
the heap that has smaller root.
4. Swap the children of the new heap
4/6/2022 4.4 _ Priority Queues (Heaps) 158
Merging Two Skew Heaps
4/6/2022 4.4 _ Priority Queues (Heaps) 159
Example
4/6/2022 4.4 _ Priority Queues (Heaps) 160
Skew Heap Code
SkewHeap merge(heap1, heap2) {
case {
heap1 == NULL: return heap2;
heap2 == NULL: return heap1;
heap1.findMin() <= heap2.findMin():
temp = heap1.right;
heap1.right = heap1.left;
heap1.left = merge(heap2, temp);
return heap1;
otherwise:
return merge(heap2, heap1);
}
}
4/6/2022 4.4 _ Priority Queues (Heaps) 161
Skew Heap – Merge() (Non-Recursive)
• Alternatively, there is a non-recursive approach which
tends to be a little clearer, but does require some sorting
at the outset.
1. Split each heap into subtrees by cutting every rightmost path.
(From the root node, sever the right node and make the right
child its own subtree.) This will result in a set of trees in which
the root either only has a left child or no children at all.
2. Sort the subtrees in ascending order based on the value of
the root node of each subtree.
3. While there are still multiple subtrees, iteratively recombine
the last two (from right to left).
1. If the root of the second-to-last subtree has a left child, swap it to
be the right child.
2. Link the root of the last subtree as the left child of the second-to-
last subtree.
4/6/2022 4.4 _ Priority Queues (Heaps) 162
Skew Heap – Merge() (Non-Recursive)
- Example
4/6/2022 4.4 _ Priority Queues (Heaps) 163
Skew Heap – Merge() (Non-Recursive)
- Example
4/6/2022 4.4 _ Priority Queues (Heaps) 164
Skew Heap – Merge() (Non-Recursive)
- Example
4/6/2022 4.4 _ Priority Queues (Heaps) 165
Skew Heap – Merge() (Non-Recursive)
- Example
4/6/2022 4.4 _ Priority Queues (Heaps) 166
Skew Heap – Merge() (Non-Recursive)
- Example
4/6/2022 4.4 _ Priority Queues (Heaps) 167
Skew Heap - Example 2
4/6/2022 4.4 _ Priority Queues (Heaps) 168
Skew Heap - Example 2
4/6/2022 4.4 _ Priority Queues (Heaps) 169
Skew Heap - Example 2
4/6/2022 4.4 _ Priority Queues (Heaps) 170
Skew Heap Time Complexity
• A skew heap is a self-adjusting version of a leftist heap that is
incredibly simple to implement.
• The relationship of skew heaps to leftist heaps is analogous to the
relation between splay trees and AVL trees.
• Skew heaps are binary trees with heap order, but there is no
structural constraint on these trees.
• Unlike leftist heaps, no information is maintained about the null
path length of any node.
• The right path of a skew heap can be arbitrarily long at any time, so
the worst-case running time of all operations is O(n).
• However, as with splay trees, it can be shown that for any m
consecutive operations, the total worst-case running time is O(m
log n).
• Thus, skew heaps have O(log n) amortized cost per operation.
4/6/2022 4.4 _ Priority Queues (Heaps) 171

More Related Content

What's hot

Unit I-Data structures stack & Queue
Unit I-Data structures stack & QueueUnit I-Data structures stack & Queue
Unit I-Data structures stack & QueueDrkhanchanaR
 
Data Structures (CS8391)
Data Structures (CS8391)Data Structures (CS8391)
Data Structures (CS8391)Elavarasi K
 
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4358 33 powerpoint-slides_4-introduction-data-structures_chapter-4
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4sumitbardhan
 
Sorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha MajumderSorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha MajumderAshin Guha Majumder
 
trees in data structure
trees in data structure trees in data structure
trees in data structure shameen khan
 
Ppt on Linked list,stack,queue
Ppt on Linked list,stack,queuePpt on Linked list,stack,queue
Ppt on Linked list,stack,queueSrajan Shukla
 
C# Dictionary Hash Table and sets
C# Dictionary Hash Table and setsC# Dictionary Hash Table and sets
C# Dictionary Hash Table and setsSimplilearn
 
heap Sort Algorithm
heap  Sort Algorithmheap  Sort Algorithm
heap Sort AlgorithmLemia Algmri
 
STACKS IN DATASTRUCTURE
STACKS IN DATASTRUCTURESTACKS IN DATASTRUCTURE
STACKS IN DATASTRUCTUREArchie Jamwal
 
Practical Problem Solving with Apache Hadoop & Pig
Practical Problem Solving with Apache Hadoop & PigPractical Problem Solving with Apache Hadoop & Pig
Practical Problem Solving with Apache Hadoop & PigMilind Bhandarkar
 
linked lists in data structures
linked lists in data structureslinked lists in data structures
linked lists in data structuresDurgaDeviCbit
 
Binary Heap Tree, Data Structure
Binary Heap Tree, Data Structure Binary Heap Tree, Data Structure
Binary Heap Tree, Data Structure Anand Ingle
 
Trees in data structures
Trees in data structuresTrees in data structures
Trees in data structuresASairamSairam1
 

What's hot (20)

stack & queue
stack & queuestack & queue
stack & queue
 
Unit I-Data structures stack & Queue
Unit I-Data structures stack & QueueUnit I-Data structures stack & Queue
Unit I-Data structures stack & Queue
 
Data Structures (CS8391)
Data Structures (CS8391)Data Structures (CS8391)
Data Structures (CS8391)
 
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4358 33 powerpoint-slides_4-introduction-data-structures_chapter-4
358 33 powerpoint-slides_4-introduction-data-structures_chapter-4
 
Doubly Linked List
Doubly Linked ListDoubly Linked List
Doubly Linked List
 
Heap tree
Heap treeHeap tree
Heap tree
 
Sorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha MajumderSorting Seminar Presentation by Ashin Guha Majumder
Sorting Seminar Presentation by Ashin Guha Majumder
 
Unit 3 dsa LINKED LIST
Unit 3 dsa LINKED LISTUnit 3 dsa LINKED LIST
Unit 3 dsa LINKED LIST
 
Linked list
Linked listLinked list
Linked list
 
trees in data structure
trees in data structure trees in data structure
trees in data structure
 
Ppt on Linked list,stack,queue
Ppt on Linked list,stack,queuePpt on Linked list,stack,queue
Ppt on Linked list,stack,queue
 
B and B+ tree
B and B+ treeB and B+ tree
B and B+ tree
 
C# Dictionary Hash Table and sets
C# Dictionary Hash Table and setsC# Dictionary Hash Table and sets
C# Dictionary Hash Table and sets
 
heap Sort Algorithm
heap  Sort Algorithmheap  Sort Algorithm
heap Sort Algorithm
 
STACKS IN DATASTRUCTURE
STACKS IN DATASTRUCTURESTACKS IN DATASTRUCTURE
STACKS IN DATASTRUCTURE
 
Practical Problem Solving with Apache Hadoop & Pig
Practical Problem Solving with Apache Hadoop & PigPractical Problem Solving with Apache Hadoop & Pig
Practical Problem Solving with Apache Hadoop & Pig
 
Arrays
ArraysArrays
Arrays
 
linked lists in data structures
linked lists in data structureslinked lists in data structures
linked lists in data structures
 
Binary Heap Tree, Data Structure
Binary Heap Tree, Data Structure Binary Heap Tree, Data Structure
Binary Heap Tree, Data Structure
 
Trees in data structures
Trees in data structuresTrees in data structures
Trees in data structures
 

Similar to Advanced Trees

Searching, Sorting and Hashing Techniques
Searching, Sorting and Hashing TechniquesSearching, Sorting and Hashing Techniques
Searching, Sorting and Hashing TechniquesSelvaraj Seerangan
 
Binary Search Tree In Python.pptx
Binary Search Tree In Python.pptxBinary Search Tree In Python.pptx
Binary Search Tree In Python.pptxRohanThota3
 
Splay trees by NIKHIL ARORA (www.internetnotes.in)
Splay trees by NIKHIL ARORA (www.internetnotes.in)Splay trees by NIKHIL ARORA (www.internetnotes.in)
Splay trees by NIKHIL ARORA (www.internetnotes.in)nikhilarora2211
 
Binary Search Tree
Binary Search TreeBinary Search Tree
Binary Search TreeINAM352782
 
Binary trees
Binary treesBinary trees
Binary treesAmit Vats
 
Sienna 6 bst
Sienna 6 bstSienna 6 bst
Sienna 6 bstchidabdu
 
Week 8 (trees)
Week 8 (trees)Week 8 (trees)
Week 8 (trees)amna izzat
 
BTrees-fall2010.ppt
BTrees-fall2010.pptBTrees-fall2010.ppt
BTrees-fall2010.pptzalanmvb
 
Data cube computation
Data cube computationData cube computation
Data cube computationRashmi Sheikh
 
Rahat &amp; juhith
Rahat &amp; juhithRahat &amp; juhith
Rahat &amp; juhithRj Juhith
 
Bender kuszmaul tutorial-xldb12
Bender kuszmaul tutorial-xldb12Bender kuszmaul tutorial-xldb12
Bender kuszmaul tutorial-xldb12Atner Yegorov
 
Data Structures and Algorithms for Big Databases
Data Structures and Algorithms for Big DatabasesData Structures and Algorithms for Big Databases
Data Structures and Algorithms for Big Databasesomnidba
 
Heap Data Structure
 Heap Data Structure Heap Data Structure
Heap Data StructureSaumya Som
 

Similar to Advanced Trees (20)

Trees
TreesTrees
Trees
 
Searching, Sorting and Hashing Techniques
Searching, Sorting and Hashing TechniquesSearching, Sorting and Hashing Techniques
Searching, Sorting and Hashing Techniques
 
Binary Search Tree In Python.pptx
Binary Search Tree In Python.pptxBinary Search Tree In Python.pptx
Binary Search Tree In Python.pptx
 
Binary Search Tree
Binary Search TreeBinary Search Tree
Binary Search Tree
 
Splay trees by NIKHIL ARORA (www.internetnotes.in)
Splay trees by NIKHIL ARORA (www.internetnotes.in)Splay trees by NIKHIL ARORA (www.internetnotes.in)
Splay trees by NIKHIL ARORA (www.internetnotes.in)
 
Binary Search Tree
Binary Search TreeBinary Search Tree
Binary Search Tree
 
DAA PPT.pptx
DAA PPT.pptxDAA PPT.pptx
DAA PPT.pptx
 
Binary trees
Binary treesBinary trees
Binary trees
 
Sienna 6 bst
Sienna 6 bstSienna 6 bst
Sienna 6 bst
 
Heapsort
HeapsortHeapsort
Heapsort
 
Heapsort
HeapsortHeapsort
Heapsort
 
Week 8 (trees)
Week 8 (trees)Week 8 (trees)
Week 8 (trees)
 
BTrees-fall2010.ppt
BTrees-fall2010.pptBTrees-fall2010.ppt
BTrees-fall2010.ppt
 
Data cube computation
Data cube computationData cube computation
Data cube computation
 
B trees
B treesB trees
B trees
 
Rahat &amp; juhith
Rahat &amp; juhithRahat &amp; juhith
Rahat &amp; juhith
 
Bender kuszmaul tutorial-xldb12
Bender kuszmaul tutorial-xldb12Bender kuszmaul tutorial-xldb12
Bender kuszmaul tutorial-xldb12
 
Data Structures and Algorithms for Big Databases
Data Structures and Algorithms for Big DatabasesData Structures and Algorithms for Big Databases
Data Structures and Algorithms for Big Databases
 
2-4 tree
2-4 tree2-4 tree
2-4 tree
 
Heap Data Structure
 Heap Data Structure Heap Data Structure
Heap Data Structure
 

More from Selvaraj Seerangan

Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...
Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...
Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...Selvaraj Seerangan
 
END SEM _ Design Thinking _ 16 Templates.pptx
END SEM _ Design Thinking _ 16 Templates.pptxEND SEM _ Design Thinking _ 16 Templates.pptx
END SEM _ Design Thinking _ 16 Templates.pptxSelvaraj Seerangan
 
Design Thinking _ Complete Templates.pptx
Design Thinking _ Complete Templates.pptxDesign Thinking _ Complete Templates.pptx
Design Thinking _ Complete Templates.pptxSelvaraj Seerangan
 
CAT 3 _ List of Templates.pptx
CAT 3 _ List of Templates.pptxCAT 3 _ List of Templates.pptx
CAT 3 _ List of Templates.pptxSelvaraj Seerangan
 
[PPT] _ Unit 3 _ Experiment.pptx
[PPT] _ Unit 3 _ Experiment.pptx[PPT] _ Unit 3 _ Experiment.pptx
[PPT] _ Unit 3 _ Experiment.pptxSelvaraj Seerangan
 
CAT 2 _ List of Templates.pptx
CAT 2 _ List of Templates.pptxCAT 2 _ List of Templates.pptx
CAT 2 _ List of Templates.pptxSelvaraj Seerangan
 
Design Thinking - Empathize Phase
Design Thinking - Empathize PhaseDesign Thinking - Empathize Phase
Design Thinking - Empathize PhaseSelvaraj Seerangan
 
18CSL51 - Network Lab Manual.pdf
18CSL51 - Network Lab Manual.pdf18CSL51 - Network Lab Manual.pdf
18CSL51 - Network Lab Manual.pdfSelvaraj Seerangan
 
CAT 1 _ List of Templates.pptx
CAT 1 _ List of Templates.pptxCAT 1 _ List of Templates.pptx
CAT 1 _ List of Templates.pptxSelvaraj Seerangan
 
[PPT] _ UNIT 1 _ COMPLETE.pptx
[PPT] _ UNIT 1 _ COMPLETE.pptx[PPT] _ UNIT 1 _ COMPLETE.pptx
[PPT] _ UNIT 1 _ COMPLETE.pptxSelvaraj Seerangan
 
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdfSelvaraj Seerangan
 

More from Selvaraj Seerangan (20)

Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...
Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...
Unit 2,3,4 _ Internet of Things A Hands-On Approach (Arshdeep Bahga, Vijay Ma...
 
Unit 5 _ Fog Computing .pdf
Unit 5 _ Fog Computing .pdfUnit 5 _ Fog Computing .pdf
Unit 5 _ Fog Computing .pdf
 
CAT III Answer Key.pdf
CAT III Answer Key.pdfCAT III Answer Key.pdf
CAT III Answer Key.pdf
 
END SEM _ Design Thinking _ 16 Templates.pptx
END SEM _ Design Thinking _ 16 Templates.pptxEND SEM _ Design Thinking _ 16 Templates.pptx
END SEM _ Design Thinking _ 16 Templates.pptx
 
Design Thinking _ Complete Templates.pptx
Design Thinking _ Complete Templates.pptxDesign Thinking _ Complete Templates.pptx
Design Thinking _ Complete Templates.pptx
 
CAT 3 _ List of Templates.pptx
CAT 3 _ List of Templates.pptxCAT 3 _ List of Templates.pptx
CAT 3 _ List of Templates.pptx
 
[PPT] _ Unit 5 _ Evolve.pptx
[PPT] _ Unit 5 _ Evolve.pptx[PPT] _ Unit 5 _ Evolve.pptx
[PPT] _ Unit 5 _ Evolve.pptx
 
[PPT] _ Unit 4 _ Engage.pptx
[PPT] _ Unit 4 _ Engage.pptx[PPT] _ Unit 4 _ Engage.pptx
[PPT] _ Unit 4 _ Engage.pptx
 
[PPT] _ Unit 3 _ Experiment.pptx
[PPT] _ Unit 3 _ Experiment.pptx[PPT] _ Unit 3 _ Experiment.pptx
[PPT] _ Unit 3 _ Experiment.pptx
 
CAT 2 _ List of Templates.pptx
CAT 2 _ List of Templates.pptxCAT 2 _ List of Templates.pptx
CAT 2 _ List of Templates.pptx
 
Design Thinking - Empathize Phase
Design Thinking - Empathize PhaseDesign Thinking - Empathize Phase
Design Thinking - Empathize Phase
 
CAT-II Answer Key.pdf
CAT-II Answer Key.pdfCAT-II Answer Key.pdf
CAT-II Answer Key.pdf
 
PSP LAB MANUAL.pdf
PSP LAB MANUAL.pdfPSP LAB MANUAL.pdf
PSP LAB MANUAL.pdf
 
18CSL51 - Network Lab Manual.pdf
18CSL51 - Network Lab Manual.pdf18CSL51 - Network Lab Manual.pdf
18CSL51 - Network Lab Manual.pdf
 
DS LAB MANUAL.pdf
DS LAB MANUAL.pdfDS LAB MANUAL.pdf
DS LAB MANUAL.pdf
 
CAT 1 _ List of Templates.pptx
CAT 1 _ List of Templates.pptxCAT 1 _ List of Templates.pptx
CAT 1 _ List of Templates.pptx
 
[PPT] _ UNIT 1 _ COMPLETE.pptx
[PPT] _ UNIT 1 _ COMPLETE.pptx[PPT] _ UNIT 1 _ COMPLETE.pptx
[PPT] _ UNIT 1 _ COMPLETE.pptx
 
CAT-1 Answer Key.doc
CAT-1 Answer Key.docCAT-1 Answer Key.doc
CAT-1 Answer Key.doc
 
Unit 3 Complete.pptx
Unit 3 Complete.pptxUnit 3 Complete.pptx
Unit 3 Complete.pptx
 
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf
[PPT] _ Unit 2 _ 9.0 _ Domain Specific IoT _Home Automation.pdf
 

Recently uploaded

Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Dr.Costas Sachpazis
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxAsutosh Ranjan
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Serviceranjana rawat
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...ranjana rawat
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Christo Ananth
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxupamatechverse
 

Recently uploaded (20)

Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptxExploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
Exploring_Network_Security_with_JA3_by_Rakesh Seal.pptx
 
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
9953056974 Call Girls In South Ex, Escorts (Delhi) NCR.pdf
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(ANJALI) Dange Chowk Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
Sheet Pile Wall Design and Construction: A Practical Guide for Civil Engineer...
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
Coefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptxCoefficient of Thermal Expansion and their Importance.pptx
Coefficient of Thermal Expansion and their Importance.pptx
 
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
★ CALL US 9953330565 ( HOT Young Call Girls In Badarpur delhi NCR
 
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
(RIA) Call Girls Bhosari ( 7001035870 ) HI-Fi Pune Escorts Service
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
(PRIYA) Rajgurunagar Call Girls Just Call 7001035870 [ Cash on Delivery ] Pun...
 
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
Call for Papers - African Journal of Biological Sciences, E-ISSN: 2663-2187, ...
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
Roadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and RoutesRoadmap to Membership of RICS - Pathways and Routes
Roadmap to Membership of RICS - Pathways and Routes
 
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur EscortsCall Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
Call Girls Service Nagpur Tanvi Call 7001035870 Meet With Nagpur Escorts
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptx
 

Advanced Trees

  • 1. UNIT IV : ADVANCED TREES By Mr.S.Selvaraj Asst. Professor (SRG) / CSE Kongu Engineering College Perundurai, Erode, Tamilnadu, India Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C Programming”, 1st Edition, McGraw Hill, 2018. 20CST32 – Data Structures
  • 2. Syllabus – Unit Wise 4/6/2022 4.1 _ Splay Trees 2
  • 3. List of Exercises 4/6/2022 4.1 _ Splay Trees 3
  • 4. Text Book and Reference Book 4/6/2022 4.1 _ Splay Trees 4
  • 5. Unit IV : Contents 1. Splay Trees 2. B tree 3. Red-Black Trees: – Rotation – Insertion – Deletion 4. Priority Queues(Heaps) – Binary heap – d-heaps – Leftist heaps – Skew heaps 4/6/2022 5 4.1 _ Splay Trees
  • 6. Splay Tree • Splay trees are the self-balancing or self- adjusted binary search trees. • In other words, we can say that the splay trees are the variants of the binary search trees. • The prerequisite for the splay trees that we should know about the binary search trees. 4/6/2022 4.1 _ Splay Trees 6
  • 7. Splay Tree • As we already know, the time complexity of a binary search tree in every case. The time complexity of a binary search tree in the average case is O(logn) and the time complexity in the worst case is O(n). • In a binary search tree, the value of the left subtree is smaller than the root node, and the value of the right subtree is greater than the root node; in such case, the time complexity would be O(logn). • If the binary tree is left-skewed or right-skewed, then the time complexity would be O(n). • To limit the skewness, the AVL and Red-Black tree came into the picture, having O(logn) time complexity for all the operations in all the cases. • We can also improve this time complexity by doing more practical implementations, so the new Tree data structure was designed, known as a Splay tree. 4/6/2022 4.1 _ Splay Trees 7
  • 8. Spalying • A splay tree is a self-balancing tree, but AVL and Red-Black trees are also self-balancing trees then. • What makes the splay tree unique two trees. It has one extra property that makes it unique is splaying. • A splay tree contains the same operations as a Binary search tree. • i.e., Insertion, deletion and searching, but it also contains one more operation, i.e., splaying. • So all the operations in the splay tree are followed by splaying. • Splay trees are not strictly balanced trees, but they are roughly balanced trees. Let's understand the search operation in the splay-tree. 4/6/2022 4.1 _ Splay Trees 8
  • 9. Example • Suppose we want to search 7 element in the tree, which is shown below: 4/6/2022 4.1 _ Splay Trees 9
  • 10. Rearrangements • To search any element in the splay tree, first, we will perform the standard binary search tree operation. • As 7 is less than 10 so we will come to the left of the root node. • After performing the search operation, we need to perform splaying. • Here splaying means that the operation that we are performing on any element should become the root node after performing some rearrangements. • The rearrangement of the tree will be done through the rotations. 4/6/2022 4.1 _ Splay Trees 10
  • 11. Rotations • There are six types of rotations used for splaying: 1. Zig rotation (Right rotation) 2. Zag rotation (Left rotation) 3. Zig zag (Zig followed by zag) 4. Zag zig (Zag followed by zig) 5. Zig zig (two right rotations) 6. Zag zag (two left rotations) 4/6/2022 4.1 _ Splay Trees 11
  • 12. Cases for the Rotations • Case 1: If the node does not have a grand-parent, and if it is the right child of the parent, then we carry out the left rotation; otherwise, the right rotation is performed. • Case 2: If the node has a grandparent, then based on the following scenarios; the rotation would be performed: – Scenario 1: If the node is the right of the parent and the parent is also right of its parent, then zig zig right right rotation is performed. – Scenario 2: If the node is left of a parent, but the parent is right of its parent, then zig zag right left rotation is performed. – Scenario 3: If the node is right of the parent and the parent is right of its parent, then zig zig left left rotation is performed. – Scenario 4: If the node is right of a parent, but the parent is left of its parent, then zig zag right-left rotation is performed. 4/6/2022 4.1 _ Splay Trees 12
  • 13. Zig rotations • we have to search 7 element in the tree 4/6/2022 4.1 _ Splay Trees 13
  • 14. Zag Rotation • we have to search 20 element in the tree 4/6/2022 4.1 _ Splay Trees 14
  • 15. Zig Zig Rotation • Suppose we have to search 1 element in the tree 4/6/2022 4.1 _ Splay Trees 15
  • 16. Zig Zag Rotation • Suppose we want to search 13 element in the tree 4/6/2022 4.1 _ Splay Trees 16
  • 17. Zag Zig Rotation • Suppose we want to search 9 element in the tree 4/6/2022 4.1 _ Splay Trees 17
  • 18. Zag Zag Rotation • Suppose we want to search 20 in the below tree. 4/6/2022 4.1 _ Splay Trees 18
  • 19. Advantages of Splay tree • In the splay tree, we do not need to store the extra information. – In contrast, in AVL trees, we need to store the balance factor of each node that requires extra space, and – Red-Black trees also require to store one extra bit of information that denotes the color of the node, either Red or Black. • It is the fastest type of Binary Search tree for various practical applications. It is used in Windows NT and GCC compilers. • It provides better performance as the frequently accessed nodes will move nearer to the root node, due to which the elements can be accessed quickly in splay trees. – It is used in the cache implementation as the recently accessed data is stored in the cache so that we do not need to go to the memory for accessing the data, and it takes less time. 4/6/2022 4.1 _ Splay Trees 19
  • 20. Drawback of Splay tree • The major drawback of the splay tree would be that trees are not strictly balanced, i.e., they are roughly balanced. • Sometimes the splay trees are linear, so it will take O(n) time complexity. 4/6/2022 4.1 _ Splay Trees 20
  • 21. Thank you 4/6/2022 4.1 _ Splay Trees 21
  • 22. UNIT IV : ADVANCED TREES By Mr.S.Selvaraj Asst. Professor (SRG) / CSE Kongu Engineering College Perundurai, Erode, Tamilnadu, India Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C Programming”, 1st Edition, McGraw Hill, 2018. 20CST32 – Data Structures
  • 23. Unit IV : Contents 1. Splay Trees 2. B tree 3. Red-Black Trees: – Rotation – Insertion – Deletion 4. Priority Queues(Heaps) – Binary heap – d-heaps – Leftist heaps – Skew heaps 4/6/2022 23 4.2 _ B Tree
  • 24. B Tree • B tree is a self-balancing tree, and it is a m-way tree where m defines the order of the tree. • B tree is a generalization of the Binary Search tree in which – a node can have more than one key and – more than two children depending upon the value of m. • In the B tree, the data is specified in a sorted order having – lower values on the left subtree and – higher values in the right subtree. • In the B tree, all the leaf nodes must be at the same level, whereas, in the case of a binary tree, the leaf nodes can be at different levels. 4/6/2022 4.2 _ B Tree 24
  • 25. B Tree Properties • If the B tree has an order of m, then 1. Children Case: – each node can have a maximum of m children – In the case of minimum children, • the leaf nodes have zero children, • the root node has two children, and • the internal nodes have a ceiling of m/2. 2. Key Case: – Each node can have maximum (m-1) keys. • For example, if the value of m is 5 then the maximum value of keys is 4. – In the case of minimum keys, • The root node has minimum one key, • other nodes has (ceiling of m/2 minus - 1) minimum keys. • If we perform insertion in the B tree, then the node is always inserted in the leaf node. 4/6/2022 4.2 _ B Tree 25
  • 26. Example • Suppose we want to create a B tree of order 3 by inserting values from 1 to 10. 4/6/2022 4.2 _ B Tree 26
  • 27. • Step 1: First, we create a node with 1 value 4/6/2022 4.2 _ B Tree 27
  • 28. • Step 2: The next element is 2, which comes after 1 4/6/2022 4.2 _ B Tree 28
  • 29. • Step 3: The next element is 3, and it is inserted after 2. • As we know that each node can have 2 maximum keys, so we will split this node through the middle element. • The middle element is 2, so it moves to its parent. • The node 2 does not have any parent, so it will become the root node. 4/6/2022 4.2 _ B Tree 29
  • 30. • Step 4: The next element is 4. Since 4 is greater than 2 and 3, so it will be added after the 3 4/6/2022 4.2 _ B Tree 30
  • 31. • Step 5: The next element is 5. Since 5 is greater than 2, 3 and 4 so it will be added after 4 • As we know that each node can have 2 maximum keys, so we will split this node through the middle element. • The middle element is 4, so it moves to its parent. The parent is node 2; therefore, 4 will be added after 2 4/6/2022 4.2 _ B Tree 31
  • 32. • Step 6: The next element is 6. Since 6 is greater than 2, 4 and 5, so 6 will come after 5 4/6/2022 4.2 _ B Tree 32
  • 33. • Step 7: The next element is 7. Since 7 is greater than 2, 4, 5 and 6, so 7 will come after 6 • As we know that each node can have 2 maximum keys, so we will split this node through the middle element. The middle element is 6, so it moves to its parent • But, 6 cannot be added after 4 because the node can have 2 maximum keys, so we will split this node through the middle element. The middle element is 4, so it moves to its parent. As node 4 does not have any parent, node 4 will become a root node 4/6/2022 4.2 _ B Tree 33
  • 34. • Step 8: ---------- • Step 9: ---------- • Step 10: ---------- • Can You Try........ 4/6/2022 4.2 _ B Tree 34
  • 35. Create a B tree of order 5 by inserting values from 1 to 20 4/6/2022 4.2 _ B Tree 35
  • 36. B + tree • A B+ tree is used to store the records very efficiently by storing the records in an indexed manner using the B+ tree indexed structure. • Due to the multi-level indexing, the data accessing becomes faster and easier. 4/6/2022 4.2 _ B Tree 36
  • 37. B+ tree Node Structure • The node structure of the B+ tree contains pointers and key values shown in the below figure: • As we can observe in the above B+ tree node structure that it contains n-1 key values (k1 to kn-1) and n pointers (p1 to pn). • The search key values which are placed in the node are kept in sorted order. Thus, if i<j then ki<kj. 4/6/2022 4.2 _ B Tree 37
  • 38. B+ Tree • Although all of the search trees we have seen so far are binary, there is a popular search tree that is not binary. This tree is known as a B-tree. • B Tree is a specialized m-way tree that can be widely used for disk access. • A B-Tree of order m can have at most m-1 keys and m children. • One of the main reason of using B tree is its capability to store large number of keys in a single node and large key values by keeping the height of the tree relatively small. • A B tree of order m contains all the properties of an M way tree. In addition, it contains the following properties. – Every node in a B-Tree contains at most m children. – Every node in a B-Tree except the root node and the leaf node contain at least m/2 children. – The root nodes must have at least 2 nodes. – All leaf nodes must be at the same level. – All data is stored at the leaves. • It is not necessary that, all the nodes contain the same number of children but, each node must have m/2 number of nodes. 4/6/2022 4.2 _ B Tree 38
  • 39. B+ Tree Properties • Contained in each interior node are pointers p1, p2, . . . , pm to the children, and values k1, k2, . . . , km – 1, representing the smallest key found in the subtrees p2, p3, . . . , pm respectively. • Of course, some of these pointers might be NULL, and the corresponding ki would then be undefined. • For every node, all the keys in subtree p1 are smaller than the keys in subtree p2, and so on. • The leaves contain all the actual data, which is either the keys themselves or pointers to records containing the keys. • We will assume the former to keep our examples simple. • There are various definitions of B-trees that change this structure in mostly minor ways, but this definition is one of the popular forms. • We will also insist (for now) that the number of keys in a leaf is also between m/2 and m. • A B-tree of order 4 is more popularly known as a 2-3-4 tree, and a B-tree of order 3 is known as a 2-3 tree. • We will describe the operation of B-trees by using the special case of 2-3 trees. • Our starting point is the 2-3 tree that follows. 4/6/2022 4.2 _ B Tree 39
  • 40. Example of a B-tree of order 4. 4/6/2022 4.2 _ B Tree 40
  • 41. Example of a B-tree of order 3. 4/6/2022 4.2 _ B Tree 41 • We have drawn interior nodes (nonleaves) in ellipses, which contain the two pieces of data for each node. • A dash line as a second piece of information in an interior node indicates that the node has only two children. • Leaves are drawn in boxes, which contain the keys. • The keys in the leaves are ordered.
  • 42. Find • To perform a find, we start at the root and branch in one of (at most) three directions, depending on the relation of the key we are looking for to the two (possibly one) values stored at the node. 4/6/2022 4.2 _ B Tree 42
  • 43. Insert Case 1 • To perform an insert on a previously unseen key, x, we follow the path as though we were performing a find. • When we get to a leaf node, we have found the correct place to put x. • Thus, to insert a node with key 18, we can just add it to a leaf without causing any violations of the 2-3 tree properties. • The result is shown in the following figure. 4/6/2022 4.2 _ B Tree 43
  • 44. Insert Case 2 • Unfortunately, since a leaf can hold only two or three keys, this might not always be possible. • If we now try to insert 1 into the tree, we find that the node where it belongs is already full. • Placing our new key into this node would give it a fourth element which is not allowed. • This can be solved by making two nodes of two keys each and adjusting the information in the parent. 4/6/2022 4.2 _ B Tree 44
  • 45. Insert Case 3 • Unfortunately, this idea does not always work, as can be seen by an attempt to insert 19 into the current tree. • If we make two nodes of two keys each, we obtain the following tree. 4/6/2022 4.2 _ B Tree 45
  • 46. Case 3 Solution • This tree has an internal node with four children, but we only allow three per node. • The solution is simple. We merely split this node into two nodes with two children. Of course, this node might be one of three children itself, and thus splitting it would create a problem for its parent (which would now have four children), but we can keep on splitting nodes on the way up to the root until we either get to the root or find a node with only two children. • In our case, we can get by with splitting only the first internal node we see, obtaining the following tree. 4/6/2022 4.2 _ B Tree 46
  • 47. Insert Case 4 • If we now insert an element with key 28, we create a leaf with four children, which is split into two leaves of two children: 4/6/2022 4.2 _ B Tree 47
  • 48. Case 4 Solution • This creates an internal node with four children, which is then split into two children. • What we have done here is split the root into two nodes. • When we do this, we have a special case, which we finish by creating a new root. • This is how (the only way) a 2-3 tree gains height. 4/6/2022 4.2 _ B Tree 48
  • 49. General Rule – B Tree • With general B-trees of order m, when a key is inserted, the only difficulty arises when the node that is to accept the key already has m keys. • This key gives the node m + 1 keys, which we can split into two nodes with (m + 1) / 2 and (m + 1) / 2 keys respectively. • As this gives the parent an extra node, we have to check whether this node can be accepted by the parent and split the parent if it already has m children. • We repeat this until we find a parent with less than m children. • If we split the root, we create a new root with two children. 4/6/2022 4.2 _ B Tree 49
  • 50. B Tree – Time Complexity • The depth of a B-tree is at most log m/2 n . • At each node on the path, we perform O(log m) work to determine which branch to take (using a binary search), but an insert or delete could require O(m) work to fix up all the information at the node. • The worst-case running time for each of the insert and delete operations is thus O(m logm n) = O( (m / log m ) log n), but a find takes only O(log n ). • The best (legal) choice of m for running time considerations has been shown empirically to be either m = 3 or m = 4; this agrees with the bounds above, which show that as m gets larger, the insertion and deletion times increase. • If we are only concerned with main memory speed, higher order B-trees, such as 5-9 trees, are not an advantage. 4/6/2022 4.2 _ B Tree 50
  • 51. Real Use of B-Tree • The real use of B-trees lies in database systems, where the tree is kept on a physical disk instead of main memory. • Accessing a disk is typically several orders of magnitude slower than any main memory operation. • If we use a B-tree of order m, then the number of disk accesses is O(logm n). • Although each disk access carries the overhead of O(log m) to determine the direction to branch, the time to perform this computation is typically much smaller than the time to read a block of memory and can thus be considered inconsequential (as long as m is chosen reasonably). • Even if updates are performed and O(m) computing time is required at each node, this too is generally not significant. • The value of m is then chosen to be the largest value that still allows an interior node to fit into one disk block, and is typically in the range 32 m 256. • The maximum number of elements that are stored in a leaf is chosen so that if the leaf is full, it fits in one block. • This means that a record can always be found in very few disk accesses, since a typical B-tree will have a depth of only 2 or 3, and the root (and possibly the first level) can be kept in main memory. 4/6/2022 4.2 _ B Tree 51
  • 52. B tree Vs B+ Tree 4/6/2022 4.2 _ B Tree 52
  • 53. B tree Vs B+ Tree 4/6/2022 4.2 _ B Tree 53
  • 54. UNIT IV : ADVANCED TREES By Mr.S.Selvaraj Asst. Professor (SRG) / CSE Kongu Engineering College Perundurai, Erode, Tamilnadu, India Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C Programming”, 1st Edition, McGraw Hill, 2018. 20CST32 – Data Structures
  • 55. Unit IV : Contents 1. Splay Trees 2. B tree 3. Red-Black Trees: – Rotation – Insertion – Deletion 4. Priority Queues(Heaps) – Binary heap – d-heaps – Leftist heaps – Skew heaps 4/6/2022 56 4.3 _ Red Black Tree
  • 56. Red Black Tree • The red-Black tree is a binary search tree. • The prerequisite of the red-black tree is that we should know about the binary search tree. • In a binary search tree, the values of the nodes in the – left subtree should be less than the value of the root node, and – right subtree should be greater than the value of the root node. • Each node in the Red-black tree contains an extra bit that represents a color to ensure that the tree is balanced during any operations performed on the tree like insertion, deletion, etc. • In a binary search tree, the searching, insertion and deletion take – O(log2n) time in the average case, – O(1) in the best case and – O(n) in the worst case. 4/6/2022 4.3 _ Red Black Tree 57
  • 57. BST • In the above tree, if we want to search the 80. • We will first compare 80 with the root node. • 80 is greater than the root node, i.e., 10, so searching will be performed on the right subtree. • Again, 80 is compared with 15; 80 is greater than 15, so we move to the right of the 15, i.e., 20. • Now, we reach the leaf node 20, and 20 is not equal to 80. • Therefore, it will show that the element is not found in the tree. • After each operation, the search is divided into half. The above BST will take O(logn) time to search the element. 4/6/2022 4.3 _ Red Black Tree 58
  • 58. Right Skewed BST Tree • The above tree shows the right-skewed BST. • If we want to search the 80 in the tree, we will compare 80 with all the nodes until we find the element or reach the leaf node. • So, the above right-skewed BST will take O(N) time to search the element. • In the above BST, the first one is the balanced BST, whereas the second one is the unbalanced BST. • We conclude from the above two binary search trees that a balanced tree takes less time than an unbalanced tree for performing any operation on the tree. 4/6/2022 4.3 _ Red Black Tree 59
  • 59. Why Red Black Tree? • why do we require a Red-Black tree if AVL is also a height-balanced tree. • The Red-Black tree is used because the AVL tree requires many rotations when the tree is large, whereas the Red-Black tree requires a maximum of two rotations to balance the tree. • The main difference between the AVL tree and the Red-Black tree is that the AVL tree is strictly balanced, while the Red-Black tree is not completely height-balanced. • So, the AVL tree is more balanced than the Red-Black tree, but the Red-Black tree guarantees O(log2n) time for all operations like insertion, deletion, and searching. • Insertion is easier in the AVL tree as the AVL tree is strictly balanced, whereas deletion and searching are easier in the Red- Black tree as the Red-Black tree requires fewer rotations. 4/6/2022 4.3 _ Red Black Tree 60
  • 60. Red Black Tree Properties 1. Self Balancing BST. 2. Every node is either Black / Red (extra 1 bit) 1. In case AVL Tree, Balancing Factor(BF): -1,0,1 2. 0: Black, 1: Red 3. Root is always Black. 4. Every leaf contains NIL node which in Black. (the nodes that have no child are considered the internal nodes and these nodes are connected to the NIL nodes that are always black in color. ) 5. If node is Red then its children are Black. (no red-red parent-child relationship) 6. Every path from a node to any of its descendant's NIL node should have same number of black nodes. 4/6/2022 4.3 _ Red Black Tree 61
  • 61. If node is Red then its children are Black. (no red-red parent-child relationship) 4/6/2022 4.3 _ Red Black Tree 62
  • 62. Is every AVL tree can be a Red-Black tree? • Yes, every AVL tree can be a Red-Black tree if we color each node either by Red or Black color. • But every Red-Black tree is not an AVL because the AVL tree is strictly height-balanced while the Red-Black tree is not completely height-balanced. 4/6/2022 4.3 _ Red Black Tree 63
  • 63. Example – RB Tree 4/6/2022 4.3 _ Red Black Tree 64
  • 64. Check is it RB tree or not? 4/6/2022 4.3 _ Red Black Tree 65
  • 65. Now , Check is it RB tree or not? 4/6/2022 4.3 _ Red Black Tree 66
  • 66. Now , Check is it RB tree or not? 4/6/2022 4.3 _ Red Black Tree 67
  • 67. Check RB Tree or Not? 4/6/2022 4.3 _ Red Black Tree 68
  • 68. Clue for RB Tree • Every Perfect Binary Tree, that contains only Black Nodes is also a Red Black Tree. 4/6/2022 4.3 _ Red Black Tree 69
  • 69. Once More another example.... 4/6/2022 4.3 _ Red Black Tree 70
  • 70. Check Last one more ... • If it is not then tell me which property is not satisfying by this tree. 4/6/2022 4.3 _ Red Black Tree 71
  • 71. Rules to insert values in Red Black Tree • The following are some rules used to create the Red- Black tree: 1. If the tree is empty, then we create a new node as a root node with the color black. 2. If the tree is not empty, then we create a new node as a leaf node with a color red. 3. If the parent of a new node is black, then exit. 4. If the parent of a new node is Red, then we have to check the color of the parent's sibling of a new node. 4(a) If the color is Black, then we perform rotations and recoloring. 4(b) If the color is Red then we recolor the node. We will also check whether the parents' parent of a new node is the root node or not; if it is not a root node, we will recolor and recheck the node. 4/6/2022 4.3 _ Red Black Tree 72
  • 72. Example - Insertion 4/6/2022 4.3 _ Red Black Tree 73 Let's understand the insertion in the Red-Black tree. 10, 18, 7, 15, 16, 30, 25, 40
  • 73. • Step 1: Initially, the tree is empty, so we create a new node having value 10. This is the first node of the tree, so it would be the root node of the tree. As we already discussed, that root node must be black in color, which is shown below: 4/6/2022 4.3 _ Red Black Tree 74
  • 74. • Step 2: The next node is 18. As 18 is greater than 10 so it will come at the right of 10 as shown below. 4/6/2022 4.3 _ Red Black Tree 75
  • 75. • Step 3: Now, we create the new node having value 7 with Red color. As 7 is less than 10, so it will come at the left of 10 as shown below. 4/6/2022 4.3 _ Red Black Tree 76
  • 76. • Step 4: The next element is 15, and 15 is greater than 10, but less than 18, so the new node will be created at the left of node 18. The node 15 would be Red in color as the tree is not empty. • The above tree violates the property of the Red-Black tree as it has Red-red parent- child relationship. • Now we have to apply some rule to make a Red-Black tree. The rule 4 says that if the new node's parent is Red, then we have to check the color of the parent's sibling of a new node. • The new node is node 15; the parent of the new node is node 18 and the sibling of the parent node is node 7. • As the color of the parent's sibling is Red in color, so we apply the rule 4a. • The rule 4a says that we have to recolor both the parent and parent's sibling node. So, both the nodes, i.e., 7 and 18, would be recolored as shown in the below figure. • We also have to check whether the parent's parent of the new node is the root node or not. As we can observe in the above figure, the parent's parent of a new node is the root node, so we do not need to recolor it. 4/6/2022 4.3 _ Red Black Tree 77
  • 77. • Step 5: The next element is 16. As 16 is greater than 10 but less than 18 and greater than 15, so node 16 will come at the right of node 15. The tree is not empty; node 16 would be Red in color, as shown in the below figure: 4/6/2022 4.3 _ Red Black Tree 78
  • 78. • Step 6: The next element is 30. Node 30 is inserted at the right of node 18. As the tree is not empty, so the color of node 30 would be red. • We also have to check the parent's parent of the new node, whether it is a root node or not. The parent's parent of the new node, i.e., node 30 is node 16 and node 16 is not a root node, so we will recolor the node 16 and changes to the Red color. The parent of node 16 is node 10, and it is not in Red color, so there is no Red-red conflict. 4/6/2022 4.3 _ Red Black Tree 79
  • 79. • Step 7: The next element is 25, which we have to insert in a tree. Since 25 is greater than 10, 16, 18 but less than 30; so, it will come at the left of node 30. As the tree is not empty, node 25 would be in Red color. Here Red-red conflict occurs as the parent of the newly created is Red color. 4/6/2022 4.3 _ Red Black Tree 80
  • 80. • Step 8: The next element is 40. Since 40 is greater than 10, 16, 18, 25, and 30, so node 40 will come at the right of node 30. As the tree is not empty, node 40 would be Red in color. There is a Red-red conflict between nodes 40 and 30, so rule 4b will be applied. 4/6/2022 4.3 _ Red Black Tree 81
  • 81. • You Continue for last 4 numbers : • 10, 18, 7, 15, 16, 30, 25, 40, 60, 2, 1, 70 • .................... Take it as practice purpose. 4/6/2022 4.3 _ Red Black Tree 82
  • 82. Final Answer is ... Check it out. 4/6/2022 4.3 _ Red Black Tree 83
  • 83. Deletion in Red Back tree • Let's understand how we can delete the particular node from the Red-Black tree. • The following are the rules used to delete the particular node from the tree: • Step 1: First, we perform BST rules for the deletion. • Step 2: • Case 1: if the node is Red, which is to be deleted, we simply delete it. 4/6/2022 4.3 _ Red Black Tree 84
  • 84. Case 1 – Example 1 • Suppose we want to delete node 30 from the tree, which is given below. • Initially, we are having the address of the root node. First, we will apply BST to search the node. Since 30 is greater than 10 and 20, which means that 30 is the right child of node 20. Node 30 is a leaf node and Red in color, so it is simply deleted from the tree. 4/6/2022 4.3 _ Red Black Tree 85
  • 85. Case 1 – Example 2 • If we want to delete the internal node that has one child. First, replace the value of the internal node with the value of the child node and then simply delete the child node. • Let's take another example in which we want to delete the internal node, i.e., node 20. • We cannot delete the internal node; we can only replace the value of that node with another value. Node 20 is at the right of the root node, and it is having only one child, node 30. So, node 20 is replaced with a value 30, but the color of the node would remain the same, i.e., Black. In the end, node 20 (leaf node) is deleted from the tree. 4/6/2022 4.3 _ Red Black Tree 86
  • 86. Case 1 – Example 3 • If we want to delete the internal node that has two child nodes. • In this case, we have to decide from which we have to replace the value of the internal node (either left subtree or right subtree). • We have two ways: – Inorder predecessor: We will replace with the largest value that exists in the left subtree. – Inorder successor: We will replace with the smallest value that exists in the right subtree. • Suppose we want to delete node 30 from the tree, which is shown below: • Node 30 is at the right of the root node. In this case, we will use the inorder successor. The value 38 is the smallest value in the right subtree, so we will replace the value 30 with 38, but the node would remain the same, i.e., Red. After replacement, the leaf node, i.e., 30, would be deleted from the tree. • Since node 30 is a leaf node and Red in color, we need to delete it (we do not have to perform any rotations or any recoloring). 4/6/2022 4.3 _ Red Black Tree 87
  • 87. Case 2 • Case 2: If the root node is also double black, then simply remove the double black and make it a single black. 4/6/2022 4.3 _ Red Black Tree 88
  • 88. Case 3 • Case 3: If the double black's sibling is black and both its children are black. 1. Remove the double black node. 2. Add the color of the node to the parent (P) node. • If the color of P is red then it becomes black. • If the color of P is black, then it becomes double black. 3. The color of double black's sibling changes to red. 4. If still double black situation arises, then we will apply other cases 4/6/2022 4.3 _ Red Black Tree 89
  • 89. Case 3 Example • Let's understand this case through an example. • Suppose we want to delete node 15 in the below tree. 4/6/2022 4.3 _ Red Black Tree 90
  • 90. Case 3 Example (Cntd..) • We cannot simply delete node 15 from the tree as node 15 is Black in color. Node 15 has two children, which are nil. So, we replace the 15 value with a nil value. As node 15 and nil node are black in color, the node becomes double black after replacement, as shown in the below figure. 4/6/2022 4.3 _ Red Black Tree 91
  • 91. Case 3 Example (Cntd..) • In the above tree, we can observe that the double black's sibling is black in color and its children are nil, which are also black. As the double black's sibling and its children have black so it cannot give its black color to neither of these. Now, the double black's parent node is Red so double black's node add its black color to its parent node. The color of the node 20 changes to black while the color of the nil node changes to a single black as shown in the below figure. • After adding the color to its parent node, the color of the double black's sibling, i.e., node 30 changes to red as shown in the below figure. • In the above tree, we can observe that there is no longer double black's problem exists, and it is also a Red-Black tree. 4/6/2022 4.3 _ Red Black Tree 92
  • 92. Case 4 • Case 4: If double black's sibling is Red. 1. Swap the color of its parent and its sibling. 2. Rotate the parent node in the double black's direction. 3. Reapply cases. 4/6/2022 4.3 _ Red Black Tree 93
  • 93. Case 4 – Example • Let's understand this case through an example. • Suppose we want to delete node 15. 4/6/2022 4.3 _ Red Black Tree 94
  • 94. Case 4 – Example (Cntd.,) • Initially, the 15 is replaced with a nil value. After replacement, the node becomes double black. Since double black's sibling is Red so color of the node 20 changes to Red and the color of the node 30 changes to Black. • Once the swapping of the color is completed, the rotation towards the double black would be performed. The node 30 will move upwards and the node 20 will move downwards as shown in the below figure. 4/6/2022 4.3 _ Red Black Tree 95
  • 95. Case 4 – Example (Cntd.,) • In the above tree, we can observe that double black situation still exists in the tree. It satisfies the case 3 in which double black's sibling is black as well as both its children are black. First, we remove the double black from the node and add the black color to its parent node. At the end, the color of the double black's sibling, i.e., node 25 changes to Red as shown in the below figure. • In the above tree, we can observe that the double black situation has been resolved. It also satisfies the properties of the Red Black tree. 4/6/2022 4.3 _ Red Black Tree 96
  • 96. Case 5 • Case 5: If double black's sibling is black, sibling's child who is far from the double black is black, but near child to double black is red. 1. Swap the color of double black's sibling and the sibling child which is nearer to the double black node. 2. Rotate the sibling in the opposite direction of the double black. 3. Apply case 6 4/6/2022 4.3 _ Red Black Tree 97
  • 97. Case 5 Example • Suppose we want to delete the node 1 in the below tree. 4/6/2022 4.3 _ Red Black Tree 98
  • 98. Case 5 Example (Cntd..) • First, we replace the value 1 with the nil value. The node becomes double black as both the nodes, i.e., 1 and nil are black. It satisfies the case 3 that implies if DB's sibling is black and both its children are black. First, we remove the double black of the nil node. Since the parent of DB is Black, so when the black color is added to the parent node then it becomes double black. After adding the color, the double black's sibling color changes to Red as shown below. 4/6/2022 4.3 _ Red Black Tree 99
  • 99. Case 5 Example (Cntd..) • We can observe in the above screenshot that the double black problem still exists in the tree. So, we will reapply the cases. We will apply case 5 because the sibling of node 5 is node 30, which is black in color, the child of node 30, which is far from node 5 is black, and the child of the node 30 which is near to node 5 is Red. In this case, first we will swap the color of node 30 and node 25 so the color of node 30 changes to Red and the color of node 25 changes to Black as shown below. • As we can observe in the above tree that double black situation still exists. So, we need to case 6. Let's first see what is case 6. 4/6/2022 4.3 _ Red Black Tree 100
  • 100. Case 5 Example (Cntd..) • Once the swapping of the color between the nodes is completed, we need to rotate the sibling in the opposite direction of the double black node. In this rotation, the node 30 moves downwards while the node 25 moves upwards as shown below. 4/6/2022 4.3 _ Red Black Tree 101
  • 101. Case 6 • Case 6: If double black's sibling is black, far child is Red 1. Swap the color of Parent and its sibling node. 2. Rotate the parent towards the Double black's direction 3. Remove Double black 4. Change the Red color to black. 4/6/2022 4.3 _ Red Black Tree 102
  • 102. Case 6 Example • Now we will apply case 6 in the above example to solve the double black's situation. • In the above example, the double black is node 5, and the sibling of node 5 is node 25, which is black in color. The far child of the double black node is node 30, which is Red in color as shown in the below figure: 4/6/2022 4.3 _ Red Black Tree 103
  • 103. Case 6 Example (Cntd..) • First, we will swap the colors of Parent and its sibling. The parent of node 5 is node 10, and the sibling node is node 25. The colors of both the nodes are black, so there is no swapping would occur. • In the second step, we need to rotate the parent in the double black's direction. After rotation, node 25 will move upwards, whereas node 10 will move downwards. Once the rotation is performed, the tree would like, as shown in the below figure: 4/6/2022 4.3 _ Red Black Tree 104
  • 104. Case 6 Example (Cntd..) • In the next step, we will remove double black from node 5 and node 5 will give its black color to the far child, i.e., node 30. Therefore, the color of node 30 changes to black as shown in the below figure. 4/6/2022 4.3 _ Red Black Tree 105
  • 105. UNIT IV : ADVANCED TREES By Mr.S.Selvaraj Asst. Professor (SRG) / CSE Kongu Engineering College Perundurai, Erode, Tamilnadu, India Thanks to and Resource from : Data Structures and Algorithm Analysis in C by Mark Allen Weiss & Sumitabha Das, “Computer Fundamentals and C Programming”, 1st Edition, McGraw Hill, 2018. 20CST32 – Data Structures
  • 106. Unit IV : Contents 1. Splay Trees 2. B tree 3. Red-Black Trees: – Rotation – Insertion – Deletion 4. Priority Queues(Heaps) – Binary heap – d-heaps – Leftist heaps – Skew heaps 4/6/2022 108 4.4 _ Priority Queues (Heaps)
  • 107. Why we need Priority Queue? Case 1 : Printer Job • Although jobs sent to a line printer are generally placed on a queue, this might not always be the best thing to do. • For instance, one job might be particularly important, so that it might be desirable to allow that job to be run as soon as the printer is available. • Conversely, if, when the printer becomes available, there are several one-page jobs and one hundred- page job, it might be reasonable to make the long job go last, even if it is not the last job submitted. • (Unfortunately, most systems do not do this, which can be particularly annoying at times.) 4/6/2022 4.4 _ Priority Queues (Heaps) 109
  • 108. Why we need Priority Queue? Case 2 : Multiuser Operating System Job Scheduler • Similarly, in a multiuser environment, the operating system scheduler must decide which of several processes to run. • Generally a process is only allowed to run for a fixed period of time. One algorithm uses a queue. • Jobs are initially placed at the end of the queue. • The scheduler will repeatedly take the first job on the queue, run it until either it finishes or its time limit is up, and place it at the end of the queue if it does not finish. • This strategy is generally not appropriate, because very short jobs will seem to take a long time because of the wait involved to run. • Generally, it is important that short jobs finish as fast as possible, so these jobs should have preference over jobs that have already been running. • Furthermore, some jobs that are not short are still very important and should also have preference. • This particular application seems to require a special kind of queue, known as a priority queue. 4/6/2022 4.4 _ Priority Queues (Heaps) 110
  • 109. Priority Queue - Model • A priority queue is a data structure that allows at least the following two operations: – insert, which does the obvious thing, and – delete_min, which finds, returns and removes the minimum element in the heap. • The insert operation is the equivalent of enqueue, and delete_min is the priority queue equivalent of the queue's dequeue operation. • The delete_min function also alters its input. • Current thinking in the software engineering community suggests that this is no longer a good idea. • However, we will continue to use this function because of historical reasons--many programmers expect delete_min to operate this way. 4/6/2022 4.4 _ Priority Queues (Heaps) 111
  • 110. Priority Queue - Positives • Priority queues have many applications besides operating systems. • priority queues are used for external sorting. • Priority queues are also important in the implementation of greedy algorithms, which operate by repeatedly finding a minimum. 4/6/2022 4.4 _ Priority Queues (Heaps) 112
  • 111. Priority Queues Implementation using Linked List • There are several obvious ways to implement a priority queue. • We could use a simple linked list, – performing insertions at the front in O(1) and – traversing the list, which requires O(n) time, – to delete the minimum which requires O(n) time. • Alternatively, we could insist that the list be always kept sorted; – this makes insertions expensive (O (n)) and – delete_mins cheap (O(1)). • The former is probably the better idea of the two, based on the fact that there are never more delete_mins than insertions. 4/6/2022 4.4 _ Priority Queues (Heaps) 113
  • 112. Priority Queues Implementation Using BST • Another way of implementing priority queues would be to use a binary search tree. • This gives an O(log n) average running time for both operations. • This is true in spite of the fact that although the insertions are random, but the deletions are not. • Recall that the only element we ever delete is the minimum. • Repeatedly removing a node that is in the left subtree would seem to hurt the balance of the tree by making the right subtree heavy. • However, the right subtree is random. In the worst case, where the delete_mins have depleted the left subtree, the right subtree would have at most twice as many elements as it should. • This adds only a small constant to its expected depth. Notice that the bound can be made into a worst-case bound by using a balanced tree; this protects one against bad insertion sequences. • We will then discuss how to implement heaps to support efficient merging. 4/6/2022 4.4 _ Priority Queues (Heaps) 114
  • 113. Binary Heap • The implementation we will use is known as a binary heap. • Its use is so common for priority queue implementations that when the word heap is used without a qualifier, it is generally assumed to be referring to this implementation of the data structure. • In this section, we will refer to binary heaps as merely heaps. • Like binary search trees, heaps have two properties, – namely, a structure property and – a heap order property. • As with AVL trees, an operation on a heap can destroy one of the properties, so a heap operation must not terminate until all heap properties are in order. This turns out to be simple to do. 4/6/2022 4.4 _ Priority Queues (Heaps) 115
  • 114. Structure Property • A heap is a binary tree that is completely filled, with the possible exception of the bottom level, which is filled from left to right. • Such a tree is known as a complete binary tree. • a complete binary tree of height h has between 2h and 2h+1 - 1 nodes. • This implies that the height of a complete binary tree is log n , which is clearly O(log n). • An important observation is that because a complete binary tree is so regular, it can be represented in an array and no pointers are necessary. 4/6/2022 4.4 _ Priority Queues (Heaps) 116
  • 115. Structure Property • For any element in array position i, – the left child is in position 2i, – the right child is in the cell after the left child (2i + 1), and – the parent is in position (i/2) . 4/6/2022 4.4 _ Priority Queues (Heaps) 117
  • 116. Structure Property • Thus not only are pointers not required, but the operations required to traverse the tree are extremely simple and likely to be very fast on most computers. • The only problem with this implementation is that an estimate of the maximum heap size is required in advance, but typically this is not a problem. • In the figure above, the limit on the heap size is 13 elements. • The array has a position 0; more on this later. • A heap data structure will, then, consist of an array (of whatever type the key is) and integers representing the maximum 2nd current heap size. • we shall draw the heaps as trees, with the implication that an actual implementation will use simple arrays. 4/6/2022 4.4 _ Priority Queues (Heaps) 118
  • 117. Declaration of Priority Queue struct heap_struct { /* Maximum # that can fit in the heap */ unsigned int max_heap_size; /* Current # of elements in the heap */ unsigned int size; element_type *elements; }; typedef struct heap_struct *PRIORITY_QUEUE; 4/6/2022 4.4 _ Priority Queues (Heaps) 119
  • 118. Empty Heap Creation PRIORITY_QUEUE create_pq( unsigned int max_elements ) { PRIORITY_QUEUE H; if( max_elements < MIN_PQ_SIZE ) error("Priority queue size is too small"); H = (PRIORITY_QUEUE) malloc ( sizeof (struct heap_struct) ); if( H == NULL ) fatal_error("Out of space!!!"); /* Allocate the array + one extra for sentinel */ H->elements = (element_type *) malloc ( ( max_elements+1) * sizeof (element_type) ); if( H->elements == NULL ) fatal_error("Out of space!!!"); H->max_heap_size = max_elements; H->size = 0; H->elements[0] = MIN_DATA; return H; } 4/6/2022 4.4 _ Priority Queues (Heaps) 120
  • 119. Heap Order Property • The property that allows operations to be performed quickly is the heap order property. • Since we want to be able to find the minimum quickly, it makes sense that the smallest element should be at the root. • If we consider that any subtree should also be a heap, then any node should be smaller than all of its descendants. • Applying this logic, we arrive at the heap order property. 4/6/2022 4.4 _ Priority Queues (Heaps) 121
  • 120. Heap Order Property • In a heap, for every node X, the key in the parent of X is smaller than (or equal to) the key in X, with the obvious exception of the root (which has no parent).* • In below Figure the tree on the left is a heap, but the tree on the right is not (the dashed line shows the violation of heap order). • By the heap order property, the minimum element can always be found at the root. Thus, we get the extra operation, find_min, in constant time. 4/6/2022 4.4 _ Priority Queues (Heaps) 122
  • 121. Basic Heap Operations • It is easy (both conceptually and practically) to perform the two required operations. • All the work involves ensuring that the heap order property is maintained. – Insert – Delete_min 4/6/2022 4.4 _ Priority Queues (Heaps) 123
  • 122. Insert • To insert an element x into the heap, we create a hole in the next available location, since otherwise the tree will not be complete. • If x can be placed in the hole without violating heap order, then we do so and are done. • Otherwise we slide the element that is in the hole's parent node into the hole, thus bubbling the hole up toward the root. • We continue this process until x can be placed in the hole. 4/6/2022 4.4 _ Priority Queues (Heaps) 124
  • 123. Insert • Figure shows that to insert 14, we create a hole in the next available heap location. • Inserting 14 in the hole would violate the heap order property, so 31 is slid down into the hole. This strategy is continued in next Figure until the correct location for 14 is found. • This general strategy is known as a percolate up; the new element is percolated up the heap until the correct location is found. 4/6/2022 4.4 _ Priority Queues (Heaps) 125
  • 124. Insert 4/6/2022 4.4 _ Priority Queues (Heaps) 126
  • 125. Insert • Insertion is easily implemented with the code shown. • We could have implemented the percolation in the insert routine by performing repeated swaps until the correct order was established, but a swap requires three assignment statements. • If an element is percolated up d levels, the number of assignments performed by the swaps would be 3d. • Our method uses d + 1 assignments. 4/6/2022 4.4 _ Priority Queues (Heaps) 127
  • 126. Insertion - Code /* H->element[0] is a sentinel */ Void insert( element_type x, PRIORITY_QUEUE H ) { unsigned int i; if( is_full( H ) ) error("Priority queue is full"); else { i = ++H->size; while( H->elements[i/2] > x ) { H->elements[i] = H->elements[i/2]; i /= 2; } H->elements[i] = x; } } 4/6/2022 4.4 _ Priority Queues (Heaps) 128
  • 127. Sentinel • If the element to be inserted is the new minimum, it will be pushed all the way to the top. • At some point, i will be 1 and we will want to break out of the while loop. We could do this with an explicit test, but we have chosen to put a very small value in position 0 in order to make the while loop terminate. • This value must be guaranteed to be smaller than (or equal to) any element in the heap; it is known as a sentinel. • This idea is similar to the use of header nodes in linked lists. • By adding a dummy piece of information, we avoid a test that is executed once per loop iteration, thus saving some time. • The time to do the insertion could be as much as O (log n), if the element to be inserted is the new minimum and is percolated all the way to the root. • On average, the percolation terminates early; it has been shown that 2.607 comparisons are required on average to perform an insert, so the average insert moves an element up 1.607 levels. 4/6/2022 4.4 _ Priority Queues (Heaps) 129
  • 128. Delete _ min • Delete_mins are handled in a similar manner as insertions. • Finding the minimum is easy; the hard part is removing it. • When the minimum is removed, a hole is created at the root. • Since the heap now becomes one smaller, it follows that the last element x in the heap must move somewhere in the heap. • If x can be placed in the hole, then we are done. • This is unlikely, so we slide the smaller of the hole's children into the hole, thus pushing the hole down one level. • We repeat this step until x can be placed in the hole. • Thus, our action is to place x in its correct spot along a path from the root containing minimum children. 4/6/2022 4.4 _ Priority Queues (Heaps) 130
  • 129. Delete _ min • The Figure the left figure shows a heap prior to the delete_min. • After 13 is removed, we must now try to place 31 in the heap. • 31 cannot be placed in the hole, because this would violate heap order. 4/6/2022 4.4 _ Priority Queues (Heaps) 131
  • 130. Delete _ min • Thus, we place the smaller child (14) in the hole, sliding the hole down one level. • We repeat this again, placing 19 into the hole and creating a new hole one level deeper. 4/6/2022 4.4 _ Priority Queues (Heaps) 132
  • 131. Delete _ min • We then place 26 in the hole and create a new hole on the bottom level. • Finally, we are able to place 31 in the hole. • This general strategy is known as a percolate down. • We use the same technique as in the insert routine to avoid the use of swaps in this routine. 4/6/2022 4.4 _ Priority Queues (Heaps) 133
  • 132. Delete – min : Function • A frequent implementation error in heaps occurs – when there are an even number of elements in the heap, and – the one node that has only one child is encountered. • You must make sure not to assume that there are always two children, so this usually involves an extra test. • In the code, depicted in next slide we've done this test at line 8. • One extremely tricky solution is always to ensure that your algorithm thinks every node has two children. • Do this by placing a sentinel, of value higher than any in the heap, at the spot after the heap ends, at the start of each percolate down when the heap size is even. • You should think very carefully before attempting this, and you must put in a prominent comment if you do use this technique. 4/6/2022 4.4 _ Priority Queues (Heaps) 134
  • 133. Delete _ min : function • element_type Delete_min( PRIORITY_QUEUE H ) • { • unsigned int i, child; • element_type min_element, last_element; • if( is_empty( H ) ) • { • error("Priority queue is empty"); • return H->elements[0]; • } • min_element = H->elements[1]; • last_element = H->elements[H->size--]; • for( i=1; i*2 <= H->size; i=child ) • { • /* find smaller child */ • child = i*2; • if( ( child != H->size ) && ( H->elements[child+1] < H->elements [child] ) ) // LINE 8 • child++; • /* percolate one level */ • if( last_element > H->elements[child] ) • H->elements[i] = H->elements[child]; • Else • break; • } • H->elements[i] = last_element; • return min_element; • } 4/6/2022 4.4 _ Priority Queues (Heaps) 135
  • 134. Heap – Time Complexity • Although this eliminates the need to test for the presence of a right child, you cannot eliminate the requirement that you test when you reach the bottom because this would require a sentinel for every leaf. • The worst-case running time for this operation is O(log n). • On average, the element that is placed at the root is percolated almost to the bottom of the heap (which is the level it came from), so the average running time is O (log n). 4/6/2022 4.4 _ Priority Queues (Heaps) 136
  • 135. Other Heap Operations • Decrease_key • Increase_key • Delete • Build_heap 4/6/2022 4.4 _ Priority Queues (Heaps) 137
  • 136. Applications of Priority Queues • Operating systems design. • Priority queues are used to implement several graph algorithms efficiently. • The Selection Problem. • Event Simulation. 4/6/2022 4.4 _ Priority Queues (Heaps) 138
  • 137. d-Heaps • Binary heaps are so simple that they are almost always used when priority queues are needed. • A simple generalization is a d-heap, which is exactly like a binary heap except that all nodes have d children (thus, a binary heap is a 2-heap). • Below Figure shows a 3-heap. 4/6/2022 4.4 _ Priority Queues (Heaps) 139
  • 138. d - Heaps • Notice that a d-heap is much more shallow than a binary heap, improving the running time of inserts to O(logdn). • However, the delete_min operation is more expensive, because even though the tree is shallower, the minimum of d children must be found, which takes d - 1 comparisons using a standard algorithm. • This raises the time for this operation to O(d logdn). If d is a constant, both running times are, of course, O(log n). • Furthermore, although an array can still be used, the multiplications and divisions to find children and parents are now by d, which seriously increases the running time, because we can no longer implement division by a bit shift. • d-heaps are interesting in theory, because there are many algorithms where the number of insertions is much greater than the number of delete_mins (and thus a theoretical speedup is possible). • They are also of interest when the priority queue is too large to fit entirely in main memory. • In this case, a d-heap can be advantageous in much the same way as B-trees. • The most glaring weakness of the heap implementation, aside from the inability to perform finds is that combining two heaps into one is a hard operation. This extra operation is known as a merge. • There are quite a few ways of implementing heaps so that the running time of a merge is O(log n). • We will now discuss three data structures, of various complexity, that support the merge operation efficiently. We will defer any complicated analysis until 4/6/2022 4.4 _ Priority Queues (Heaps) 140
  • 139. Leftist Heaps • It is a priority queue implemented with a variant of a binary heap. • Every node has an s-value (or rank or distance) which is the distance to the nearest leaf. • In contrast to a binary heap (Which is always a complete binary tree), a leftist tree may be very unbalanced. • It is a binary tree with the following properties: – Normal Min Heap Property: Key(i)>= Key(Parent(i)) – Heavier on Left side: dist(left(i)>=dist(right(i)) 4/6/2022 4.4 _ Priority Queues (Heaps) 141
  • 140. Time Complexities of Leftist Tree / Heap 4/6/2022 4.4 _ Priority Queues (Heaps) 142
  • 141. Leftist Heap – NPL or Distance or Rank • NPL : NULL Path Length – NPL(NULL) = -1 – NPL(Leaf/Single Child) = 0 – NPL (other nodes) = 1 + min(NPL(Left/Right)) – NPL(Left)>=NPL(Right) 4/6/2022 4.4 _ Priority Queues (Heaps) 143
  • 142. Leftist Heap – Example and Node Structure 4/6/2022 4.4 _ Priority Queues (Heaps) 144
  • 143. Leftist Heap - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 145
  • 144. Not a Leftist Heap - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 146
  • 145. Leftist Heap - Operation • The main operation is merge(). • deleteMin() or extractMin() can be done by removing root and calling merge() for left and right subtrees. • insert() can be done be create a leftist tree with single key (key to be inserted) and calling merge() for given tree and tree with single node. 4/6/2022 4.4 _ Priority Queues (Heaps) 147
  • 146. Idea behind Merging • Since right subtree is smaller, the idea is to merge right subtree of a tree with other tree. Below are abstract steps. – Put the root with smaller value as the new root. – Hang its left subtree on the left. – Recursively merge its right subtree and the other tree. – Before returning from recursion: • Update dist() of merged root. • Swap left and right subtrees just below root, if needed, to keep leftist property of merged result. 4/6/2022 4.4 _ Priority Queues (Heaps) 148
  • 147. Detailed Steps for Merge: 1. Compare the roots of two heaps. 2. Push the smaller key into an empty stack, and move to the right child of smaller key. 3. Recursively compare two keys and go on pushing the smaller key onto the stack and move to its right child. 4. Repeat until a null node is reached. 5. Take the last node processed and make it the right child of the node at top of the stack, and convert it to leftist heap if the properties of leftist heap are violated. 6. Recursively go on popping the elements from the stack and making them the right child of new stack top. 4/6/2022 4.4 _ Priority Queues (Heaps) 149
  • 148. Merge - Example • Consider two leftist heaps given below: 4/6/2022 4.4 _ Priority Queues (Heaps) 150
  • 149. Merge - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 151
  • 150. Merge - Example • The subtree at node 7 violates the property of leftist heap so we swap it with the left child and retain the property of leftist heap. 4/6/2022 4.4 _ Priority Queues (Heaps) 152
  • 151. • Convert to leftist heap. Repeat the process. • The worst case time complexity of this algorithm is O(log n) in the worst case, where n is the number of nodes in the leftist heap. Merge - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 153
  • 152. Merge - Example 2 4/6/2022 4.4 _ Priority Queues (Heaps) 154
  • 153. Skew Heap • Problem with Leftist Heap – Extra Space for NPL. – Two PASS merge (with Stack). – Extra Complexity / Logic to maintain and check NPL. • Solutions: – Skew Heap – Blind adjusting version of leftist heap. – Amortized time for merge, insert, delete_min (i.e, O(logn)) – Worst case time for all the three operations is O(n) – Merge always switches children when fixing right path. – In iterative method, skew heap has only one PASS. 4/6/2022 4.4 _ Priority Queues (Heaps) 155
  • 154. Skew Heap • A skew heap is a heap data structure implemented as a binary tree. • Skew heaps are advantageous because of their ability to merge more quickly than binary heaps. • In contrast with binary heaps, there are no structural constraints, so there is no guarantee that the height of the tree is logarithmic. • Only two conditions must be satisfied: – The general heap order must be enforced – Every operation (insert, delete_min, merge) on two skew heaps must be done using a special skew heap merge. 4/6/2022 4.4 _ Priority Queues (Heaps) 156
  • 155. Skew Heap • A skew heap is a self-adjusting form of a leftist heap which attempts to maintain balance by unconditionally swapping all nodes in the merge path when merging two heaps. • (The merge operation is also used when inserting and deleting values.) • With no structural constraints, it may seem that a skew heap would be horribly inefficient. • However, amortized complexity analysis can be used to demonstrate that all operations on a skew heap can be done in O(log n). • Skew heaps may be described with the following recursive definition: – A heap with only one element is a skew heap. – The result of skew merging two skew heaps Sh1 and Sh2 is also a skew heap. 4/6/2022 4.4 _ Priority Queues (Heaps) 157
  • 156. Skew Heap – Merge() • When two skew heaps are to be merged together, we can use the same process as the merge of two leftist heaps: 1. Compare roots of two heaps 2. Recursively merge the heap that has the larger root with the right subtree of the other heap. 3. Make the resulting merge the right subtree of the heap that has smaller root. 4. Swap the children of the new heap 4/6/2022 4.4 _ Priority Queues (Heaps) 158
  • 157. Merging Two Skew Heaps 4/6/2022 4.4 _ Priority Queues (Heaps) 159
  • 158. Example 4/6/2022 4.4 _ Priority Queues (Heaps) 160
  • 159. Skew Heap Code SkewHeap merge(heap1, heap2) { case { heap1 == NULL: return heap2; heap2 == NULL: return heap1; heap1.findMin() <= heap2.findMin(): temp = heap1.right; heap1.right = heap1.left; heap1.left = merge(heap2, temp); return heap1; otherwise: return merge(heap2, heap1); } } 4/6/2022 4.4 _ Priority Queues (Heaps) 161
  • 160. Skew Heap – Merge() (Non-Recursive) • Alternatively, there is a non-recursive approach which tends to be a little clearer, but does require some sorting at the outset. 1. Split each heap into subtrees by cutting every rightmost path. (From the root node, sever the right node and make the right child its own subtree.) This will result in a set of trees in which the root either only has a left child or no children at all. 2. Sort the subtrees in ascending order based on the value of the root node of each subtree. 3. While there are still multiple subtrees, iteratively recombine the last two (from right to left). 1. If the root of the second-to-last subtree has a left child, swap it to be the right child. 2. Link the root of the last subtree as the left child of the second-to- last subtree. 4/6/2022 4.4 _ Priority Queues (Heaps) 162
  • 161. Skew Heap – Merge() (Non-Recursive) - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 163
  • 162. Skew Heap – Merge() (Non-Recursive) - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 164
  • 163. Skew Heap – Merge() (Non-Recursive) - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 165
  • 164. Skew Heap – Merge() (Non-Recursive) - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 166
  • 165. Skew Heap – Merge() (Non-Recursive) - Example 4/6/2022 4.4 _ Priority Queues (Heaps) 167
  • 166. Skew Heap - Example 2 4/6/2022 4.4 _ Priority Queues (Heaps) 168
  • 167. Skew Heap - Example 2 4/6/2022 4.4 _ Priority Queues (Heaps) 169
  • 168. Skew Heap - Example 2 4/6/2022 4.4 _ Priority Queues (Heaps) 170
  • 169. Skew Heap Time Complexity • A skew heap is a self-adjusting version of a leftist heap that is incredibly simple to implement. • The relationship of skew heaps to leftist heaps is analogous to the relation between splay trees and AVL trees. • Skew heaps are binary trees with heap order, but there is no structural constraint on these trees. • Unlike leftist heaps, no information is maintained about the null path length of any node. • The right path of a skew heap can be arbitrarily long at any time, so the worst-case running time of all operations is O(n). • However, as with splay trees, it can be shown that for any m consecutive operations, the total worst-case running time is O(m log n). • Thus, skew heaps have O(log n) amortized cost per operation. 4/6/2022 4.4 _ Priority Queues (Heaps) 171