Applicationof datastructures


Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Applicationof datastructures

  1. 1. Application of Data Structures
  2. 2. Overview <ul><li>Priority Queue structures </li></ul><ul><ul><li>Heaps </li></ul></ul><ul><ul><li>Application: Dijkstra’s algorithm </li></ul></ul><ul><li>Cumulative Sum Data Structures on Intervals </li></ul><ul><li>Augmenting data structures with extra info to solve questions </li></ul>
  3. 3. Priority Queue (PQ) Structures <ul><li>Stores elements in a list by comparing a key field </li></ul><ul><ul><li>Often has other satellite data </li></ul></ul><ul><ul><li>For example, when sorting pixels by their R value, we consider the R as the key field and GB as satellite data </li></ul></ul><ul><li>Priority queues allow us to sort elements by their key field. </li></ul>
  4. 4. Common PQ operations <ul><li>Create() </li></ul><ul><ul><li>Creates an empty priority queue </li></ul></ul><ul><li>Find_Min() </li></ul><ul><ul><li>Returns the smallest element (by key field) </li></ul></ul><ul><li>Insert(x) </li></ul><ul><ul><li>Insert element x (with predefined key field) </li></ul></ul><ul><li>Delete(x) </li></ul><ul><ul><li>Delete position x from the queue </li></ul></ul><ul><li>Change(x, k) </li></ul><ul><ul><li>Change key field of position x to k </li></ul></ul>
  5. 5. Optional PQ operations <ul><li>Union (a,b) </li></ul><ul><ul><li>Combines two PQs a and b </li></ul></ul><ul><li>Search (k) </li></ul><ul><ul><li>Returns the position of the element in the heap with key value k </li></ul></ul>
  6. 6. Considerations when implementing a PQ in competition <ul><li>How complicated is it? </li></ul><ul><ul><li>Is the code likely to be buggy? </li></ul></ul><ul><li>How fast does it need to be? </li></ul><ul><ul><li>Does a constant factor also come into the equation? </li></ul></ul><ul><li>Do I need to store extra data to do a Search? </li></ul><ul><ul><li>During the course of this presentation, we shall assume that there exists existing extra data which allows us to do a search in O(1) time. The handling of this data structure will be assumed and not covered. </li></ul></ul>
  7. 7. Linear Array <ul><li>Unsorted Array </li></ul><ul><ul><li>Create, Insert, Change in O(1) time </li></ul></ul><ul><ul><li>Find_min, Delete in O(n) time </li></ul></ul><ul><li>Sorted Array </li></ul><ul><ul><li>Create, Find_min in O(1) time </li></ul></ul><ul><ul><li>Insert, Delete, Change in O(n + log n) = O(n) time </li></ul></ul>
  8. 8. Binary Heaps <ul><li>Will be the most common structure that will be implemented in competition setting </li></ul><ul><ul><li>Efficient for most applications </li></ul></ul><ul><ul><li>Easy to implement </li></ul></ul><ul><li>A heap is a structure where the value of a node is less than the value of all of its children </li></ul><ul><li>A binary heap is a heap where the maximum number of children for each node is 2. </li></ul>
  9. 9. Array implementation <ul><li>Consider a heap of size nheap in an array BHeap[1 ..nheap] (Define BHeap[nheap+1 .. (nheap*2)+1] to be INFINITY for practical reasons) </li></ul><ul><ul><li>The children of BHeap[x] are BHeap[x*2] and BHeap[x*2+1] </li></ul></ul><ul><ul><li>The parent of BHeap[x] are BHeap[x/2] </li></ul></ul><ul><ul><li>This allows a near uniform Binary Heap where we can ensure that the number of levels in this heap is O(log n) </li></ul></ul><ul><ul><li>Some properties wrt Key values: BHeap[x] >= BHeap[x/2], BHeap[x] <= BHeap[x*2], BHeap[x] <= BHeap[x*2+1], BHeap[x*2] ?? BHeap[x*2+1] </li></ul></ul>
  10. 10. PQ Operations on a BHeap <ul><li>We define BTree(x) to be the Binary Tree rooted at BHeap[x] </li></ul><ul><li>We define Heapify(x) to be an operation that does the following: </li></ul><ul><ul><li>Assume: BTree(x*2) and BTree(x*2+1) are binary heaps but BTree(x) is not necessarily a binary heap </li></ul></ul><ul><ul><li>Produce: BTree(x) binary heap </li></ul></ul><ul><ul><li>Details of Heapify in later slides – but for now, we assume Heapify is O(log n) </li></ul></ul><ul><li>For the rest of the presentation, we assume the variable n refers to nheap </li></ul>
  11. 11. Operations on a BHeap <ul><li>Create is trivial – O(1) time </li></ul><ul><li>Find_min: </li></ul><ul><ul><li>Return BHeap[1] </li></ul></ul><ul><ul><li>O(1) time </li></ul></ul><ul><li>Insert (element with key value x) </li></ul><ul><ul><li>nheap++ </li></ul></ul><ul><ul><li>BHeap[nheap] = x </li></ul></ul><ul><ul><li>T = nheap </li></ul></ul><ul><ul><li>While (T != 1 && Bheap[T] < BHeap[T/2]) </li></ul></ul><ul><ul><ul><li>Swap (Bheap[T], BHeap[T/2] </li></ul></ul></ul><ul><ul><ul><li>T = T / 2 </li></ul></ul></ul><ul><ul><li>O(log n) time as the number of levels is O(log n) </li></ul></ul>
  12. 12. Operations on a BHeap <ul><li>ChangeDown (position x, new key value k) </li></ul><ul><ul><li>Assume: k < existing BHeap[x] </li></ul></ul><ul><ul><li>BHeap[x] = k </li></ul></ul><ul><ul><li>T = x </li></ul></ul><ul><ul><li>While (T != 1 && BHeap[T] < BHeap[T/2]) </li></ul></ul><ul><ul><ul><li>Swap (BHeap[T], BHeap[T/2]) </li></ul></ul></ul><ul><ul><ul><li>T = T/2 </li></ul></ul></ul><ul><ul><li>Complexity: O(log n) </li></ul></ul><ul><ul><li>This procedure is known as “bubbling up” the heap </li></ul></ul>
  13. 13. Operations on a BHeap <ul><li>ChangeUp (position x, new key value k) </li></ul><ul><ul><li>Assume: k > existing BHeap[x] </li></ul></ul><ul><ul><li>BHeap[x] = k </li></ul></ul><ul><ul><li>Heapify(x) </li></ul></ul><ul><ul><li>O(log n) as complexity of Heapify is O(log n) </li></ul></ul>
  14. 14. Operations on a BHeap <ul><li>Delete (position x on the heap) </li></ul><ul><ul><li>BHeap[x] = BHeap[nheap] </li></ul></ul><ul><ul><li>nheap— </li></ul></ul><ul><ul><li>Heapify(x) </li></ul></ul><ul><ul><li>T = x </li></ul></ul><ul><ul><li>While (T != 1 && BHeap[T] < BHeap[T/2]) </li></ul></ul><ul><ul><ul><li>Swap (BHeap[T], BHeap[T/2]) </li></ul></ul></ul><ul><ul><ul><li>T = T / 2 </li></ul></ul></ul><ul><ul><li>Complexity is O(log n) </li></ul></ul><ul><ul><li>Why must I do both Heapify and “bubble up”? </li></ul></ul>
  15. 15. Operations on a BHeap <ul><li>Heapify (position x on the heap) </li></ul><ul><ul><li>T = min(BHeap[x], BHeap[x*2], BHeap[x*2+1]) </li></ul></ul><ul><ul><li>If (T == BHeap[x]) return; </li></ul></ul><ul><ul><li>K = position where BHeap[K] = T </li></ul></ul><ul><ul><li>Swap(BHeap[x], BHeap[K]) </li></ul></ul><ul><ul><li>Heapify(K) </li></ul></ul><ul><ul><li>O(log n) as the maximum number of levels in the heap is O(log n) and Heapify only goes through each level at most once </li></ul></ul>
  16. 16. BHeap Operations: Summary <ul><li>Create, Find_min in O(1) time </li></ul><ul><li>Change (includes both ChangeUp and ChangeDown), Insert, and Delete are O(log n) time </li></ul><ul><li>Union operations are how long? </li></ul><ul><ul><li>Insertion: O(n log n) union </li></ul></ul><ul><ul><li>Heapify: O(n) union </li></ul></ul>
  17. 17. Corollary: Heapsort <ul><li>We can convert an unsorted array to a heap using Heapify (why does this work?): </li></ul><ul><ul><li>For (i = n/2; i >= 1; i--) </li></ul></ul><ul><ul><ul><li>Heapify(i) </li></ul></ul></ul><ul><li>We can then return a sorted list (list initially empty): </li></ul><ul><ul><li>For (i = 1; i <= n; i++) </li></ul></ul><ul><ul><ul><li>Append the value of find_min to the list </li></ul></ul></ul><ul><ul><ul><li>Delete(1) </li></ul></ul></ul><ul><li>Complexity is O(n log n) </li></ul>
  18. 18. Binomial Trees <ul><li>Define Binomial Tree B(k) as follows: </li></ul><ul><ul><li>B(0) is a single node </li></ul></ul><ul><ul><li>B(n), n != 0, is formed by merging two B(n-1) trees in the following way: </li></ul></ul><ul><ul><ul><li>The root of the B(n) tree is the root of one of the B(n-1) trees, and the (new) leftmost child of this root is the root of the other B(n-1) tree. </li></ul></ul></ul><ul><ul><li>Within the tree, the heap property holds i.e. that the key field of any node is greater than the key field of all its children. </li></ul></ul>
  19. 19. Properties of Binomial Trees <ul><li>The number of nodes in B(k) is exactly 2^k. </li></ul><ul><li>The height of B(k) is exactly (k + 1) </li></ul><ul><li>For any tree B(k) </li></ul><ul><ul><li>The root of B(k) has exactly k children </li></ul></ul><ul><ul><li>If we take the children of B(k) from left to right, they form the roots of a B(k-1), B(k-2), …, B(0) tree in that order </li></ul></ul>
  20. 20. Binomial Heaps <ul><li>Binomial Heaps are a forest of binomial trees with the following properties: </li></ul><ul><ul><li>All the binomial trees are of different sizes </li></ul></ul><ul><ul><li>The binomial trees are ordered (from left to right) by increasing size </li></ul></ul><ul><li>If we consider the fact that the size of B(k) is 2^k, the binomial tree B(k) exists in a binomial heap of n nodes iff the bit representing 2^k is “1” in the binary representation of n </li></ul><ul><ul><li>For example: 13 (decimal) = 1101 (binary), so the binomial heap with 13 nodes consists of the binomial trees B(0), B(2), and B(3). </li></ul></ul>
  21. 21. Binomial Heap Implementation <ul><li>Each node will store the following data: </li></ul><ul><ul><li>Key field </li></ul></ul><ul><ul><li>Pointers (if non-existent, points to NIL) to </li></ul></ul><ul><ul><ul><li>Parent </li></ul></ul></ul><ul><ul><ul><li>Next Sibling (ordered left to right; a sibling must have the same parent); For roots of binomial trees, next sibling points to the root of the next binomial tree </li></ul></ul></ul><ul><ul><ul><li>Leftmost child </li></ul></ul></ul><ul><ul><li>Number of children in field degree </li></ul></ul><ul><ul><li>Any other data that might be useful for the program </li></ul></ul><ul><li>The binomial heap is represented by a head pointer that points to the root of the smallest binomial tree (which is the leftmost binomial tree) </li></ul>
  22. 22. Operations on Binomial Trees <ul><li>Link (h1, h2) </li></ul><ul><ul><li>Links two binomial trees with root h1 and h2 of the same order k to form a new binomial tree of order (k+1) </li></ul></ul><ul><ul><li>We assume h1->key < h2->key which implies that h1 is the root of the new tree </li></ul></ul><ul><ul><li>T = h1->leftchild </li></ul></ul><ul><ul><li>h1->leftchild = h2 </li></ul></ul><ul><ul><li>h2->parent = h1 </li></ul></ul><ul><ul><li>H2->next_sibling= T </li></ul></ul><ul><ul><li>O(1) time </li></ul></ul>
  23. 23. Operations on binomial heaps <ul><li>Create – Create a new binomial heap with one node ( key field set) </li></ul><ul><ul><li>Set Parent, Leftchild, Next sibling to NIL </li></ul></ul><ul><ul><li>O(1) time </li></ul></ul><ul><li>Find_min </li></ul><ul><ul><li>X = head, min = INFINITY </li></ul></ul><ul><ul><li>While (X != nil) </li></ul></ul><ul><ul><ul><li>If (X->key < min) min = X->key </li></ul></ul></ul><ul><ul><ul><li>X = X->next_sibling </li></ul></ul></ul><ul><ul><li>Return min </li></ul></ul><ul><ul><li>O(log n) time as there are at most log n binomial trees (log n bits) </li></ul></ul>
  24. 24. More Operations <ul><li>Merge (h1, h2, L) </li></ul><ul><ul><li>Given binomial heaps with head pointers h1 and h2, create a list L of all the binomial trees of h1 U h2 arranged in ascending order of size </li></ul></ul><ul><ul><li>For any order k, there may be zero, one, or two binomial trees of order k in this list. </li></ul></ul>
  25. 25. More Operations <ul><li>Merge (h1, h2, L) </li></ul><ul><ul><li>Assume that NIL is a node of infinitely small order </li></ul></ul><ul><ul><li>L = empty </li></ul></ul><ul><ul><li>While (h1 != NIL || h2 != NIL) </li></ul></ul><ul><ul><ul><li>If (h1->degree < h2->degree) </li></ul></ul></ul><ul><ul><ul><ul><li>Append the (binomial)tree with root h1 to L </li></ul></ul></ul></ul><ul><ul><ul><ul><li>h1 = h1->next_sibling </li></ul></ul></ul></ul><ul><ul><ul><li>Else </li></ul></ul></ul><ul><ul><ul><ul><li>Apply above steps to h2 instead </li></ul></ul></ul></ul>
  26. 26. More Operations <ul><li>Union (h1, h2) </li></ul><ul><ul><li>The fundamental operation involving binomial heaps </li></ul></ul><ul><ul><li>Takes two binomial heaps with head pointers h1 and h2 and creates a new binomial heap of the union of h1 and h2 </li></ul></ul>
  27. 27. More Operations <ul><li>Union (h1, h2) </li></ul><ul><ul><li>Start with empty binomial heap </li></ul></ul><ul><ul><li>Merge (h1, h2, L) </li></ul></ul><ul><ul><li>Go by increasing k in the list L until L is empty </li></ul></ul><ul><ul><ul><li>If there is exactly one or exactly three (how can this happen?) binomial trees of order k in L, append one binomial tree of order k to the binomial heap and remove that tree from L </li></ul></ul></ul><ul><ul><ul><li>If there are two trees of order k, remove both trees, use Link to form a tree of order (k+1) and pre-pend this tree to L </li></ul></ul></ul><ul><li>Union is O(log n) </li></ul>
  28. 28. More Operations <ul><li>Inserting a new node with key field set </li></ul><ul><ul><li>Create a new binomial heap with that one node </li></ul></ul><ul><ul><li>Union (existing heap with head h, new heap) </li></ul></ul><ul><ul><li>O (log n) time </li></ul></ul><ul><li>ChangeDown (node at position x, new value) </li></ul><ul><ul><li>Decreasing the key value of a node </li></ul></ul><ul><ul><li>Same idea as binary heap: “Bubble” up the binomial tree containing this node (exchange only key fields and satellite data! What’s the complexity if you physically change the node?) </li></ul></ul><ul><ul><li>O (log n) time </li></ul></ul>
  29. 29. More Operations <ul><li>Delete (node at position x) </li></ul><ul><ul><li>Deleting position x from the heap </li></ul></ul><ul><ul><li>ChangeDown(x, -INFINITY) </li></ul></ul><ul><ul><li>Now x is at the root of its binomial tree </li></ul></ul><ul><ul><li>Supposing that the binomial tree is of order k </li></ul></ul><ul><ul><li>Recall that the children of the root of the binomial tree, from right to left, are binomial trees of order 0, 1, 2, 3, 4, …, k-1 </li></ul></ul><ul><ul><li>Form a new binomial heap with the children of the root of this binomial tree the roots in the new binomial heap </li></ul></ul><ul><ul><li>Remove the original binomial tree from the original binomial heap </li></ul></ul><ul><ul><li>Union (original heap, new heap) </li></ul></ul><ul><li>O(log n) complexity </li></ul>
  30. 30. More Operations <ul><li>ChangeUp (node at position X, new value) </li></ul><ul><ul><li>Delete (X) </li></ul></ul><ul><ul><li>Insert (new value) </li></ul></ul><ul><ul><li>O (log n) time </li></ul></ul>
  31. 31. Summary – Binomial Heaps <ul><li>Create in O(1) time </li></ul><ul><li>Union, Find_min, Delete, Insert, and Change operations take O(log n) time </li></ul><ul><li>In general, because they are more complicated, in competition it is far more prudent (saves time coding and debugging) to use a binary heap instead </li></ul><ul><ul><li>Unless there are MANY Union operations </li></ul></ul>
  32. 32. Application of heaps: Dijkstra <ul><li>The following describes how Dijkstra’s algorithm can be coded with a binary heap </li></ul><ul><li>Initializing phase: </li></ul><ul><li>Let n be the number of nodes </li></ul><ul><li>Create a heap of size n, all key fields initialized to INFINITY </li></ul><ul><li>Change_val (s, 0) where s is the source node </li></ul>
  33. 33. Running of Dijkstra’s algorithm <ul><li>While (heap is not empty) </li></ul><ul><ul><li>X = node corresponding to find_min value </li></ul></ul><ul><ul><li>Delete (position of X in heap = 1) </li></ul></ul><ul><ul><li>For all nodes k that are adjacent to X </li></ul></ul><ul><ul><ul><li>If (cost[X] + distance[X][k] < cost[k]) </li></ul></ul></ul><ul><ul><ul><ul><li>ChangeDown (position of k in heap, cost[X] + distance[X][k]) </li></ul></ul></ul></ul>
  34. 34. Analysis of running time <ul><li>At most n nodes are deleted </li></ul><ul><ul><li>O(n log n) </li></ul></ul><ul><li>Let m be the number of edges. Each edge is relaxed at most once. </li></ul><ul><ul><li>O(m log n) </li></ul></ul><ul><li>Total running time O([m+n] log n) </li></ul><ul><li>This is faster than using a basic array list unless the graph is very dense, in which case m is about O(n^2) which leads to a running time of O(n^2 log n) </li></ul>
  35. 35. Cumulative Sum on Intervals <ul><li>Problem: We have a line that runs from x coordinate 1 to x coordinate N. At x coordinate X [X an integer between 0 and N], there is g(X) gold. Given an interval [a,b], how much gold is there between a and b? </li></ul><ul><li>How efficiently can this be done if we dynamically change the amount of gold and the interval [a,b] keeps changing? </li></ul>
  36. 36. Cumulative Sum Array <ul><li>Let us define C(0) = 0, and C(x) = C(x-1) + g(x) where g(x) is the amount of gold at position x </li></ul><ul><li>C(x) then defines the total amount of gold from position 1 to position x </li></ul><ul><li>The amount of gold in interval [a,b] is simply C(b) – C(a-1) </li></ul><ul><ul><li>For any change in a or b, we can perform the update in O(1) time </li></ul></ul><ul><li>However, if we change g(x), we will have to change C(x), C(x+1), C(x+2), …, C(N) </li></ul><ul><ul><li>Any change in gold results in an update in O(N) time </li></ul></ul>
  37. 37. Cumulative Sum Tree <ul><li>We can use the binary representation of any number to come up with a cumulative sum tree </li></ul><ul><li>For example, let say we take 13 (decimal) = 1101 (binary) </li></ul><ul><ul><li>The cumulative sum of g(1) + g(2) + … g(13) can be represented as the sum of: </li></ul></ul><ul><ul><ul><li>g(1) + g(2) + … + g(8) [ 8 elements ] </li></ul></ul></ul><ul><ul><ul><li>g(9) + g(10) + … + g(12) [ 4 elements ] </li></ul></ul></ul><ul><ul><ul><li>g(13) [ 1 element ] </li></ul></ul></ul><ul><ul><li>Notice that the number of elements in each case represents a bit that is “1” in the binary representation of the number </li></ul></ul>
  38. 38. Cumulative Sum Tree <ul><li>Another example: C(19) </li></ul><ul><ul><li>19 (decimal) is 10011 (binary) </li></ul></ul><ul><ul><ul><li>C(19) is the sum of the following: </li></ul></ul></ul><ul><ul><ul><li>g(1) + g(2) + … + g(16) [ 16 elements ] </li></ul></ul></ul><ul><ul><ul><li>g(17) + g(18) [ 2 elements ] </li></ul></ul></ul><ul><ul><ul><li>g(19) [ 1 element ] </li></ul></ul></ul>
  39. 39. Cumulative Sum Tree <ul><li>Let us define C2(x) to be the sum of g(x) + g(x-1) + … + g(p + 1) where p is a number with the same binary representation as x except the least significant bit of x (the rightmost bit of x that is “1”) is “0” </li></ul><ul><li>Examples of x and the corresponding p: </li></ul><ul><ul><li>x = 6 [110], p = 4 [100] </li></ul></ul><ul><ul><li>x = 13 [1101], p = 12 [1100] </li></ul></ul><ul><ul><li>x = 16 [10000], p = 0 [00000] </li></ul></ul>
  40. 40. Cumulative Sum Tree <ul><li>If we want to find the cumulative sum C(x) = g(1) + g(2) + … + g(x), we can trace through the values of C2 using the binary representation of x </li></ul><ul><ul><li>Examples: </li></ul></ul><ul><ul><li>C(13) = C2(8) + C2(8+4) + C2(8+4+1) </li></ul></ul><ul><ul><li>C(16) = C2(16) </li></ul></ul><ul><ul><li>C(21) = C2(16) + C2(16+4) + C2(16+4+1) </li></ul></ul><ul><ul><li>C(99) = C2(64) + C2(64+32) + C2(64+32+2) + C2(64+32+2+1) </li></ul></ul><ul><li>This allows us to find C(x) in log x time </li></ul><ul><ul><li>Hence the amount of gold in interval [a,b] = C(b) – C(a-1) can be found in log N time, which implies updates of a and b can be done in O(log N) </li></ul></ul>
  41. 41. Cumulative Sum Tree <ul><li>What happens when we change g(x)? </li></ul><ul><ul><li>If g(x) is changed, we only need to update C2(y) where C2(y) covers g(x) </li></ul></ul><ul><ul><li>We can go through all necessary C2(y) in the following way: </li></ul></ul><ul><ul><ul><li>While (x <= N) </li></ul></ul></ul><ul><ul><ul><ul><li>Update C2(x) </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Add the value of the least significant bit of x to x </li></ul></ul></ul></ul><ul><ul><li>This runs in O(log N) time </li></ul></ul><ul><ul><li>Hence updates to g can also be done in O(log n) time, which is a great improvement over the O(N) needed for an array. </li></ul></ul>
  42. 42. Cumulative Sum Tree <ul><li>Examples [binary representation in brackets] </li></ul><ul><ul><li>Change to g(5) [ 101 ] : Update C2(5), C2(6), C2(8), C2(16) and all C2(power of 2 > 16) </li></ul></ul><ul><ul><li>Change to g(13) [ 1101 ]: Update C2(13), C2(14), C2(16), and all C2(power of 2 > 16) </li></ul></ul><ul><ul><li>Change to g(35) [ 100011 ]: Update C2(35), C2(36), C2(40), C2(48), C2(64), and all C2(power of 2 > 64) </li></ul></ul><ul><li>We can implement a cumulative sum tree very simply: By simply using a linear array to store the values of C2. </li></ul><ul><li>Can we extend a cumulative sum tree to 2 or more dimensions? </li></ul><ul><ul><li>See IOI 2001 Day 1 Question 1 </li></ul></ul>
  43. 43. Sum of Intervals Tree <ul><li>Another way to solve the question is to use a “Sum of Intervals” Binary Tree </li></ul><ul><li>Each node in the tree is represented by (L, R) and the value of (L,R) is g(L) + g(L+1) + … + g(R) </li></ul><ul><li>The root of the tree has L = 1 and R = N </li></ul><ul><li>Every leaf has L = R </li></ul><ul><li>Every non-leaf has children (L, [L+R]/2) [left child] and ([L+R]/2+1, R) [right child] </li></ul><ul><li>The number of nodes in the tree is O(2*N) [ why? ] </li></ul><ul><li>In an implementation, every node should have pointers to its children and its parent </li></ul>
  44. 44. Sum of Intervals Tree <ul><li>How to find C(x) = g(1) + g(2) + … + g(x)? </li></ul><ul><ul><li>We trace from the root downwards </li></ul></ul><ul><ul><li>L = 1, R = N, C = 0 </li></ul></ul><ul><ul><li>While (L != R) </li></ul></ul><ul><ul><ul><li>M = (L + R) / 2 </li></ul></ul></ul><ul><ul><ul><li>If (M < x) </li></ul></ul></ul><ul><ul><ul><ul><li>C += value of (L,R) </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Set L and R to the left child of the current node </li></ul></ul></ul></ul><ul><ul><ul><li>Else </li></ul></ul></ul><ul><ul><ul><ul><li>Set L and R to the right child of the current node </li></ul></ul></ul></ul><ul><ul><li>C += value at (L,R) [ or (L,L) or (R,R) as L = R ] </li></ul></ul><ul><ul><li>Time complexity: O(log n) </li></ul></ul>
  45. 45. Sum of Intervals Tree <ul><li>What happens when g(x) is changed? </li></ul><ul><ul><li>Trace from (x,x) upwards to the root </li></ul></ul><ul><ul><li>Let L = R = x </li></ul></ul><ul><ul><li>While (L,R) is not the root </li></ul></ul><ul><ul><ul><li>Update the value of (L,R) </li></ul></ul></ul><ul><ul><ul><li>Set (L,R) to the parent of (L,R) </li></ul></ul></ul><ul><ul><li>Update the root </li></ul></ul><ul><li>Complexity of O(log N) </li></ul><ul><li>Hence all updates of interval [a,b] and g(x) can be done in O(log N) time </li></ul>
  46. 46. Augmenting Data Structures <ul><li>It is often useful to change the data structure in some way, by adding additional data in each node or changing what each node represents. </li></ul><ul><li>This allows us to use the same data structure to solve problems </li></ul><ul><li>For example, we can use so-called “interval trees” to solve not just cumulative sum problems </li></ul><ul><ul><li>We can use properties of elements in the interval (L,R) that are related to L and R. </li></ul></ul>
  47. 47. Other data structures <ul><li>Balanced (and unbalanced) binary trees </li></ul><ul><ul><li>Red-Black trees </li></ul></ul><ul><ul><li>2-3-4 trees </li></ul></ul><ul><ul><li>Splay trees </li></ul></ul><ul><li>Suffix Trees </li></ul><ul><li>Fibonacci Heaps </li></ul>