SlideShare a Scribd company logo
(a data structure created by Bernard Chazelle)
1
“Be soft, even if you stand to get squashed.”
-- E. M. Forster
Outline
 Soft heaps
 What are they?
 Advantages vs. regular heaps
 Soft heap operations:
 sift, combine, updatesufmin, insert, deletemin
 Analysis of running time
 Proof of correctness
 Applications
2
Why soft heaps?
 (Because they are used in Chazelle’s MST algorithm)
 But what’s wrong with regular heaps?
 The “problem”: they can be used to sort
 Sorting takes (n log n) time
 So either insert(x) or delete-min() takes (log n)
3
How can we improve this LB?
 Inaccuracy:
 Soft heaps have an error parameter 
 After n inserts, at most n elements may be corrupted
 But nice amortized run times:
 delete-min()– O(1)
 insert(x) – O(log 1/)
 Note: with  < 1/n, works just like ordinary heap.
 So… how do they work?
4
Overview of soft heaps
 New construction (Kaplan & Zwick, 2009)
 Linked list of heap-ordered binary trees: each tree has a rank,
and one tree per rank
 Each tree also has a suffix-min pointer (black)
5
0 1 3
3 12 8
925
17 11
Trees are not
necessarily balanced
5, 7, 8
2, 10, 11
Each node has list of
elements
Corruption
 Each node x has a list of elements, list[x]
 All elements in list are indexed by the node’s key
 If their key is less, then they are corrupted
 (Note that corruption can only increase a key)
 The soft heap has at most n elements corrupted after
n inserts
 These elements move together (“carpooling”), which
allows faster operations
6
8
5, 7, 8
Some more notation
 For a node x:
 left[x] is the left child of x, null if none exists
 right[x] is the right child of x, null if none exists
 key[x] is the key value of x, or  if x = null
 size[x] is the target size of list[x] (defined later)
 For a root node x:
 suf-min[x] is the suffix-min pointer of x
 next[x] is the next root in the linked list
 prev[x] is the previous root in the linked list
7
Soft heap operations: sift
 For each node x, we want list[x] to have “enough” elements so
that operations are fast
 The sift(x) operation increases the number of elements in list[x]:
 If size[x] > |list[x]|:
 Let y be the child of x with smaller key
 append(list[x], list[y])
 Set key[x] = key[y]
 If y is a leaf, delete it, else call sift(y)
8
4
9
5, 7, 9
9
4, 5, 7, 9
Operations: combine
 If we have two trees x, y of the same rank, we can
combine(x, y) them into one tree:
 Create a new node z
 Set left[z] = x, right[z] = y
 Set rank[z] = rank[x] + 1
 Set size[z] appropriately (defined later)
 Call sift(z)
9
12
25
5
18

12
25
5
18
5
12
25
18
Operations: updatesufmin
 Recall: each root node x has a suffix-min pointer
 Points to the smallest root from x to the end of the list
 These may become invalid: updatesufmin(x) updates
suf-min[x] and suf-min[y], for y  x
 We do this when we change the key of x (during sift),
add a new root, or delete next[x]:
 If key[x]  key[suf-min[next[x]]]:
 Set suf-min[x] = x
 Else set suf-min[x] = suf-min[next[x]]
 updatesufmin(prev[x])
10
Operations: insert, deletemin
 To insert:
 We create a new rank 0 tree and add it to the list of roots
 Then we call combine until there is only one root of
each rank
 Finally, if x was the last root we modified, we call
updatesufmin(x)
 To deletemin:
 Let z be the first root of the linked list
 Then set y = suf-min[z], and return an element of list[y]
 If |list[y]| < size[y]/2, call sift(y)
11
Example: insert
1212
0 1 3
3 12 8
925
17 11
5, 7, 8
2, 10, 11
1
Example: insert
1313
0 1 3
12 8
925
17 11
5, 7, 8
2, 10, 11

1 3
Example: insert
1414
1 3
12 8
925
17 11
5, 7, 8
2, 10, 11
1
3
Example: insert
1515
1 3
12
8
9
25
17 11
5, 7, 8
2, 10, 11
1
3

Example: insert
1616
2 3
12
8
9
25
17 11
5, 7, 8
2, 10, 11
3
1
Example: insert
1717
2 3
12
8
9
25
17 11
5, 7, 8
2, 10, 11
3
1
Example: deletemin
18
2 3
12
8
9
25
17 11
7, 5, 8
2, 10, 11
33
10
Example: deletemin
19
2 3
12
8
9
25
17 11
5, 8
2, 10, 11
33
10
7
Let’s say 2 < size[8]/2…
Example: deletemin
20
2 3
12
9

25
17 11
5, 8, 9
2, 10, 11
33
10
7
Note: 8 is now
corrupted
Example: deletemin
21
2 3
12
9
11
25
17 
5, 8, 9
2, 10, 11
33
10
7
Example: deletemin
22
2 3
12
9
11
25
17 34
5, 8, 9
2, 10, 11
33
10
7
19, 34
Amortized Analysis
 Potential analysis:
 Let M be the max rank of the heap
 Each internal node has potential of 1
 Each root node x has potential of:
 rank[x] + 5
 The heap itself has potential M
 For each tree T, let del(T) be the number of deletions
from T since the last sift
 We also give T a potential of (r + 2)del(T)
 (r will be defined later)
23
Analysis of combine(x, y)
 x, y root nodes with rank of k
 The potentials of x, y decrease from 2k + 10 to 2.
 Creating a new node z increases the potential by k + 6
 The potential may increase by 1 if M increases
 We charge 1 cost for the operation
 After doing combines, we may have to do an
updatesufmin, which costs k
 This gives an amortized cost of 2-2k-10+k+6+2+k = 0
24
Analysis of sift(x)
 Let x be a node of rank k, y its child and we move the
elements of list[y] to list[x]
 If |list[y]| < size[y]/2, then y is a leaf and it is deleted,
so the potential decreases by 1, which pays for this
operation
 Otherwise…
 We have |list[y]|  size[y]/2, so we charge each element
2/size[y] for the operation
 Each element can be charged at most once per rank, (in
its whole history of existence)
 So the max cost of sift is:
25


0 krankofnodeofsizemax
2
k
What is this size[y] function?
 Consider: if size[y] = 1, then the cost of sift is just O(M), which
will be O(log n) (also, no elements will be corrupted)
 So having size[y] > 1 is what makes soft heaps “soft”
 We want the sum from the previous slide to be O(log 1/)
 So we will make size[y] = 1 for ranks up to O(log 1/), and then
exponentially increasing
 So let r = C + log 1/, then we can define (for 0 <  < 1):
 With this definition, the previous sum indeed gives O(log 1/)
26






 rks
rk
s
k
k
1)1(
1

Analysis of insert
 We create a new root node (potential increased by 5)
 Every combine pays for itself, and the final
updatesufmin
 Finally, the sift cost of the new element and the
eventual delete cost will be O(log 1/)
 (see next slide)
 So insert takes O(log 1/) amortized
27
Analysis of deletemin
 Finding the key takes 1 operation
 Deleting raises the potential by r+2 (so cost of r+3) – we
have charged this to nodes when they were inserted
 If we have a leaf with more elements in its list or |list[x]| >
size[x]/2, we do nothing
 (If it is a leaf and we delete it, then the potential is reduced by
k+5)
 Otherwise: If we do sift, del(x)  size[x]/2
 So the potential of the tree was at least (r + 2)sk/2
 Can show: (r + 2)sk/2 > k + 1
 Either way we have k+1 potential, which pays for the
updatesufmin operation, so O(1) amortized
28
Proof of correctness
 Lemma 1: The number of nodes of rank k is at most
n/2k.
 Lemma 2: If x has rank k, size[x]  2(1 + )k-r
 Proof of both: Easy induction.
 Lemma 3: |list[x]|  (1 + 1/)size[x]
 Proof: By induction on the rank of x. If we move nodes
from list[y] of lower rank, we have
|list[y]|  ((1 + )/)size[y] = size[x]/
 So the new size is size[x] + size[x]/ = (1 + 1/)size[x]
29
Proof of correctness (cont’d)
 Theorem: After n insertions, at most n elements are corrupted.
 Proof:
 Elements can only be corrupted in nodes of rank > r
 For a node of rank k,
|list[x]|  (1 + 1/)size[x]  2 (1 + 1/) (1 + )k-r
 Since r = C + log 1/ the # of corrupted elements is at most:
30
i
i
r
rk
rk
rk
r
rk
k
rk
nn
n  









 





0 2
)1(
2
)/11(2
2
)1(
2
)/11(2
2
)1(
)/11(2







n
n
Cr
2)1(
)/11(4
1
2
2
)/11(2






So we choose C such that this is < n
(We needed  < 1 for the sum to converge)
Applications of Soft Heaps
 Selection in O(n) time:
 Make a soft heap with  = 1/3:
 Insert all the elements, then remove n/3 + 1 elements
 Let x be the largest element removed
 x is greater than 1/3 of the elements, and it is less than 1/3 of
the elements (because only 1/3 could be corrupted)
 So partition the elements around x, in the worst case you are
left with (2/3)n elements, then we continue recursively
 Running time: O(n + (2/3)n + (4/9)n + …) = O(n)
 More interesting application:
 Finding the MST in O(α(m, n) m) time
31
(I hope you have enjoyed soft heaps.)
32

More Related Content

What's hot

Oracle sql ppt2
Oracle sql ppt2Oracle sql ppt2
Oracle sql ppt2
Madhavendra Dutt
 
10. Getting Spatial
10. Getting Spatial10. Getting Spatial
10. Getting Spatial
FAO
 
Primitive Recursive Functions
Primitive Recursive FunctionsPrimitive Recursive Functions
Primitive Recursive Functions
Radhakrishnan Chinnusamy
 
A Note on Leapfrog Integration
A Note on Leapfrog IntegrationA Note on Leapfrog Integration
A Note on Leapfrog Integration
Kai Xu
 
Clojure for Data Science
Clojure for Data ScienceClojure for Data Science
Clojure for Data Science
henrygarner
 
Dancing Links: an educational pearl
Dancing Links: an educational pearlDancing Links: an educational pearl
Dancing Links: an educational pearl
ESUG
 
The Ring programming language version 1.10 book - Part 45 of 212
The Ring programming language version 1.10 book - Part 45 of 212The Ring programming language version 1.10 book - Part 45 of 212
The Ring programming language version 1.10 book - Part 45 of 212
Mahmoud Samir Fayed
 
Clojure for Data Science
Clojure for Data ScienceClojure for Data Science
Clojure for Data Science
Mike Anderson
 
From Lisp to Clojure/Incanter and RAn Introduction
From Lisp to Clojure/Incanter and RAn IntroductionFrom Lisp to Clojure/Incanter and RAn Introduction
From Lisp to Clojure/Incanter and RAn Introductionelliando dias
 
DSP 05 _ Sheet Five
DSP 05 _ Sheet FiveDSP 05 _ Sheet Five
DSP 05 _ Sheet Five
Amr E. Mohamed
 
Matrix Factorizations for Recommender Systems
Matrix Factorizations for Recommender SystemsMatrix Factorizations for Recommender Systems
Matrix Factorizations for Recommender Systems
Dmitriy Selivanov
 
Data structures in scala
Data structures in scalaData structures in scala
Data structures in scalaMeetu Maltiar
 
125 4.1 through 4.5
125 4.1 through 4.5125 4.1 through 4.5
125 4.1 through 4.5Jeneva Clark
 
Machine Learning Live
Machine Learning LiveMachine Learning Live
Machine Learning Live
Mike Anderson
 
Recsys matrix-factorizations
Recsys matrix-factorizationsRecsys matrix-factorizations
Recsys matrix-factorizations
Dmitriy Selivanov
 
lecture 4
lecture 4lecture 4
lecture 4sajinsc
 
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions ManualApplied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
towojixi
 

What's hot (20)

Oracle sql ppt2
Oracle sql ppt2Oracle sql ppt2
Oracle sql ppt2
 
10. Getting Spatial
10. Getting Spatial10. Getting Spatial
10. Getting Spatial
 
Primitive Recursive Functions
Primitive Recursive FunctionsPrimitive Recursive Functions
Primitive Recursive Functions
 
A Note on Leapfrog Integration
A Note on Leapfrog IntegrationA Note on Leapfrog Integration
A Note on Leapfrog Integration
 
Clojure for Data Science
Clojure for Data ScienceClojure for Data Science
Clojure for Data Science
 
04. haskell handling
04. haskell handling04. haskell handling
04. haskell handling
 
Dancing Links: an educational pearl
Dancing Links: an educational pearlDancing Links: an educational pearl
Dancing Links: an educational pearl
 
The Ring programming language version 1.10 book - Part 45 of 212
The Ring programming language version 1.10 book - Part 45 of 212The Ring programming language version 1.10 book - Part 45 of 212
The Ring programming language version 1.10 book - Part 45 of 212
 
Clojure for Data Science
Clojure for Data ScienceClojure for Data Science
Clojure for Data Science
 
Lec10
Lec10Lec10
Lec10
 
From Lisp to Clojure/Incanter and RAn Introduction
From Lisp to Clojure/Incanter and RAn IntroductionFrom Lisp to Clojure/Incanter and RAn Introduction
From Lisp to Clojure/Incanter and RAn Introduction
 
DSP 05 _ Sheet Five
DSP 05 _ Sheet FiveDSP 05 _ Sheet Five
DSP 05 _ Sheet Five
 
Algorithms
AlgorithmsAlgorithms
Algorithms
 
Matrix Factorizations for Recommender Systems
Matrix Factorizations for Recommender SystemsMatrix Factorizations for Recommender Systems
Matrix Factorizations for Recommender Systems
 
Data structures in scala
Data structures in scalaData structures in scala
Data structures in scala
 
125 4.1 through 4.5
125 4.1 through 4.5125 4.1 through 4.5
125 4.1 through 4.5
 
Machine Learning Live
Machine Learning LiveMachine Learning Live
Machine Learning Live
 
Recsys matrix-factorizations
Recsys matrix-factorizationsRecsys matrix-factorizations
Recsys matrix-factorizations
 
lecture 4
lecture 4lecture 4
lecture 4
 
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions ManualApplied Digital Signal Processing 1st Edition Manolakis Solutions Manual
Applied Digital Signal Processing 1st Edition Manolakis Solutions Manual
 

Similar to Soft Heaps

Functional Programming by Examples using Haskell
Functional Programming by Examples using HaskellFunctional Programming by Examples using Haskell
Functional Programming by Examples using Haskell
goncharenko
 
Intro to Matlab programming
Intro to Matlab programmingIntro to Matlab programming
Intro to Matlab programming
Ahmed Moawad
 
Dynamic programming - fundamentals review
Dynamic programming - fundamentals reviewDynamic programming - fundamentals review
Dynamic programming - fundamentals review
ElifTech
 
03-data-structures.pdf
03-data-structures.pdf03-data-structures.pdf
03-data-structures.pdf
Nash229987
 
Frp2016 3
Frp2016 3Frp2016 3
Frp2016 3
Kirill Kozlov
 
Scala. Introduction to FP. Monads
Scala. Introduction to FP. MonadsScala. Introduction to FP. Monads
Scala. Introduction to FP. Monads
Kirill Kozlov
 
Statistics lab 1
Statistics lab 1Statistics lab 1
Statistics lab 1
University of Salerno
 
lecture 11
lecture 11lecture 11
lecture 11sajinsc
 
Introduction to Neural Networks and Deep Learning from Scratch
Introduction to Neural Networks and Deep Learning from ScratchIntroduction to Neural Networks and Deep Learning from Scratch
Introduction to Neural Networks and Deep Learning from Scratch
Ahmed BESBES
 
Computer Network Assignment Help
Computer Network Assignment HelpComputer Network Assignment Help
Computer Network Assignment Help
Computer Network Assignment Help
 
Review session2
Review session2Review session2
Review session2
NEEDY12345
 
Introduction to R programming
Introduction to R programmingIntroduction to R programming
Introduction to R programming
Alberto Labarga
 
Fp in scala part 2
Fp in scala part 2Fp in scala part 2
Fp in scala part 2
Hang Zhao
 
DSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptxDSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptx
HamedNassar5
 
MATLAB-Introd.ppt
MATLAB-Introd.pptMATLAB-Introd.ppt
MATLAB-Introd.ppt
kebeAman
 
Numarical values
Numarical valuesNumarical values
Numarical values
AmanSaeed11
 
Numarical values highlighted
Numarical values highlightedNumarical values highlighted
Numarical values highlighted
AmanSaeed11
 
R Cheat Sheet for Data Analysts and Statisticians.pdf
R Cheat Sheet for Data Analysts and Statisticians.pdfR Cheat Sheet for Data Analysts and Statisticians.pdf
R Cheat Sheet for Data Analysts and Statisticians.pdf
Timothy McBush Hiele
 
Sorting
SortingSorting

Similar to Soft Heaps (20)

Functional Programming by Examples using Haskell
Functional Programming by Examples using HaskellFunctional Programming by Examples using Haskell
Functional Programming by Examples using Haskell
 
Intro to Matlab programming
Intro to Matlab programmingIntro to Matlab programming
Intro to Matlab programming
 
Dynamic programming - fundamentals review
Dynamic programming - fundamentals reviewDynamic programming - fundamentals review
Dynamic programming - fundamentals review
 
03-data-structures.pdf
03-data-structures.pdf03-data-structures.pdf
03-data-structures.pdf
 
Frp2016 3
Frp2016 3Frp2016 3
Frp2016 3
 
Scala. Introduction to FP. Monads
Scala. Introduction to FP. MonadsScala. Introduction to FP. Monads
Scala. Introduction to FP. Monads
 
Introduction to R
Introduction to RIntroduction to R
Introduction to R
 
Statistics lab 1
Statistics lab 1Statistics lab 1
Statistics lab 1
 
lecture 11
lecture 11lecture 11
lecture 11
 
Introduction to Neural Networks and Deep Learning from Scratch
Introduction to Neural Networks and Deep Learning from ScratchIntroduction to Neural Networks and Deep Learning from Scratch
Introduction to Neural Networks and Deep Learning from Scratch
 
Computer Network Assignment Help
Computer Network Assignment HelpComputer Network Assignment Help
Computer Network Assignment Help
 
Review session2
Review session2Review session2
Review session2
 
Introduction to R programming
Introduction to R programmingIntroduction to R programming
Introduction to R programming
 
Fp in scala part 2
Fp in scala part 2Fp in scala part 2
Fp in scala part 2
 
DSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptxDSP_DiscSignals_LinearS_150417.pptx
DSP_DiscSignals_LinearS_150417.pptx
 
MATLAB-Introd.ppt
MATLAB-Introd.pptMATLAB-Introd.ppt
MATLAB-Introd.ppt
 
Numarical values
Numarical valuesNumarical values
Numarical values
 
Numarical values highlighted
Numarical values highlightedNumarical values highlighted
Numarical values highlighted
 
R Cheat Sheet for Data Analysts and Statisticians.pdf
R Cheat Sheet for Data Analysts and Statisticians.pdfR Cheat Sheet for Data Analysts and Statisticians.pdf
R Cheat Sheet for Data Analysts and Statisticians.pdf
 
Sorting
SortingSorting
Sorting
 

Recently uploaded

Hemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptxHemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptx
muralinath2
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
yqqaatn0
 
Leaf Initiation, Growth and Differentiation.pdf
Leaf Initiation, Growth and Differentiation.pdfLeaf Initiation, Growth and Differentiation.pdf
Leaf Initiation, Growth and Differentiation.pdf
RenuJangid3
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
Nistarini College, Purulia (W.B) India
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
SAMIR PANDA
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
muralinath2
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
Richard Gill
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
ChetanK57
 
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of LipidsGBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
Areesha Ahmad
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
pablovgd
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
muralinath2
 
GBSN - Microbiology (Lab 4) Culture Media
GBSN - Microbiology (Lab 4) Culture MediaGBSN - Microbiology (Lab 4) Culture Media
GBSN - Microbiology (Lab 4) Culture Media
Areesha Ahmad
 
nodule formation by alisha dewangan.pptx
nodule formation by alisha dewangan.pptxnodule formation by alisha dewangan.pptx
nodule formation by alisha dewangan.pptx
alishadewangan1
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
tonzsalvador2222
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
kejapriya1
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
muralinath2
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
Columbia Weather Systems
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
Sérgio Sacani
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
Wasswaderrick3
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
moosaasad1975
 

Recently uploaded (20)

Hemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptxHemoglobin metabolism_pathophysiology.pptx
Hemoglobin metabolism_pathophysiology.pptx
 
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
原版制作(carleton毕业证书)卡尔顿大学毕业证硕士文凭原版一模一样
 
Leaf Initiation, Growth and Differentiation.pdf
Leaf Initiation, Growth and Differentiation.pdfLeaf Initiation, Growth and Differentiation.pdf
Leaf Initiation, Growth and Differentiation.pdf
 
Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.Nucleic Acid-its structural and functional complexity.
Nucleic Acid-its structural and functional complexity.
 
Seminar of U.V. Spectroscopy by SAMIR PANDA
 Seminar of U.V. Spectroscopy by SAMIR PANDA Seminar of U.V. Spectroscopy by SAMIR PANDA
Seminar of U.V. Spectroscopy by SAMIR PANDA
 
Hemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptxHemostasis_importance& clinical significance.pptx
Hemostasis_importance& clinical significance.pptx
 
Richard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlandsRichard's aventures in two entangled wonderlands
Richard's aventures in two entangled wonderlands
 
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATIONPRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
PRESENTATION ABOUT PRINCIPLE OF COSMATIC EVALUATION
 
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of LipidsGBSN - Biochemistry (Unit 5) Chemistry of Lipids
GBSN - Biochemistry (Unit 5) Chemistry of Lipids
 
NuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final versionNuGOweek 2024 Ghent - programme - final version
NuGOweek 2024 Ghent - programme - final version
 
erythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptxerythropoiesis-I_mechanism& clinical significance.pptx
erythropoiesis-I_mechanism& clinical significance.pptx
 
GBSN - Microbiology (Lab 4) Culture Media
GBSN - Microbiology (Lab 4) Culture MediaGBSN - Microbiology (Lab 4) Culture Media
GBSN - Microbiology (Lab 4) Culture Media
 
nodule formation by alisha dewangan.pptx
nodule formation by alisha dewangan.pptxnodule formation by alisha dewangan.pptx
nodule formation by alisha dewangan.pptx
 
Chapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisisChapter 12 - climate change and the energy crisis
Chapter 12 - climate change and the energy crisis
 
bordetella pertussis.................................ppt
bordetella pertussis.................................pptbordetella pertussis.................................ppt
bordetella pertussis.................................ppt
 
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptxBody fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
Body fluids_tonicity_dehydration_hypovolemia_hypervolemia.pptx
 
Orion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWSOrion Air Quality Monitoring Systems - CWS
Orion Air Quality Monitoring Systems - CWS
 
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
THE IMPORTANCE OF MARTIAN ATMOSPHERE SAMPLE RETURN.
 
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
DERIVATION OF MODIFIED BERNOULLI EQUATION WITH VISCOUS EFFECTS AND TERMINAL V...
 
What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.What is greenhouse gasses and how many gasses are there to affect the Earth.
What is greenhouse gasses and how many gasses are there to affect the Earth.
 

Soft Heaps

  • 1. (a data structure created by Bernard Chazelle) 1 “Be soft, even if you stand to get squashed.” -- E. M. Forster
  • 2. Outline  Soft heaps  What are they?  Advantages vs. regular heaps  Soft heap operations:  sift, combine, updatesufmin, insert, deletemin  Analysis of running time  Proof of correctness  Applications 2
  • 3. Why soft heaps?  (Because they are used in Chazelle’s MST algorithm)  But what’s wrong with regular heaps?  The “problem”: they can be used to sort  Sorting takes (n log n) time  So either insert(x) or delete-min() takes (log n) 3
  • 4. How can we improve this LB?  Inaccuracy:  Soft heaps have an error parameter   After n inserts, at most n elements may be corrupted  But nice amortized run times:  delete-min()– O(1)  insert(x) – O(log 1/)  Note: with  < 1/n, works just like ordinary heap.  So… how do they work? 4
  • 5. Overview of soft heaps  New construction (Kaplan & Zwick, 2009)  Linked list of heap-ordered binary trees: each tree has a rank, and one tree per rank  Each tree also has a suffix-min pointer (black) 5 0 1 3 3 12 8 925 17 11 Trees are not necessarily balanced 5, 7, 8 2, 10, 11 Each node has list of elements
  • 6. Corruption  Each node x has a list of elements, list[x]  All elements in list are indexed by the node’s key  If their key is less, then they are corrupted  (Note that corruption can only increase a key)  The soft heap has at most n elements corrupted after n inserts  These elements move together (“carpooling”), which allows faster operations 6 8 5, 7, 8
  • 7. Some more notation  For a node x:  left[x] is the left child of x, null if none exists  right[x] is the right child of x, null if none exists  key[x] is the key value of x, or  if x = null  size[x] is the target size of list[x] (defined later)  For a root node x:  suf-min[x] is the suffix-min pointer of x  next[x] is the next root in the linked list  prev[x] is the previous root in the linked list 7
  • 8. Soft heap operations: sift  For each node x, we want list[x] to have “enough” elements so that operations are fast  The sift(x) operation increases the number of elements in list[x]:  If size[x] > |list[x]|:  Let y be the child of x with smaller key  append(list[x], list[y])  Set key[x] = key[y]  If y is a leaf, delete it, else call sift(y) 8 4 9 5, 7, 9 9 4, 5, 7, 9
  • 9. Operations: combine  If we have two trees x, y of the same rank, we can combine(x, y) them into one tree:  Create a new node z  Set left[z] = x, right[z] = y  Set rank[z] = rank[x] + 1  Set size[z] appropriately (defined later)  Call sift(z) 9 12 25 5 18  12 25 5 18 5 12 25 18
  • 10. Operations: updatesufmin  Recall: each root node x has a suffix-min pointer  Points to the smallest root from x to the end of the list  These may become invalid: updatesufmin(x) updates suf-min[x] and suf-min[y], for y  x  We do this when we change the key of x (during sift), add a new root, or delete next[x]:  If key[x]  key[suf-min[next[x]]]:  Set suf-min[x] = x  Else set suf-min[x] = suf-min[next[x]]  updatesufmin(prev[x]) 10
  • 11. Operations: insert, deletemin  To insert:  We create a new rank 0 tree and add it to the list of roots  Then we call combine until there is only one root of each rank  Finally, if x was the last root we modified, we call updatesufmin(x)  To deletemin:  Let z be the first root of the linked list  Then set y = suf-min[z], and return an element of list[y]  If |list[y]| < size[y]/2, call sift(y) 11
  • 12. Example: insert 1212 0 1 3 3 12 8 925 17 11 5, 7, 8 2, 10, 11 1
  • 13. Example: insert 1313 0 1 3 12 8 925 17 11 5, 7, 8 2, 10, 11  1 3
  • 14. Example: insert 1414 1 3 12 8 925 17 11 5, 7, 8 2, 10, 11 1 3
  • 15. Example: insert 1515 1 3 12 8 9 25 17 11 5, 7, 8 2, 10, 11 1 3 
  • 16. Example: insert 1616 2 3 12 8 9 25 17 11 5, 7, 8 2, 10, 11 3 1
  • 17. Example: insert 1717 2 3 12 8 9 25 17 11 5, 7, 8 2, 10, 11 3 1
  • 18. Example: deletemin 18 2 3 12 8 9 25 17 11 7, 5, 8 2, 10, 11 33 10
  • 19. Example: deletemin 19 2 3 12 8 9 25 17 11 5, 8 2, 10, 11 33 10 7 Let’s say 2 < size[8]/2…
  • 20. Example: deletemin 20 2 3 12 9  25 17 11 5, 8, 9 2, 10, 11 33 10 7 Note: 8 is now corrupted
  • 21. Example: deletemin 21 2 3 12 9 11 25 17  5, 8, 9 2, 10, 11 33 10 7
  • 22. Example: deletemin 22 2 3 12 9 11 25 17 34 5, 8, 9 2, 10, 11 33 10 7 19, 34
  • 23. Amortized Analysis  Potential analysis:  Let M be the max rank of the heap  Each internal node has potential of 1  Each root node x has potential of:  rank[x] + 5  The heap itself has potential M  For each tree T, let del(T) be the number of deletions from T since the last sift  We also give T a potential of (r + 2)del(T)  (r will be defined later) 23
  • 24. Analysis of combine(x, y)  x, y root nodes with rank of k  The potentials of x, y decrease from 2k + 10 to 2.  Creating a new node z increases the potential by k + 6  The potential may increase by 1 if M increases  We charge 1 cost for the operation  After doing combines, we may have to do an updatesufmin, which costs k  This gives an amortized cost of 2-2k-10+k+6+2+k = 0 24
  • 25. Analysis of sift(x)  Let x be a node of rank k, y its child and we move the elements of list[y] to list[x]  If |list[y]| < size[y]/2, then y is a leaf and it is deleted, so the potential decreases by 1, which pays for this operation  Otherwise…  We have |list[y]|  size[y]/2, so we charge each element 2/size[y] for the operation  Each element can be charged at most once per rank, (in its whole history of existence)  So the max cost of sift is: 25   0 krankofnodeofsizemax 2 k
  • 26. What is this size[y] function?  Consider: if size[y] = 1, then the cost of sift is just O(M), which will be O(log n) (also, no elements will be corrupted)  So having size[y] > 1 is what makes soft heaps “soft”  We want the sum from the previous slide to be O(log 1/)  So we will make size[y] = 1 for ranks up to O(log 1/), and then exponentially increasing  So let r = C + log 1/, then we can define (for 0 <  < 1):  With this definition, the previous sum indeed gives O(log 1/) 26        rks rk s k k 1)1( 1 
  • 27. Analysis of insert  We create a new root node (potential increased by 5)  Every combine pays for itself, and the final updatesufmin  Finally, the sift cost of the new element and the eventual delete cost will be O(log 1/)  (see next slide)  So insert takes O(log 1/) amortized 27
  • 28. Analysis of deletemin  Finding the key takes 1 operation  Deleting raises the potential by r+2 (so cost of r+3) – we have charged this to nodes when they were inserted  If we have a leaf with more elements in its list or |list[x]| > size[x]/2, we do nothing  (If it is a leaf and we delete it, then the potential is reduced by k+5)  Otherwise: If we do sift, del(x)  size[x]/2  So the potential of the tree was at least (r + 2)sk/2  Can show: (r + 2)sk/2 > k + 1  Either way we have k+1 potential, which pays for the updatesufmin operation, so O(1) amortized 28
  • 29. Proof of correctness  Lemma 1: The number of nodes of rank k is at most n/2k.  Lemma 2: If x has rank k, size[x]  2(1 + )k-r  Proof of both: Easy induction.  Lemma 3: |list[x]|  (1 + 1/)size[x]  Proof: By induction on the rank of x. If we move nodes from list[y] of lower rank, we have |list[y]|  ((1 + )/)size[y] = size[x]/  So the new size is size[x] + size[x]/ = (1 + 1/)size[x] 29
  • 30. Proof of correctness (cont’d)  Theorem: After n insertions, at most n elements are corrupted.  Proof:  Elements can only be corrupted in nodes of rank > r  For a node of rank k, |list[x]|  (1 + 1/)size[x]  2 (1 + 1/) (1 + )k-r  Since r = C + log 1/ the # of corrupted elements is at most: 30 i i r rk rk rk r rk k rk nn n                   0 2 )1( 2 )/11(2 2 )1( 2 )/11(2 2 )1( )/11(2        n n Cr 2)1( )/11(4 1 2 2 )/11(2       So we choose C such that this is < n (We needed  < 1 for the sum to converge)
  • 31. Applications of Soft Heaps  Selection in O(n) time:  Make a soft heap with  = 1/3:  Insert all the elements, then remove n/3 + 1 elements  Let x be the largest element removed  x is greater than 1/3 of the elements, and it is less than 1/3 of the elements (because only 1/3 could be corrupted)  So partition the elements around x, in the worst case you are left with (2/3)n elements, then we continue recursively  Running time: O(n + (2/3)n + (4/9)n + …) = O(n)  More interesting application:  Finding the MST in O(α(m, n) m) time 31
  • 32. (I hope you have enjoyed soft heaps.) 32