SlideShare a Scribd company logo
1 of 30
Download to read offline
Iris Hui-Ru Jiang Fall 2014
CHAPTER 2
ALGORITHM ANALYSIS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Computational tractability
¤ Asymptotic order of growth
¤ Implementing Gale-Shapley algorithm
¤ Survey on common running times (Self-study)
¤ Priority queues (Postponed to Section 4.5)
¨ Reading:
¤ Chapter 2
Complexity
2
IRIS H.-R. JIANG
Computational Efficiency
¨ Q: What is a good algorithm?
¨ A:
¤ Correct: proofs
¤ Efficient: run quickly and consume small memory
n Efficiency Û resource requirement Û complexity
¨ Q: How does the resource requirements scale with increasing
input size?
¤ Time: running time
¤ Space: memory usage
Complexity
3
IRIS H.-R. JIANG
Running Time
Complexity
Analytical Engine
As soon as an Analytical Engine exists,
it will necessarily guide the future course of the science.
Whenever any result is sought by its aid, the question will then arise —
by what course of calculation can these results be arrived at by the machine
in the shortest time?
-- Charles Babbage (1864)
how many times
do you have to
turn the crank?
Charles Babbage
4
Iris Hui-Ru Jiang Fall 2014
CHAPTER 2
ALGORITHM ANALYSIS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Computational tractability
¤ Asymptotic order of growth
¤ Implementing Gale-Shapley algorithm
¤ Survey on common running times (Self-study)
¤ Priority queues (Postponed to Section 4.5)
¨ Reading:
¤ Chapter 2
Complexity
2
IRIS H.-R. JIANG
Computational Efficiency
¨ Q: What is a good algorithm?
¨ A:
¤ Correct: proofs
¤ Efficient: run quickly and consume small memory
n Efficiency Û resource requirement Û complexity
¨ Q: How does the resource requirements scale with increasing
input size?
¤ Time: running time
¤ Space: memory usage
Complexity
3
IRIS H.-R. JIANG
Running Time
Complexity
Analytical Engine
As soon as an Analytical Engine exists,
it will necessarily guide the future course of the science.
Whenever any result is sought by its aid, the question will then arise —
by what course of calculation can these results be arrived at by the machine
in the shortest time?
-- Charles Babbage (1864)
how many times
do you have to
turn the crank?
Charles Babbage
4
Iris Hui-Ru Jiang Fall 2014
CHAPTER 2
ALGORITHM ANALYSIS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Computational tractability
¤ Asymptotic order of growth
¤ Implementing Gale-Shapley algorithm
¤ Survey on common running times (Self-study)
¤ Priority queues (Postponed to Section 4.5)
¨ Reading:
¤ Chapter 2
Complexity
2
IRIS H.-R. JIANG
Computational Efficiency
¨ Q: What is a good algorithm?
¨ A:
¤ Correct: proofs
¤ Efficient: run quickly and consume small memory
n Efficiency Û resource requirement Û complexity
¨ Q: How does the resource requirements scale with increasing
input size?
¤ Time: running time
¤ Space: memory usage
Complexity
3
IRIS H.-R. JIANG
Running Time
Complexity
Analytical Engine
As soon as an Analytical Engine exists,
it will necessarily guide the future course of the science.
Whenever any result is sought by its aid, the question will then arise —
by what course of calculation can these results be arrived at by the machine
in the shortest time?
-- Charles Babbage (1864)
how many times
do you have to
turn the crank?
Charles Babbage
4
Iris Hui-Ru Jiang Fall 2014
CHAPTER 2
ALGORITHM ANALYSIS
IRIS H.-R. JIANG
Outline
¨ Content:
¤ Computational tractability
¤ Asymptotic order of growth
¤ Implementing Gale-Shapley algorithm
¤ Survey on common running times (Self-study)
¤ Priority queues (Postponed to Section 4.5)
¨ Reading:
¤ Chapter 2
Complexity
2
IRIS H.-R. JIANG
Computational Efficiency
¨ Q: What is a good algorithm?
¨ A:
¤ Correct: proofs
¤ Efficient: run quickly and consume small memory
n Efficiency Û resource requirement Û complexity
¨ Q: How does the resource requirements scale with increasing
input size?
¤ Time: running time
¤ Space: memory usage
Complexity
3
IRIS H.-R. JIANG
Running Time
Complexity
Analytical Engine
As soon as an Analytical Engine exists,
it will necessarily guide the future course of the science.
Whenever any result is sought by its aid, the question will then arise —
by what course of calculation can these results be arrived at by the machine
in the shortest time?
-- Charles Babbage (1864)
how many times
do you have to
turn the crank?
Charles Babbage
4
Defining efficiency
Computational Tractability
Complexity
5
IRIS H.-R. JIANG
Proposed Definition 1 (1/2)
¨ Q: How should we turn the fuzzy notion of an “efficient”
algorithm into something more concrete?
¨ A: Proposed definition 1: An algorithm is efficient if, when
implemented, it runs quickly on real input instances.
¨ Q: What’s wrong?
¨ A: Crucial things are missing…
¤ Where and how well do we implement an algorithm?
n Bad algorithms can run quickly with small cases on fast
machines.
¤ What is a real input instance?
n We don’t know the full range of input instances in practice
¤ How well, or badly, may an algorithm scale as problem sizes
grow to unexpected levels.
n Two algorithms perform comparably on inputs of size 100;
multiply the input size tenfold, and one will still run quickly,
while the other consumes a huge amount of time.
Complexity
6
IRIS H.-R. JIANG
Proposed Definition 1 (2/2)
¨ Q: What are we asking for?
¨ A: A concrete definition of efficiency
¤ Platform-independent
¤ Instance-independent
¤ Of predictive value w.r.t. increasing input sizes
¨ E.g., the stable matching problem
¤ Input size: N = the total size of preference lists
n n men & n women: 2n lists, each of length n
n N = 2n2
¤ Describe an algorithm at a high level
¤ Analyze its running time mathematically as a function of N
Complexity
7
IRIS H.-R. JIANG
Proposed Definition 2 (1/2)
¨ We will focus on the worst-case running time
¤ A bound on the largest possible running time the algorithm could
have over all inputs of a given size N.
¨ Q: How about average-case analysis?
¤ The performance averaged over random instances
¨ A: But, how to generate a random input?
¤ We don’t know the full range of input instances
n Good on one; poor on another
¨ Q: On what can we say a running-time bound is good based?
¨ A: Compare with brute-force search over the solution space
¤ Try all possibilities and see if any one of them works.
¤ One thing in common: a compact representation implicitly
specifies a giant search space Þ brute-force is too slow!
n Typically takes 2N time or worse for inputs of size N.
n Too naïve to be acceptable in practice.
Complexity
8
Defining efficiency
Computational Tractability
Complexity
5
IRIS H.-R. JIANG
Proposed Definition 1 (1/2)
¨ Q: How should we turn the fuzzy notion of an “efficient”
algorithm into something more concrete?
¨ A: Proposed definition 1: An algorithm is efficient if, when
implemented, it runs quickly on real input instances.
¨ Q: What’s wrong?
¨ A: Crucial things are missing…
¤ Where and how well do we implement an algorithm?
n Bad algorithms can run quickly with small cases on fast
machines.
¤ What is a real input instance?
n We don’t know the full range of input instances in practice
¤ How well, or badly, may an algorithm scale as problem sizes
grow to unexpected levels.
n Two algorithms perform comparably on inputs of size 100;
multiply the input size tenfold, and one will still run quickly,
while the other consumes a huge amount of time.
Complexity
6
IRIS H.-R. JIANG
Proposed Definition 1 (2/2)
¨ Q: What are we asking for?
¨ A: A concrete definition of efficiency
¤ Platform-independent
¤ Instance-independent
¤ Of predictive value w.r.t. increasing input sizes
¨ E.g., the stable matching problem
¤ Input size: N = the total size of preference lists
n n men & n women: 2n lists, each of length n
n N = 2n2
¤ Describe an algorithm at a high level
¤ Analyze its running time mathematically as a function of N
Complexity
7
IRIS H.-R. JIANG
Proposed Definition 2 (1/2)
¨ We will focus on the worst-case running time
¤ A bound on the largest possible running time the algorithm could
have over all inputs of a given size N.
¨ Q: How about average-case analysis?
¤ The performance averaged over random instances
¨ A: But, how to generate a random input?
¤ We don’t know the full range of input instances
n Good on one; poor on another
¨ Q: On what can we say a running-time bound is good based?
¨ A: Compare with brute-force search over the solution space
¤ Try all possibilities and see if any one of them works.
¤ One thing in common: a compact representation implicitly
specifies a giant search space Þ brute-force is too slow!
n Typically takes 2N time or worse for inputs of size N.
n Too naïve to be acceptable in practice.
Complexity
8
Defining efficiency
Computational Tractability
Complexity
5
IRIS H.-R. JIANG
Proposed Definition 1 (1/2)
¨ Q: How should we turn the fuzzy notion of an “efficient”
algorithm into something more concrete?
¨ A: Proposed definition 1: An algorithm is efficient if, when
implemented, it runs quickly on real input instances.
¨ Q: What’s wrong?
¨ A: Crucial things are missing…
¤ Where and how well do we implement an algorithm?
n Bad algorithms can run quickly with small cases on fast
machines.
¤ What is a real input instance?
n We don’t know the full range of input instances in practice
¤ How well, or badly, may an algorithm scale as problem sizes
grow to unexpected levels.
n Two algorithms perform comparably on inputs of size 100;
multiply the input size tenfold, and one will still run quickly,
while the other consumes a huge amount of time.
Complexity
6
IRIS H.-R. JIANG
Proposed Definition 1 (2/2)
¨ Q: What are we asking for?
¨ A: A concrete definition of efficiency
¤ Platform-independent
¤ Instance-independent
¤ Of predictive value w.r.t. increasing input sizes
¨ E.g., the stable matching problem
¤ Input size: N = the total size of preference lists
n n men & n women: 2n lists, each of length n
n N = 2n2
¤ Describe an algorithm at a high level
¤ Analyze its running time mathematically as a function of N
Complexity
7
IRIS H.-R. JIANG
Proposed Definition 2 (1/2)
¨ We will focus on the worst-case running time
¤ A bound on the largest possible running time the algorithm could
have over all inputs of a given size N.
¨ Q: How about average-case analysis?
¤ The performance averaged over random instances
¨ A: But, how to generate a random input?
¤ We don’t know the full range of input instances
n Good on one; poor on another
¨ Q: On what can we say a running-time bound is good based?
¨ A: Compare with brute-force search over the solution space
¤ Try all possibilities and see if any one of them works.
¤ One thing in common: a compact representation implicitly
specifies a giant search space Þ brute-force is too slow!
n Typically takes 2N time or worse for inputs of size N.
n Too naïve to be acceptable in practice.
Complexity
8
Defining efficiency
Computational Tractability
Complexity
5
IRIS H.-R. JIANG
Proposed Definition 1 (1/2)
¨ Q: How should we turn the fuzzy notion of an “efficient”
algorithm into something more concrete?
¨ A: Proposed definition 1: An algorithm is efficient if, when
implemented, it runs quickly on real input instances.
¨ Q: What’s wrong?
¨ A: Crucial things are missing…
¤ Where and how well do we implement an algorithm?
n Bad algorithms can run quickly with small cases on fast
machines.
¤ What is a real input instance?
n We don’t know the full range of input instances in practice
¤ How well, or badly, may an algorithm scale as problem sizes
grow to unexpected levels.
n Two algorithms perform comparably on inputs of size 100;
multiply the input size tenfold, and one will still run quickly,
while the other consumes a huge amount of time.
Complexity
6
IRIS H.-R. JIANG
Proposed Definition 1 (2/2)
¨ Q: What are we asking for?
¨ A: A concrete definition of efficiency
¤ Platform-independent
¤ Instance-independent
¤ Of predictive value w.r.t. increasing input sizes
¨ E.g., the stable matching problem
¤ Input size: N = the total size of preference lists
n n men & n women: 2n lists, each of length n
n N = 2n2
¤ Describe an algorithm at a high level
¤ Analyze its running time mathematically as a function of N
Complexity
7
IRIS H.-R. JIANG
Proposed Definition 2 (1/2)
¨ We will focus on the worst-case running time
¤ A bound on the largest possible running time the algorithm could
have over all inputs of a given size N.
¨ Q: How about average-case analysis?
¤ The performance averaged over random instances
¨ A: But, how to generate a random input?
¤ We don’t know the full range of input instances
n Good on one; poor on another
¨ Q: On what can we say a running-time bound is good based?
¨ A: Compare with brute-force search over the solution space
¤ Try all possibilities and see if any one of them works.
¤ One thing in common: a compact representation implicitly
specifies a giant search space Þ brute-force is too slow!
n Typically takes 2N time or worse for inputs of size N.
n Too naïve to be acceptable in practice.
Complexity
8
IRIS H.-R. JIANG
Proposed Definition 2 (2/2)
¨ Proposed definition 2: An algorithm is efficient if it achieves
qualitatively better worst-case performance, at an analytical level,
than brute-force search.
¨ Q: What is qualitatively better performance?
¨ A: We consider the actual running time of algorithms more
carefully, and try to quantify what a reasonable running time
would be.
Complexity
9
IRIS H.-R. JIANG
Proposed Definition 3 (1/3)
¨ We expect a good algorithm has a good scaling property.
¤ E.g., when the input size doubles, the algorithm should only slow
down by some constant factor C.
¨ Polynomial running time: There exists constants c > 0 and d > 0
so that on every input of size N, its running time is bounded by
cNd primitive computational steps.
¤ Proportional to Nd; the smaller d has a better scalibility.
¤ The algorithm has a polynomial running time if this running time
bound holds for some c and d.
¨ Q: What is a primitive computational step?
¨ A: Each step corresponds to…
¤ A single assembly-language instruction on a standard processor
¤ One line of a standard programming language
¤ Simple enough and regardless to input size N
Complexity
10
IRIS H.-R. JIANG
Proposed Definition 3 (2/3)
¨ Proposed definition 3: An algorithm is efficient if it has a
polynomial running time.
¨ Justification: It really works in practice!
¤ In practice, the polynomial-time algorithms that people develop
almost always have low constants and low exponents.
¤ Breaking through the exponential barrier of brute force typically
exposes some crucial structure of the problem.
¨ Exceptions
¤ Some polynomial-time algorithms do have high constants and/or
exponents, and are useless in practice.
n Although 6.02´1023´N20 is technically polynomial-time, it would
be useless in practice.
¤ Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
n This kind of algorithms may run quickly for average cases.
Complexity
11
Simplex method
Unix grep
IRIS H.-R. JIANG
Proposed Definition 3 (3/3)
¨ An algorithm is efficient if it has a polynomial running time.
¤ It really works in practice.
¤ The gulf between the growth rates of polynomial and exponential
functions is enormous.
¤ It allow us to ask about the existence or nonexistence of efficient
algorithms as a well-defined question.
Complexity
12
n nlog2n n2 n3 1.5n 2n n!
n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec
n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs
n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long
n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long
n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long
n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long
n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long
n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long
The running times (rounded up) of different algorithms on inputs of increasing sizes, for a
processor performing a million high-level instructions per second. (very long: > 1025 yrs)
Non-polynomial time
Polynomial time
IRIS H.-R. JIANG
Proposed Definition 2 (2/2)
¨ Proposed definition 2: An algorithm is efficient if it achieves
qualitatively better worst-case performance, at an analytical level,
than brute-force search.
¨ Q: What is qualitatively better performance?
¨ A: We consider the actual running time of algorithms more
carefully, and try to quantify what a reasonable running time
would be.
Complexity
9
IRIS H.-R. JIANG
Proposed Definition 3 (1/3)
¨ We expect a good algorithm has a good scaling property.
¤ E.g., when the input size doubles, the algorithm should only slow
down by some constant factor C.
¨ Polynomial running time: There exists constants c > 0 and d > 0
so that on every input of size N, its running time is bounded by
cNd primitive computational steps.
¤ Proportional to Nd; the smaller d has a better scalibility.
¤ The algorithm has a polynomial running time if this running time
bound holds for some c and d.
¨ Q: What is a primitive computational step?
¨ A: Each step corresponds to…
¤ A single assembly-language instruction on a standard processor
¤ One line of a standard programming language
¤ Simple enough and regardless to input size N
Complexity
10
IRIS H.-R. JIANG
Proposed Definition 3 (2/3)
¨ Proposed definition 3: An algorithm is efficient if it has a
polynomial running time.
¨ Justification: It really works in practice!
¤ In practice, the polynomial-time algorithms that people develop
almost always have low constants and low exponents.
¤ Breaking through the exponential barrier of brute force typically
exposes some crucial structure of the problem.
¨ Exceptions
¤ Some polynomial-time algorithms do have high constants and/or
exponents, and are useless in practice.
n Although 6.02´1023´N20 is technically polynomial-time, it would
be useless in practice.
¤ Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
n This kind of algorithms may run quickly for average cases.
Complexity
11
Simplex method
Unix grep
IRIS H.-R. JIANG
Proposed Definition 3 (3/3)
¨ An algorithm is efficient if it has a polynomial running time.
¤ It really works in practice.
¤ The gulf between the growth rates of polynomial and exponential
functions is enormous.
¤ It allow us to ask about the existence or nonexistence of efficient
algorithms as a well-defined question.
Complexity
12
n nlog2n n2 n3 1.5n 2n n!
n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec
n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs
n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long
n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long
n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long
n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long
n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long
n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long
The running times (rounded up) of different algorithms on inputs of increasing sizes, for a
processor performing a million high-level instructions per second. (very long: > 1025 yrs)
Non-polynomial time
Polynomial time
IRIS H.-R. JIANG
Proposed Definition 2 (2/2)
¨ Proposed definition 2: An algorithm is efficient if it achieves
qualitatively better worst-case performance, at an analytical level,
than brute-force search.
¨ Q: What is qualitatively better performance?
¨ A: We consider the actual running time of algorithms more
carefully, and try to quantify what a reasonable running time
would be.
Complexity
9
IRIS H.-R. JIANG
Proposed Definition 3 (1/3)
¨ We expect a good algorithm has a good scaling property.
¤ E.g., when the input size doubles, the algorithm should only slow
down by some constant factor C.
¨ Polynomial running time: There exists constants c > 0 and d > 0
so that on every input of size N, its running time is bounded by
cNd primitive computational steps.
¤ Proportional to Nd; the smaller d has a better scalibility.
¤ The algorithm has a polynomial running time if this running time
bound holds for some c and d.
¨ Q: What is a primitive computational step?
¨ A: Each step corresponds to…
¤ A single assembly-language instruction on a standard processor
¤ One line of a standard programming language
¤ Simple enough and regardless to input size N
Complexity
10
IRIS H.-R. JIANG
Proposed Definition 3 (2/3)
¨ Proposed definition 3: An algorithm is efficient if it has a
polynomial running time.
¨ Justification: It really works in practice!
¤ In practice, the polynomial-time algorithms that people develop
almost always have low constants and low exponents.
¤ Breaking through the exponential barrier of brute force typically
exposes some crucial structure of the problem.
¨ Exceptions
¤ Some polynomial-time algorithms do have high constants and/or
exponents, and are useless in practice.
n Although 6.02´1023´N20 is technically polynomial-time, it would
be useless in practice.
¤ Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
n This kind of algorithms may run quickly for average cases.
Complexity
11
Simplex method
Unix grep
IRIS H.-R. JIANG
Proposed Definition 3 (3/3)
¨ An algorithm is efficient if it has a polynomial running time.
¤ It really works in practice.
¤ The gulf between the growth rates of polynomial and exponential
functions is enormous.
¤ It allow us to ask about the existence or nonexistence of efficient
algorithms as a well-defined question.
Complexity
12
n nlog2n n2 n3 1.5n 2n n!
n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec
n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs
n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long
n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long
n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long
n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long
n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long
n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long
The running times (rounded up) of different algorithms on inputs of increasing sizes, for a
processor performing a million high-level instructions per second. (very long: > 1025 yrs)
Non-polynomial time
Polynomial time
IRIS H.-R. JIANG
Proposed Definition 2 (2/2)
¨ Proposed definition 2: An algorithm is efficient if it achieves
qualitatively better worst-case performance, at an analytical level,
than brute-force search.
¨ Q: What is qualitatively better performance?
¨ A: We consider the actual running time of algorithms more
carefully, and try to quantify what a reasonable running time
would be.
Complexity
9
IRIS H.-R. JIANG
Proposed Definition 3 (1/3)
¨ We expect a good algorithm has a good scaling property.
¤ E.g., when the input size doubles, the algorithm should only slow
down by some constant factor C.
¨ Polynomial running time: There exists constants c > 0 and d > 0
so that on every input of size N, its running time is bounded by
cNd primitive computational steps.
¤ Proportional to Nd; the smaller d has a better scalibility.
¤ The algorithm has a polynomial running time if this running time
bound holds for some c and d.
¨ Q: What is a primitive computational step?
¨ A: Each step corresponds to…
¤ A single assembly-language instruction on a standard processor
¤ One line of a standard programming language
¤ Simple enough and regardless to input size N
Complexity
10
IRIS H.-R. JIANG
Proposed Definition 3 (2/3)
¨ Proposed definition 3: An algorithm is efficient if it has a
polynomial running time.
¨ Justification: It really works in practice!
¤ In practice, the polynomial-time algorithms that people develop
almost always have low constants and low exponents.
¤ Breaking through the exponential barrier of brute force typically
exposes some crucial structure of the problem.
¨ Exceptions
¤ Some polynomial-time algorithms do have high constants and/or
exponents, and are useless in practice.
n Although 6.02´1023´N20 is technically polynomial-time, it would
be useless in practice.
¤ Some exponential-time (or worse) algorithms are widely used
because the worst-case instances seem to be rare.
n This kind of algorithms may run quickly for average cases.
Complexity
11
Simplex method
Unix grep
IRIS H.-R. JIANG
Proposed Definition 3 (3/3)
¨ An algorithm is efficient if it has a polynomial running time.
¤ It really works in practice.
¤ The gulf between the growth rates of polynomial and exponential
functions is enormous.
¤ It allow us to ask about the existence or nonexistence of efficient
algorithms as a well-defined question.
Complexity
12
n nlog2n n2 n3 1.5n 2n n!
n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec
n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs
n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long
n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long
n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long
n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long
n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long
n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long
The running times (rounded up) of different algorithms on inputs of increasing sizes, for a
processor performing a million high-level instructions per second. (very long: > 1025 yrs)
Non-polynomial time
Polynomial time
Intrinsic computational tractability
O, W, Q
Asymptotic Order of Growth
Complexity
13
IRIS H.-R. JIANG
Intrinsic Computational Tractability
¨ Intrinsic computational tractability: An algorithm’s worst-case
running time on inputs of size n grows at a rate that is at most
proportional to some function f(n).
¤ f(n): an upper bound of the running time of the algorithm
¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps?
¨ A: We’d like to say it grows like n2 up to constant factors
¤ Too detailed
¤ Meaningless
¤ Hard to classify its efficiency
n Our ultimate goal is to identify broad classes of algorithms that
have similar behavior.
n We’d actually like to classify running times at a coarser level of
granularity so that similarities among different algorithms, and
among different problems, show up more clearly.
Complexity
14
Insensitive to constant factors
and low-order terms
IRIS H.-R. JIANG
O, W, Q
¨ Let T(n) be a function to describe the worst-case running time
of a certain algorithm on an input of size n.
¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n).
¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n).
¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and
W(f(n)).
¤ Also, we can prove:
Let f and g be two functions that
exists and is equal to some number c > 0.
Then f(n) = Q(g(n)).
Complexity
15
)
(
)
(
lim
n
g
n
f
n ¥
®
IRIS H.-R. JIANG
Example: O, W, Q
¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false?
1. T(n) = O(n)
2. T(n) = O(n2)
3. T(n) = O(n3)
4. T(n) = W(n)
5. T(n) = W(n2)
6. T(n) = W(n3)
7. T(n) = Q(n)
8. T(n) = Q(n2)
9. T(n) = Q(n3)
¨ A:
Complexity
16
In practice, we
expect an O(.) as
tight as possible
because we seek an
intrinsic running time.
Intrinsic computational tractability
O, W, Q
Asymptotic Order of Growth
Complexity
13
IRIS H.-R. JIANG
Intrinsic Computational Tractability
¨ Intrinsic computational tractability: An algorithm’s worst-case
running time on inputs of size n grows at a rate that is at most
proportional to some function f(n).
¤ f(n): an upper bound of the running time of the algorithm
¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps?
¨ A: We’d like to say it grows like n2 up to constant factors
¤ Too detailed
¤ Meaningless
¤ Hard to classify its efficiency
n Our ultimate goal is to identify broad classes of algorithms that
have similar behavior.
n We’d actually like to classify running times at a coarser level of
granularity so that similarities among different algorithms, and
among different problems, show up more clearly.
Complexity
14
Insensitive to constant factors
and low-order terms
IRIS H.-R. JIANG
O, W, Q
¨ Let T(n) be a function to describe the worst-case running time
of a certain algorithm on an input of size n.
¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n).
¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n).
¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and
W(f(n)).
¤ Also, we can prove:
Let f and g be two functions that
exists and is equal to some number c > 0.
Then f(n) = Q(g(n)).
Complexity
15
)
(
)
(
lim
n
g
n
f
n ¥
®
IRIS H.-R. JIANG
Example: O, W, Q
¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false?
1. T(n) = O(n)
2. T(n) = O(n2)
3. T(n) = O(n3)
4. T(n) = W(n)
5. T(n) = W(n2)
6. T(n) = W(n3)
7. T(n) = Q(n)
8. T(n) = Q(n2)
9. T(n) = Q(n3)
¨ A:
Complexity
16
In practice, we
expect an O(.) as
tight as possible
because we seek an
intrinsic running time.
Intrinsic computational tractability
O, W, Q
Asymptotic Order of Growth
Complexity
13
IRIS H.-R. JIANG
Intrinsic Computational Tractability
¨ Intrinsic computational tractability: An algorithm’s worst-case
running time on inputs of size n grows at a rate that is at most
proportional to some function f(n).
¤ f(n): an upper bound of the running time of the algorithm
¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps?
¨ A: We’d like to say it grows like n2 up to constant factors
¤ Too detailed
¤ Meaningless
¤ Hard to classify its efficiency
n Our ultimate goal is to identify broad classes of algorithms that
have similar behavior.
n We’d actually like to classify running times at a coarser level of
granularity so that similarities among different algorithms, and
among different problems, show up more clearly.
Complexity
14
Insensitive to constant factors
and low-order terms
IRIS H.-R. JIANG
O, W, Q
¨ Let T(n) be a function to describe the worst-case running time
of a certain algorithm on an input of size n.
¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n).
¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n).
¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and
W(f(n)).
¤ Also, we can prove:
Let f and g be two functions that
exists and is equal to some number c > 0.
Then f(n) = Q(g(n)).
Complexity
15
)
(
)
(
lim
n
g
n
f
n ¥
®
IRIS H.-R. JIANG
Example: O, W, Q
¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false?
1. T(n) = O(n)
2. T(n) = O(n2)
3. T(n) = O(n3)
4. T(n) = W(n)
5. T(n) = W(n2)
6. T(n) = W(n3)
7. T(n) = Q(n)
8. T(n) = Q(n2)
9. T(n) = Q(n3)
¨ A:
Complexity
16
In practice, we
expect an O(.) as
tight as possible
because we seek an
intrinsic running time.
Intrinsic computational tractability
O, W, Q
Asymptotic Order of Growth
Complexity
13
IRIS H.-R. JIANG
Intrinsic Computational Tractability
¨ Intrinsic computational tractability: An algorithm’s worst-case
running time on inputs of size n grows at a rate that is at most
proportional to some function f(n).
¤ f(n): an upper bound of the running time of the algorithm
¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps?
¨ A: We’d like to say it grows like n2 up to constant factors
¤ Too detailed
¤ Meaningless
¤ Hard to classify its efficiency
n Our ultimate goal is to identify broad classes of algorithms that
have similar behavior.
n We’d actually like to classify running times at a coarser level of
granularity so that similarities among different algorithms, and
among different problems, show up more clearly.
Complexity
14
Insensitive to constant factors
and low-order terms
IRIS H.-R. JIANG
O, W, Q
¨ Let T(n) be a function to describe the worst-case running time
of a certain algorithm on an input of size n.
¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n).
¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants
c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n).
¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and
W(f(n)).
¤ Also, we can prove:
Let f and g be two functions that
exists and is equal to some number c > 0.
Then f(n) = Q(g(n)).
Complexity
15
)
(
)
(
lim
n
g
n
f
n ¥
®
IRIS H.-R. JIANG
Example: O, W, Q
¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false?
1. T(n) = O(n)
2. T(n) = O(n2)
3. T(n) = O(n3)
4. T(n) = W(n)
5. T(n) = W(n2)
6. T(n) = W(n3)
7. T(n) = Q(n)
8. T(n) = Q(n2)
9. T(n) = Q(n3)
¨ A:
Complexity
16
In practice, we
expect an O(.) as
tight as possible
because we seek an
intrinsic running time.
IRIS H.-R. JIANG
Abuse of Notation
¨ Q: Why using equality in T(n) = O(f(n))?
¤ Asymmetric:
n f(n) = 5n3; g(n) = 3n2
n f(n) = O(n3) = g(n)
n but f(n) ¹ g(n).
¤ Better notation: T(n) Î O(f(n))
n O(f(n)) forms a set
¨ Cf. “Is” in English
¤ Aristotle is a man, but a man isn’t necessarily Aristotle.
Complexity
17
IRIS H.-R. JIANG
Properties
¨ Transitivity:
If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)),
where P = O, W, or Q.
¨ Rule of sums:
f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q.
¨ Rule of products:
If f1(n) = P(g1(n)) and f2(n) = P(g2(n)),
then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q.
¨ Transpose symmetry:
f(n) = O(g(n)) iff g(n) = W(f(n)).
¨ Reflexivity:
f(n) = P(f(n)), where P = O, W , or Q.
¨ Symmetry:
f(n) = Q(g(n)) iff g(n) = Q(f(n)).
Complexity
18
IRIS H.-R. JIANG
Summary: Polynomial-Time Complexity
¨ Polynomial running time: O(p(n))
¤ p(n) is a polynomial function of input size n (p(n) = nO(1))
¤ a.k.a. polynomial-time complexity
¨ Order
¤ O(1): constant
¤ O(logn): logarithmic
¤ O(n0.5): sublinear
¤ O(n): linear
¤ O(nlogn): loglinear
¤ O(n2): quadratic
¤ O(n3): cubic
¤ O(n4): quartic
¤ O(2n): exponential
¤ O(n!): factorial
¤ O(nn)
Complexity
19
Faster
Slower
Arrays and lists
Implementing Gale-Shapley Algorithm
Complexity
20
IRIS H.-R. JIANG
Abuse of Notation
¨ Q: Why using equality in T(n) = O(f(n))?
¤ Asymmetric:
n f(n) = 5n3; g(n) = 3n2
n f(n) = O(n3) = g(n)
n but f(n) ¹ g(n).
¤ Better notation: T(n) Î O(f(n))
n O(f(n)) forms a set
¨ Cf. “Is” in English
¤ Aristotle is a man, but a man isn’t necessarily Aristotle.
Complexity
17
IRIS H.-R. JIANG
Properties
¨ Transitivity:
If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)),
where P = O, W, or Q.
¨ Rule of sums:
f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q.
¨ Rule of products:
If f1(n) = P(g1(n)) and f2(n) = P(g2(n)),
then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q.
¨ Transpose symmetry:
f(n) = O(g(n)) iff g(n) = W(f(n)).
¨ Reflexivity:
f(n) = P(f(n)), where P = O, W , or Q.
¨ Symmetry:
f(n) = Q(g(n)) iff g(n) = Q(f(n)).
Complexity
18
IRIS H.-R. JIANG
Summary: Polynomial-Time Complexity
¨ Polynomial running time: O(p(n))
¤ p(n) is a polynomial function of input size n (p(n) = nO(1))
¤ a.k.a. polynomial-time complexity
¨ Order
¤ O(1): constant
¤ O(logn): logarithmic
¤ O(n0.5): sublinear
¤ O(n): linear
¤ O(nlogn): loglinear
¤ O(n2): quadratic
¤ O(n3): cubic
¤ O(n4): quartic
¤ O(2n): exponential
¤ O(n!): factorial
¤ O(nn)
Complexity
19
Faster
Slower
Arrays and lists
Implementing Gale-Shapley Algorithm
Complexity
20
IRIS H.-R. JIANG
Abuse of Notation
¨ Q: Why using equality in T(n) = O(f(n))?
¤ Asymmetric:
n f(n) = 5n3; g(n) = 3n2
n f(n) = O(n3) = g(n)
n but f(n) ¹ g(n).
¤ Better notation: T(n) Î O(f(n))
n O(f(n)) forms a set
¨ Cf. “Is” in English
¤ Aristotle is a man, but a man isn’t necessarily Aristotle.
Complexity
17
IRIS H.-R. JIANG
Properties
¨ Transitivity:
If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)),
where P = O, W, or Q.
¨ Rule of sums:
f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q.
¨ Rule of products:
If f1(n) = P(g1(n)) and f2(n) = P(g2(n)),
then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q.
¨ Transpose symmetry:
f(n) = O(g(n)) iff g(n) = W(f(n)).
¨ Reflexivity:
f(n) = P(f(n)), where P = O, W , or Q.
¨ Symmetry:
f(n) = Q(g(n)) iff g(n) = Q(f(n)).
Complexity
18
IRIS H.-R. JIANG
Summary: Polynomial-Time Complexity
¨ Polynomial running time: O(p(n))
¤ p(n) is a polynomial function of input size n (p(n) = nO(1))
¤ a.k.a. polynomial-time complexity
¨ Order
¤ O(1): constant
¤ O(logn): logarithmic
¤ O(n0.5): sublinear
¤ O(n): linear
¤ O(nlogn): loglinear
¤ O(n2): quadratic
¤ O(n3): cubic
¤ O(n4): quartic
¤ O(2n): exponential
¤ O(n!): factorial
¤ O(nn)
Complexity
19
Faster
Slower
Arrays and lists
Implementing Gale-Shapley Algorithm
Complexity
20
IRIS H.-R. JIANG
Abuse of Notation
¨ Q: Why using equality in T(n) = O(f(n))?
¤ Asymmetric:
n f(n) = 5n3; g(n) = 3n2
n f(n) = O(n3) = g(n)
n but f(n) ¹ g(n).
¤ Better notation: T(n) Î O(f(n))
n O(f(n)) forms a set
¨ Cf. “Is” in English
¤ Aristotle is a man, but a man isn’t necessarily Aristotle.
Complexity
17
IRIS H.-R. JIANG
Properties
¨ Transitivity:
If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)),
where P = O, W, or Q.
¨ Rule of sums:
f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q.
¨ Rule of products:
If f1(n) = P(g1(n)) and f2(n) = P(g2(n)),
then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q.
¨ Transpose symmetry:
f(n) = O(g(n)) iff g(n) = W(f(n)).
¨ Reflexivity:
f(n) = P(f(n)), where P = O, W , or Q.
¨ Symmetry:
f(n) = Q(g(n)) iff g(n) = Q(f(n)).
Complexity
18
IRIS H.-R. JIANG
Summary: Polynomial-Time Complexity
¨ Polynomial running time: O(p(n))
¤ p(n) is a polynomial function of input size n (p(n) = nO(1))
¤ a.k.a. polynomial-time complexity
¨ Order
¤ O(1): constant
¤ O(logn): logarithmic
¤ O(n0.5): sublinear
¤ O(n): linear
¤ O(nlogn): loglinear
¤ O(n2): quadratic
¤ O(n3): cubic
¤ O(n4): quartic
¤ O(2n): exponential
¤ O(n!): factorial
¤ O(nn)
Complexity
19
Faster
Slower
Arrays and lists
Implementing Gale-Shapley Algorithm
Complexity
20
IRIS H.-R. JIANG
Recap: Gale-Shapley Algorithm
¨ Correctness:
¤ Termination: G-S terminates after at most n2 iterations.
¤ Perfection: Everyone gets married.
¤ Stability: The marriages are stable.
¨ Male-optimal and female-pessimal
¨ All executions yield the same matching
Complexity
21
O(n2)?
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
What to Do?
¨ Goal: Each iteration takes O(1) time and then O(n2) in total.
¤ Line 2: Identify a free man.
¤ Line 3: For a man m, identify the highest ranked woman whom he
hasn’t yet proposed.
¤ Line 4: For a woman w, decide if w is currently engaged, and if
so, identify her current partner.
¤ Line 6: For a woman w and two men m and m', decide which of m
and m' is preferred by w.
¤
Complexity
22
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
Representing Men and Women
¨ Q: How to represent men and women?
¨ A: Assume the set of men and women are both {1, …, n}.
¤ Record in two arrays
¨ Q: How to store men’s preference lists?
¨ A: Record in an nxn array or (n arrays, each of length n)
Complexity
23
Amy
Bertha
Clare
Women
1
2
3
ID
1
2
3
ID
Xavier
Yancey
Zeus
Amy
Bertha
Amy
Bertha
Amy
Bertha
Clare
Clare
Clare
Men’s Preference Profile
1
2
1
2
1
2
3
3
3
ManPref[ , ]
Men
Yancey
Xavier
Xavier
Xavier
Yancey
Yancey
Zeus
Zeus
Zeus
Women’s Preference Profile
ManPref[m, i]:
ith woman in m’s list
IRIS H.-R. JIANG
Identifying a Free Man
¨ Q: Line 2: How to identify a free man in O(1) time?
¨ A:
¤ Since the set of free men is dynamic, a static array is not good for
insertion/deletion.
¤ How about a linked list? It can be accessed in O(1) time and
allows various sizes.
n Read/insert/delete from first
Complexity
24
1 2 3 0
first
data link(pointer)
1 2 3 0
first
delete 1
insert 1
IRIS H.-R. JIANG
Recap: Gale-Shapley Algorithm
¨ Correctness:
¤ Termination: G-S terminates after at most n2 iterations.
¤ Perfection: Everyone gets married.
¤ Stability: The marriages are stable.
¨ Male-optimal and female-pessimal
¨ All executions yield the same matching
Complexity
21
O(n2)?
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
What to Do?
¨ Goal: Each iteration takes O(1) time and then O(n2) in total.
¤ Line 2: Identify a free man.
¤ Line 3: For a man m, identify the highest ranked woman whom he
hasn’t yet proposed.
¤ Line 4: For a woman w, decide if w is currently engaged, and if
so, identify her current partner.
¤ Line 6: For a woman w and two men m and m', decide which of m
and m' is preferred by w.
¤
Complexity
22
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
Representing Men and Women
¨ Q: How to represent men and women?
¨ A: Assume the set of men and women are both {1, …, n}.
¤ Record in two arrays
¨ Q: How to store men’s preference lists?
¨ A: Record in an nxn array or (n arrays, each of length n)
Complexity
23
Amy
Bertha
Clare
Women
1
2
3
ID
1
2
3
ID
Xavier
Yancey
Zeus
Amy
Bertha
Amy
Bertha
Amy
Bertha
Clare
Clare
Clare
Men’s Preference Profile
1
2
1
2
1
2
3
3
3
ManPref[ , ]
Men
Yancey
Xavier
Xavier
Xavier
Yancey
Yancey
Zeus
Zeus
Zeus
Women’s Preference Profile
ManPref[m, i]:
ith woman in m’s list
IRIS H.-R. JIANG
Identifying a Free Man
¨ Q: Line 2: How to identify a free man in O(1) time?
¨ A:
¤ Since the set of free men is dynamic, a static array is not good for
insertion/deletion.
¤ How about a linked list? It can be accessed in O(1) time and
allows various sizes.
n Read/insert/delete from first
Complexity
24
1 2 3 0
first
data link(pointer)
1 2 3 0
first
delete 1
insert 1
IRIS H.-R. JIANG
Recap: Gale-Shapley Algorithm
¨ Correctness:
¤ Termination: G-S terminates after at most n2 iterations.
¤ Perfection: Everyone gets married.
¤ Stability: The marriages are stable.
¨ Male-optimal and female-pessimal
¨ All executions yield the same matching
Complexity
21
O(n2)?
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
What to Do?
¨ Goal: Each iteration takes O(1) time and then O(n2) in total.
¤ Line 2: Identify a free man.
¤ Line 3: For a man m, identify the highest ranked woman whom he
hasn’t yet proposed.
¤ Line 4: For a woman w, decide if w is currently engaged, and if
so, identify her current partner.
¤ Line 6: For a woman w and two men m and m', decide which of m
and m' is preferred by w.
¤
Complexity
22
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
Representing Men and Women
¨ Q: How to represent men and women?
¨ A: Assume the set of men and women are both {1, …, n}.
¤ Record in two arrays
¨ Q: How to store men’s preference lists?
¨ A: Record in an nxn array or (n arrays, each of length n)
Complexity
23
Amy
Bertha
Clare
Women
1
2
3
ID
1
2
3
ID
Xavier
Yancey
Zeus
Amy
Bertha
Amy
Bertha
Amy
Bertha
Clare
Clare
Clare
Men’s Preference Profile
1
2
1
2
1
2
3
3
3
ManPref[ , ]
Men
Yancey
Xavier
Xavier
Xavier
Yancey
Yancey
Zeus
Zeus
Zeus
Women’s Preference Profile
ManPref[m, i]:
ith woman in m’s list
IRIS H.-R. JIANG
Identifying a Free Man
¨ Q: Line 2: How to identify a free man in O(1) time?
¨ A:
¤ Since the set of free men is dynamic, a static array is not good for
insertion/deletion.
¤ How about a linked list? It can be accessed in O(1) time and
allows various sizes.
n Read/insert/delete from first
Complexity
24
1 2 3 0
first
data link(pointer)
1 2 3 0
first
delete 1
insert 1
IRIS H.-R. JIANG
Recap: Gale-Shapley Algorithm
¨ Correctness:
¤ Termination: G-S terminates after at most n2 iterations.
¤ Perfection: Everyone gets married.
¤ Stability: The marriages are stable.
¨ Male-optimal and female-pessimal
¨ All executions yield the same matching
Complexity
21
O(n2)?
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
What to Do?
¨ Goal: Each iteration takes O(1) time and then O(n2) in total.
¤ Line 2: Identify a free man.
¤ Line 3: For a man m, identify the highest ranked woman whom he
hasn’t yet proposed.
¤ Line 4: For a woman w, decide if w is currently engaged, and if
so, identify her current partner.
¤ Line 6: For a woman w and two men m and m', decide which of m
and m' is preferred by w.
¤
Complexity
22
Gale-Shapley
1. initialize each person to be free
2. while (some man m is free and hasn't proposed to every woman) do
3. w = highest ranked woman in m's list to whom m has not yet proposed
4. if (w is free) then
5. (m, w) become engaged
6. else if (w prefers m to her fiancé m') then
7. (m, w) become engaged
8. m' become free
9. return the set S of engaged pairs
IRIS H.-R. JIANG
Representing Men and Women
¨ Q: How to represent men and women?
¨ A: Assume the set of men and women are both {1, …, n}.
¤ Record in two arrays
¨ Q: How to store men’s preference lists?
¨ A: Record in an nxn array or (n arrays, each of length n)
Complexity
23
Amy
Bertha
Clare
Women
1
2
3
ID
1
2
3
ID
Xavier
Yancey
Zeus
Amy
Bertha
Amy
Bertha
Amy
Bertha
Clare
Clare
Clare
Men’s Preference Profile
1
2
1
2
1
2
3
3
3
ManPref[ , ]
Men
Yancey
Xavier
Xavier
Xavier
Yancey
Yancey
Zeus
Zeus
Zeus
Women’s Preference Profile
ManPref[m, i]:
ith woman in m’s list
IRIS H.-R. JIANG
Identifying a Free Man
¨ Q: Line 2: How to identify a free man in O(1) time?
¨ A:
¤ Since the set of free men is dynamic, a static array is not good for
insertion/deletion.
¤ How about a linked list? It can be accessed in O(1) time and
allows various sizes.
n Read/insert/delete from first
Complexity
24
1 2 3 0
first
data link(pointer)
1 2 3 0
first
delete 1
insert 1
IRIS H.-R. JIANG
Whom to Propose?
¨ Q: Line 3: For a man m, how to identify the highest ranked
woman whom he hasn’t yet proposed in O(1) time?
¨ A:
¤ An extra array Next[] indicates for each man m the position of the
next woman he will propose to on his list.
n m will propose to w = ManPref[m, Next[m]]
n Next[m]++ after proposal, regardless of rejection/acceptance
Complexity
25
1
1
1
Next
Xavier
Yancey
Zeus
Men
1
2
3
ID
Initial value
1
2
1
2
1
2
3
3
3
ManPref[ , ]
2
2
3
Next
In progress
1
2
1
2
1
2
3
3
3
ManPref[ , ]
IRIS H.-R. JIANG
Identifying Her Fiancé
¨ Q: Line 4: For a woman w, how to decide if w is currently
engaged? If so, how to identify her current fiancé in O(1) time?
¨ A:
¤ An array Current[] of length n.
n w’s current fiancé = Current[w]
n Initially, Current[w] = 0 (unmatched)
Complexity
26
0
0
0
Current
Initial value
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
Amy
Bertha
Clare
Women
1
2
3
ID
Xavier
Yancey
Zeus
Men
1
2
3
ID
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
1
3
0
Current
In progress
IRIS H.-R. JIANG
Women’s Preferences (1/2)
¨ Q: Line 6: For a woman w and two men m and m', how to
decide w prefers m or m' in O(1)?
¨ A:
¤ An array WomanPref[w, i] is analogous to ManPref[m, i]
¨ Q: Does it work? Why?
¨ A:
¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time
¤ When man 1 proposes to woman 1 …
Complexity
27
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
IRIS H.-R. JIANG
Women’s Preferences (2/2)
¨ Q: Can we do better?
¨ A: Yes.
¤ Create an nxn array Ranking[ , ] at the beginning.
¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list)
¤ When man 1 proposes to woman 1, compare them in O(1) time.
¤ Construct Ranking[ , ] while parsing inputs in O(n2) time.
Complexity
28
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
3 4 5
2 6 1
Ranking[ , ]
1 2 3 4 5 6
Men
for (i=1 to n) do
Ranking[WomanPref[w, i]] = i
IRIS H.-R. JIANG
Whom to Propose?
¨ Q: Line 3: For a man m, how to identify the highest ranked
woman whom he hasn’t yet proposed in O(1) time?
¨ A:
¤ An extra array Next[] indicates for each man m the position of the
next woman he will propose to on his list.
n m will propose to w = ManPref[m, Next[m]]
n Next[m]++ after proposal, regardless of rejection/acceptance
Complexity
25
1
1
1
Next
Xavier
Yancey
Zeus
Men
1
2
3
ID
Initial value
1
2
1
2
1
2
3
3
3
ManPref[ , ]
2
2
3
Next
In progress
1
2
1
2
1
2
3
3
3
ManPref[ , ]
IRIS H.-R. JIANG
Identifying Her Fiancé
¨ Q: Line 4: For a woman w, how to decide if w is currently
engaged? If so, how to identify her current fiancé in O(1) time?
¨ A:
¤ An array Current[] of length n.
n w’s current fiancé = Current[w]
n Initially, Current[w] = 0 (unmatched)
Complexity
26
0
0
0
Current
Initial value
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
Amy
Bertha
Clare
Women
1
2
3
ID
Xavier
Yancey
Zeus
Men
1
2
3
ID
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
1
3
0
Current
In progress
IRIS H.-R. JIANG
Women’s Preferences (1/2)
¨ Q: Line 6: For a woman w and two men m and m', how to
decide w prefers m or m' in O(1)?
¨ A:
¤ An array WomanPref[w, i] is analogous to ManPref[m, i]
¨ Q: Does it work? Why?
¨ A:
¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time
¤ When man 1 proposes to woman 1 …
Complexity
27
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
IRIS H.-R. JIANG
Women’s Preferences (2/2)
¨ Q: Can we do better?
¨ A: Yes.
¤ Create an nxn array Ranking[ , ] at the beginning.
¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list)
¤ When man 1 proposes to woman 1, compare them in O(1) time.
¤ Construct Ranking[ , ] while parsing inputs in O(n2) time.
Complexity
28
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
3 4 5
2 6 1
Ranking[ , ]
1 2 3 4 5 6
Men
for (i=1 to n) do
Ranking[WomanPref[w, i]] = i
IRIS H.-R. JIANG
Whom to Propose?
¨ Q: Line 3: For a man m, how to identify the highest ranked
woman whom he hasn’t yet proposed in O(1) time?
¨ A:
¤ An extra array Next[] indicates for each man m the position of the
next woman he will propose to on his list.
n m will propose to w = ManPref[m, Next[m]]
n Next[m]++ after proposal, regardless of rejection/acceptance
Complexity
25
1
1
1
Next
Xavier
Yancey
Zeus
Men
1
2
3
ID
Initial value
1
2
1
2
1
2
3
3
3
ManPref[ , ]
2
2
3
Next
In progress
1
2
1
2
1
2
3
3
3
ManPref[ , ]
IRIS H.-R. JIANG
Identifying Her Fiancé
¨ Q: Line 4: For a woman w, how to decide if w is currently
engaged? If so, how to identify her current fiancé in O(1) time?
¨ A:
¤ An array Current[] of length n.
n w’s current fiancé = Current[w]
n Initially, Current[w] = 0 (unmatched)
Complexity
26
0
0
0
Current
Initial value
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
Amy
Bertha
Clare
Women
1
2
3
ID
Xavier
Yancey
Zeus
Men
1
2
3
ID
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
1
3
0
Current
In progress
IRIS H.-R. JIANG
Women’s Preferences (1/2)
¨ Q: Line 6: For a woman w and two men m and m', how to
decide w prefers m or m' in O(1)?
¨ A:
¤ An array WomanPref[w, i] is analogous to ManPref[m, i]
¨ Q: Does it work? Why?
¨ A:
¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time
¤ When man 1 proposes to woman 1 …
Complexity
27
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
IRIS H.-R. JIANG
Women’s Preferences (2/2)
¨ Q: Can we do better?
¨ A: Yes.
¤ Create an nxn array Ranking[ , ] at the beginning.
¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list)
¤ When man 1 proposes to woman 1, compare them in O(1) time.
¤ Construct Ranking[ , ] while parsing inputs in O(n2) time.
Complexity
28
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
3 4 5
2 6 1
Ranking[ , ]
1 2 3 4 5 6
Men
for (i=1 to n) do
Ranking[WomanPref[w, i]] = i
IRIS H.-R. JIANG
Whom to Propose?
¨ Q: Line 3: For a man m, how to identify the highest ranked
woman whom he hasn’t yet proposed in O(1) time?
¨ A:
¤ An extra array Next[] indicates for each man m the position of the
next woman he will propose to on his list.
n m will propose to w = ManPref[m, Next[m]]
n Next[m]++ after proposal, regardless of rejection/acceptance
Complexity
25
1
1
1
Next
Xavier
Yancey
Zeus
Men
1
2
3
ID
Initial value
1
2
1
2
1
2
3
3
3
ManPref[ , ]
2
2
3
Next
In progress
1
2
1
2
1
2
3
3
3
ManPref[ , ]
IRIS H.-R. JIANG
Identifying Her Fiancé
¨ Q: Line 4: For a woman w, how to decide if w is currently
engaged? If so, how to identify her current fiancé in O(1) time?
¨ A:
¤ An array Current[] of length n.
n w’s current fiancé = Current[w]
n Initially, Current[w] = 0 (unmatched)
Complexity
26
0
0
0
Current
Initial value
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
Amy
Bertha
Clare
Women
1
2
3
ID
Xavier
Yancey
Zeus
Men
1
2
3
ID
2
1
1
1
2
2
3
3
3
WomanPref[ , ]
1
3
0
Current
In progress
IRIS H.-R. JIANG
Women’s Preferences (1/2)
¨ Q: Line 6: For a woman w and two men m and m', how to
decide w prefers m or m' in O(1)?
¨ A:
¤ An array WomanPref[w, i] is analogous to ManPref[m, i]
¨ Q: Does it work? Why?
¨ A:
¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time
¤ When man 1 proposes to woman 1 …
Complexity
27
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
IRIS H.-R. JIANG
Women’s Preferences (2/2)
¨ Q: Can we do better?
¨ A: Yes.
¤ Create an nxn array Ranking[ , ] at the beginning.
¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list)
¤ When man 1 proposes to woman 1, compare them in O(1) time.
¤ Construct Ranking[ , ] while parsing inputs in O(n2) time.
Complexity
28
Amy
Women
1
ID
6
Current
5 6 2
3 1 4
WomanPref[ , ]
1st 2nd 3rd 4th 5th 6th
3 4 5
2 6 1
Ranking[ , ]
1 2 3 4 5 6
Men
for (i=1 to n) do
Ranking[WomanPref[w, i]] = i
IRIS H.-R. JIANG
What Did We Learn?
¨ The meaning of efficiency
¤ Efficiency Û polynomial-time complexity
¨ The analysis of an algorithm
¤ We try to figure out the intrinsic running time before
implementing.
¤ Data structures play an important role; if necessary, preprocess
the data into a proper form.
Complexity
29
IRIS H.-R. JIANG
Running Times Comparison
¨ Q: Arrange the following list of functions in ascending order of
growth rate.
¤ f1(n) = n1/3
¤ f2(n) = lg n (lg = log2)
¤ f3(n) = 2!lg n
¨ A: Convert them into a common form:
¤ Take logarithm!
¤ Let z = lg n
¤ lg f1(n) = lg n1/3 = (1/3)log n = (1/3)z
¤ lg f2(n) = lg (lg n) = lg z
¤ lg f3(n) = lg (2!lg n) = !lg n = !z
¤ DIY the rest! (see Solved Exercise 1)
Complexity
30
IRIS H.-R. JIANG
What Did We Learn?
¨ The meaning of efficiency
¤ Efficiency Û polynomial-time complexity
¨ The analysis of an algorithm
¤ We try to figure out the intrinsic running time before
implementing.
¤ Data structures play an important role; if necessary, preprocess
the data into a proper form.
Complexity
29
IRIS H.-R. JIANG
Running Times Comparison
¨ Q: Arrange the following list of functions in ascending order of
growth rate.
¤ f1(n) = n1/3
¤ f2(n) = lg n (lg = log2)
¤ f3(n) = 2!lg n
¨ A: Convert them into a common form:
¤ Take logarithm!
¤ Let z = lg n
¤ lg f1(n) = lg n1/3 = (1/3)log n = (1/3)z
¤ lg f2(n) = lg (lg n) = lg z
¤ lg f3(n) = lg (2!lg n) = !lg n = !z
¤ DIY the rest! (see Solved Exercise 1)
Complexity
30

More Related Content

What's hot

Just Enough C For Open Source Projects
Just Enough C For Open Source ProjectsJust Enough C For Open Source Projects
Just Enough C For Open Source Projects
Andy Lester
 

What's hot (17)

BERT (v3).pptx
BERT (v3).pptxBERT (v3).pptx
BERT (v3).pptx
 
Natural Language Processing: Parsing
Natural Language Processing: ParsingNatural Language Processing: Parsing
Natural Language Processing: Parsing
 
Techniques in Deep Learning
Techniques in Deep LearningTechniques in Deep Learning
Techniques in Deep Learning
 
Deep Learning for Natural Language Processing
Deep Learning for Natural Language ProcessingDeep Learning for Natural Language Processing
Deep Learning for Natural Language Processing
 
Akka-intro-training-public.pdf
Akka-intro-training-public.pdfAkka-intro-training-public.pdf
Akka-intro-training-public.pdf
 
An introduction to the Transformers architecture and BERT
An introduction to the Transformers architecture and BERTAn introduction to the Transformers architecture and BERT
An introduction to the Transformers architecture and BERT
 
Lecture: Word Sense Disambiguation
Lecture: Word Sense DisambiguationLecture: Word Sense Disambiguation
Lecture: Word Sense Disambiguation
 
Scaling Instruction-Finetuned Language Models
Scaling Instruction-Finetuned Language ModelsScaling Instruction-Finetuned Language Models
Scaling Instruction-Finetuned Language Models
 
BERT: Bidirectional Encoder Representations from Transformers
BERT: Bidirectional Encoder Representations from TransformersBERT: Bidirectional Encoder Representations from Transformers
BERT: Bidirectional Encoder Representations from Transformers
 
Machine translation from English to Hindi
Machine translation from English to HindiMachine translation from English to Hindi
Machine translation from English to Hindi
 
Convolutional Neural Networks
Convolutional Neural NetworksConvolutional Neural Networks
Convolutional Neural Networks
 
Introduction to natural language processing, history and origin
Introduction to natural language processing, history and originIntroduction to natural language processing, history and origin
Introduction to natural language processing, history and origin
 
Autoencoders Tutorial | Autoencoders In Deep Learning | Tensorflow Training |...
Autoencoders Tutorial | Autoencoders In Deep Learning | Tensorflow Training |...Autoencoders Tutorial | Autoencoders In Deep Learning | Tensorflow Training |...
Autoencoders Tutorial | Autoencoders In Deep Learning | Tensorflow Training |...
 
Bert.pptx
Bert.pptxBert.pptx
Bert.pptx
 
Chapter 14 AutoEncoder
Chapter 14 AutoEncoderChapter 14 AutoEncoder
Chapter 14 AutoEncoder
 
Just Enough C For Open Source Projects
Just Enough C For Open Source ProjectsJust Enough C For Open Source Projects
Just Enough C For Open Source Projects
 
RabbitMQ interview Questions and Answers
RabbitMQ interview Questions and AnswersRabbitMQ interview Questions and Answers
RabbitMQ interview Questions and Answers
 

Similar to algorithm_2algorithm_analysis.pdf

FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
AntareepMajumder
 

Similar to algorithm_2algorithm_analysis.pdf (20)

Analysis of Algorithms
Analysis of AlgorithmsAnalysis of Algorithms
Analysis of Algorithms
 
Daa
DaaDaa
Daa
 
Design & Analysis of Algorithms Lecture Notes
Design & Analysis of Algorithms Lecture NotesDesign & Analysis of Algorithms Lecture Notes
Design & Analysis of Algorithms Lecture Notes
 
Design and Analysis of Algorithms.pptx
Design and Analysis of Algorithms.pptxDesign and Analysis of Algorithms.pptx
Design and Analysis of Algorithms.pptx
 
Design and analysis of algorithms - Abstract View
Design and analysis of algorithms - Abstract ViewDesign and analysis of algorithms - Abstract View
Design and analysis of algorithms - Abstract View
 
DAA Slides for Multiple topics such as different algorithms
DAA Slides for Multiple topics such as different algorithmsDAA Slides for Multiple topics such as different algorithms
DAA Slides for Multiple topics such as different algorithms
 
DAA Notes.pdf
DAA Notes.pdfDAA Notes.pdf
DAA Notes.pdf
 
Performance evaluation of parallel bubble sort algorithm on supercomputer IMAN1.
Performance evaluation of parallel bubble sort algorithm on supercomputer IMAN1.Performance evaluation of parallel bubble sort algorithm on supercomputer IMAN1.
Performance evaluation of parallel bubble sort algorithm on supercomputer IMAN1.
 
Analysis
AnalysisAnalysis
Analysis
 
Accelerating stochastic gradient descent using adaptive mini batch size3
Accelerating stochastic gradient descent using adaptive mini batch size3Accelerating stochastic gradient descent using adaptive mini batch size3
Accelerating stochastic gradient descent using adaptive mini batch size3
 
Python algorithm
Python algorithmPython algorithm
Python algorithm
 
Refactoring Applications for the XK7 and Future Hybrid Architectures
Refactoring Applications for the XK7 and Future Hybrid ArchitecturesRefactoring Applications for the XK7 and Future Hybrid Architectures
Refactoring Applications for the XK7 and Future Hybrid Architectures
 
FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
FALLSEM2022-23_BCSE202L_TH_VL2022230103292_Reference_Material_I_25-07-2022_Fu...
 
Optimization of power systems - old and new tools
Optimization of power systems - old and new toolsOptimization of power systems - old and new tools
Optimization of power systems - old and new tools
 
Tools for Discrete Time Control; Application to Power Systems
Tools for Discrete Time Control; Application to Power SystemsTools for Discrete Time Control; Application to Power Systems
Tools for Discrete Time Control; Application to Power Systems
 
Algorithm Analysis.pdf
Algorithm Analysis.pdfAlgorithm Analysis.pdf
Algorithm Analysis.pdf
 
daa unit 1.pptx
daa unit 1.pptxdaa unit 1.pptx
daa unit 1.pptx
 
Using SigOpt to Tune Deep Learning Models with Nervana Cloud
Using SigOpt to Tune Deep Learning Models with Nervana CloudUsing SigOpt to Tune Deep Learning Models with Nervana Cloud
Using SigOpt to Tune Deep Learning Models with Nervana Cloud
 
VCE Unit 01 (1).pptx
VCE Unit 01 (1).pptxVCE Unit 01 (1).pptx
VCE Unit 01 (1).pptx
 
DATA STRUCTURE.pdf
DATA STRUCTURE.pdfDATA STRUCTURE.pdf
DATA STRUCTURE.pdf
 

More from HsuChi Chen

More from HsuChi Chen (14)

algorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdfalgorithm_6dynamic_programming.pdf
algorithm_6dynamic_programming.pdf
 
algorithm_1introdunction.pdf
algorithm_1introdunction.pdfalgorithm_1introdunction.pdf
algorithm_1introdunction.pdf
 
algorithm_9linear_programming.pdf
algorithm_9linear_programming.pdfalgorithm_9linear_programming.pdf
algorithm_9linear_programming.pdf
 
algorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdfalgorithm_4greedy_algorithms.pdf
algorithm_4greedy_algorithms.pdf
 
algorithm_3graphs.pdf
algorithm_3graphs.pdfalgorithm_3graphs.pdf
algorithm_3graphs.pdf
 
algorithm_7network_flow.pdf
algorithm_7network_flow.pdfalgorithm_7network_flow.pdf
algorithm_7network_flow.pdf
 
algorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdfalgorithm_8beyond_polynomial_running_times.pdf
algorithm_8beyond_polynomial_running_times.pdf
 
algorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdfalgorithm_5divide_and_conquer.pdf
algorithm_5divide_and_conquer.pdf
 
algorithm_0introdunction.pdf
algorithm_0introdunction.pdfalgorithm_0introdunction.pdf
algorithm_0introdunction.pdf
 
期末專題報告書
期末專題報告書期末專題報告書
期末專題報告書
 
單晶片期末專題-報告二
單晶片期末專題-報告二單晶片期末專題-報告二
單晶片期末專題-報告二
 
單晶片期末專題-報告一
單晶片期末專題-報告一單晶片期末專題-報告一
單晶片期末專題-報告一
 
模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告模組化課程生醫微製程期末報告
模組化課程生醫微製程期末報告
 
單晶片期末專題-初步想法
單晶片期末專題-初步想法單晶片期末專題-初步想法
單晶片期末專題-初步想法
 

Recently uploaded

+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
Health
 
The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is inside
shinachiaurasa2
 

Recently uploaded (20)

call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa%in tembisa+277-882-255-28 abortion pills for sale in tembisa
%in tembisa+277-882-255-28 abortion pills for sale in tembisa
 
%in kaalfontein+277-882-255-28 abortion pills for sale in kaalfontein
%in kaalfontein+277-882-255-28 abortion pills for sale in kaalfontein%in kaalfontein+277-882-255-28 abortion pills for sale in kaalfontein
%in kaalfontein+277-882-255-28 abortion pills for sale in kaalfontein
 
%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand%in Midrand+277-882-255-28 abortion pills for sale in midrand
%in Midrand+277-882-255-28 abortion pills for sale in midrand
 
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
+971565801893>>SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHAB...
 
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdfPayment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
Payment Gateway Testing Simplified_ A Step-by-Step Guide for Beginners.pdf
 
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdfAzure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
Azure_Native_Qumulo_High_Performance_Compute_Benchmarks.pdf
 
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
call girls in Vaishali (Ghaziabad) 🔝 >༒8448380779 🔝 genuine Escort Service 🔝✔️✔️
 
Exploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdfExploring the Best Video Editing App.pdf
Exploring the Best Video Editing App.pdf
 
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
The Guide to Integrating Generative AI into Unified Continuous Testing Platfo...
 
Unlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language ModelsUnlocking the Future of AI Agents with Large Language Models
Unlocking the Future of AI Agents with Large Language Models
 
Direct Style Effect Systems - The Print[A] Example - A Comprehension Aid
Direct Style Effect Systems -The Print[A] Example- A Comprehension AidDirect Style Effect Systems -The Print[A] Example- A Comprehension Aid
Direct Style Effect Systems - The Print[A] Example - A Comprehension Aid
 
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
 
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected WorkerHow To Troubleshoot Collaboration Apps for the Modern Connected Worker
How To Troubleshoot Collaboration Apps for the Modern Connected Worker
 
The title is not connected to what is inside
The title is not connected to what is insideThe title is not connected to what is inside
The title is not connected to what is inside
 
Software Quality Assurance Interview Questions
Software Quality Assurance Interview QuestionsSoftware Quality Assurance Interview Questions
Software Quality Assurance Interview Questions
 
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
Shapes for Sharing between Graph Data Spaces - and Epistemic Querying of RDF-...
 
%in kempton park+277-882-255-28 abortion pills for sale in kempton park
%in kempton park+277-882-255-28 abortion pills for sale in kempton park %in kempton park+277-882-255-28 abortion pills for sale in kempton park
%in kempton park+277-882-255-28 abortion pills for sale in kempton park
 
Define the academic and professional writing..pdf
Define the academic and professional writing..pdfDefine the academic and professional writing..pdf
Define the academic and professional writing..pdf
 
Introducing Microsoft’s new Enterprise Work Management (EWM) Solution
Introducing Microsoft’s new Enterprise Work Management (EWM) SolutionIntroducing Microsoft’s new Enterprise Work Management (EWM) Solution
Introducing Microsoft’s new Enterprise Work Management (EWM) Solution
 

algorithm_2algorithm_analysis.pdf

  • 1. Iris Hui-Ru Jiang Fall 2014 CHAPTER 2 ALGORITHM ANALYSIS IRIS H.-R. JIANG Outline ¨ Content: ¤ Computational tractability ¤ Asymptotic order of growth ¤ Implementing Gale-Shapley algorithm ¤ Survey on common running times (Self-study) ¤ Priority queues (Postponed to Section 4.5) ¨ Reading: ¤ Chapter 2 Complexity 2 IRIS H.-R. JIANG Computational Efficiency ¨ Q: What is a good algorithm? ¨ A: ¤ Correct: proofs ¤ Efficient: run quickly and consume small memory n Efficiency Û resource requirement Û complexity ¨ Q: How does the resource requirements scale with increasing input size? ¤ Time: running time ¤ Space: memory usage Complexity 3 IRIS H.-R. JIANG Running Time Complexity Analytical Engine As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise — by what course of calculation can these results be arrived at by the machine in the shortest time? -- Charles Babbage (1864) how many times do you have to turn the crank? Charles Babbage 4
  • 2. Iris Hui-Ru Jiang Fall 2014 CHAPTER 2 ALGORITHM ANALYSIS IRIS H.-R. JIANG Outline ¨ Content: ¤ Computational tractability ¤ Asymptotic order of growth ¤ Implementing Gale-Shapley algorithm ¤ Survey on common running times (Self-study) ¤ Priority queues (Postponed to Section 4.5) ¨ Reading: ¤ Chapter 2 Complexity 2 IRIS H.-R. JIANG Computational Efficiency ¨ Q: What is a good algorithm? ¨ A: ¤ Correct: proofs ¤ Efficient: run quickly and consume small memory n Efficiency Û resource requirement Û complexity ¨ Q: How does the resource requirements scale with increasing input size? ¤ Time: running time ¤ Space: memory usage Complexity 3 IRIS H.-R. JIANG Running Time Complexity Analytical Engine As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise — by what course of calculation can these results be arrived at by the machine in the shortest time? -- Charles Babbage (1864) how many times do you have to turn the crank? Charles Babbage 4
  • 3. Iris Hui-Ru Jiang Fall 2014 CHAPTER 2 ALGORITHM ANALYSIS IRIS H.-R. JIANG Outline ¨ Content: ¤ Computational tractability ¤ Asymptotic order of growth ¤ Implementing Gale-Shapley algorithm ¤ Survey on common running times (Self-study) ¤ Priority queues (Postponed to Section 4.5) ¨ Reading: ¤ Chapter 2 Complexity 2 IRIS H.-R. JIANG Computational Efficiency ¨ Q: What is a good algorithm? ¨ A: ¤ Correct: proofs ¤ Efficient: run quickly and consume small memory n Efficiency Û resource requirement Û complexity ¨ Q: How does the resource requirements scale with increasing input size? ¤ Time: running time ¤ Space: memory usage Complexity 3 IRIS H.-R. JIANG Running Time Complexity Analytical Engine As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise — by what course of calculation can these results be arrived at by the machine in the shortest time? -- Charles Babbage (1864) how many times do you have to turn the crank? Charles Babbage 4
  • 4. Iris Hui-Ru Jiang Fall 2014 CHAPTER 2 ALGORITHM ANALYSIS IRIS H.-R. JIANG Outline ¨ Content: ¤ Computational tractability ¤ Asymptotic order of growth ¤ Implementing Gale-Shapley algorithm ¤ Survey on common running times (Self-study) ¤ Priority queues (Postponed to Section 4.5) ¨ Reading: ¤ Chapter 2 Complexity 2 IRIS H.-R. JIANG Computational Efficiency ¨ Q: What is a good algorithm? ¨ A: ¤ Correct: proofs ¤ Efficient: run quickly and consume small memory n Efficiency Û resource requirement Û complexity ¨ Q: How does the resource requirements scale with increasing input size? ¤ Time: running time ¤ Space: memory usage Complexity 3 IRIS H.-R. JIANG Running Time Complexity Analytical Engine As soon as an Analytical Engine exists, it will necessarily guide the future course of the science. Whenever any result is sought by its aid, the question will then arise — by what course of calculation can these results be arrived at by the machine in the shortest time? -- Charles Babbage (1864) how many times do you have to turn the crank? Charles Babbage 4
  • 5. Defining efficiency Computational Tractability Complexity 5 IRIS H.-R. JIANG Proposed Definition 1 (1/2) ¨ Q: How should we turn the fuzzy notion of an “efficient” algorithm into something more concrete? ¨ A: Proposed definition 1: An algorithm is efficient if, when implemented, it runs quickly on real input instances. ¨ Q: What’s wrong? ¨ A: Crucial things are missing… ¤ Where and how well do we implement an algorithm? n Bad algorithms can run quickly with small cases on fast machines. ¤ What is a real input instance? n We don’t know the full range of input instances in practice ¤ How well, or badly, may an algorithm scale as problem sizes grow to unexpected levels. n Two algorithms perform comparably on inputs of size 100; multiply the input size tenfold, and one will still run quickly, while the other consumes a huge amount of time. Complexity 6 IRIS H.-R. JIANG Proposed Definition 1 (2/2) ¨ Q: What are we asking for? ¨ A: A concrete definition of efficiency ¤ Platform-independent ¤ Instance-independent ¤ Of predictive value w.r.t. increasing input sizes ¨ E.g., the stable matching problem ¤ Input size: N = the total size of preference lists n n men & n women: 2n lists, each of length n n N = 2n2 ¤ Describe an algorithm at a high level ¤ Analyze its running time mathematically as a function of N Complexity 7 IRIS H.-R. JIANG Proposed Definition 2 (1/2) ¨ We will focus on the worst-case running time ¤ A bound on the largest possible running time the algorithm could have over all inputs of a given size N. ¨ Q: How about average-case analysis? ¤ The performance averaged over random instances ¨ A: But, how to generate a random input? ¤ We don’t know the full range of input instances n Good on one; poor on another ¨ Q: On what can we say a running-time bound is good based? ¨ A: Compare with brute-force search over the solution space ¤ Try all possibilities and see if any one of them works. ¤ One thing in common: a compact representation implicitly specifies a giant search space Þ brute-force is too slow! n Typically takes 2N time or worse for inputs of size N. n Too naïve to be acceptable in practice. Complexity 8
  • 6. Defining efficiency Computational Tractability Complexity 5 IRIS H.-R. JIANG Proposed Definition 1 (1/2) ¨ Q: How should we turn the fuzzy notion of an “efficient” algorithm into something more concrete? ¨ A: Proposed definition 1: An algorithm is efficient if, when implemented, it runs quickly on real input instances. ¨ Q: What’s wrong? ¨ A: Crucial things are missing… ¤ Where and how well do we implement an algorithm? n Bad algorithms can run quickly with small cases on fast machines. ¤ What is a real input instance? n We don’t know the full range of input instances in practice ¤ How well, or badly, may an algorithm scale as problem sizes grow to unexpected levels. n Two algorithms perform comparably on inputs of size 100; multiply the input size tenfold, and one will still run quickly, while the other consumes a huge amount of time. Complexity 6 IRIS H.-R. JIANG Proposed Definition 1 (2/2) ¨ Q: What are we asking for? ¨ A: A concrete definition of efficiency ¤ Platform-independent ¤ Instance-independent ¤ Of predictive value w.r.t. increasing input sizes ¨ E.g., the stable matching problem ¤ Input size: N = the total size of preference lists n n men & n women: 2n lists, each of length n n N = 2n2 ¤ Describe an algorithm at a high level ¤ Analyze its running time mathematically as a function of N Complexity 7 IRIS H.-R. JIANG Proposed Definition 2 (1/2) ¨ We will focus on the worst-case running time ¤ A bound on the largest possible running time the algorithm could have over all inputs of a given size N. ¨ Q: How about average-case analysis? ¤ The performance averaged over random instances ¨ A: But, how to generate a random input? ¤ We don’t know the full range of input instances n Good on one; poor on another ¨ Q: On what can we say a running-time bound is good based? ¨ A: Compare with brute-force search over the solution space ¤ Try all possibilities and see if any one of them works. ¤ One thing in common: a compact representation implicitly specifies a giant search space Þ brute-force is too slow! n Typically takes 2N time or worse for inputs of size N. n Too naïve to be acceptable in practice. Complexity 8
  • 7. Defining efficiency Computational Tractability Complexity 5 IRIS H.-R. JIANG Proposed Definition 1 (1/2) ¨ Q: How should we turn the fuzzy notion of an “efficient” algorithm into something more concrete? ¨ A: Proposed definition 1: An algorithm is efficient if, when implemented, it runs quickly on real input instances. ¨ Q: What’s wrong? ¨ A: Crucial things are missing… ¤ Where and how well do we implement an algorithm? n Bad algorithms can run quickly with small cases on fast machines. ¤ What is a real input instance? n We don’t know the full range of input instances in practice ¤ How well, or badly, may an algorithm scale as problem sizes grow to unexpected levels. n Two algorithms perform comparably on inputs of size 100; multiply the input size tenfold, and one will still run quickly, while the other consumes a huge amount of time. Complexity 6 IRIS H.-R. JIANG Proposed Definition 1 (2/2) ¨ Q: What are we asking for? ¨ A: A concrete definition of efficiency ¤ Platform-independent ¤ Instance-independent ¤ Of predictive value w.r.t. increasing input sizes ¨ E.g., the stable matching problem ¤ Input size: N = the total size of preference lists n n men & n women: 2n lists, each of length n n N = 2n2 ¤ Describe an algorithm at a high level ¤ Analyze its running time mathematically as a function of N Complexity 7 IRIS H.-R. JIANG Proposed Definition 2 (1/2) ¨ We will focus on the worst-case running time ¤ A bound on the largest possible running time the algorithm could have over all inputs of a given size N. ¨ Q: How about average-case analysis? ¤ The performance averaged over random instances ¨ A: But, how to generate a random input? ¤ We don’t know the full range of input instances n Good on one; poor on another ¨ Q: On what can we say a running-time bound is good based? ¨ A: Compare with brute-force search over the solution space ¤ Try all possibilities and see if any one of them works. ¤ One thing in common: a compact representation implicitly specifies a giant search space Þ brute-force is too slow! n Typically takes 2N time or worse for inputs of size N. n Too naïve to be acceptable in practice. Complexity 8
  • 8. Defining efficiency Computational Tractability Complexity 5 IRIS H.-R. JIANG Proposed Definition 1 (1/2) ¨ Q: How should we turn the fuzzy notion of an “efficient” algorithm into something more concrete? ¨ A: Proposed definition 1: An algorithm is efficient if, when implemented, it runs quickly on real input instances. ¨ Q: What’s wrong? ¨ A: Crucial things are missing… ¤ Where and how well do we implement an algorithm? n Bad algorithms can run quickly with small cases on fast machines. ¤ What is a real input instance? n We don’t know the full range of input instances in practice ¤ How well, or badly, may an algorithm scale as problem sizes grow to unexpected levels. n Two algorithms perform comparably on inputs of size 100; multiply the input size tenfold, and one will still run quickly, while the other consumes a huge amount of time. Complexity 6 IRIS H.-R. JIANG Proposed Definition 1 (2/2) ¨ Q: What are we asking for? ¨ A: A concrete definition of efficiency ¤ Platform-independent ¤ Instance-independent ¤ Of predictive value w.r.t. increasing input sizes ¨ E.g., the stable matching problem ¤ Input size: N = the total size of preference lists n n men & n women: 2n lists, each of length n n N = 2n2 ¤ Describe an algorithm at a high level ¤ Analyze its running time mathematically as a function of N Complexity 7 IRIS H.-R. JIANG Proposed Definition 2 (1/2) ¨ We will focus on the worst-case running time ¤ A bound on the largest possible running time the algorithm could have over all inputs of a given size N. ¨ Q: How about average-case analysis? ¤ The performance averaged over random instances ¨ A: But, how to generate a random input? ¤ We don’t know the full range of input instances n Good on one; poor on another ¨ Q: On what can we say a running-time bound is good based? ¨ A: Compare with brute-force search over the solution space ¤ Try all possibilities and see if any one of them works. ¤ One thing in common: a compact representation implicitly specifies a giant search space Þ brute-force is too slow! n Typically takes 2N time or worse for inputs of size N. n Too naïve to be acceptable in practice. Complexity 8
  • 9. IRIS H.-R. JIANG Proposed Definition 2 (2/2) ¨ Proposed definition 2: An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search. ¨ Q: What is qualitatively better performance? ¨ A: We consider the actual running time of algorithms more carefully, and try to quantify what a reasonable running time would be. Complexity 9 IRIS H.-R. JIANG Proposed Definition 3 (1/3) ¨ We expect a good algorithm has a good scaling property. ¤ E.g., when the input size doubles, the algorithm should only slow down by some constant factor C. ¨ Polynomial running time: There exists constants c > 0 and d > 0 so that on every input of size N, its running time is bounded by cNd primitive computational steps. ¤ Proportional to Nd; the smaller d has a better scalibility. ¤ The algorithm has a polynomial running time if this running time bound holds for some c and d. ¨ Q: What is a primitive computational step? ¨ A: Each step corresponds to… ¤ A single assembly-language instruction on a standard processor ¤ One line of a standard programming language ¤ Simple enough and regardless to input size N Complexity 10 IRIS H.-R. JIANG Proposed Definition 3 (2/3) ¨ Proposed definition 3: An algorithm is efficient if it has a polynomial running time. ¨ Justification: It really works in practice! ¤ In practice, the polynomial-time algorithms that people develop almost always have low constants and low exponents. ¤ Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. ¨ Exceptions ¤ Some polynomial-time algorithms do have high constants and/or exponents, and are useless in practice. n Although 6.02´1023´N20 is technically polynomial-time, it would be useless in practice. ¤ Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. n This kind of algorithms may run quickly for average cases. Complexity 11 Simplex method Unix grep IRIS H.-R. JIANG Proposed Definition 3 (3/3) ¨ An algorithm is efficient if it has a polynomial running time. ¤ It really works in practice. ¤ The gulf between the growth rates of polynomial and exponential functions is enormous. ¤ It allow us to ask about the existence or nonexistence of efficient algorithms as a well-defined question. Complexity 12 n nlog2n n2 n3 1.5n 2n n! n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long The running times (rounded up) of different algorithms on inputs of increasing sizes, for a processor performing a million high-level instructions per second. (very long: > 1025 yrs) Non-polynomial time Polynomial time
  • 10. IRIS H.-R. JIANG Proposed Definition 2 (2/2) ¨ Proposed definition 2: An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search. ¨ Q: What is qualitatively better performance? ¨ A: We consider the actual running time of algorithms more carefully, and try to quantify what a reasonable running time would be. Complexity 9 IRIS H.-R. JIANG Proposed Definition 3 (1/3) ¨ We expect a good algorithm has a good scaling property. ¤ E.g., when the input size doubles, the algorithm should only slow down by some constant factor C. ¨ Polynomial running time: There exists constants c > 0 and d > 0 so that on every input of size N, its running time is bounded by cNd primitive computational steps. ¤ Proportional to Nd; the smaller d has a better scalibility. ¤ The algorithm has a polynomial running time if this running time bound holds for some c and d. ¨ Q: What is a primitive computational step? ¨ A: Each step corresponds to… ¤ A single assembly-language instruction on a standard processor ¤ One line of a standard programming language ¤ Simple enough and regardless to input size N Complexity 10 IRIS H.-R. JIANG Proposed Definition 3 (2/3) ¨ Proposed definition 3: An algorithm is efficient if it has a polynomial running time. ¨ Justification: It really works in practice! ¤ In practice, the polynomial-time algorithms that people develop almost always have low constants and low exponents. ¤ Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. ¨ Exceptions ¤ Some polynomial-time algorithms do have high constants and/or exponents, and are useless in practice. n Although 6.02´1023´N20 is technically polynomial-time, it would be useless in practice. ¤ Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. n This kind of algorithms may run quickly for average cases. Complexity 11 Simplex method Unix grep IRIS H.-R. JIANG Proposed Definition 3 (3/3) ¨ An algorithm is efficient if it has a polynomial running time. ¤ It really works in practice. ¤ The gulf between the growth rates of polynomial and exponential functions is enormous. ¤ It allow us to ask about the existence or nonexistence of efficient algorithms as a well-defined question. Complexity 12 n nlog2n n2 n3 1.5n 2n n! n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long The running times (rounded up) of different algorithms on inputs of increasing sizes, for a processor performing a million high-level instructions per second. (very long: > 1025 yrs) Non-polynomial time Polynomial time
  • 11. IRIS H.-R. JIANG Proposed Definition 2 (2/2) ¨ Proposed definition 2: An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search. ¨ Q: What is qualitatively better performance? ¨ A: We consider the actual running time of algorithms more carefully, and try to quantify what a reasonable running time would be. Complexity 9 IRIS H.-R. JIANG Proposed Definition 3 (1/3) ¨ We expect a good algorithm has a good scaling property. ¤ E.g., when the input size doubles, the algorithm should only slow down by some constant factor C. ¨ Polynomial running time: There exists constants c > 0 and d > 0 so that on every input of size N, its running time is bounded by cNd primitive computational steps. ¤ Proportional to Nd; the smaller d has a better scalibility. ¤ The algorithm has a polynomial running time if this running time bound holds for some c and d. ¨ Q: What is a primitive computational step? ¨ A: Each step corresponds to… ¤ A single assembly-language instruction on a standard processor ¤ One line of a standard programming language ¤ Simple enough and regardless to input size N Complexity 10 IRIS H.-R. JIANG Proposed Definition 3 (2/3) ¨ Proposed definition 3: An algorithm is efficient if it has a polynomial running time. ¨ Justification: It really works in practice! ¤ In practice, the polynomial-time algorithms that people develop almost always have low constants and low exponents. ¤ Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. ¨ Exceptions ¤ Some polynomial-time algorithms do have high constants and/or exponents, and are useless in practice. n Although 6.02´1023´N20 is technically polynomial-time, it would be useless in practice. ¤ Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. n This kind of algorithms may run quickly for average cases. Complexity 11 Simplex method Unix grep IRIS H.-R. JIANG Proposed Definition 3 (3/3) ¨ An algorithm is efficient if it has a polynomial running time. ¤ It really works in practice. ¤ The gulf between the growth rates of polynomial and exponential functions is enormous. ¤ It allow us to ask about the existence or nonexistence of efficient algorithms as a well-defined question. Complexity 12 n nlog2n n2 n3 1.5n 2n n! n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long The running times (rounded up) of different algorithms on inputs of increasing sizes, for a processor performing a million high-level instructions per second. (very long: > 1025 yrs) Non-polynomial time Polynomial time
  • 12. IRIS H.-R. JIANG Proposed Definition 2 (2/2) ¨ Proposed definition 2: An algorithm is efficient if it achieves qualitatively better worst-case performance, at an analytical level, than brute-force search. ¨ Q: What is qualitatively better performance? ¨ A: We consider the actual running time of algorithms more carefully, and try to quantify what a reasonable running time would be. Complexity 9 IRIS H.-R. JIANG Proposed Definition 3 (1/3) ¨ We expect a good algorithm has a good scaling property. ¤ E.g., when the input size doubles, the algorithm should only slow down by some constant factor C. ¨ Polynomial running time: There exists constants c > 0 and d > 0 so that on every input of size N, its running time is bounded by cNd primitive computational steps. ¤ Proportional to Nd; the smaller d has a better scalibility. ¤ The algorithm has a polynomial running time if this running time bound holds for some c and d. ¨ Q: What is a primitive computational step? ¨ A: Each step corresponds to… ¤ A single assembly-language instruction on a standard processor ¤ One line of a standard programming language ¤ Simple enough and regardless to input size N Complexity 10 IRIS H.-R. JIANG Proposed Definition 3 (2/3) ¨ Proposed definition 3: An algorithm is efficient if it has a polynomial running time. ¨ Justification: It really works in practice! ¤ In practice, the polynomial-time algorithms that people develop almost always have low constants and low exponents. ¤ Breaking through the exponential barrier of brute force typically exposes some crucial structure of the problem. ¨ Exceptions ¤ Some polynomial-time algorithms do have high constants and/or exponents, and are useless in practice. n Although 6.02´1023´N20 is technically polynomial-time, it would be useless in practice. ¤ Some exponential-time (or worse) algorithms are widely used because the worst-case instances seem to be rare. n This kind of algorithms may run quickly for average cases. Complexity 11 Simplex method Unix grep IRIS H.-R. JIANG Proposed Definition 3 (3/3) ¨ An algorithm is efficient if it has a polynomial running time. ¤ It really works in practice. ¤ The gulf between the growth rates of polynomial and exponential functions is enormous. ¤ It allow us to ask about the existence or nonexistence of efficient algorithms as a well-defined question. Complexity 12 n nlog2n n2 n3 1.5n 2n n! n = 10 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 4 sec n = 30 < 1 sec < 1 sec < 1 sec < 1 sec < 1 sec 18 min 1025 yrs n = 50 < 1 sec < 1 sec < 1 sec < 1 sec 11 min 36 yrs very long n = 100 < 1 sec < 1 sec < 1 sec 1 sec 12,892 yrs 1017 yrs very long n = 1,000 < 1 sec < 1 sec 1 sec 18 min very long very long very long n = 10,000 < 1 sec < 1 sec 2 min 12 days very long very long very long n = 100,000 < 1 sec 2 sec 3 hrs 32 yrs very long very long very long n = 1,000,000 1 sec 20 sec 12 days 31,710 yrs very long very long very long The running times (rounded up) of different algorithms on inputs of increasing sizes, for a processor performing a million high-level instructions per second. (very long: > 1025 yrs) Non-polynomial time Polynomial time
  • 13. Intrinsic computational tractability O, W, Q Asymptotic Order of Growth Complexity 13 IRIS H.-R. JIANG Intrinsic Computational Tractability ¨ Intrinsic computational tractability: An algorithm’s worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). ¤ f(n): an upper bound of the running time of the algorithm ¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps? ¨ A: We’d like to say it grows like n2 up to constant factors ¤ Too detailed ¤ Meaningless ¤ Hard to classify its efficiency n Our ultimate goal is to identify broad classes of algorithms that have similar behavior. n We’d actually like to classify running times at a coarser level of granularity so that similarities among different algorithms, and among different problems, show up more clearly. Complexity 14 Insensitive to constant factors and low-order terms IRIS H.-R. JIANG O, W, Q ¨ Let T(n) be a function to describe the worst-case running time of a certain algorithm on an input of size n. ¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n). ¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n). ¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and W(f(n)). ¤ Also, we can prove: Let f and g be two functions that exists and is equal to some number c > 0. Then f(n) = Q(g(n)). Complexity 15 ) ( ) ( lim n g n f n ¥ ® IRIS H.-R. JIANG Example: O, W, Q ¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false? 1. T(n) = O(n) 2. T(n) = O(n2) 3. T(n) = O(n3) 4. T(n) = W(n) 5. T(n) = W(n2) 6. T(n) = W(n3) 7. T(n) = Q(n) 8. T(n) = Q(n2) 9. T(n) = Q(n3) ¨ A: Complexity 16 In practice, we expect an O(.) as tight as possible because we seek an intrinsic running time.
  • 14. Intrinsic computational tractability O, W, Q Asymptotic Order of Growth Complexity 13 IRIS H.-R. JIANG Intrinsic Computational Tractability ¨ Intrinsic computational tractability: An algorithm’s worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). ¤ f(n): an upper bound of the running time of the algorithm ¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps? ¨ A: We’d like to say it grows like n2 up to constant factors ¤ Too detailed ¤ Meaningless ¤ Hard to classify its efficiency n Our ultimate goal is to identify broad classes of algorithms that have similar behavior. n We’d actually like to classify running times at a coarser level of granularity so that similarities among different algorithms, and among different problems, show up more clearly. Complexity 14 Insensitive to constant factors and low-order terms IRIS H.-R. JIANG O, W, Q ¨ Let T(n) be a function to describe the worst-case running time of a certain algorithm on an input of size n. ¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n). ¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n). ¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and W(f(n)). ¤ Also, we can prove: Let f and g be two functions that exists and is equal to some number c > 0. Then f(n) = Q(g(n)). Complexity 15 ) ( ) ( lim n g n f n ¥ ® IRIS H.-R. JIANG Example: O, W, Q ¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false? 1. T(n) = O(n) 2. T(n) = O(n2) 3. T(n) = O(n3) 4. T(n) = W(n) 5. T(n) = W(n2) 6. T(n) = W(n3) 7. T(n) = Q(n) 8. T(n) = Q(n2) 9. T(n) = Q(n3) ¨ A: Complexity 16 In practice, we expect an O(.) as tight as possible because we seek an intrinsic running time.
  • 15. Intrinsic computational tractability O, W, Q Asymptotic Order of Growth Complexity 13 IRIS H.-R. JIANG Intrinsic Computational Tractability ¨ Intrinsic computational tractability: An algorithm’s worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). ¤ f(n): an upper bound of the running time of the algorithm ¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps? ¨ A: We’d like to say it grows like n2 up to constant factors ¤ Too detailed ¤ Meaningless ¤ Hard to classify its efficiency n Our ultimate goal is to identify broad classes of algorithms that have similar behavior. n We’d actually like to classify running times at a coarser level of granularity so that similarities among different algorithms, and among different problems, show up more clearly. Complexity 14 Insensitive to constant factors and low-order terms IRIS H.-R. JIANG O, W, Q ¨ Let T(n) be a function to describe the worst-case running time of a certain algorithm on an input of size n. ¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n). ¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n). ¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and W(f(n)). ¤ Also, we can prove: Let f and g be two functions that exists and is equal to some number c > 0. Then f(n) = Q(g(n)). Complexity 15 ) ( ) ( lim n g n f n ¥ ® IRIS H.-R. JIANG Example: O, W, Q ¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false? 1. T(n) = O(n) 2. T(n) = O(n2) 3. T(n) = O(n3) 4. T(n) = W(n) 5. T(n) = W(n2) 6. T(n) = W(n3) 7. T(n) = Q(n) 8. T(n) = Q(n2) 9. T(n) = Q(n3) ¨ A: Complexity 16 In practice, we expect an O(.) as tight as possible because we seek an intrinsic running time.
  • 16. Intrinsic computational tractability O, W, Q Asymptotic Order of Growth Complexity 13 IRIS H.-R. JIANG Intrinsic Computational Tractability ¨ Intrinsic computational tractability: An algorithm’s worst-case running time on inputs of size n grows at a rate that is at most proportional to some function f(n). ¤ f(n): an upper bound of the running time of the algorithm ¨ Q: What’s wrong with 1.62n2 + 3.5n + 8 steps? ¨ A: We’d like to say it grows like n2 up to constant factors ¤ Too detailed ¤ Meaningless ¤ Hard to classify its efficiency n Our ultimate goal is to identify broad classes of algorithms that have similar behavior. n We’d actually like to classify running times at a coarser level of granularity so that similarities among different algorithms, and among different problems, show up more clearly. Complexity 14 Insensitive to constant factors and low-order terms IRIS H.-R. JIANG O, W, Q ¨ Let T(n) be a function to describe the worst-case running time of a certain algorithm on an input of size n. ¨ Asymptotic upper bound: T(n) = O(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) £ cf(n). ¨ Asymptotic lower bound: T(n) = W(f(n)) if there exist constants c > 0 and n0 ³ 0 such that for all n ³ n0 we have T(n) ³ cf(n). ¨ Asymptotic tight bound: T(n) = Q(f(n)) if T(n) is both O(f(n)) and W(f(n)). ¤ Also, we can prove: Let f and g be two functions that exists and is equal to some number c > 0. Then f(n) = Q(g(n)). Complexity 15 ) ( ) ( lim n g n f n ¥ ® IRIS H.-R. JIANG Example: O, W, Q ¨ Q: T(n) = 1.62n2 + 3.5n + 8, true or false? 1. T(n) = O(n) 2. T(n) = O(n2) 3. T(n) = O(n3) 4. T(n) = W(n) 5. T(n) = W(n2) 6. T(n) = W(n3) 7. T(n) = Q(n) 8. T(n) = Q(n2) 9. T(n) = Q(n3) ¨ A: Complexity 16 In practice, we expect an O(.) as tight as possible because we seek an intrinsic running time.
  • 17. IRIS H.-R. JIANG Abuse of Notation ¨ Q: Why using equality in T(n) = O(f(n))? ¤ Asymmetric: n f(n) = 5n3; g(n) = 3n2 n f(n) = O(n3) = g(n) n but f(n) ¹ g(n). ¤ Better notation: T(n) Î O(f(n)) n O(f(n)) forms a set ¨ Cf. “Is” in English ¤ Aristotle is a man, but a man isn’t necessarily Aristotle. Complexity 17 IRIS H.-R. JIANG Properties ¨ Transitivity: If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)), where P = O, W, or Q. ¨ Rule of sums: f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q. ¨ Rule of products: If f1(n) = P(g1(n)) and f2(n) = P(g2(n)), then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q. ¨ Transpose symmetry: f(n) = O(g(n)) iff g(n) = W(f(n)). ¨ Reflexivity: f(n) = P(f(n)), where P = O, W , or Q. ¨ Symmetry: f(n) = Q(g(n)) iff g(n) = Q(f(n)). Complexity 18 IRIS H.-R. JIANG Summary: Polynomial-Time Complexity ¨ Polynomial running time: O(p(n)) ¤ p(n) is a polynomial function of input size n (p(n) = nO(1)) ¤ a.k.a. polynomial-time complexity ¨ Order ¤ O(1): constant ¤ O(logn): logarithmic ¤ O(n0.5): sublinear ¤ O(n): linear ¤ O(nlogn): loglinear ¤ O(n2): quadratic ¤ O(n3): cubic ¤ O(n4): quartic ¤ O(2n): exponential ¤ O(n!): factorial ¤ O(nn) Complexity 19 Faster Slower Arrays and lists Implementing Gale-Shapley Algorithm Complexity 20
  • 18. IRIS H.-R. JIANG Abuse of Notation ¨ Q: Why using equality in T(n) = O(f(n))? ¤ Asymmetric: n f(n) = 5n3; g(n) = 3n2 n f(n) = O(n3) = g(n) n but f(n) ¹ g(n). ¤ Better notation: T(n) Î O(f(n)) n O(f(n)) forms a set ¨ Cf. “Is” in English ¤ Aristotle is a man, but a man isn’t necessarily Aristotle. Complexity 17 IRIS H.-R. JIANG Properties ¨ Transitivity: If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)), where P = O, W, or Q. ¨ Rule of sums: f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q. ¨ Rule of products: If f1(n) = P(g1(n)) and f2(n) = P(g2(n)), then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q. ¨ Transpose symmetry: f(n) = O(g(n)) iff g(n) = W(f(n)). ¨ Reflexivity: f(n) = P(f(n)), where P = O, W , or Q. ¨ Symmetry: f(n) = Q(g(n)) iff g(n) = Q(f(n)). Complexity 18 IRIS H.-R. JIANG Summary: Polynomial-Time Complexity ¨ Polynomial running time: O(p(n)) ¤ p(n) is a polynomial function of input size n (p(n) = nO(1)) ¤ a.k.a. polynomial-time complexity ¨ Order ¤ O(1): constant ¤ O(logn): logarithmic ¤ O(n0.5): sublinear ¤ O(n): linear ¤ O(nlogn): loglinear ¤ O(n2): quadratic ¤ O(n3): cubic ¤ O(n4): quartic ¤ O(2n): exponential ¤ O(n!): factorial ¤ O(nn) Complexity 19 Faster Slower Arrays and lists Implementing Gale-Shapley Algorithm Complexity 20
  • 19. IRIS H.-R. JIANG Abuse of Notation ¨ Q: Why using equality in T(n) = O(f(n))? ¤ Asymmetric: n f(n) = 5n3; g(n) = 3n2 n f(n) = O(n3) = g(n) n but f(n) ¹ g(n). ¤ Better notation: T(n) Î O(f(n)) n O(f(n)) forms a set ¨ Cf. “Is” in English ¤ Aristotle is a man, but a man isn’t necessarily Aristotle. Complexity 17 IRIS H.-R. JIANG Properties ¨ Transitivity: If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)), where P = O, W, or Q. ¨ Rule of sums: f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q. ¨ Rule of products: If f1(n) = P(g1(n)) and f2(n) = P(g2(n)), then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q. ¨ Transpose symmetry: f(n) = O(g(n)) iff g(n) = W(f(n)). ¨ Reflexivity: f(n) = P(f(n)), where P = O, W , or Q. ¨ Symmetry: f(n) = Q(g(n)) iff g(n) = Q(f(n)). Complexity 18 IRIS H.-R. JIANG Summary: Polynomial-Time Complexity ¨ Polynomial running time: O(p(n)) ¤ p(n) is a polynomial function of input size n (p(n) = nO(1)) ¤ a.k.a. polynomial-time complexity ¨ Order ¤ O(1): constant ¤ O(logn): logarithmic ¤ O(n0.5): sublinear ¤ O(n): linear ¤ O(nlogn): loglinear ¤ O(n2): quadratic ¤ O(n3): cubic ¤ O(n4): quartic ¤ O(2n): exponential ¤ O(n!): factorial ¤ O(nn) Complexity 19 Faster Slower Arrays and lists Implementing Gale-Shapley Algorithm Complexity 20
  • 20. IRIS H.-R. JIANG Abuse of Notation ¨ Q: Why using equality in T(n) = O(f(n))? ¤ Asymmetric: n f(n) = 5n3; g(n) = 3n2 n f(n) = O(n3) = g(n) n but f(n) ¹ g(n). ¤ Better notation: T(n) Î O(f(n)) n O(f(n)) forms a set ¨ Cf. “Is” in English ¤ Aristotle is a man, but a man isn’t necessarily Aristotle. Complexity 17 IRIS H.-R. JIANG Properties ¨ Transitivity: If f(n) = P(g(n)) and g(n) = P(h(n)), then f(n) = P(h(n)), where P = O, W, or Q. ¨ Rule of sums: f(n) + g(n) = P(max{f(n), g(n)}), where P = O, W, or Q. ¨ Rule of products: If f1(n) = P(g1(n)) and f2(n) = P(g2(n)), then f1(n)f2(n) = P(g1(n)g2(n)), where P = O, W, or Q. ¨ Transpose symmetry: f(n) = O(g(n)) iff g(n) = W(f(n)). ¨ Reflexivity: f(n) = P(f(n)), where P = O, W , or Q. ¨ Symmetry: f(n) = Q(g(n)) iff g(n) = Q(f(n)). Complexity 18 IRIS H.-R. JIANG Summary: Polynomial-Time Complexity ¨ Polynomial running time: O(p(n)) ¤ p(n) is a polynomial function of input size n (p(n) = nO(1)) ¤ a.k.a. polynomial-time complexity ¨ Order ¤ O(1): constant ¤ O(logn): logarithmic ¤ O(n0.5): sublinear ¤ O(n): linear ¤ O(nlogn): loglinear ¤ O(n2): quadratic ¤ O(n3): cubic ¤ O(n4): quartic ¤ O(2n): exponential ¤ O(n!): factorial ¤ O(nn) Complexity 19 Faster Slower Arrays and lists Implementing Gale-Shapley Algorithm Complexity 20
  • 21. IRIS H.-R. JIANG Recap: Gale-Shapley Algorithm ¨ Correctness: ¤ Termination: G-S terminates after at most n2 iterations. ¤ Perfection: Everyone gets married. ¤ Stability: The marriages are stable. ¨ Male-optimal and female-pessimal ¨ All executions yield the same matching Complexity 21 O(n2)? Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG What to Do? ¨ Goal: Each iteration takes O(1) time and then O(n2) in total. ¤ Line 2: Identify a free man. ¤ Line 3: For a man m, identify the highest ranked woman whom he hasn’t yet proposed. ¤ Line 4: For a woman w, decide if w is currently engaged, and if so, identify her current partner. ¤ Line 6: For a woman w and two men m and m', decide which of m and m' is preferred by w. ¤ Complexity 22 Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG Representing Men and Women ¨ Q: How to represent men and women? ¨ A: Assume the set of men and women are both {1, …, n}. ¤ Record in two arrays ¨ Q: How to store men’s preference lists? ¨ A: Record in an nxn array or (n arrays, each of length n) Complexity 23 Amy Bertha Clare Women 1 2 3 ID 1 2 3 ID Xavier Yancey Zeus Amy Bertha Amy Bertha Amy Bertha Clare Clare Clare Men’s Preference Profile 1 2 1 2 1 2 3 3 3 ManPref[ , ] Men Yancey Xavier Xavier Xavier Yancey Yancey Zeus Zeus Zeus Women’s Preference Profile ManPref[m, i]: ith woman in m’s list IRIS H.-R. JIANG Identifying a Free Man ¨ Q: Line 2: How to identify a free man in O(1) time? ¨ A: ¤ Since the set of free men is dynamic, a static array is not good for insertion/deletion. ¤ How about a linked list? It can be accessed in O(1) time and allows various sizes. n Read/insert/delete from first Complexity 24 1 2 3 0 first data link(pointer) 1 2 3 0 first delete 1 insert 1
  • 22. IRIS H.-R. JIANG Recap: Gale-Shapley Algorithm ¨ Correctness: ¤ Termination: G-S terminates after at most n2 iterations. ¤ Perfection: Everyone gets married. ¤ Stability: The marriages are stable. ¨ Male-optimal and female-pessimal ¨ All executions yield the same matching Complexity 21 O(n2)? Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG What to Do? ¨ Goal: Each iteration takes O(1) time and then O(n2) in total. ¤ Line 2: Identify a free man. ¤ Line 3: For a man m, identify the highest ranked woman whom he hasn’t yet proposed. ¤ Line 4: For a woman w, decide if w is currently engaged, and if so, identify her current partner. ¤ Line 6: For a woman w and two men m and m', decide which of m and m' is preferred by w. ¤ Complexity 22 Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG Representing Men and Women ¨ Q: How to represent men and women? ¨ A: Assume the set of men and women are both {1, …, n}. ¤ Record in two arrays ¨ Q: How to store men’s preference lists? ¨ A: Record in an nxn array or (n arrays, each of length n) Complexity 23 Amy Bertha Clare Women 1 2 3 ID 1 2 3 ID Xavier Yancey Zeus Amy Bertha Amy Bertha Amy Bertha Clare Clare Clare Men’s Preference Profile 1 2 1 2 1 2 3 3 3 ManPref[ , ] Men Yancey Xavier Xavier Xavier Yancey Yancey Zeus Zeus Zeus Women’s Preference Profile ManPref[m, i]: ith woman in m’s list IRIS H.-R. JIANG Identifying a Free Man ¨ Q: Line 2: How to identify a free man in O(1) time? ¨ A: ¤ Since the set of free men is dynamic, a static array is not good for insertion/deletion. ¤ How about a linked list? It can be accessed in O(1) time and allows various sizes. n Read/insert/delete from first Complexity 24 1 2 3 0 first data link(pointer) 1 2 3 0 first delete 1 insert 1
  • 23. IRIS H.-R. JIANG Recap: Gale-Shapley Algorithm ¨ Correctness: ¤ Termination: G-S terminates after at most n2 iterations. ¤ Perfection: Everyone gets married. ¤ Stability: The marriages are stable. ¨ Male-optimal and female-pessimal ¨ All executions yield the same matching Complexity 21 O(n2)? Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG What to Do? ¨ Goal: Each iteration takes O(1) time and then O(n2) in total. ¤ Line 2: Identify a free man. ¤ Line 3: For a man m, identify the highest ranked woman whom he hasn’t yet proposed. ¤ Line 4: For a woman w, decide if w is currently engaged, and if so, identify her current partner. ¤ Line 6: For a woman w and two men m and m', decide which of m and m' is preferred by w. ¤ Complexity 22 Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG Representing Men and Women ¨ Q: How to represent men and women? ¨ A: Assume the set of men and women are both {1, …, n}. ¤ Record in two arrays ¨ Q: How to store men’s preference lists? ¨ A: Record in an nxn array or (n arrays, each of length n) Complexity 23 Amy Bertha Clare Women 1 2 3 ID 1 2 3 ID Xavier Yancey Zeus Amy Bertha Amy Bertha Amy Bertha Clare Clare Clare Men’s Preference Profile 1 2 1 2 1 2 3 3 3 ManPref[ , ] Men Yancey Xavier Xavier Xavier Yancey Yancey Zeus Zeus Zeus Women’s Preference Profile ManPref[m, i]: ith woman in m’s list IRIS H.-R. JIANG Identifying a Free Man ¨ Q: Line 2: How to identify a free man in O(1) time? ¨ A: ¤ Since the set of free men is dynamic, a static array is not good for insertion/deletion. ¤ How about a linked list? It can be accessed in O(1) time and allows various sizes. n Read/insert/delete from first Complexity 24 1 2 3 0 first data link(pointer) 1 2 3 0 first delete 1 insert 1
  • 24. IRIS H.-R. JIANG Recap: Gale-Shapley Algorithm ¨ Correctness: ¤ Termination: G-S terminates after at most n2 iterations. ¤ Perfection: Everyone gets married. ¤ Stability: The marriages are stable. ¨ Male-optimal and female-pessimal ¨ All executions yield the same matching Complexity 21 O(n2)? Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG What to Do? ¨ Goal: Each iteration takes O(1) time and then O(n2) in total. ¤ Line 2: Identify a free man. ¤ Line 3: For a man m, identify the highest ranked woman whom he hasn’t yet proposed. ¤ Line 4: For a woman w, decide if w is currently engaged, and if so, identify her current partner. ¤ Line 6: For a woman w and two men m and m', decide which of m and m' is preferred by w. ¤ Complexity 22 Gale-Shapley 1. initialize each person to be free 2. while (some man m is free and hasn't proposed to every woman) do 3. w = highest ranked woman in m's list to whom m has not yet proposed 4. if (w is free) then 5. (m, w) become engaged 6. else if (w prefers m to her fiancé m') then 7. (m, w) become engaged 8. m' become free 9. return the set S of engaged pairs IRIS H.-R. JIANG Representing Men and Women ¨ Q: How to represent men and women? ¨ A: Assume the set of men and women are both {1, …, n}. ¤ Record in two arrays ¨ Q: How to store men’s preference lists? ¨ A: Record in an nxn array or (n arrays, each of length n) Complexity 23 Amy Bertha Clare Women 1 2 3 ID 1 2 3 ID Xavier Yancey Zeus Amy Bertha Amy Bertha Amy Bertha Clare Clare Clare Men’s Preference Profile 1 2 1 2 1 2 3 3 3 ManPref[ , ] Men Yancey Xavier Xavier Xavier Yancey Yancey Zeus Zeus Zeus Women’s Preference Profile ManPref[m, i]: ith woman in m’s list IRIS H.-R. JIANG Identifying a Free Man ¨ Q: Line 2: How to identify a free man in O(1) time? ¨ A: ¤ Since the set of free men is dynamic, a static array is not good for insertion/deletion. ¤ How about a linked list? It can be accessed in O(1) time and allows various sizes. n Read/insert/delete from first Complexity 24 1 2 3 0 first data link(pointer) 1 2 3 0 first delete 1 insert 1
  • 25. IRIS H.-R. JIANG Whom to Propose? ¨ Q: Line 3: For a man m, how to identify the highest ranked woman whom he hasn’t yet proposed in O(1) time? ¨ A: ¤ An extra array Next[] indicates for each man m the position of the next woman he will propose to on his list. n m will propose to w = ManPref[m, Next[m]] n Next[m]++ after proposal, regardless of rejection/acceptance Complexity 25 1 1 1 Next Xavier Yancey Zeus Men 1 2 3 ID Initial value 1 2 1 2 1 2 3 3 3 ManPref[ , ] 2 2 3 Next In progress 1 2 1 2 1 2 3 3 3 ManPref[ , ] IRIS H.-R. JIANG Identifying Her Fiancé ¨ Q: Line 4: For a woman w, how to decide if w is currently engaged? If so, how to identify her current fiancé in O(1) time? ¨ A: ¤ An array Current[] of length n. n w’s current fiancé = Current[w] n Initially, Current[w] = 0 (unmatched) Complexity 26 0 0 0 Current Initial value 2 1 1 1 2 2 3 3 3 WomanPref[ , ] Amy Bertha Clare Women 1 2 3 ID Xavier Yancey Zeus Men 1 2 3 ID 2 1 1 1 2 2 3 3 3 WomanPref[ , ] 1 3 0 Current In progress IRIS H.-R. JIANG Women’s Preferences (1/2) ¨ Q: Line 6: For a woman w and two men m and m', how to decide w prefers m or m' in O(1)? ¨ A: ¤ An array WomanPref[w, i] is analogous to ManPref[m, i] ¨ Q: Does it work? Why? ¨ A: ¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time ¤ When man 1 proposes to woman 1 … Complexity 27 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th IRIS H.-R. JIANG Women’s Preferences (2/2) ¨ Q: Can we do better? ¨ A: Yes. ¤ Create an nxn array Ranking[ , ] at the beginning. ¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list) ¤ When man 1 proposes to woman 1, compare them in O(1) time. ¤ Construct Ranking[ , ] while parsing inputs in O(n2) time. Complexity 28 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th 3 4 5 2 6 1 Ranking[ , ] 1 2 3 4 5 6 Men for (i=1 to n) do Ranking[WomanPref[w, i]] = i
  • 26. IRIS H.-R. JIANG Whom to Propose? ¨ Q: Line 3: For a man m, how to identify the highest ranked woman whom he hasn’t yet proposed in O(1) time? ¨ A: ¤ An extra array Next[] indicates for each man m the position of the next woman he will propose to on his list. n m will propose to w = ManPref[m, Next[m]] n Next[m]++ after proposal, regardless of rejection/acceptance Complexity 25 1 1 1 Next Xavier Yancey Zeus Men 1 2 3 ID Initial value 1 2 1 2 1 2 3 3 3 ManPref[ , ] 2 2 3 Next In progress 1 2 1 2 1 2 3 3 3 ManPref[ , ] IRIS H.-R. JIANG Identifying Her Fiancé ¨ Q: Line 4: For a woman w, how to decide if w is currently engaged? If so, how to identify her current fiancé in O(1) time? ¨ A: ¤ An array Current[] of length n. n w’s current fiancé = Current[w] n Initially, Current[w] = 0 (unmatched) Complexity 26 0 0 0 Current Initial value 2 1 1 1 2 2 3 3 3 WomanPref[ , ] Amy Bertha Clare Women 1 2 3 ID Xavier Yancey Zeus Men 1 2 3 ID 2 1 1 1 2 2 3 3 3 WomanPref[ , ] 1 3 0 Current In progress IRIS H.-R. JIANG Women’s Preferences (1/2) ¨ Q: Line 6: For a woman w and two men m and m', how to decide w prefers m or m' in O(1)? ¨ A: ¤ An array WomanPref[w, i] is analogous to ManPref[m, i] ¨ Q: Does it work? Why? ¨ A: ¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time ¤ When man 1 proposes to woman 1 … Complexity 27 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th IRIS H.-R. JIANG Women’s Preferences (2/2) ¨ Q: Can we do better? ¨ A: Yes. ¤ Create an nxn array Ranking[ , ] at the beginning. ¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list) ¤ When man 1 proposes to woman 1, compare them in O(1) time. ¤ Construct Ranking[ , ] while parsing inputs in O(n2) time. Complexity 28 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th 3 4 5 2 6 1 Ranking[ , ] 1 2 3 4 5 6 Men for (i=1 to n) do Ranking[WomanPref[w, i]] = i
  • 27. IRIS H.-R. JIANG Whom to Propose? ¨ Q: Line 3: For a man m, how to identify the highest ranked woman whom he hasn’t yet proposed in O(1) time? ¨ A: ¤ An extra array Next[] indicates for each man m the position of the next woman he will propose to on his list. n m will propose to w = ManPref[m, Next[m]] n Next[m]++ after proposal, regardless of rejection/acceptance Complexity 25 1 1 1 Next Xavier Yancey Zeus Men 1 2 3 ID Initial value 1 2 1 2 1 2 3 3 3 ManPref[ , ] 2 2 3 Next In progress 1 2 1 2 1 2 3 3 3 ManPref[ , ] IRIS H.-R. JIANG Identifying Her Fiancé ¨ Q: Line 4: For a woman w, how to decide if w is currently engaged? If so, how to identify her current fiancé in O(1) time? ¨ A: ¤ An array Current[] of length n. n w’s current fiancé = Current[w] n Initially, Current[w] = 0 (unmatched) Complexity 26 0 0 0 Current Initial value 2 1 1 1 2 2 3 3 3 WomanPref[ , ] Amy Bertha Clare Women 1 2 3 ID Xavier Yancey Zeus Men 1 2 3 ID 2 1 1 1 2 2 3 3 3 WomanPref[ , ] 1 3 0 Current In progress IRIS H.-R. JIANG Women’s Preferences (1/2) ¨ Q: Line 6: For a woman w and two men m and m', how to decide w prefers m or m' in O(1)? ¨ A: ¤ An array WomanPref[w, i] is analogous to ManPref[m, i] ¨ Q: Does it work? Why? ¨ A: ¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time ¤ When man 1 proposes to woman 1 … Complexity 27 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th IRIS H.-R. JIANG Women’s Preferences (2/2) ¨ Q: Can we do better? ¨ A: Yes. ¤ Create an nxn array Ranking[ , ] at the beginning. ¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list) ¤ When man 1 proposes to woman 1, compare them in O(1) time. ¤ Construct Ranking[ , ] while parsing inputs in O(n2) time. Complexity 28 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th 3 4 5 2 6 1 Ranking[ , ] 1 2 3 4 5 6 Men for (i=1 to n) do Ranking[WomanPref[w, i]] = i
  • 28. IRIS H.-R. JIANG Whom to Propose? ¨ Q: Line 3: For a man m, how to identify the highest ranked woman whom he hasn’t yet proposed in O(1) time? ¨ A: ¤ An extra array Next[] indicates for each man m the position of the next woman he will propose to on his list. n m will propose to w = ManPref[m, Next[m]] n Next[m]++ after proposal, regardless of rejection/acceptance Complexity 25 1 1 1 Next Xavier Yancey Zeus Men 1 2 3 ID Initial value 1 2 1 2 1 2 3 3 3 ManPref[ , ] 2 2 3 Next In progress 1 2 1 2 1 2 3 3 3 ManPref[ , ] IRIS H.-R. JIANG Identifying Her Fiancé ¨ Q: Line 4: For a woman w, how to decide if w is currently engaged? If so, how to identify her current fiancé in O(1) time? ¨ A: ¤ An array Current[] of length n. n w’s current fiancé = Current[w] n Initially, Current[w] = 0 (unmatched) Complexity 26 0 0 0 Current Initial value 2 1 1 1 2 2 3 3 3 WomanPref[ , ] Amy Bertha Clare Women 1 2 3 ID Xavier Yancey Zeus Men 1 2 3 ID 2 1 1 1 2 2 3 3 3 WomanPref[ , ] 1 3 0 Current In progress IRIS H.-R. JIANG Women’s Preferences (1/2) ¨ Q: Line 6: For a woman w and two men m and m', how to decide w prefers m or m' in O(1)? ¨ A: ¤ An array WomanPref[w, i] is analogous to ManPref[m, i] ¨ Q: Does it work? Why? ¨ A: ¤ No. Scan WomanPref[ , ] to extract the rank in O(n) time ¤ When man 1 proposes to woman 1 … Complexity 27 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th IRIS H.-R. JIANG Women’s Preferences (2/2) ¨ Q: Can we do better? ¨ A: Yes. ¤ Create an nxn array Ranking[ , ] at the beginning. ¤ Ranking[w, m] = the rank of m is w’s preference list (inverse list) ¤ When man 1 proposes to woman 1, compare them in O(1) time. ¤ Construct Ranking[ , ] while parsing inputs in O(n2) time. Complexity 28 Amy Women 1 ID 6 Current 5 6 2 3 1 4 WomanPref[ , ] 1st 2nd 3rd 4th 5th 6th 3 4 5 2 6 1 Ranking[ , ] 1 2 3 4 5 6 Men for (i=1 to n) do Ranking[WomanPref[w, i]] = i
  • 29. IRIS H.-R. JIANG What Did We Learn? ¨ The meaning of efficiency ¤ Efficiency Û polynomial-time complexity ¨ The analysis of an algorithm ¤ We try to figure out the intrinsic running time before implementing. ¤ Data structures play an important role; if necessary, preprocess the data into a proper form. Complexity 29 IRIS H.-R. JIANG Running Times Comparison ¨ Q: Arrange the following list of functions in ascending order of growth rate. ¤ f1(n) = n1/3 ¤ f2(n) = lg n (lg = log2) ¤ f3(n) = 2!lg n ¨ A: Convert them into a common form: ¤ Take logarithm! ¤ Let z = lg n ¤ lg f1(n) = lg n1/3 = (1/3)log n = (1/3)z ¤ lg f2(n) = lg (lg n) = lg z ¤ lg f3(n) = lg (2!lg n) = !lg n = !z ¤ DIY the rest! (see Solved Exercise 1) Complexity 30
  • 30. IRIS H.-R. JIANG What Did We Learn? ¨ The meaning of efficiency ¤ Efficiency Û polynomial-time complexity ¨ The analysis of an algorithm ¤ We try to figure out the intrinsic running time before implementing. ¤ Data structures play an important role; if necessary, preprocess the data into a proper form. Complexity 29 IRIS H.-R. JIANG Running Times Comparison ¨ Q: Arrange the following list of functions in ascending order of growth rate. ¤ f1(n) = n1/3 ¤ f2(n) = lg n (lg = log2) ¤ f3(n) = 2!lg n ¨ A: Convert them into a common form: ¤ Take logarithm! ¤ Let z = lg n ¤ lg f1(n) = lg n1/3 = (1/3)log n = (1/3)z ¤ lg f2(n) = lg (lg n) = lg z ¤ lg f3(n) = lg (2!lg n) = !lg n = !z ¤ DIY the rest! (see Solved Exercise 1) Complexity 30