3. Dynamic Programming
• Dynamic programming is a general strategy to solve computational
problems, like divide-and-conquer.
4. Dynamic Programming
• Divide-and-conquer paradigm:
• Break the problem into several
disjoint subproblems that are
similar to the original one but
smaller in size.
• Solve the subproblems
recursively.
• Combine these solutions to
create a solution to the original
problem.
• Dynamic programming:
• Break the problem into several
overlapping subproblems that
are similar to the original one
but smaller in size.
• Solve the subproblems
recursively only once.
• Combine these solutions to
create a solution to the original
problem.
6. Dynamic Programming
• Lets look at the rod-cutting problem.
• Given a rod of length n inches and a table of prices pi for i=1,2,…,n, determine
the maximum revenue rn obtainable by cutting up the rod and selling the
prices.
8. Dynamic Programming
pn: the revenue from no cut at all
ri+rn-i: the revenue from cutting at the i-th place
This is called the optimal substructure property:
Optimal solutions to a problem incorporate optimal solutions to
related subproblems, which we may solve independently.
9. Dynamic Programming
• To make it simpler, we have
Only the right-hand side of a cut at the i–th place will be cut further.
12. Dynamic Programming
• We do not want to repeatedly calculate r2, r3, …, rn-1.
• In dynamic programming, we only need to calculate them once.
• The first time we calculate ri, we store it in a place.
• The next time we need to use ri, we just retrieve it directly.
14. Dynamic Programming
Key idea:
1. We use r[0,…,n] to store the maximum revenue.
2. In line 1 of MEMOIZED-CUT-ROD-AUX(p, n, r),
If we find that rn is calculated before, we just
return it directly;
Otherwise, calculate and store it.
15. Dynamic Programming
• MEMOIZED-CUT-ROD-AUX(p, n, r) calculates rn in a top-down fashion
by recursion.
• Most often, we want a bottom-up solution for two reasons:
• Logically, it is easy to follow.
• Faster, as it avoids recursion.