Lesson 28: Lagrange Multipliers II

5,183
-1

Published on

more justifications, more examples, more dimensions, more constraints, more better!

Published in: Technology, Education
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
5,183
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
127
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Lesson 28: Lagrange Multipliers II

  1. 1. Lesson 28 (Sections 18.2–5) Lagrange Multipliers II Math 20 November 28, 2007 Announcements Problem Set 11 assigned today. Due December 5. next OH: Today 1–3 (SC 323) Midterm II review: Tuesday 12/4, 7:30-9:00pm in Hall E Midterm II: Thursday, 12/6, 7-8:30pm in Hall A
  2. 2. Outline A homework problem Restating the Method of Lagrange Multipliers Statement Justifications Second order conditions Compact feasibility sets Ad hoc arguments Analytic conditions Example: More than two variables More than one constraint
  3. 3. Problem 17.1.10 Problem Maximize the quantity f (x, y , z) = Ax a y b z c subject to the constraint that px + qy + rz = m. (Here A, a, b, c, p, q, r , m are positive constants.)
  4. 4. Problem 17.1.10 Problem Maximize the quantity f (x, y , z) = Ax a y b z c subject to the constraint that px + qy + rz = m. (Here A, a, b, c, p, q, r , m are positive constants.) Solution (By elimination) Solving the constraint for z in terms of x and y , we get m − px − qy z= r So we optimize the unconstrained function Aab x y (m − px − qy )c f (x, y ) = rc
  5. 5. We have ∂f (x, y ) A = c ax a−1 y b (m − px − qy )c + x a y b c(m − px − qy )c−1 (−p) x r A = c x a−1 y b (m − px − qy )c−1 [a(m − px − qy ) − cpx] r Likewise ∂f (x, y ) A = c x a y b−1 (m − px − qy )c−1 [b(m − px − qy ) − cqy ] y r So throwing out the critical points where x = 0, y = 0, or z = 0 (these give minimal values of f , not maximal), we get (a + c)px + aqy = am bpx + (b + c)qy = bm
  6. 6. This is a fun exercise in Cramer’s Rule: am aq 1 1 amq bm (b + c)q b b+c x= = (a + c)p aq a+c a pq bp (b + c)q b b+c amqc m a = = pq(ac + bc + c 2 ) p a+b+c It follows that m b m c y= z= q a+b+c r a+b+c If this is a utility-maximization problem subject to fixed budget, the portion spent on each good ( px , for instance) is the relative m a degree to which that good multiplies utility ( a+b+c ).
  7. 7. Outline A homework problem Restating the Method of Lagrange Multipliers Statement Justifications Second order conditions Compact feasibility sets Ad hoc arguments Analytic conditions Example: More than two variables More than one constraint
  8. 8. Theorem (The Method of Lagrange Multipliers) Let f (x1 , x2 , . . . , xn ) and g (x1 , x2 , . . . , xn ) be functions of several variables. The critical points of the function f restricted to the set g = 0 are solutions to the equations: ∂f ∂g (x1 , x2 , . . . , xn ) = λ (x1 , x2 , . . . , xn ) for each i = 1, . . . , n ∂xi ∂xi g (x1 , x2 , . . . , xn ) = 0. Note that this is n + 1 equations in n + 1 variables x1 , . . . , xn , λ.
  9. 9. Graphical Justification In two variables, the critical points of f restricted to the level curve g = 0 are found when the tangent to the the level curve of f is parallel to the tangent to the level curve g = 0.
  10. 10. These tangents have slopes dy fx dy gx =− =− and dx fy dx gy f g
  11. 11. These tangents have slopes dy fx dy gx =− =− and dx fy dx gy f g So they are equal when fy fx g f = x =⇒ x = fy gy gx gy or fx = λgx fy = λgy
  12. 12. Symbolic Justification Suppose that we can use the relation g (x1 , . . . , xn ) = 0 to solve for xn in terms of the the other variables x1 , . . . , xn−1 , after making some choices. Then the critical points of f (x1 , . . . , xn ) are unconstrained critical points of f (x1 , . . . , xn (x1 , . . . , xn−1 )). f x1 x2 xn ··· xn−1 x1 x2 ···
  13. 13. Now for any i = 1, . . . , n − 1, ∂f ∂f ∂f ∂xn = + ∂xi ∂xi ∂xn ∂xi g g ∂f ∂f ∂g /∂xi − = ∂xi ∂xn ∂g /∂xn ∂f If = 0, then ∂xi g ∂f /∂xi ∂g /∂xi ∂f /∂xi ∂f /∂xn ⇐⇒ = = ∂f /∂xn ∂g /∂xn ∂g /∂xi ∂g /∂xn ∂f ∂g So as before, =λ for all i. ∂xi ∂xi
  14. 14. Another perspective To find the critical points of f subject to the constraint that g = 0, create the lagrangian function L = f (x1 , x2 , . . . , xn ) − λg (x1 , x2 , . . . , xn ) If L is restricted to the set g = 0, L = f and so the constrained critical points are unconstrained critical points of L . So for each i, ∂L ∂f ∂g = 0 =⇒ =λ . ∂xi ∂xi ∂xi But also, ∂L = 0 =⇒ g (x1 , x2 , . . . , xn ) = 0. ∂λ
  15. 15. Outline A homework problem Restating the Method of Lagrange Multipliers Statement Justifications Second order conditions Compact feasibility sets Ad hoc arguments Analytic conditions Example: More than two variables More than one constraint
  16. 16. Second order conditions The Method of Lagrange Multipliers finds the constrained critical points, but doesn’t determine their “type” (max, min, neither). So what then?
  17. 17. A dash of topology Cf. Sections 17.2–3 Definition A subset of Rn is called closed if it includes its boundary.
  18. 18. A dash of topology Cf. Sections 17.2–3 Definition A subset of Rn is called closed if it includes its boundary. x2 + y2 ≤ 1 x2 + y2 ≤ 1 y ≥0 not closed closed closed Basically, if a subset is described by ≤ or ≥ inequalities, it is closed.
  19. 19. Definition A subset of Rn is called bounded if it is contained within some ball centered at the origin. x2 + y2 ≤ 1 x2 + y2 ≤ 1 y ≥0 bounded not bounded bounded
  20. 20. Definition A subset of Rn is called compact if it is closed and bounded. x2 + y2 ≤ 1 x2 + y2 ≤ 1 y ≥0 not compact not compact compact
  21. 21. Optimizing over compact sets Theorem (Compact Set Method) To find the extreme values of function f on a compact set D of Rn , it suffices to find the (unconstrained) critical points of f “inside” D the (constrained) critical points of f on the “boundary” of D.
  22. 22. Ad hoc arguments If D is not compact, sometimes it’s still easy to argue that as x gets farther away, f becomes larger, or smaller, so the critical points are “obviously” maxes, or mins.
  23. 23. Ad hoc arguments If D is not compact, sometimes it’s still easy to argue that as x gets farther away, f becomes larger, or smaller, so the critical points are “obviously” maxes, or mins. (Example later)
  24. 24. Analytic conditions Recall Equation 16.13, cf. Section 18.4 For the two-variable constrained optimization problem, we have (look in the book if you want the gory details): Lλλ Lλx Lλy 0 gx gy d 2f fxy − λgxy = Lxλ Lxx Lxy − λgxx = gx fxx dx 2 Ly λ Lyx Lyy g − λgyx gyy − λgyy gy fyx The critical point is a local max if this determinant is negative, and a local min if this is positive. The matrix on the right is the Hessian of the Lagrangian. But there is still a distinction between this and the unconstrained case. The constrained extrema are critical points of the Lagrangian, not extrema. Don’t worry too much about this!
  25. 25. Outline A homework problem Restating the Method of Lagrange Multipliers Statement Justifications Second order conditions Compact feasibility sets Ad hoc arguments Analytic conditions Example: More than two variables More than one constraint
  26. 26. Problem 17.1.10 Problem Maximize the quantity f (x, y , z) = Ax a y b z c subject to the constraint that px + qy + rz = m. (Here A, a, b, c, p, q, r , m are positive constants.)
  27. 27. Problem 17.1.10 Problem Maximize the quantity f (x, y , z) = Ax a y b z c subject to the constraint that px + qy + rz = m. (Here A, a, b, c, p, q, r , m are positive constants.) Solution The Lagrange equations are Aax a−1 y b z c = λp Abx a y b−1 z c = λq Acx a y b z c−1 = λr We rule out any solution with x, y , z, or λ equal to 0 (they will minimize f , not maximize it).
  28. 28. Dividing the first two equations gives ay p bp = =⇒ y = x bx q aq Dividing the first and last equations gives az p cp = =⇒ z = x cx r ar Plugging these into the equation of constraint gives bp cp m a px + x + x = m =⇒ x = a a p a+b+c
  29. 29. Outline A homework problem Restating the Method of Lagrange Multipliers Statement Justifications Second order conditions Compact feasibility sets Ad hoc arguments Analytic conditions Example: More than two variables More than one constraint
  30. 30. General method for more than one constraint If we are optimizing f (x1 , . . . , xn ) subject to gj (x1 , . . . , xn ) ≡ 0, j = 1, . . . , m we need multiple lambdas for them. The new Lagrangian is m L (x1 , . . . , xn ) = f (x1 , . . . , xn ) − λj gj (x1 , . . . , xn ) j=1 ∂L ∂L The conditions are that = 0 and = 0 for all i and j. In ∂xi ∂λj other words, ∂f ∂g1 ∂gm + · · · + λm = λ1 (all i) ∂xi ∂xi ∂xi gj (x1 , . . . , xn ) = 0 (all j)
  31. 31. Example Find the minimum distance between the curves xy = 1 and x + 2y = 1.
  32. 32. Example Find the minimum distance between the curves xy = 1 and x + 2y = 1. Reframing this, we can minimize f (x, y , u, v ) = (x − u)2 + (y − v )2 subject to the constraints xy − 1 = 0 u + 2v = 1.
  33. 33. • • • • xy = 1 • • • • x + 2y = 1
  34. 34. The Lagrangian is L = (x − u)2 + (y − v )2 − λ(xy − 1) − µ(u + 2v − 1) So the Lagrangian equations are 2(x − u) = λy −2(x − u) = µ 2(y − v ) = λx −2(y − v ) = 2µ Dividing the two λ equations and the two µ equations gives x −u x −u y 1 = =. y −v y −v x 2 Since the left-hand-sides are the same, we have 2y =√ Since x. √ 1 1 √ , or x = − 2, y = − √ xy = 1, we can say either x = 2, y = 2 2
  35. 35. √ 1 Suppose x = 2, y = √. Then 2 √ √ 2−u 1 32 =⇒ 2u − v = = 1 2 2 √ −v 2 This along with u + 2v = 1 gives √ √ 1 1 4−3 2 u= 1+3 2 v= 5 10 √ 1 If we instead choose x = − 2, y = − √2 , we get √ 1 1 3 2+ √ 1−3 2 u= v= 5 5 2
  36. 36. √ 1 9−4 2 5 • • xy = 1 • • x + 2y = 1 √ 1 9+4 2 5
  37. 37. Because f gets larger as x, y , u, and v get larger, the absolute minimum is the smaller of these two critical values. So the √ 1 minimum distance is 5 9 − 4 2 .
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×