Upcoming SlideShare
×

# Lesson31 Higher Dimensional First Order Difference Equations Slides

1,796 views

Published on

matrix techniques in difference equations

Published in: Technology, Sports
0 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

• Be the first to like this

Views
Total views
1,796
On SlideShare
0
From Embeds
0
Number of Embeds
60
Actions
Shares
0
63
0
Likes
0
Embeds 0
No embeds

No notes for slide

### Lesson31 Higher Dimensional First Order Difference Equations Slides

1. 1. Lesson 31 First Order, Higher Dimensional Difference Equations Math 20 April 30, 2007 Announcements PS 12 due Wednesday, May 2 MT III Friday, May 4 in SC Hall A Final Exam: Friday, May 25 at 9:15am, Boylston 110 (Fong Auditorium)
2. 2. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
3. 3. one-dimensional linear difference equations Fact The solution to the inhomogeneous difference equation yk+1 = ayk + b (with a = 1) has solution b b yk = ak y0 − + 1−a 1−a Please try not to memorize this. When a and b have actual values, it’s either to follow this process: 1. Start with ak times an undetermined parameter c (this satisﬁes the homogenized equation) 2. Find the equilibrium value y∗ . 3. Add the two and pick c to match y0 when k = 0.
4. 4. Nonlinear equations sl op e = − Fact 1 slo The equilibrium pe =g y0 point y∗ of the (y ∗) nonlinear y2 difference equation yk+1 = g(yk ) is y1 stable if 1 |g (yk )| < 1. = e op sl
5. 5. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
6. 6. Let’s kick it up a notch and look at the multivariable, linear, homogeneous difference equation y(k + 1) = Ay(k ) (we move the index into parentheses to allow y(k ) to have coordinates and to avoid writing yk,i .)
7. 7. Skipping class Example This example was a Markov chain with transition matrix 0.7 0.8 A= 0.3 0.2 Then the probability of going or skipping on day k satisﬁes the equation p(k + 1) = Ap(k )
8. 8. Example Female lobsters have more eggs each season the longer they live. For this reason, it is illegal to keep a lobster that has laid eggs. Let yi be the number of lobsters in a ﬁshery which are i years alive. Then the difference equation might have the simpliﬁed form   0 100 400 700 0.1 0 0 0 y(k + 1) =   y(k)  0 0.3 0 0 0 0 0.9 0
9. 9. Mmmm. . . Lobster
10. 10. Formal solution y(1) = Ay(0) y(2) = Ay(1) = A2 y(0) y(3) = Ay(2) = A3 y(0) So
11. 11. Formal solution y(1) = Ay(0) y(2) = Ay(1) = A2 y(0) y(3) = Ay(2) = A3 y(0) So Fact The solution to the homogeneous system of linear difference equations y(k + 1) = Ay(k) is y(k) = Ak y(0)
12. 12. Flop count To multiply two n × n matrices takes n3 (n − 1) additions or multiplications (ﬂop=ﬂoating point operation)
13. 13. Flop count To multiply two n × n matrices takes n3 (n − 1) additions or multiplications (ﬂop=ﬂoating point operation) So ﬁnding Ak takes about n4k ﬂops!
14. 14. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is
15. 15. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v
16. 16. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm
17. 17. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm Then Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm 2 2 2
18. 18. Now what? Suppose v is an eigenvector of A with eigenvalue λ . Then the solution to the problem y(k + 1) = Ay(k), y(0) = v is y(k ) = λ k v Suppose y(0) = c1 v1 + c2 v2 + · · · + cm vm Then Ay(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm A2 y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cm λm vm 2 2 2 If A is diagonalizable, we can take m = n and write any initial vector as a linear combination of eigenvalues.
19. 19. The big picture Fact Let A have a complete system of eigenvalues and eigenvectors λ1 , λ2 , . . . , λn and v1 , v2 , . . . , vn . Then the solution to the difference equation y(k + 1) = Ay(k) is y(k ) = Ak y(0) = c1 λ1 v1 + c2 λ2 v2 + · · · + cn λn vn k k k where c1 , c2 , . . . , cn are chosen to make y(0) = c1 v1 + c2 v2 + · · · + cn vn
20. 20. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
21. 21. Iterating diagonal systems Consider a 2 × 2 matrix of the form λ1 0 D= 0 λ2 Then the λ ’s tell the behavior of the system.
22. 22. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin
23. 23. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin
24. 24. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point
25. 25. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point
26. 26. Picture in terms of eigenvalues λ1 > λ2 > 1: repulsion away from the origin 1 > λ1 > λ2 > 0: attraction to the origin λ1 > 1 > λ2 : saddle point For negative eigenvalues just square them and use the above results.
27. 27. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2
28. 28. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2 The eigenvectors (in decreasing order of absolute value) are −1 8/11 1 with eigenvalue 1 and 12 with eigenvalue − 10 . 3/11 2
29. 29. Back to skipping class Example If 0.7 0.8 A= 0.3 0.2 The eigenvectors (in decreasing order of absolute value) are −1 8/11 1 with eigenvalue 1 and 12 with eigenvalue − 10 . So the 3/11 2 8/11 system converges to a multiple of 3 . /11
30. 30. Back to the lobsters We had   0 100 400 700 0.1 0 0 0 A=   0 0.3 0 0 0 0 0.9 0 The eigenvalues are 3.80293, −2.84895, −0.476993 + 1.23164i, −0.476993 − 1.23164i and the ﬁrst eigenvector is T 0.999716 0.0233099 0.00489153
31. 31. Back to the lobsters We had   0 100 400 700 0.1 0 0 0 A=   0 0.3 0 0 0 0 0.9 0 The eigenvalues are 3.80293, −2.84895, −0.476993 + 1.23164i, −0.476993 − 1.23164i and the ﬁrst eigenvector is T 0.999716 0.0233099 0.00489153 The population will grow despite the increased harvesting!
32. 32. Recap Higher dimensional linear systems Examples Markov Chains Population Dynamics Solution Qualitative Analysis Diagonal systems Examples Higher dimensional nonlinear
33. 33. The nonlinear case Consider now the nonlinear system y(k + 1) = g(y(k)). The process is as it was with the one-dimensional nonlinear: 1. Look for equilibria y∗ with g(y∗ ) = y∗ 2. Linearize about the equilibrium using the matrix ∂ gi A = Dg(y∗ ) = ∂ yj 3. The eigenvalues of A determine the stability of y∗ .