Upcoming SlideShare
×

# Lesson 9: Gaussian Elimination

7,668 views

Published on

Published in: Technology, Education
2 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• Be the first to comment

Views
Total views
7,668
On SlideShare
0
From Embeds
0
Number of Embeds
47
Actions
Shares
0
278
0
Likes
2
Embeds 0
No embeds

No notes for slide

### Lesson 9: Gaussian Elimination

1. 1. Lesson 9 Gaussian Elimination (KH, Section 1.6) Math 20 October 10, 2007 Announcements Problem Set 4 will be on the course web site today. Due 10/17 Prob. Sess.: Sundays 6–7 (SC B-10), Tuesdays 1–2 (SC 116) OH Mon 1–2, Tues 3–4, Weds 1–3 (SC 323) Midterm I 10/18, Hall A 7–8:30pm Review Session (ML), 10/16, 7:30–9:30 Hall E
2. 2. Systems of Linear equations Any set (system) of equations involving one or more variables, in which each equation involves a linear combination of the variables.
3. 3. Systems of Linear equations Any set (system) of equations involving one or more variables, in which each equation involves a linear combination of the variables. Example Here is a single linear equation in one variable: 4x + 2 = 6
4. 4. Systems of Linear equations Any set (system) of equations involving one or more variables, in which each equation involves a linear combination of the variables. Example Here is a single linear equation in one variable: 4x + 2 = 6 Solution Subtract 2 from each side and you get 4x = 4. Divide both sides by 4 and you get x = 1.
5. 5. Two equations in two variables Example Solve 2x + y = 3, x + 2y = 0.
6. 6. Two equations in two variables Example Solve 2x + y = 3, x + 2y = 0. Solution x = 2, y = −1.
7. 7. Three equations in three variables Example Solve the system of linear equations 2x2 − 3x3 = 4 −2x1 + x2 + 2x3 = −6 2x1 + x3 = 0
8. 8. Three equations in three variables Example Solve the system of linear equations 2x2 − 3x3 = 4 −2x1 + x2 + 2x3 = −6 2x1 + x3 = 0 The more variables you get, the bigger the need for a systematic way of solving systems of linear equations.
9. 9. The Matrix viewpoint on SLEs A system of m equations in n variables looks like: a11 x1 + a12 x2 + . . . + a1n xn = b1 a21 x1 + a22 x2 + . . . + a2n xn = b2 . . . . .. . . . . . . . . . am1 x1 + am2 x2 + . . . + amn xn = bm The operative data are the coeﬃcients and the right-hand sides. We can summarize it like this:      a11 a12 . . . a1n x1 b1  a21 a22 . . . a2n  x2  b2  .   .  =  .  , or Ax = b      . . .. . . .  .   .  . . . . . . am1 am2 . . . amn xn bn
10. 10. The augmented matrix In fact, we can express the whole system of linear equations in a single matrix, called the augmented matrix:   a11 a12 . . . a1n b1  a21 a22 . . . a2n b2    . . .. . . . . . .  . . . . .  am1 am2 . . . amn bm
11. 11. Operations on systems of equations Here are some facts about systems of equations. 1. Transposing equations doesn’t change their solution. 2. Scaling an equation doesn’t change its solution. 3. If a set of numbers satisﬁes two equations, then it also satisﬁes the equation which is one plus a scalar multiple of the other.
12. 12. Operations on systems of equations Here are some facts about systems of equations. 1. Transposing equations doesn’t change their solution. 2. Scaling an equation doesn’t change its solution. 3. If a set of numbers satisﬁes two equations, then it also satisﬁes the equation which is one plus a scalar multiple of the other. A simpler form might be 3’. If a set of numbers satisﬁes two equations, it satisﬁes the sum of the two equations.
13. 13. Row Operations The operations on systems of linear equations are reﬂected in the augmented matrix, too. 1. Transposing (switching) rows in an augmented matrix does not change the solution. 2. Scaling any row in an augmented matrix does not change the solution. 3. Adding to any row in an augmented matrix any multiple of any other row in the matrix does not change the solution.
14. 14. The Process of Gaussian Elimination We’ll solve the system of linear equations 2x2 − 3x3 = 4 −2x1 + x2 + 2x3 = −6 2x1 + x3 = 0 The augmented matrix is   0 2 −3 4  −2 1 2 −6  20 1 0
15. 15. Transpose the ﬁrst and third equations:     2 −3 ←− 0 4 2 0 1 0 −2 1 − 6 −2 1 − 6 2 2 ←− −3 2 0 1 0 0 2 4
16. 16. Transpose the ﬁrst and third equations:     2 −3 ←− 0 4 2 0 1 0 −2 1 − 6 −2 1 − 6 2 2 ←− −3 2 0 1 0 0 2 4 Now we can add the ﬁrst row to the second and get another zero in that column.     2 0 1 0 20 1 0 −2 1 − 6 ← +− − 6 2 0 1 3 2 −3 0 2 −3 0 4 4
17. 17. We add (-2) times the second row to the third row.     20 1 0 20 1 0 −6 − 6 0 1 3 0 1 3 −2  0 2 −3 ←+ − −9 4 00 16
18. 18. We add (-2) times the second row to the third row.     20 1 0 20 1 0 −6 − 6 0 1 3 0 1 3 −2  0 2 −3 ←+ − −9 4 00 16 This matrix is in row echelon form. The corresponding SLE can be solved by back-substitution.
19. 19.   20 1 0 − 6 0 1 3 −9 00 16 Since −9x3 = 16, we have x3 = − 16 . Substituting this into the 9 second equation gives −6 48 54 2 x2 − = −6 = − =− . =⇒ x2 = 9 9 9 3 Finally, we have 16 8 2x1 − = 0 =⇒ x1 = . 9 9
20. 20. More Gaussian Elimination: The “backward pass” Starting with the last matrix above, we scale the last row by − 1 : 9     201 0 20 1 0 −6 − 6 0 1 3 0 1 3 − 16 1 −9 |− 001 00 16 9 9
21. 21. More Gaussian Elimination: The “backward pass” Starting with the last matrix above, we scale the last row by − 1 : 9     201 0 20 1 0 −6 − 6 0 1 3 0 1 3 − 16 1 −9 |− 001 00 16 9 9 Now we can zero out the third column above that bottom entry, by adding (-3) times the third row to the second row, then adding (-1) times the third row to the ﬁrst row. 16     ← − −+ −− 201 0 200 9 −6 −6  ←+ − 0 1 3 0 1 0 9 − 16 − 16 001 001 −3 −1 9 9
22. 22. The top row can be scaled by 1 , and we ﬁnally have 2 16 1 8     |2 200 100 9 9 −6 −6 0 1 0 0 1 0 9 9 − 16 − 16 001 001 9 9 This matrix is said to be in reduced row echelon form.
23. 23. The top row can be scaled by 1 , and we ﬁnally have 2 16 1 8     |2 200 100 9 9 −6 −6 0 1 0 0 1 0 9 9 − 16 − 16 001 001 9 9 This matrix is said to be in reduced row echelon form. And there you go; the solutions are staring you in the face!
24. 24. Gaussian Elimination 1. Locate the ﬁrst nonzero column. This is pivot column, and the top row in this column is called a pivot position. Transpose rows to make sure this position has a nonzero entry. If you like, scale the row to make this position equal to one. 2. Use row operations to make all entries below the pivot position zero. 3. Repeat Steps 1 and 2 on the submatrix below the ﬁrst row and to the right of the ﬁrst column. Finally, you will arrive at a matrix in row echelon form. (up to here is called the forward pass) 4. Scale the bottom row to make the leading entry one. 5. Use row operations to make all entries above this entry zero. 6. Repeat Steps 4 and 5 on the submatrix formed above and to the left of this entry. (These steps are called the backward pass)
25. 25. So to solve a SLE: Form the augmented matrix. reduce this matrix to (R)REF. read oﬀ the solution.
26. 26. Meet the Mathematician German “the prince of mathematicians” Proved FTA four times Invented least-squares method Predicted motion of planets Carl Friedrich Gauss 1777–1855
27. 27. Questions Suppose the matrix   0 3 −6 6 4 −5 3 −7 8 −5 8 9  3 −9 12 −9 6 15 is given as the augmented matrix to a system of linear equations. How do we interpret the solution from the RREF?
28. 28. Questions Suppose the matrix   0 3 −6 6 4 −5 3 −7 8 −5 8 9  3 −9 12 −9 6 15 is given as the augmented matrix to a system of linear equations. How do we interpret the solution from the RREF?   1 0 −2 3 0 −24 0 1 −2 2 0 −7  00 0 01 4
29. 29. The system of linear equations is − 2x3 + 3x4 = −24 x1 x2 − 2x3 + 2x4 = −7 x5 = 4
30. 30. The system of linear equations is − 2x3 + 3x4 = −24 x1 x2 − 2x3 + 2x4 = −7 x5 = 4 or x1 = −24 + 2s − 3t x2 = −7 + 2s − 2t x3 = s x4 = t x5 = 4 Here s and t can be anything we want and we can construct a solution out of them. x3 and x4 are known as free variables; they can take any value.
31. 31. The system of linear equations is − 2x3 + 3x4 = −24 x1 x2 − 2x3 + 2x4 = −7 x5 = 4 or x1 = −24 + 2s − 3t x2 = −7 + 2s − 2t x3 = s x4 = t x5 = 4 Here s and t can be anything we want and we can construct a solution out of them. x3 and x4 are known as free variables; they can take any value. We see free variables in the RREF as the columns with no leading entry.
32. 32. Question What if the RREF of the matrix were   1 0 −2 3 0 −24 0 1 −2 2 0 −7  00 0 00 1 What would be the solutions to the associated system now?
33. 33. Question What if the RREF of the matrix were   1 0 −2 3 0 −24 0 1 −2 2 0 −7  00 0 00 1 What would be the solutions to the associated system now? Answer. The bottom row represents the equation 0 = 1, which has no solution. This system of equations is inconsistent.