> with(linalg); -1
 

>
 

MATH 244 -- Linear Algebra Section 1 

Problem Set 9 Solutions 

April 20, 2007 

 

Section 5.2 

 

6.  The characteristic polynomial is `.`(det, Matrix(%id = 135868756)) = lambda^2-11*lambda+40. 

    The eigenvalues are the roots of 0 = so by the quadratic 

    formula:  

 

                           

     Neither root is real, so there are no real eigenvalues. 

 

10.  The characteristic polynomial is  

 

      `.`(det, Matrix(%id = 139548500)) = -lambda^3+14*lambda+12(after expanding out and simplifying). 

 

14.  The characteristic polynomial is (expanding determinant along row 2): 

 

       

       

 

18.  The eigenspace for lambda = 5is: 

 

      If we consider reducing this to row-reduced 

      echelon form, then there is only one free variable in the system (A-`*`(5, I))*x = 0 

      unless h = 6.For h = 6, we have rref =  

      Span*({Vector[column](%id = 136181288), Vector[column](%id = 138526992)})dimension 2.  In all other cases  dim*Nul(A-`*`(5, I)) = 1. 

 

23.   If then This shows that A[1] = RQand  

      Aare similar since the invertible matrix P = 1/Qsatisfies A = `*`(A[1]/P, P) 

      (see the definition at bottom of page 314 in the text).   

 

25.   a)  By direct calculation, `.`(Matrix(%id = 136237148), Vector[column](%id = 140169988)) = Vector[column](%id = 136084140) and Vector[column](%id = 136084140) = Vector[column](%id = 140169988) 

           so the given vector is an eigenvector with eigenvalue λ = 1.  The characteristic 

           polynomial is so the other eigenvalue is 

           Since the eigenvalues 

           1 and .3 are distinct, the set {Vector[column](%id = 135311832), Vector[column](%id = 135143236)}is linearly independent (Theorem 2 on 

            page 307 of the text, or direct check), hence a basis for  

 

       b)   

 

       c)  This is similar to the computations you did on Discussion 3: 

 

            x[1] = Ax[0] and Ax[0] = (Matrix(%id = 140198924))(Vector[column](%id = 139342576)+`.`(-1/14, Vector[column](%id = 138411204))) and (Matrix(%id = 140198924))(Vector[column](%id = 139342576)+`.`(-1/14,... 

 

 

            x[2] = Ax[1] and Ax[1] = (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`((-1/14)(.3), Vector[column](%id = 135967588))) and (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`(...
x[2] = Ax[1] and Ax[1] = (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`((-1/14)(.3), Vector[column](%id = 135967588))) and (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`(...
x[2] = Ax[1] and Ax[1] = (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`((-1/14)(.3), Vector[column](%id = 135967588))) and (Matrix(%id = 136393492))(Vector[column](%id = 136328928)+`.`(...
 

 

            Similarly, for all 1 <= k, by mathematical induction, 

 

              

            

 

            In the limit as so   

             

 

Section 5.3 

 

12.   Given that the eigenvalues are we have  

 

        

 

        

 

       Then, ℬ = {Vector[column](%id = 136102836), Vector[column](%id = 138003488), Vector[column](%id = 140090712)}is a basis for real^3consisting of eigenvectors 

       of A.  This shows that  A  is diagonalizable, with  AP/P = Matrix(%id = 138292784)for  

       Check: 

> multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
 

table( [( 3, 3 ) = 8, ( 2, 2 ) = 2, ( 3, 2 ) = 0, ( 2, 1 ) = 0, ( 3, 1 ) = 0, ( 2, 3 ) = 0, ( 1, 2 ) = 0, ( 1, 3 ) = 0, ( 1, 1 ) = 2 ] ) 

 

14.   Given the eigenvalues are lambda = 4, 5(characteristic polynomial factors as 

       Theorem 7 on page 326 implies that in order for  

       A  to be diagonalizable, we must have dim*Nul(A-`*`(5, I)) = 2. 

       A-`*`(5, I) = Matrix(%id = 137116304)which is row-equivalent to    so there are 2 free 

       variables in the homogeneous system   

Then  

       This shows that A  is diagonalizable with  

> P := matrix([[0, -2, -1/2], [1, 0, 1], [0, 1, 0]]); 1
 

table( [( 3, 3 ) = 0, ( 2, 2 ) = 0, ( 3, 2 ) = 1, ( 2, 1 ) = 1, ( 3, 1 ) = 0, ( 2, 3 ) = 1, ( 1, 2 ) = -2, ( 1, 3 ) = -1/2, ( 1, 1 ) = 0 ] ) 

> A := matrix([[4, 0, -2], [2, 5, 4], [0, 0, 5]]); 1
 

table( [( 3, 3 ) = 5, ( 2, 2 ) = 5, ( 3, 2 ) = 0, ( 2, 1 ) = 2, ( 3, 1 ) = 0, ( 2, 3 ) = 4, ( 1, 2 ) = 0, ( 1, 3 ) = -2, ( 1, 1 ) = 4 ] ) 

> multiply(multiply(inverse(P), A), P); 1
 

table( [( 3, 3 ) = 4, ( 2, 2 ) = 5, ( 3, 2 ) = 0, ( 2, 1 ) = 0, ( 3, 1 ) = 0, ( 2, 3 ) = 0, ( 1, 2 ) = 0, ( 1, 3 ) = 0, ( 1, 1 ) = 5 ] ) 

 

24.   No -- the hypotheses of Theorem 7 on page 324 cannot be satisfied in  

       this case.  The largest linearly independent set of eigenvectors will have 

       size 2, so there is no basis consisting of real^3consisting of eigenvectors of A. 

 

26.   Yes -- if the remaining eigenspace has dimension only 1, then the matrix 

       is not diagonalizable (see Theorem7).   The following matrix gives an example  

 

       The eigenvalues are  lambda = 1(with multiplicity 2),  

      with multiplicity 3) , and lambda = 3(with multiplicity 2).  By the form of the 

      matrix  Nul*(A-I) = Span*({e[1], e[2]})has dimension 2,  Nul(A-`*`(2, I)) = Span*({e[3], e[5], e[4]}) 

      has dimension 3, but  

 

32.  Since A  is not invertible,  lambda = 0  must be one of the eigenvalues (see Theorem on page 

      312 -- continuation of the Invertible Matrix Theorem).  By Theorem 6 on page 323,  

      if the other eigenvalue is lambda = 1(or anything other than 0), then  A  will be diagonalizable. 

      This means that A  should be similar to the diagonal matrix  To get  

      Anot diagonal we can try taking an arbitrary invertible matrix like with 

      Then   

> A := multiply(multiply(matrix([[1/2, -1/2], [1/2, 1/2]]), matrix([[0, 0], [0, 1]])), matrix([[1, 1], [-1, 1]])); 1
A := multiply(multiply(matrix([[1/2, -1/2], [1/2, 1/2]]), matrix([[0, 0], [0, 1]])), matrix([[1, 1], [-1, 1]])); 1
 

table( [( 2, 2 ) = 1/2, ( 2, 1 ) = -1/2, ( 1, 2 ) = -1/2, ( 1, 1 ) = 1/2 ] ) 

     is a matrix that satisfies all of the properties we want.   

 

 

Section 6.2 

 

6.   Call the three vectors  We have  

 

     v[1]*v[2] = `*`(5, -4)+(-4)(1)+`*`(0, -3)+`*`(3, 8) and `*`(5, -4)+(-4)(1)+`*`(0, -3)+`*`(3, 8) = `+`(-20, -4)+24 and `+`(-20, -4)+24 = 0 

   

     v[1]*v[3] = `*`(5, 3)+(-4)(3)+`*`(0, 5)+`*`(3, -1) and `*`(5, 3)+(-4)(3)+`*`(0, 5)+`*`(3, -1) = `+`(15, -12)-3 and `+`(15, -12)-3 = 0 

      

     v[2]*v[3] = (-4)(3)+3+(-3)(5)+`*`(8, -1) and (-4)(3)+3+(-3)(5)+`*`(8, -1) = `+`(`+`(-12, 3)-15, -8) and `+`(`+`(-12, 3)-15, -8) <> 0 

 

     So the set is not  orthogonal.  (It would be enough just to show  

     of course).   

 

10.  It is clear that u[1]*u[2] = u[1]*u[3] and u[1]*u[3] = u[2]*u[3] and u[2]*u[3] = 0here.  This implies that 

     `𝒮` = ({u[3], u[1], u[2]})is an orthogonal set.  By Theorem 4 in this section, it 

     follows that `𝒮` is linearly independent.  (This can also be checked  

     directly:  If  c[1]*u[1]+c[2]*u[2]+c[3]*u[3] = 0for some scalars then 

     we can take dot products with  u[1], u[2], u[3]; -1 

 

          0 =  

 

     Since  `*`(u[1], u[1]) <> 0, c[1] = 0.Similarly, dotting with  u[2]shows  c[2] = 0and 

     dotting with  u[3]shows  It follows that`𝒮` is a basis for    

 

     Then because of the orthogonality, there is a shortcut  we can use for 

     computing the weights in a linear combination: 

 

               

 

     For each  i,  i = 1, 2, 3,  we have 

 

              x*u[i] = `*`(c[i]*u[i], u[i])(since the dot product of  u[i]with the other two terms 

                                         is zero). 

 

     Hence  c[i] = x*u[i]/`*`(u[i], u[i])for each  i = 1, 2, 3. 

 

       

 

12.   The orthogonal projection on the line spanned by Vector[column](%id = 135669108)is defined by  

         

 

       With  we get  

 

 

> factor(det(matrix([[5-lambda, -2, 3], [0, 1-lambda, 0], [6, 7, -2-lambda]]))); 1
 

-(lambda-1)*(lambda-7)*(lambda+4) 

> multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
multiply(multiply(inverse(matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])), matrix([[4, 2, 2], [2, 4, 2], [2, 2, 4]])), matrix([[-1, -1, 1], [1, 0, 1], [0, 1, 1]])); 1
 

table( [( 3, 3 ) = 8, ( 2, 2 ) = 2, ( 3, 2 ) = 0, ( 2, 1 ) = 0, ( 3, 1 ) = 0, ( 2, 3 ) = 0, ( 1, 2 ) = 0, ( 1, 3 ) = 0, ( 1, 1 ) = 2 ] ) 

>