> | ![]() |
> |
MATH 244 -- Linear Algebra Section 1
Problem Set 9 Solutions
April 20, 2007
Section 5.2
6. The characteristic polynomial is
The eigenvalues are the roots of 0 = so by the quadratic
formula:
Neither root is real, so there are no real eigenvalues.
10. The characteristic polynomial is
(after expanding out and simplifying).
14. The characteristic polynomial is (expanding determinant along row 2):
18. The eigenspace for is:
If we consider reducing this to row-reduced
echelon form, then there is only one free variable in the system
unless For h = 6, we have rref =
dimension 2. In all other cases
23. If then
This shows that
and
are similar since the invertible matrix
satisfies
(see the definition at bottom of page 314 in the text).
25. a) By direct calculation,
so the given vector is an eigenvector with eigenvalue λ = 1. The characteristic
polynomial is so the other eigenvalue is
Since the eigenvalues
1 and .3 are distinct, the set is linearly independent (Theorem 2 on
page 307 of the text, or direct check), hence a basis for
b)
c) This is similar to the computations you did on Discussion 3:
Similarly, for all , by mathematical induction,
In the limit as so
Section 5.3
12. Given that the eigenvalues are we have
Then, ℬ = is a basis for
consisting of eigenvectors
of A. This shows that A is diagonalizable, with for
Check:
> | ![]() ![]() ![]() |
14. Given the eigenvalues are (characteristic polynomial factors as
Theorem 7 on page 326 implies that in order for
A to be diagonalizable, we must have
which is row-equivalent to
so there are 2 free
variables in the homogeneous system
Then
This shows that A is diagonalizable with
> | ![]() |
> | ![]() |
> | ![]() |
24. No -- the hypotheses of Theorem 7 on page 324 cannot be satisfied in
this case. The largest linearly independent set of eigenvectors will have
size 2, so there is no basis consisting of consisting of eigenvectors of A.
26. Yes -- if the remaining eigenspace has dimension only 1, then the matrix
is not diagonalizable (see Theorem7). The following matrix gives an example
The eigenvalues are
(with multiplicity 2),
with multiplicity 3) , and
(with multiplicity 2). By the form of the
matrix has dimension 2,
has dimension 3, but
32. Since A is not invertible, must be one of the eigenvalues (see Theorem on page
312 -- continuation of the Invertible Matrix Theorem). By Theorem 6 on page 323,
if the other eigenvalue is (or anything other than 0), then A will be diagonalizable.
This means that A should be similar to the diagonal matrix To get
not diagonal we can try taking an arbitrary invertible matrix like
with
Then
> | ![]() ![]() |
is a matrix that satisfies all of the properties we want.
Section 6.2
6. Call the three vectors We have
So the set is not orthogonal. (It would be enough just to show
of course).
10. It is clear that here. This implies that
is an orthogonal set. By Theorem 4 in this section, it
follows that is linearly independent. (This can also be checked
directly: If for some scalars
then
we can take dot products with
0 =
Since Similarly, dotting with
shows
and
dotting with shows
It follows that
is a basis for
Then because of the orthogonality, there is a shortcut we can use for
computing the weights in a linear combination:
For each i, i = 1, 2, 3, we have
(since the dot product of
with the other two terms
is zero).
Hence for each i = 1, 2, 3.
12. The orthogonal projection on the line spanned by is defined by
With we get
> | ![]() |
> | ![]() ![]() ![]() |
> |