| > |
| > |
MATH 244 -- Linear Algebra
Problem Set 7 Solutions
March 30, 2007
Section 4.3
24. Let A be the n x n matrix with columns
By the Invertible Matrix Theorem (section 2.3)
A is invertible. Hence for all
the
system of linear equations
has a solution.
But this implies Span(ℬ)
Therefore, ℬ
is a basis for
(Note: this also follows from the reasoning given
in the proof of Section 4.5/26 below.)
26. The set S =
is linearly
dependent because of a trigonometric identity:
for all t. Hence
The set
is linearly independent
since if
for all t, then
setting
shows
hence
But then
as well, since
is not the zero
function.
31. If
is linearly dependent, then there exist
scalars
not all equal to zero such that 0 =
Since T is assumed to be a linear mapping, applying it
to both sides of the last equation shows
![]()
But
for all linear mappings so
0 =
Since the
are not all zero (they are the same scalars as
in the original linear dependence on the
this shows that
is also linearly dependent.
32. Assume that there are scalars
not all equal to zero such that
=
Since
is linear, this implies
Since T is one-to-one,
this implies that
=
Since
are not all
equal to zero, this implies that
is linearly dependent.
34. ``By inspection'',
As in 24 above, this
relation shows
or the span of
any of the other two-element subsets). The set
is linearly
independent since
as a polynomial
implies
It is easy to check that
the only solution of this 2 x 2 homogeneous system of linear
equations is
Thus,
is one basis for
Section 4.4
10. By the standard ``recipe''
(The columns of this matrix are the coordinate vectors of the
basis ℬ with respect to the standard basis.)
12.
so
14. We can follow the method used in problem 12. Take
the "standard basis"
Then
(columns are the coordinate vectors of the
polynomials in ℬ).
| > |
Then
Check:
OK!
18. Follow the standard ``recipe'' for getting the coordinate
vectors. The unique linear combination of the vectors in ℬ
that equals
is
Picking off the scalars here and placing them into a vector
in
we get
the ith standard basis vector, which is the
same as the ith column of the n x n identity matrix.
22. The matrix that does this is the change of coordinates matrix
This is the inverse of the matrix whose
columns are the (standard coordinates) of the vectors in the basis
ℬ.
28. With respect to the basis
for
the coordinate
vectors of the polynomials are
The mapping
is an isomorphism of vector spaces
(linear, 1-1, and onto). This implies that the polynomials are
linearly independent if and only if the coordinate vectors are.
To check linear dependence or independence, we apply our
standard methods. Reducing the 4 x 3 matrix with these
columns to echelon form we find:
| > |
Since there is a free variable in the corresponding homogeneous
system, there is a nontrivial solution, so the vectors (and polynomials)
are linearly dependent.
Section 4.5
26. Let ℬ =
be a basis for H.
We will argue by contradiction. Suppose
H is strictly contained in V. Then there is some
such that
We claim that the set =
must be linearly independent.
To see this, consider a linear combination
If
then we could use this equation to solve for
x, and that would say
So
But then the remaining terms only involve vectors in
ℬ and we are assuming ℬ is a basis, hence linearly
independent. Therefore
for all i. But this
leads to a contradiction because
also.
In an
dimensional vector space, any set of
vectors is linearly dependent (Theorem 9 in this section
of the text). Therefore
28. The subset
for all
spans the subspace of
consisting of all polynomial
functions. This set is linearly independent because
any nonzero polynomial has only finitely many real roots.
(That is, the only polynomial that is the zero function
is the zero polynomial, with all coefficients equal to zero.)
This implies that
is not a finite-dimensional vector
space (Theorem 9 again -- in a vector space of dimension
n, the largest linearly independent sets have n elements).
32. To prove
it suffices to show that if
is a basis for
then
is a basis for
do you see why?)
First, note that the results of problems 31 and 32 from section
4.3 above say that when T is 1-1 (as assumed here),
is linearly dependent if and only if
is linearly dependent. Thus, it follows by logic that
is linearly independent if and only if
is linearly independent too.
Next, since
for all
we have
for some scalars
Since T is linear this implies
![]()
This shows
is contained in
On the other hand, we can
take any linear combination as on the right of the last equation,
use the linearity of T ``in reverse'', and see that it equals
for some
This shows that
is equal to
Since
spans
and
is linearly independent, it is a basis for