Mathematics 244, section 1 -- Linear Algebra
Review Sheet for Final Exam
May 3, 2004
General Information
As announced in the course syllabus, the final exam of the semester will
be given at 8:30 am on Monday, May 10, in our regular classroom, SW 302.
You will have the full 3-hour period to work on this exam.
The exam will be roughly twice the length of the three in-class exams, though,
so if you are well-prepared and
work steadily, then you should be able to complete it in two hours without
time pressure. The questions will be similar to those from the in-class exams.
Again, one question may (i.e. will!)
consist of several ``true - false'' questions where you must either
give a short proof (if you think the statement is true), or a counterexample
(if you think the statement is false).
Topics to be Covered
This will be a comprehensive examination, covering all the topics we
have studied since the start of the semester. This includes:
- The axioms for vector spaces -- showing that a set is or is not
a vector space using the axioms
- The key examples of vector spaces: Rk,
Pk(R), various function spaces, and their properties
- Subspaces of a vector space -- know how to show that a subset
of a vector space is or is not a subspace using the definition.
- Linear combinations and the linear span of a set of vectors
- Linear dependence and independence (know the definitions and how
to show whether a set is linearly independent or not)
- Reduction to echelon form as a method for solving systems
of linear equations and inverting matrices.
- Bases of a vector space and the dimension of a vector space
- Linear mappings, kernels, images, etc.
- The dimension theorem: If T : V -> W
is linear and V is finite dimensional, then
dim(V) = dim(Ker(T)) + dim(Im(T))
and its consequences for injectivity, surjectivity, isomorphisms, etc.
- Matrix representations of a linear mapping
- Basic theorems about sums and compositions of linear mappings
and the corresponding facts about matrices
- Change of basis
- Eigenvalues and eigenvectors
- Determinants and the characteristic polynomial of a mapping or
matrix. Know how to use the characteristic polynomial to
find all the eigenvalues and eigenvectors of a linear
mapping T : V -> V. I will stick to small dimensions here:
dim(V) <= 4. In any example
you will need to work out, the characteristic polynomial will be set
up to factor "nicely".
- Diagonalizability; the characterization of diagonalizable linear
mappings T : V -> V.
- The dot product on Rn
and its properties.
- The Spectral Theorem for real
symmetric matrices (symmetric mappings) -- both forms: the statement
that there exists an orthonormal basis consisting of eigenvectors
of the symmetric mapping, and the statement giving the spectral
decomposition of A and I in terms of the
orthogonal projections onto the eigenspaces.
Proofs to Know
See the review sheets for midterm exams 1 - 3, plus the following
additional proof:
- If A is a symmetric matrix, and x,y
are eigenvectors of A with distinct
eigenvalues, then x and y are orthogonal vectors.
Review Problems
Consult the review sheets for the three in-class exams for review problems
on topics 1 - 16 above.
For topic 17, from Chapter 4, section 6: 1 cde, 4,5,8,9,10.
Several parts of these questions use 2 facts that we have not seen
explicitly in this class, but that you may use without proof:
- The transpose of a product of matrices satisfies
a ``reverse order law'' like the matrix inverse:
(AB)t = BtAt.
- The determinant of a product of square matrices is the
product of the determinants:
det(AB) = det(A)det(B)
Note: I will treat the problems from Chapter 4, section 6
above as an Extra Credit assignment for the course. Solutions
to some or all of these may be submitted any time up to the time
of the final exam for Extra Credit on the semester Problem Set average.
(If you don't get all of these, it will still be to your advantage
to submit solutions for the ones you can solve.)
Some true-false review problems:
- The set W = {f in F(R) : f(4) = 3f(1)} is a vector
subspace of F(R).
- If E and F are two sets of vectors in a vector
space V, then Span(E intersect F) = Span(E) intersect Span(F).
- If W is a vector subspace of V, then W = Span(W).
- There exists a linearly independent set of 5 vectors in the vector space
V = P3(R).
- A set of vectors E = {x1, ... , xn}
is linearly independent if and only if dim(Span(E)) = n = |E|.
- If E = {x1, ... , xn} is linearly
independent in V, and T: V -> W is a linear mapping, then
T(E) = {T(x1), ... , T(xn)} is
linearly independent.
- A linear mapping T : V -> V is said to be an involution if
T2 = I. If T is an involution, then T is
an isomorphism.
- If T is an involution (see 7 above)
and x is a vector in V,
then x - T(x) is an eigenvector of T.
- If E = {x1, ... , xn} is linearly
independent in V, and y1, ... , yn
are arbitrary vectors in W, then there exists a linear mapping
T : V -> W such that T(xi) = yi
for i = 1,...,n.
- If T : R4 -> R4 has
characteristic polynomial (3-lambda)(2-lambda)2(1-lambda),
then T is an isomorphism.
- Every homogeneous system of 3 linear equations in 4 variables has
a non-zero solution.
- If M = (mij) is an upper-triangular
n x n matrix
(that is, if mij = 0 if i > j), then the eigenvalues
of M are the diagonal entries mii, i = 1, ..., n.
- If E = {x1, ... , xn} is an orthonormal
basis of Rn, and y,z are
two vectors in V, then
< y,z > =
sumi = 1,...,n < y,xi > < z,xi >
- The vector subspace of M2x2(R) consisting of
matrices M satisfying Tr(M) = 0 has dimension 2.
- Let S : V -> W and T : U -> V be linear mappings.
Then Ker(ST) is equal to Ker(T).
- If p(lambda) is the characteristic polynomial of M,
then p(0) = det(M).
Review Session
If there is interest, we can schedule a day or evening review session
during study week.