Mathematics 44, section 1 -- Linear Algebra
Review Sheet for Final Exam
General Information
As announced in the course syllabus, the final exam of the semester will
be given at 2:30 pm on Tuesday, May 11, in our regular classroom, HA 412.
You will have the full 3-hour period to work on this exam.
The exam will be roughly twice the length of the three in-class exams, though,
so if you work steadily, then you should be able to complete it in two hours.
The questions will be similar to those from the in-class exams.
Again, one question may (i.e. will!)
consist of several ``true - false'' questions where you must either
give a short proof (if you think the statement is true), or a counterexample
(if you think the statement is false).
Topics to be Covered
This will be a comprehensive examination, covering all the topics we
have studied since the start of the semester. This includes:
- The axioms for vector spaces -- showing that a set is or is not
a vector space using the axioms
- The key examples of vector spaces: Rk,
Pk(R), Fun(S), linfinity, and their properties
- Subspaces of a vector space -- know how to show that a subset
of a vector space is or is not a subspace using the definition (see page 35 in
Smith)
- Linear combinations and the linear span of a set of vectors
- Linear dependence and independence (know the definitions and how
to show whether a set is linearly independent or not)
- Bases of a vector space and the dimension of a vector space
- Linear mappings, kernels, images, etc.
- The dimension theorem: If T : V -> W
is linear and V is finite dimensional, then
dim(V) = dim(Ker(T)) + dim(Im(T))
and its consequences for injectivity, surjectivity, isomorphisms, etc.
- Matrix representations of a linear mapping
- Basic theorems about sums and compositions of linear mappings
and the corresponding facts about matrices
- Change of basis
- Gaussian and Gauss-Jordan elimination for solving systems
of linear equations and inverting matrices and their matrix interpretations
via elementary matrices Eij(c), Ei(c),
Pij)
- Eigenvalues and eigenvectors
- Determinants and the characteristic polynomial of a mapping or
matrix. Know how to use the characteristic polynomial to
find all the eigenvalues and eigenvectors of a linear
mapping T : V -> V. I will stick to small dimensions here:
dim(V) <= 5. In any example
you will need to work out, the characteristic polynomial will be set
up to factor "nicely".
- Diagonalizability; the characterization of diagonalizable linear
mappings T : V -> V.
- Inner product spaces, the adjoint of a linear mapping, self-adjoint
mappings, the Spectral Theorem for 2 x 2 and 3 x 3
symmetric matrices (self-adjoint mappings).
Proofs to Know
- If V is a vector space with some finite spanning set,
then every two bases of V have the same number of elements.
(You may state without proof any facts about solutions of systems
of linear equations that you need here.)
- Proof of the dimension theorem.
- Let T : V -> W and S : W -> U be linear
mappings between finite dimensional vector spaces. Then the
composition ST : V -> U is linear. Moreover, if
E,F,G are bases for V,W,U respectively,
the matrix of T with respect to E,F
is A, and the matrix of S with respect to F,G
is B, then matrix of ST with respect to E,G
is the matrix productBA.
- The cofactor matrix Acof satisfies
A Acof = det(A)I.
- The proof of the Spectral Theorem for 2 x 2 matrices.
Review Problems
Consult the review sheets for the three in-class exams for review problems
on topics 1 - 15 above.
For topic 16, from Smith Chapter 16/3 d,e,f, 11, 12, 14.
Some true-false review problems:
- The set W = {f in Fun(R) : f(4) = 3f(1)} is a vector
subspace of Fun(R).
- If E and F are two sets of vectors in a vector
space V, then L(E intersect F) = L(E) intersect L(F).
- If W is a vector subspace of V, then W = L(W).
- There exists a linearly independent set of 5 vectors in the vector space
V = P3(R).
- A set of vectors E = {A1, ... , An}
is linearly independent if and only if dim(L(E)) = n = |E|.
- If E = {A1, ... , An} is linearly
independent in V, and T: V -> W is a linear mapping, then
T(E) = {T(A1), ... , T(An)} is
linearly independent.
- A linear mapping T : V -> V is said to be an involution if
T2 = I. If T is an involution, then T is
an isomorphism.
- If T is an involution and A is a vector in V,
then A - T(A) is an eigenvector of T. What if
A - T(A) <> 0?
- If E = {A1, ... , An} is linearly
independent in V, and B1, ... , Bn
are arbitrary vectors in W, then there exists a linear mapping
T : V -> W such that T(Ai) = Bi
for i = 1,...,n.
- If T : R4 -> R4 has
characteristic polynomial (3-lambda)(2-lambda)2(1-lambda),
then T is an isomorphism.
- Every homogeneous system of 3 linear equations in 4 variables has
a non-zero solution.
- If M = (mij) is an upper-triangular
n x n matrix
(that is, if mij = 0 if i > j), then the eigenvalues
of M are the diagonal entries mii, i = 1, ..., n.
- If E = {A1, ... , An} is an orthonormal
basis in an inner product space V, and B,C are
two vectors in V, then
< B,C > = sumi = 1,...,n < B,Ai > < C,Ai >.
- The vector subspace of M2x2(R) consisting of
matrices M satisfying Trace(M) = 0 has dimension 2.
- Let S : V -> W and T : U -> V be linear mappings.
Then Ker(ST) is equal to Ker(T).
- If p(lambda) is the characteristic polynomial of M,
then p(0) = det(M).