Mathematics 44, section 1 -- Linear Algebra

Review Sheet for Final Exam

General Information

As announced in the course syllabus, the final exam of the semester will be given at 2:30 pm on Tuesday, May 11, in our regular classroom, HA 412. You will have the full 3-hour period to work on this exam. The exam will be roughly twice the length of the three in-class exams, though, so if you work steadily, then you should be able to complete it in two hours. The questions will be similar to those from the in-class exams. Again, one question may (i.e. will!) consist of several ``true - false'' questions where you must either give a short proof (if you think the statement is true), or a counterexample (if you think the statement is false).

Topics to be Covered

This will be a comprehensive examination, covering all the topics we have studied since the start of the semester. This includes:

  1. The axioms for vector spaces -- showing that a set is or is not a vector space using the axioms
  2. The key examples of vector spaces: Rk, Pk(R), Fun(S), linfinity, and their properties
  3. Subspaces of a vector space -- know how to show that a subset of a vector space is or is not a subspace using the definition (see page 35 in Smith)
  4. Linear combinations and the linear span of a set of vectors
  5. Linear dependence and independence (know the definitions and how to show whether a set is linearly independent or not)
  6. Bases of a vector space and the dimension of a vector space
  7. Linear mappings, kernels, images, etc.
  8. The dimension theorem: If T : V -> W is linear and V is finite dimensional, then

    dim(V) = dim(Ker(T)) + dim(Im(T))

    and its consequences for injectivity, surjectivity, isomorphisms, etc.

  9. Matrix representations of a linear mapping
  10. Basic theorems about sums and compositions of linear mappings and the corresponding facts about matrices
  11. Change of basis
  12. Gaussian and Gauss-Jordan elimination for solving systems of linear equations and inverting matrices and their matrix interpretations via elementary matrices Eij(c), Ei(c), Pij)
  13. Eigenvalues and eigenvectors
  14. Determinants and the characteristic polynomial of a mapping or matrix. Know how to use the characteristic polynomial to find all the eigenvalues and eigenvectors of a linear mapping T : V -> V. I will stick to small dimensions here: dim(V) <= 5. In any example you will need to work out, the characteristic polynomial will be set up to factor "nicely".
  15. Diagonalizability; the characterization of diagonalizable linear mappings T : V -> V.
  16. Inner product spaces, the adjoint of a linear mapping, self-adjoint mappings, the Spectral Theorem for 2 x 2 and 3 x 3 symmetric matrices (self-adjoint mappings).

Proofs to Know

  1. If V is a vector space with some finite spanning set, then every two bases of V have the same number of elements. (You may state without proof any facts about solutions of systems of linear equations that you need here.)
  2. Proof of the dimension theorem.
  3. Let T : V -> W and S : W -> U be linear mappings between finite dimensional vector spaces. Then the composition ST : V -> U is linear. Moreover, if E,F,G are bases for V,W,U respectively, the matrix of T with respect to E,F is A, and the matrix of S with respect to F,G is B, then matrix of ST with respect to E,G is the matrix productBA.
  4. The cofactor matrix Acof satisfies A Acof = det(A)I.
  5. The proof of the Spectral Theorem for 2 x 2 matrices.

Review Problems

Consult the review sheets for the three in-class exams for review problems on topics 1 - 15 above.

For topic 16, from Smith Chapter 16/3 d,e,f, 11, 12, 14.

Some true-false review problems:

  1. The set W = {f in Fun(R) : f(4) = 3f(1)} is a vector subspace of Fun(R).
  2. If E and F are two sets of vectors in a vector space V, then L(E intersect F) = L(E) intersect L(F).
  3. If W is a vector subspace of V, then W = L(W).
  4. There exists a linearly independent set of 5 vectors in the vector space V = P3(R).
  5. A set of vectors E = {A1, ... , An} is linearly independent if and only if dim(L(E)) = n = |E|.
  6. If E = {A1, ... , An} is linearly independent in V, and T: V -> W is a linear mapping, then T(E) = {T(A1), ... , T(An)} is linearly independent.
  7. A linear mapping T : V -> V is said to be an involution if T2 = I. If T is an involution, then T is an isomorphism.
  8. If T is an involution and A is a vector in V, then A - T(A) is an eigenvector of T. What if A - T(A) <> 0?
  9. If E = {A1, ... , An} is linearly independent in V, and B1, ... , Bn are arbitrary vectors in W, then there exists a linear mapping T : V -> W such that T(Ai) = Bi for i = 1,...,n.
  10. If T : R4 -> R4 has characteristic polynomial (3-lambda)(2-lambda)2(1-lambda), then T is an isomorphism.
  11. Every homogeneous system of 3 linear equations in 4 variables has a non-zero solution.
  12. If M = (mij) is an upper-triangular n x n matrix (that is, if mij = 0 if i > j), then the eigenvalues of M are the diagonal entries mii, i = 1, ..., n.
  13. If E = {A1, ... , An} is an orthonormal basis in an inner product space V, and B,C are two vectors in V, then < B,C > = sumi = 1,...,n < B,Ai > < C,Ai >.
  14. The vector subspace of M2x2(R) consisting of matrices M satisfying Trace(M) = 0 has dimension 2.
  15. Let S : V -> W and T : U -> V be linear mappings. Then Ker(ST) is equal to Ker(T).
  16. If p(lambda) is the characteristic polynomial of M, then p(0) = det(M).