Mathematics 244, section 1 -- Linear Algebra

Review Sheet for Final Exam

May 3, 2004

General Information

As announced in the course syllabus, the final exam of the semester will be given at 8:30 am on Monday, May 10, in our regular classroom, SW 302. You will have the full 3-hour period to work on this exam. The exam will be roughly twice the length of the three in-class exams, though, so if you are well-prepared and work steadily, then you should be able to complete it in two hours without time pressure. The questions will be similar to those from the in-class exams. Again, one question may (i.e. will!) consist of several ``true - false'' questions where you must either give a short proof (if you think the statement is true), or a counterexample (if you think the statement is false).

Topics to be Covered

This will be a comprehensive examination, covering all the topics we have studied since the start of the semester. This includes:

  1. The axioms for vector spaces -- showing that a set is or is not a vector space using the axioms
  2. The key examples of vector spaces: Rk, Pk(R), various function spaces, and their properties
  3. Subspaces of a vector space -- know how to show that a subset of a vector space is or is not a subspace using the definition.
  4. Linear combinations and the linear span of a set of vectors
  5. Linear dependence and independence (know the definitions and how to show whether a set is linearly independent or not)
  6. Reduction to echelon form as a method for solving systems of linear equations and inverting matrices.
  7. Bases of a vector space and the dimension of a vector space
  8. Linear mappings, kernels, images, etc.
  9. The dimension theorem: If T : V -> W is linear and V is finite dimensional, then

    dim(V) = dim(Ker(T)) + dim(Im(T))

    and its consequences for injectivity, surjectivity, isomorphisms, etc.

  10. Matrix representations of a linear mapping
  11. Basic theorems about sums and compositions of linear mappings and the corresponding facts about matrices
  12. Change of basis
  13. Eigenvalues and eigenvectors
  14. Determinants and the characteristic polynomial of a mapping or matrix. Know how to use the characteristic polynomial to find all the eigenvalues and eigenvectors of a linear mapping T : V -> V. I will stick to small dimensions here: dim(V) <= 4. In any example you will need to work out, the characteristic polynomial will be set up to factor "nicely".
  15. Diagonalizability; the characterization of diagonalizable linear mappings T : V -> V.
  16. The dot product on Rn and its properties.
  17. The Spectral Theorem for real symmetric matrices (symmetric mappings) -- both forms: the statement that there exists an orthonormal basis consisting of eigenvectors of the symmetric mapping, and the statement giving the spectral decomposition of A and I in terms of the orthogonal projections onto the eigenspaces.

Proofs to Know

See the review sheets for midterm exams 1 - 3, plus the following additional proof:

  1. If A is a symmetric matrix, and x,y are eigenvectors of A with distinct eigenvalues, then x and y are orthogonal vectors.

Review Problems

Consult the review sheets for the three in-class exams for review problems on topics 1 - 16 above.

For topic 17, from Chapter 4, section 6: 1 cde, 4,5,8,9,10.
Several parts of these questions use 2 facts that we have not seen explicitly in this class, but that you may use without proof:

Note: I will treat the problems from Chapter 4, section 6 above as an Extra Credit assignment for the course. Solutions to some or all of these may be submitted any time up to the time of the final exam for Extra Credit on the semester Problem Set average. (If you don't get all of these, it will still be to your advantage to submit solutions for the ones you can solve.)

Some true-false review problems:

  1. The set W = {f in F(R) : f(4) = 3f(1)} is a vector subspace of F(R).
  2. If E and F are two sets of vectors in a vector space V, then Span(E intersect F) = Span(E) intersect Span(F).
  3. If W is a vector subspace of V, then W = Span(W).
  4. There exists a linearly independent set of 5 vectors in the vector space V = P3(R).
  5. A set of vectors E = {x1, ... , xn} is linearly independent if and only if dim(Span(E)) = n = |E|.
  6. If E = {x1, ... , xn} is linearly independent in V, and T: V -> W is a linear mapping, then T(E) = {T(x1), ... , T(xn)} is linearly independent.
  7. A linear mapping T : V -> V is said to be an involution if T2 = I. If T is an involution, then T is an isomorphism.
  8. If T is an involution (see 7 above) and x is a vector in V, then x - T(x) is an eigenvector of T.
  9. If E = {x1, ... , xn} is linearly independent in V, and y1, ... , yn are arbitrary vectors in W, then there exists a linear mapping T : V -> W such that T(xi) = yi for i = 1,...,n.
  10. If T : R4 -> R4 has characteristic polynomial (3-lambda)(2-lambda)2(1-lambda), then T is an isomorphism.
  11. Every homogeneous system of 3 linear equations in 4 variables has a non-zero solution.
  12. If M = (mij) is an upper-triangular n x n matrix (that is, if mij = 0 if i > j), then the eigenvalues of M are the diagonal entries mii, i = 1, ..., n.
  13. If E = {x1, ... , xn} is an orthonormal basis of Rn, and y,z are two vectors in V, then
    < y,z > = sumi = 1,...,n < y,xi > < z,xi >

  14. The vector subspace of M2x2(R) consisting of matrices M satisfying Tr(M) = 0 has dimension 2.
  15. Let S : V -> W and T : U -> V be linear mappings. Then Ker(ST) is equal to Ker(T).
  16. If p(lambda) is the characteristic polynomial of M, then p(0) = det(M).

Review Session

If there is interest, we can schedule a day or evening review session during study week.