Pool problems in linear algebra

  1. Give an explicit example (with proof) showing that the union of two subspaces (of a given vector space) is not necessarily a subspace.
  2. Suppose U1 and U2 are subspaces of a vector space V. Recall that their sum is defined to be the set U1+U2={u1+u2u1U1,u2U2}. Prove U1+U2 is a subspace of V containing U1 and U2.

Suppose F is a field and A is an n×n matrix over F. Suppose further that A possesses distinct eigenvalues λ1 and λ2 with dimNull(Aλ1In)=n1. Prove A is diagonalizable.


Let ϕ:VW be a surjective linear transformation of finite-dimensional linear spaces. Show that there is a UV such that V=(ker(ϕ))U and ϕU:UW is an isomorphism. (Note that V is not assumed to be an inner-product space; also note that ker(ϕ) is sometimes referred to as the null space of ϕ; finally, ϕU denotes the restriction of ϕ to U.)


Suppose V is a finite-dimensional real vector space and T:VV is a linear transformation. Prove that T has at most dim(rangeT) distinct nonzero eigenvalues.


Let a and b be real numbers and let AR3×3 with each diagonal entry equal to a and each off-diagonal entry equal to b.

  1. Determine all eigenvalues and representative eigenvectors of A together with their algebraic multiplicities. (Hint: A=(ab)I+bJ where J is the 3×3 matrix each of whose entries equals 1.)
  2. Is A diagonalizable? Justify your answer.
  3. Determine the minimal polynomial of A.

Let T:VV be a linear transformation on a finite-dimensional vector space. Prove that if T2=T, then

V=ker(T)im(T).


Let R3 denote the 3-dimensional vector space, and let v=(a,b,c) be a fixed nonzero vector. The maps C:R3R3 and D:R3R defined by C(w)=v×w and D(w)=(vw)v are linear transformations.

  1. Determine the eigenvalues of C and D.
  2. Determine the eigenspaces of C and D as subspaces of R3, in terms of a,b,c.
  3. Find a matrix for C with respect to the standard basis.

Show all work and explain reasoning.


Suppose A is a real n×n matrix that satisfies A2v=2Av for every vRn.

  1. Show that the only possible eigenvalues of A are 0 and 2.
  2. For each λR, let Eλ denote the λ-eigenspace of A, i.e., Eλ={vRnAv=λv}. Prove that Rn=E0E2. (Hint: For every vector v one can write v=(v12Av)+12Av.)

Suppose T:RnRn is a linear transformation with distinct eigenvalues λ1,λ2,,λm, and let v1,v2,,vm be corresponding eigenvectors. Prove v1,v2,,vm are linearly independent.


Let S:VV and T:VV be linear transformations that commute, i.e. ST=TS. Let vV be an eigenvector of S such that T(v)0. Prove that T(v) is also an eigenvector of S.


Suppose A is a 5×5 matrix and v1,v2,v3 are eigenvectors of A with distinct eigenvalues. Prove {v1,v2,v3} is a linearly independent set. Hint: Consider a minimal linear dependence relation.


Suppose V is a vector space, and v1,v2,,vn are in V. Prove that either v1,,vn are linearly independent, or there exists a number kn such that vk is a linear combination of v1,,vk1.


Let M4(R) denote the 16-dimensional real vector space of 4×4 matrices with real entries, in which the vectors are represented as matrices. Let T:M4(R)M4(R) be the linear transformation defined by T(A)=AA.

  1. Determine the dimension of ker(T).
  2. Determine the dimension of im(T).

Let V be a vector space with basis v0,,vn and let a0,,an be scalars. Define a linear transformation T:VV by the rules T(vi)=vi+1 if i<n, and T(vn)=a0v0+a1v1++anvn. You don't have to prove this defines a linear transformation. Determine the matrix for T with respect to the basis v0,,vn, and determine the characteristic polynomial of T.


Suppose T is a linear transformation on a finite-dimensional complex inner-product space V. Let I denote the identity transformation on V. The numerical range of T is the subset of C defined by

W(T)={T(x),x|xV,x=1}.

  1. Show that W(T+cI)=W(T)+c for every cC.
  2. Show that W(cT)=cW(T) for every cC.
  3. Show that the eigenvalues of T are contained in W(T).
  4. Let B be an orthonormal basis for V. Show that the diagonal entries of [T]B are contained in W(T).

Let V be a vector space and T:VV be a linear transformation.

  1. Prove that if T is a projection (i.e., T2=T), then V can be decomposed into the internal direct sum V=null(T)range(T).
  2. Suppose V is an inner product space and T is the adjoint of T with respect to the inner product. Show that null(T) is the orthogonal complement of range(T).
  3. Suppose V is an inner product space and T is an orthogonal projection, i.e., a projection for which the null space and range are orthogonal. Show that T is self adjoint.

Let A be a real n×n matrix and let A denote its transpose.

  1. Prove that (Av)w=v(Aw) for all vectors v,wRn. Hint: Recall that the dot product uv equals the matrix product uv.
  2. Suppose now A is also symmetric, i.e., that A=A. Also suppose v and w are eigenvectors of A with different eigenvalues. Prove that v and w are orthogonal.

A real n×n matrix A is called skew-symmetric if A=A. Let Vn be the set of all skew-symmetric matrices in Mn(R). Recall that Mn(R) is an n2-dimensional R-vector space with standard basis {eij|1i,jn}, where eij is the n×n matrix with a 1 in the (i,j)-position and zeros everywhere else.

  1. Show Vn is a subspace of Mn(R).
  2. Find an ordered basis B for the space V3 of all skew-symmetric 3×3 matrices.

Home for the Algebra Qual
Exam syllabus
Problem bank
Past exams