Pool problems in linear algebra
- Give an explicit example (with proof) showing that the union of two subspaces (of a given vector space) is not necessarily a subspace.
- Suppose
and are subspaces of a vector space . Recall that their sum is defined to be the set . Prove is a subspace of containing and .
\begin{enumerate}[label=\alph*)]
\item Give an explicit example (with proof) showing that the union of two subspaces (of a given vector space) is not necessarily a subspace.
\item Suppose $U_1$ and $U_2$ are subspaces of a vector space $V$. Recall that their {\bfseries sum} is defined to be the set $U_1+U_2 =\left\{u_1+u_2\,\mid \, u_1\in U_1, u_2\in U_2\right\}$. Prove $U_1+U_2$ is a subspace of $V$ containing $U_1$ and $U_2$.
\end{enumerate}
Suppose
Suppose $F$ is a field and $A$ is an $n\times n$ matrix over $F$. Suppose further that $A$ possesses distinct eigenvalues $\lambda_1$ and $\lambda_2$ with $\dim \operatorname{Null}(A-\lambda_1 I_n)=n-1$. Prove $A$ is diagonalizable.
Let
Let $\phi: V\to W$ be a surjective linear transformation of finite-dimensional linear spaces. Show that there is a $U\subseteq V$ such that $V=(\ker(\phi))\oplus U$ and $\phi\mid_U:U\to W$ is an isomorphism. (Note that $V$ is not assumed to be an inner-product space; also note that $\ker(\phi)$ is sometimes referred to as the {\bfseries null space} of $\phi$; finally, $\phi\mid_U$ denotes the restriction of $\phi$ to $U$.)
Suppose
Suppose $V$ is a finite-dimensional real vector space and $T:V\to V$ is a linear transformation. Prove that $T$ has at most $\dim(\operatorname{range} \,T)$ distinct nonzero eigenvalues.
Let
- Determine all eigenvalues and representative eigenvectors of
together with their algebraic multiplicities. (Hint: where is the matrix each of whose entries equals .) - Is
diagonalizable? Justify your answer. - Determine the minimal polynomial of
.
Let $a$ and $b$ be real numbers and let $A\in {\bf R}^{3\times 3}$ with each diagonal entry equal to $a$ and each off-diagonal entry equal to $b$.
\begin{enumerate}[label=\alph*)]
\item Determine all eigenvalues and representative eigenvectors of $A$ together with their algebraic multiplicities.
\medskip
\noindent ({\itshape Hint:} $A=(a-b)I+bJ$ where $J$ is the $3\times 3$ matrix each of whose entries equals $1$.)
\item Is $A$ diagonalizable? Justify your answer.
\item Determine the minimal polynomial of $A$.
\end{enumerate}
Let
Let $T:V\to V$ be a linear transformation on a finite-dimensional vector space. Prove that if $T^2=T$, then
\[
V=\ker(T)\oplus \operatorname{im}(T).
\]
Let
- Determine the eigenvalues of
and . - Determine the eigenspaces of
and as subspaces of , in terms of . - Find a matrix for
with respect to the standard basis.
Show all work and explain reasoning.
Let ${\bf R}^3$ denote the $3$-dimensional vector space, and let ${\bf v}=(a,b,c)$ be a fixed nonzero vector. The maps $C:{\bf R}^3\to {\bf R}^3$ and $D:{\bf R}^3\to {\bf R}$ defined by $C({\bf w})={\bf v}\times {\bf w}$ and $D({\bf w})=({\bf v}\cdot {\bf w}){\bf v}$ are linear transformations.
\begin{enumerate}[label=\alph*)]
\item Determine the eigenvalues of $C$ and $D$.
\item Determine the eigenspaces of $C$ and $D$ as subspaces of ${\bf R}^3$, in terms of $a, b, c$.
\item Find a matrix for $C$ with respect to the standard basis.
\end{enumerate}
Show all work and explain reasoning.
Suppose
- Show that the only possible eigenvalues of
are 0 and 2. - For each
, let denote the -eigenspace of , i.e., . Prove that . (Hint: For every vector one can write .)
Suppose $A$ is a real $n\times n$ matrix that satisfies $A^2 {\bf v} = 2A{\bf v}$ for every ${\bf v}\in {\bf R}^n$.
\begin{enumerate}[label=\alph*)]
\item Show that the only possible eigenvalues of $A$ are 0 and 2.
\item For each $\lambda\in {\bf R}$, let $E_{\lambda}$ denote the $\lambda$-eigenspace of $A$, i.e., $E_{\lambda} = \{{\bf v}\in {\bf R}^n\mid A{\bf v}=\lambda {\bf v}\}$. Prove that ${\bf R}^n = E_0\oplus E_2$. ({\itshape Hint:} For every vector ${\bf v}$ one can write ${\bf v}=({\bf v}-\frac{1}{2}A{\bf v})+\frac{1}{2}A{\bf v}$.)
\end{enumerate}
Suppose
Suppose $T:{\bf R}^n\to {\bf R}^n$ is a linear transformation with distinct eigenvalues $\lambda_1, \lambda_2,\ldots, \lambda_m$, and let ${\bf v}_1,{\bf v}_2,\ldots, {\bf v}_m$ be corresponding eigenvectors. Prove ${\bf v}_1,{\bf v}_2,\ldots, {\bf v}_m$ are linearly independent.
Let
Let $S : V \to V$ and $T : V \to V$ be linear transformations that commute, i.e. $S \circ T = T \circ S$. Let $v \in V$ be an eigenvector of $S$ such that $T(v) \ne 0$. Prove that $T(v)$ is also an eigenvector of $S$.
Suppose
Suppose $A$ is a $5\times 5$ matrix and $v_1,v_2, v_3$ are eigenvectors of $A$ with distinct eigenvalues. Prove $\{v_1,v_2,v_3\}$ is a linearly independent set. {\itshape Hint:} Consider a minimal linear dependence relation.
Suppose
Suppose $V$ is a vector space, and ${\bf v}_1, {\bf v}_2, \ldots, {\bf v}_n$ are in $V$. Prove that either ${\bf v}_1, \ldots, {\bf v}_n$ are linearly independent, or there exists a number $k\leq n$ such that ${\bf v}_k$ is a linear combination of ${\bf v}_1,\ldots, {\bf v}_{k-1}$.
Let
- Determine the dimension of
. - Determine the dimension of
.
Let $M_4({\bf R})$ denote the 16-dimensional real vector space of $4\times 4$ matrices with real entries, in which the vectors are represented as matrices. Let $T:M_4({\bf R})\to M_4({\bf R})$ be the linear transformation defined by $T(A)=A-A^{\top}$.
\begin{enumerate}[label=\alph*)]
\item Determine the dimension of $\operatorname{ker}(T)$.
\item Determine the dimension of $\operatorname{im}(T)$.
\end{enumerate}
Let
Let $V$ be a vector space with basis ${\bf v}_0,\ldots, {\bf v}_n$ and let $a_0,\ldots, a_n$ be scalars. Define a linear transformation $T:V\to V$ by the rules $T({\bf v}_i)={\bf v}_{i+1}$ if $i<n$, and $T({\bf v}_n)=a_0{\bf v}_0+a_1{\bf v}_1+\cdots +a_n {\bf v}_n$. You don't have to prove this defines a linear transformation. Determine the matrix for $T$ with respect to the basis ${\bf v}_0,\ldots, {\bf v}_n$, and determine the characteristic polynomial of $T$.
Suppose
- Show that
for every . - Show that
for every . - Show that the eigenvalues of
are contained in . - Let
be an orthonormal basis for . Show that the diagonal entries of are contained in .
Suppose $T$ is a linear transformation on a finite-dimensional complex inner-product space $V$. Let $I$ denote the identity transformation on $V$. The {\bfseries numerical range} of $T$ is the subset of ${\bf C}$ defined by
\[
\operatorname{W}(T)=\{\langle T(x),x\rangle \,|\, x\in V,\quad \|x\|=1\}.
\]
\medskip
\begin{enumerate}[label=(\alph*)]
\item Show that $\operatorname{W}(T+cI)=\operatorname{W}(T)+c$ for every $c\in {\bf C}$.
\item Show that $\operatorname{W}(cT)=c\operatorname{W}(T)$ for every $c\in {\bf C}$.
\item Show that the eigenvalues of $T$ are contained in $\operatorname{W}(T)$.
\item Let $\mathcal{B}$ be an orthonormal basis for $V$. Show that the diagonal entries of $[T]_{\mathcal{B}}$ are contained in $\operatorname{W}(T)$.
\end{enumerate}
Let
- Prove that if
is a projection (i.e., ), then can be decomposed into the internal direct sum . - Suppose
is an inner product space and is the adjoint of with respect to the inner product. Show that is the orthogonal complement of . - Suppose
is an inner product space and is an orthogonal projection, i.e., a projection for which the null space and range are orthogonal. Show that is self adjoint.
Let $V$ be a vector space and $T:V\to V$ be a linear transformation.
\begin{enumerate}[label=\alph*)]
\item Prove that if $T$ is a projection (i.e., $T^2=T$), then $V$ can be decomposed into the internal direct sum $V=\operatorname{null}(T)\oplus \operatorname{range}(T)$.
\item Suppose $V$ is an inner product space and $T^*$ is the adjoint of $T$ with respect to the inner product. Show that $\operatorname{null}(T^*)$ is the orthogonal complement of $\operatorname{range}(T)$.
\item Suppose $V$ is an inner product space and $T$ is an orthogonal projection, i.e., a projection for which the null space and range are orthogonal. Show that $T$ is self adjoint.
\end{enumerate}
Let
- Prove that
for all vectors . Hint: Recall that the dot product equals the matrix product . - Suppose now
is also symmetric, i.e., that . Also suppose and are eigenvectors of with different eigenvalues. Prove that and are orthogonal.
Let $A$ be a real $n\times n$ matrix and let $A^{\top}$ denote its transpose.
\begin{enumerate}[label=\alph*)]
\item Prove that $(A{\bf v})\cdot {\bf w}= {\bf v}\cdot (A^{\top}{\bf w})$ for all vectors ${\bf v},{\bf w}\in {\bf R}^n$. {\itshape Hint:} Recall that the dot product ${\bf u}\cdot {\bf v}$ equals the matrix product ${\bf u}^{\top}{\bf v}$.
\item Suppose now $A$ is also symmetric, i.e., that $A^{\top} = A$. Also suppose ${\bf v}$ and ${\bf w}$ are eigenvectors of $A$ with different eigenvalues. Prove that ${\bf v}$ and ${\bf w}$ are orthogonal.
\end{enumerate}
A real
- Show
is a subspace of . - Find an ordered basis
for the space of all skew-symmetric matrices.
A real $n\times n$ matrix $A$ is called {\bfseries skew-symmetric} if $A^{\top}=-A$. Let $V_n$ be the set of all skew-symmetric matrices in $\operatorname{M}_n({\bf R})$. Recall that $\operatorname{M}_n({\bf R})$ is an $n^2$-dimensional ${\bf R}$-vector space with standard basis $\left\{e_{ij}\,|\, 1\leq i,j\leq n\right\}$, where $e_{ij}$ is the $n\times n$ matrix with a 1 in the $(i,j)$-position and zeros everywhere else.
\begin{enumerate}[label=\alph*)]
\item Show $V_n$ is a subspace of $\operatorname{M}_n({\bf R})$.
\item Find an ordered basis $\mathcal{B}$ for the space $V_3$ of all skew-symmetric $3\times 3$ matrices.
\end{enumerate}
Related pages
Home for the Algebra Qual
Exam syllabus
Problem bank
Past exams