Homework Assignments

Homework is collected at the beginning of lecture of the specified day. No late homework will be accepted. Please write your name and the homework number on each assignment. If you would like to try writing your homework with LaTeX, here is a template that produces this output.

Homework 12, due Thursday, December 5th (Sections 5.3 and 5.4) Solutions
  • 1. Let $V$ be a finite-dimensional inner product space, let $E\subset V$ be a subspace, and let $P_E$ be the orthogonal projection onto $E$.
    • (a) Determine the spectrum $\sigma(P_E)$.
    • (b) Prove that $P_E$ is diagonalizable.
    • (c) Find a diagonalization of $P_E$.
  • 2. Let $A\in M_{m\times n}$. For $\mathbf{v}\in \mathbb{F}^n$ and $\mathbf{w}\in \mathbb{F}^m$, show that \[ \langle A\mathbf{v},\mathbf{w}\rangle = \langle \mathbf{v} , A^*\mathbf{w} \rangle. \]
  • 3. Let $\mathbf{v}_1,\ldots, \mathbf{v}_n\in \mathbb{F}^n$. Show that the system $\mathbf{v}_1,\ldots, \mathbf{v}_n$ is orthonormal if and only if the matrix \[ A=\left(\begin{array}{ccc} \mathbf{v}_1 & \cdots & \mathbf{v}_n \end{array}\right) \] is unitary.
  • 4. Let $A\in M_{n\times n}$ be a self-adjoint matrix: $A=A^*$.
    • (a) Show that if $\lambda,\mu$ are distinct eigenvalues of $A$, then the eigenspaces $\text{Ker}(A-\lambda I)$ and $\text{Ker}(A-\mu I)$ are orthogonal.
    • (b) Show that there exists a unitary matrix $U\in M_{n\times n}$ so that \[ A= U\left(\begin{array}{ccc} \lambda_1 & & 0 \\ &\ddots &\\ 0 & & \lambda_n \end{array}\right) U^*, \] where $\lambda_1,\ldots, \lambda_n$ are the eigenvalues of $A$ counting multiplicities.
  • 5. Let $C[0,1]$ be the vector space of continuous functions $f\colon [0,1]\to \mathbb{R}$, which we endow with the following inner product: \[ \langle f,g\rangle_2= \int_0^1 f(t)g(t)\ dt \qquad f,g\in C[0,1]. \]
    • (a) Show that the following functions in $C[0,1]$ are linearly independent: \[ f_1(t) =t-1\qquad f_2(t) = \sqrt{t} \qquad f_3(t) = 2. \]
    • (b) Find an orthonormal basis for the subspace $E:=\text{span}\{f_1,f_2,f_3\}$.
  • 6. Find the equation for the line $y=ax+b$ that best fits the following data: \[ (-1,2),\ (0,1),\ (1,4),\ (2,5),\ (3,8). \] Plot the data points and the line.
  • 7. Consider the matrix \[ A=\left(\begin{array}{rr} 3 & 2 \\ 2 & 3 \\ 2 & -2 \end{array}\right). \]
    • (a) Compute the singular values of $A$.
    • (b) Compute the singular values of $A^*$.
    • (c) Compute the absolute value $|A|$ of $A$.
  • Exercises 1 and 4 were graded and the rest were checked for completion.
Homework 11, due Tuesday, November 26th (Sections 5.2 and 5.3) Solutions
  • 1. Let $V$ be an inner product space with an orthonormal basis $\mathcal{B}:=\{\mathbf{v}_1,\ldots, \mathbf{v}_n\}$.
    • (a) Prove that for any $\mathbf{x},\mathbf{y}\in V$ \[ \langle \mathbf{x},\mathbf{y} \rangle = \langle [\mathbf{x}]_\mathcal{B}, [\mathbf{y}]_\mathcal{B} \rangle, \] where the first inner product is from $V$ and the second is from $\mathbb{F}^n$.
    • (b) Prove Parseval's identity: \[ \langle \mathbf{x},\mathbf{y} \rangle = \sum_{j=1}^n \langle \mathbf{x},\mathbf{v}_j \rangle \langle \mathbf{v}_j, \mathbf{y}\rangle \qquad \forall \mathbf{x},\mathbf{y}\in V. \]
  • 2. Let $V$ be an inner product space and let $P_E$ be the orthogonal projection onto a subspace $E\subset V$.
    • (a) Prove that $P_E(\mathbf{v})=\mathbf{v}$ if and only if $\mathbf{v}\in E$.
    • (b) Prove that $P_E(\mathbf{w})=\mathbf{0}$ if and only if $\mathbf{w}\perp E$.
    • (c) Show that $P_E\circ P_E= P_E$ and $P_E\circ (I-P_E)=O$.
  • 3. Let $V$ be an inner product space and let $S\subset V$ be a subset. Define \[ S^\perp := \{\mathbf{v}\in V\colon \mathbf{v}\perp \mathbf{x}\ \forall \mathbf{x}\in S\}. \]
    • (a) Prove that $S^\perp$ is a subspace (even when $S$ is not).
    • (b) Show that $S\subset (S^\perp)^\perp$.
    • (c) Prove that $S= (S^\perp)^\perp$ if and only if $S$ is a subspace.
    • (d) Prove that $(S^\perp)^\perp = \text{span}\,S$.
Exercises 2 and 3 were graded and the rest were checked for completion.
Homework 10, due Thursday, November 21st (Section 5.1) Solutions
  • 1. Verify that $\langle A,B \rangle_{2}:=\text{Tr}(B^*A)$ defines an inner product on $M_{m\times n}$.
  • 2. Let $V$ be an inner product space with a generating system $\mathbf{v}_1,\ldots, \mathbf{v}_n$. Prove that for $\mathbf{x}\in V$, $\langle \mathbf{x},\mathbf{v}_k\rangle=0$ for each $k=1,\ldots, n$ if and only if $\mathbf{x}=\mathbf{0}$.
  • 3. Let $V$ be an inner product space. Prove that equality holds in the Cauchy--Schwarz inequality if and only if one of the vectors is a scalar multiple of the other. That is, for $\mathbf{x},\mathbf{y}\in V$, prove that $|\langle \mathbf{x},\mathbf{y}\rangle| = \|\mathbf{x}\| \|\mathbf{y}\|$ if and only if $\mathbf{x}=\alpha \mathbf{y}$ or $\mathbf{y}=\alpha\mathbf{x}$ for some scalar $\alpha$.
  • 4. Let $V$ be a real inner product space and let $\mathbf{x},\mathbf{y} \in V$.
    • (a) For $t\in \mathbb{R}$, define $p(t):=\|\mathbf{x} - t \mathbf{y}\|^2$. Show that $p(t)$ is a quadratic polynomial with real coefficients.
    • (b) Using the quadratic formula, determine when $p(t)$ has at most one real root.
    • (c) Use the previous part to give an alternate proof of the Cauchy--Schwarz inequality.
  • 5. Let $A\in M_{n\times n}$ be diagonalizable with eigenvalues $\lambda_1,\ldots,\lambda_n$ counting multiplicities. Assume that in the diagonalization $A=QDQ^{-1}$, one can choose $Q$ to be a unitary matrix. Prove that \[ \|A\|_p = \left(\sum_{i=1}^n |\lambda_i|^p \right)^{1/p}. \] Use this to determine how $\|A\|_\infty$ should be defined.
  • 6. Let $V$ be a real vector space with norm $\|\cdot\|$. Assume the parallelogram identity holds: \[ \|\mathbf{x} + \mathbf{y}\|^2 + \| \mathbf{x} - \mathbf{y} \|^2 = 2\|\mathbf{x}\|^2 + 2\|\mathbf{y}\|^2 \] for all $\mathbf{x},\mathbf{y}\in V$. Define \[ \langle\mathbf{x}, \mathbf{y}\rangle := \frac14 \|\mathbf{x} + \mathbf{y}\|^2 - \frac14 \| \mathbf{x} - \mathbf{y}\|^2. \] In this exercise you will prove that the above map is an inner product.
    • (a) Show that $\langle \mathbf{x},\mathbf{x} \rangle = \|\mathbf{x}\|^2$ for all $\mathbf{x}\in V$. Use this to prove non-negativity and non-degeneracy.
    • (b) Prove symmetry; that is, show that $\langle \mathbf{y},\mathbf{x} \rangle = \langle \mathbf{x},\mathbf{y}\rangle$ for all $\mathbf{x},\mathbf{y}\in V$.
    • (c) Show that $\langle 2\mathbf{x}, \mathbf{y} \rangle = 2\langle \mathbf{x},\mathbf{y} \rangle$ and $\langle\frac{1}{2}\mathbf{x}, \mathbf{y}\rangle = \frac{1}{2} \langle \mathbf{x}, \mathbf{y} \rangle$ for all $\mathbf{x},\mathbf{y}\in V$.
    • (d) Show that $\langle \mathbf{w} +\mathbf{x}, \mathbf{y} \rangle = \langle \mathbf{w},\mathbf{y} \rangle + \langle \mathbf{x}, \mathbf{y} \rangle$ for all $\mathbf{w},\mathbf{x},\mathbf{y}\in V$.
      [Hint: use $\mathbf{w}+\mathbf{x} \pm \mathbf{y} = \mathbf{w} \pm \frac12 \mathbf{y} + \mathbf{x} \pm \frac12 \mathbf{y}$.]
    • (e) Show that $\langle - \mathbf{x}, \mathbf{y} \rangle = -\langle \mathbf{x}, \mathbf{y} \rangle$ for all $\mathbf{x},\mathbf{y}\in V$.
    • (f) Show that $\langle n \mathbf{x}, \mathbf{y} \rangle = n \langle \mathbf{x}, \mathbf{y} \rangle$ and $\langle \frac1n \mathbf{x}, \mathbf{y} \rangle = \frac1n \langle \mathbf{x}, \mathbf{y} \rangle$ for all $n\in \mathbb{Z}$ and all $\mathbf{x},\mathbf{y}\in V$.
    • (g) Show that $\langle q \mathbf{x}, \mathbf{y} \rangle = q \langle \mathbf{x}, \mathbf{y} \rangle$ for all $q\in \mathbb{Q}$ and all $\mathbf{x},\mathbf{y}\in V$.
    • (h) Use the previous parts to prove rational linearity; that is, \[ \langle p \mathbf{w} + q\mathbf{x},\mathbf{y}\rangle = p \langle \mathbf{w},\mathbf{y}\rangle + q \langle \mathbf{x},\mathbf{y}\rangle, \] for all $p,q\in \mathbb{Q}$ and all $\mathbf{w},\mathbf{x},\mathbf{y}\in V$.
    The following parts will not be collected as part of Homework 10.
    • (i) Use the parallelogram identity and the triangle inequality to show $|\langle \mathbf{x},\mathbf{y} \rangle| \leq \| \mathbf{x}\| \|\mathbf{y}\|$ for all $\mathbf{x},\mathbf{y}\in V$.
    • (j) Show that \[ | \langle \alpha \mathbf{x},\mathbf{y} \rangle - \alpha \langle \mathbf{x},\mathbf{y}\rangle | \leq 2 |\alpha - q | \|\mathbf{x}\| \|\mathbf{y}\| \] for any $\alpha\in\mathbb{R}$, $q\in \mathbb{Q}$, and $\mathbf{x},\mathbf{y}\in V$.
    • (k) Use the fact that $\mathbb{Q}$ is dense in $\mathbb{R}$ (that is, any real number has a sequence of rational numbers converging to it) to prove $ \langle \alpha \mathbf{x}, \mathbf{y}\rangle = \alpha \langle \mathbf{x},\mathbf{y}\rangle$ for all $\alpha\in \mathbb{R}$ and all $\mathbf{x},\mathbf{y}\in V$.
    • (l) Use the previous parts to prove linearity.
Exercises 1 and 6 were graded and the rest were checked for completion.
Homework 9, due Thursday, November 7th (Section 4.1) Solutions
  • 1. For each of the following matrices find: (i) the characteristic polynomial; (ii) the spectrum; and (iii) a basis for each eigenspace.
       (a) $\left(\begin{array}{rr} 2 & -1 \\ -2 & 3\end{array}\right)$    (b) $\left(\begin{array}{rr} 1 & 1 \\ - 1 & 3 \end{array}\right)$    (c) $\left(\begin{array}{rr} 1 & 1 \\ 1 & 0 \end{array}\right)$    (d) $\left(\begin{array}{rrr} 1 & 3 & 3 \\ -3 & -5 & -3 \\ 3 & 3 & 1 \end{array}\right)$.
  • 2. Let $\theta\in [0,2\pi)$. Find the spectrum $\sigma(R_\theta)$ for the rotation matrix \[ R_\theta = \left(\begin{array}{rr} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta) \end{array}\right). \] Determine which values of $\theta$ yield only real eigenvalues.
  • 3. Let $A\in M_{n\times n}$ be a nilpotent matrix. Prove that $\sigma(A)=\{0\}$.
  • 4. Fix $m,n\in\mathbb{N}$. Let $A\in M_{m\times m}$, $B\in M_{m\times n}$, and $C\in M_{n\times n}$. Prove that the characteristic polynomial of \[ \left(\begin{array}{cc} A & B \\ \mathbf{0} & C\end{array}\right) \] is the product of the characteristic polynomials of $A$ and $C$.
  • 5. Let $T\colon V\to V$ be a linear transformation on a finite-dimensional vector space. Suppose there is a basis $\mathcal{B}=\{\mathbf{v}_1,\ldots, \mathbf{v}_n\}$ such that for some $k\leq n$, the first $k$ basis vectors $\mathbf{v}_1,\ldots, \mathbf{v}_k$ are all eigenvectors of $T$ with the same eigenvalue $\lambda$. Show that \[ [T]_{\mathcal{B}}^{\mathcal{B}} = \left(\begin{array}{cc} \lambda I_k & B \\ \mathbf{0} & C\end{array}\right) \] for some $B\in M_{k\times(n-k)}$ and $C\in M_{(n-k)\times (n-k)}$.
  • 6. Let $T\colon V\to V$ be a linear transformation on a finite-dimensional vector space. Suppose $\lambda$ is an eigenvalue of $T$. Prove that \[ \dim(\text{Ker}(T-\lambda I_V)) \leq m_{\lambda}(T), \] i.e. the geometric multiplicity is always less than or equal to the algebraic multiplicity.
    [Hint: use the two previous exercises.]
  • 7. Let $A\in M_{n\times n}$ with eigenvalues $\lambda_1,\ldots, \lambda_n$ counting multiplicities. Use the characteristic polynomial to prove each of the following formulas:
    • (a) $\text{Tr}(A)=\lambda_1+\cdots +\lambda_n$.
    • (b) $\det(A) = \lambda_1\cdots \lambda_n$.
Exercises 3 and 6 were graded and the rest were checked for completion.
Homework 8, due Thursday, October 31st (Sections 3.3, 3.4, and 3.5) Solutions
  • 1. Consider the following matrices \[ A=\left(\begin{array}{rrr} 0 & 2 & -1 \\ 1 & -1 & 3 \\ 0 & 0 & 4 \end{array}\right),\qquad B=\left(\begin{array}{rrr} 5 & 0 & -2 \\ 1 & 1 & 1 \\ -3 & -1 & 0 \end{array}\right),\qquad C=\left(\begin{array}{rrrr} 1 & 1 & -2 & 1 \\ 0 & 3 & 0 &3 \\ -2 & 1 & 1& 2 \\ 5 & -5 & 0 & -1 \end{array}\right). \]
    • (a) Compute $\det(A)$ and $\det(B)$ using cofactor expansion along any row.
    • (b) Compute $\det(A)$ and $\det(B)$ using cofactor expansion along any column.
    • (c) Compute $\det(C)$ using cofactor expansion along any row or column.
    • (d) Compute $\det(C)$ using row reduction.
  • 2. Let $A\in M_{n\times m}$.
    • (a) Suppose $n=m$. Prove that $A^TA$ is invertible if and only if $AA^T$ is invertible.
    • (b) Suppose $n\neq m$. Find a counterexample to the previous statement.
  • 3. For $A\in M_{n\times m}$, its adjoint (or conjugate transpose) is the matrix $A^*\in M_{m\times n}$ with entries $(A^*)_{ij} = \overline{(A)_{ji}}$, where $\bar{z}$ is the complex conjugate of $z\in \mathbb{C}$. Prove that $\det(A^*) = \overline{\det(A)}$.
  • 4. In this exercise, you will examine the behavior of the determinant on a variety of special matrices.
    • (a) Suppose $A\in M_{n\times n}$ is invertible. Prove that $\det(A^{-1}) = \frac{1}{\det(A)}$.
    • (b) Suppose $A,B\in M_{n\times n}$ are similar. Prove that $\det(A)=\det(B)$.
    • (c) A matrix $A\in M_{n\times n}$ is called nilpotent if $A^k=O$ for some $k\in \mathbb{N}$. Show that if $A$ is nilpotent, then $\det(A)=0$.
    • (d) A matrix $A\in M_{n\times n}(\mathbb{R})$ is called orthogonal if $A^TA=AA^T=I_n$. Show that if $A$ is orthogonal, then $\det(A)=\pm 1$.
    • (e) A matrix $A\in M_{n\times n}(\mathbb{C})$ is called unitary if $A^*A=AA^*=I_n$. Show that if $A$ is unitary, then $|\det(A)|=1$.
  • 5. Fix $m,n\in \mathbb{N}$.
    • (a) Show that $\det\left(\begin{array}{cc} E & \mathbf{0} \\ \mathbf{0} & I_n\end{array}\right)=\det(E)$ for an elementary matrix $E\in M_{m\times m}$.
    • (b) Show that $\det\left(\begin{array}{cc} I_m & \mathbf{0} \\ \mathbf{0} & E \end{array}\right)=\det(E)$ for an elementary matrix $E\in M_{n\times n}$.
    • (c) Show that $\det\left(\begin{array}{cc} A & B \\ \mathbf{0} & C \end{array}\right) = \det(A)\det(C)$ for $A\in M_{m\times m}$, $B\in M_{m\times n}$, and $C\in M_{n\times n}$.
      [Hint: use a product and the first two parts.]
  • 6. Consider the matrix \[ A=\left(\begin{array}{rrr} 2 & 0 & -2 \\ 0 & -1 & 0 \\ 1 & 1 & 1 \end{array}\right). \]
    • (a) Use the determinant to show that $A$ is invertible.
    • (b) Use Cramer's rule to solve $A\mathbf{x} = (-4,0, 8)^T$.
    • (c) Use the cofactor inversion formula to compute $A^{-1}$.
Exercises 4 and 5 were graded and the rest checked for completion.
Homework 7, due Thursday, October 24th (Sections 2.8, 3.1, and 3.2) Solutions
  • 1. Consider the two following bases for $\mathbb{P}_2$: \[ \mathcal{S}:=\{1,x,x^2\}\qquad \text{ and } \qquad \mathcal{A}=\{x-1, x^2+x, 2x\}. \]
    • (a) Compute $[I]_{\mathcal{A}}^{\mathcal{S}}$, the change of coordinate matrix from $\mathcal{A}$ to $\mathcal{S}$.
    • (b) Compute $[I]_{\mathcal{S}}^{\mathcal{A}}$, the change of coordinate matrix from $\mathcal{S}$ to $\mathcal{A}$.
    • (c) Compute the following coordinate vectors:
      • $[3x^2 - x + 2]_{\mathcal{A}}$
      • $[x^2 +x - 3]_{\mathcal{A}}$
      • $[x^2+x]_{\mathcal{A}}$
    • (d) For $T\colon \mathbb{P}_2 \to \mathbb{P}_2$ defined by $T(p(x)) = p'(x)$, compute the following matrix representations:
      • $[T]_{\mathcal{S}}^{\mathcal{S}}$
      • $[T]_{\mathcal{A}}^{\mathcal{S}}$
      • $[T]_{\mathcal{S}}^{\mathcal{A}}$
      • $[T]_{\mathcal{A}}^{\mathcal{A}}$
  • 2. Define a linear transformation $T\colon \mathbb{R}^2 \to \mathbb{R}^2$ by letting $T(\mathbf{v})$ be the reflection of $\mathbf{v}$ over the line $y=-\frac13 x$. For the standard basis $\mathcal{S}=\{\mathbf{e}_1,\mathbf{e}_2\}$, compute $[T]_{\mathcal{S}}^{\mathcal{S}}$.
  • 3. Show that if $A,B\in M_{n\times n}$ are similar, then $\text{Tr}(A)=\text{Tr}(B)$.
  • 4. Prove whether or not the following matrices are similar \[ A=\left(\begin{array}{rr} 1 & 3 \\ 2 & 2 \end{array}\right)\qquad \qquad B= \left(\begin{array}{rr} 0 & 2 \\ 4 & 2 \end{array}\right). \]
  • 5. Consider the matrix \[ A=\left(\begin{array}{cc} a & b \\ c & d \end{array}\right), \] and let $\mathbf{v}_1,\mathbf{v}_2$ be its column vectors. Prove that the area of the parallelogram determined by $\mathbf{v}_1,\mathbf{v}_2$ is always $|ad - bc|$.
    [Hint: find a rotation matrix $R_\theta$ such that $R_\theta \mathbf{v}_1 = \alpha \mathbf{e}_1$ for some scalar $\alpha$.]
  • 6. For a matrix $A\in M_{n\times n}$, suppose the RREF of $A^T$ is $I_n$. Prove that $\text{det}(A)\neq 0$.
Exercises 1 and 2 were graded and the rest checked for completion.
Homework 6, due Thursday, October 17th (Section 2.7) Solutions
  • 1. For the following matrix, compute its rank and find bases for each of its four fundamental subspaces: \[ A=\left(\begin{array}{rrrrr} 1 & 2 & 3 & 1 & 1 \\ 1 & 4 & 0 & 1 & 2 \\ 0 & 2 & -3 & 0 & 1 \\ 1 & 0 & 0 & 0 & 0 \end{array}\right). \]
  • 2. Let $S\colon U\to V$ and $T\colon V\to W$ be a linear transformations between finite-dimensional vector spaces.
    • (a) Prove that if $V_0\subset V$ is a subspace, then \[ T(V_0) = \{\mathbf{w}\in W\colon \mathbf{w}= T(\mathbf{v}) \text{ for some }\mathbf{v}\in V_0\} \] is a subspace.
    • (b) Prove that $\dim(T(V_0)) \leq \min\{\text{rank}(T), \dim(V_0)\}$.
    • (c) Prove that $\text{rank}(T\circ S) \leq \min\{\text{rank}(T), \text{rank}(S)\}$.
  • 3. Let $V$ be a finite dimensional vector space and let $X,Y\subset V$ be subspaces. The goal of this exercise is to proof the following formula: \[ \dim(X+Y) = \dim(X)+\dim(Y) - \dim(X\cap Y). \] Here, $X+Y:=\{\mathbf{v}=\mathbf{x}+\mathbf{y}\colon \mathbf{x}\in X,\ \mathbf{y}\in Y\}$.
    • (a) Prove that $X+Y$ is a subspace of $V$.
    • (b) The direct sum of $X$ and $Y$ is the following set \[ X\oplus Y:=\{(x,y)\colon x\in X,\ y\in Y\}. \] This can be made into a vector space with operations of addition \[ (x_1,y_1)+(x_2,y_2)= (x_1+x_2,y_1+y_2), \] and scalar multiplication \[ \alpha(x,y) = (\alpha x, \alpha y). \] You do not need to prove that $X\oplus Y$ is a vector space, but do prove that $\dim(X\oplus Y)=\dim(X)+\dim(Y)$.
    • (c) Consider the transformation $T\colon X\oplus Y\to V$ defined by $T(x,y) = x-y$. Prove that $T$ is linear.
    • (d) Show that $\text{Ran}(T)=X+Y$.
    • (e) Show that $\text{Ker}(T)\cong X\cap Y$.
    • (f) Use the rank-nullity theorem on $T$ to prove the claimed formula.
  • 4. Let $A\in M_{m\times n}$. Prove that $A\mathbf{x}=\mathbf{b}$ is consistent for all $\mathbf{b}\in \mathbb{F}^m$ if and only if $A^T\mathbf{x}=\mathbf{0}$ has a unique solution.
  • 5. Complete the following set of vectors to a basis for $\mathbb{R}^5$: \[ \mathbf{v}_1=\left(\begin{array}{r} 1 \\ 2 \\ -1 \\ 2 \\ 3 \end{array}\right),\qquad \mathbf{v}_2 = \left(\begin{array}{r} 2 \\ 2\\ 1 \\5 \\5 \end{array}\right),\qquad \mathbf{v}_3= \left(\begin{array}{r} -1 \\ -4 \\ 4\\ 7 \\-8 \end{array}\right) \]
Exercises 2 and 3 were graded and the rest were checked for completion.
Homework 5, due Thursday, October 10th (Sections 2.2 - 2.6) Solutions
  • 1. The reduced row echelon form of a matrix $A$ is \[ \left(\begin{array}{rrrrr} 1 & 0 & 2 & 0 & -2 \\ 0 & 1 & -5 & 0 & -3\\ 0 & 0 & 0 & 1 & 6 \end{array}\right). \] If the first, second, and fourth columns of $A$ are \[ \left(\begin{array}{r} 1 \\ - 1\\ 3 \end{array}\right),\qquad \left(\begin{array}{r} 0 \\ -1 \\ 1\end{array}\right),\qquad \left(\begin{array}{r} 1\\ -2 \\ 0\end{array}\right), \] respectively, find the original matrix $A$.
  • 2. Prove whether or not the system of polynomials \[ p_1(x)=x^3 + 2x,\qquad p_2(x) = x^2+x+1,\qquad p_3(x) = x^3+5, \] generates $\mathbb{P}_3$, the vector space of polynomials with degree at most three.
  • 3. Suppose $A\mathbf{x}=\mathbf{0}$ has a unique solution. Prove that $A$ is left invertible.
  • 4. Compute the inverses of the following matrices (if they exist). Show your work.
    • (a) $\left(\begin{array}{rrr} 1 & 2 & 1 \\ 3 & 7 & 3 \\ 2 & 3 & 4 \end{array}\right)$
    • (b) $\left(\begin{array}{rrr} 1 & -1 & 1 \\ 1 & 1 & -2 \\ 1 & 1 & 4 \end{array}\right)$
    • (c) $\left(\begin{array}{rrr} 1 & 0 & 3 \\ 3 & -1 & 0 \\ 4 & -1 & 3\end{array}\right)$
  • 5. Let $V$ be a finite-dimensional vector space with $\text{dim}(V)=n$. Show that a system of vectors $\mathbf{v}_1,\ldots, \mathbf{v}_n\in V$ is linearly independent if and only if it is generating in $V$.
  • 6. Define a linear transformation $T\colon \mathbb{P}_2\to \mathbb{P}_2$ by $T(p(x)) = p(x) - p'(x)$. Determine whether or not $T$ is invertible. If it is, write down a formula for its inverse. If it is not, provide a reason.
  • 7. Find a $2\times 3$ linear system of equations whose general solution is \[ \left(\begin{array}{r} 1\\ 1\\ 0\end{array}\right) + t\left(\begin{array}{r} 1 \\ 2 \\ 1 \end{array}\right),\qquad t\in \mathbb{R}. \]
Exercises 3 and 5 were graded and the rest were checked for completion.
Homework 4, due Thursday, September 26th (Sections 1.6 and 1.7) Solutions
  • 1. Let $T\colon V\to W$ be an isomorphism, and let $\mathbf{v}_1,\ldots, \mathbf{v}_n\in V$.
    • (a) Prove that if $\mathbf{v}_1,\ldots, \mathbf{v}_n$ is a generating system in $V$, then $T(\mathbf{v}_1),\ldots, T(\mathbf{v}_n)$ is a generating system in $W$.
    • (b) Prove that if $\mathbf{v}_1,\ldots, \mathbf{v}_n$ is a linearly independent system in $V$, then $T(\mathbf{v}_1),\ldots, T(\mathbf{v}_n)$ is a linearly independent system in $W$.
    • (c) Prove that if $\mathbf{v}_1,\ldots, \mathbf{v}_n$ is a basis for $V$, then $T(\mathbf{v}_1),\ldots, T(\mathbf{v}_n)$ is a basis for in $W$.
  • 2. Find all right inverses of the matrix $A=(1, 1)\in M_{1\times 2}$. Use this to prove that $A$ is not left invertible.
  • 3. Suppose $A\colon V\to W$ and $B\colon U\to V$ are linear transformations such that $A\circ B$ is invertible.
    • (a) Prove that $A$ is right invertible and $B$ is left invertible.
    • (b) Find an example of such an $A$ and $B$ so that $A$ is not left invertible and $B$ is not right invertible.
  • 4. Let $T\colon V\to W$ be a linear transformation.
    • (a) Show that \[ \text{Null}(T)=\{\mathbf{v}\in V\colon T(\mathbf{v})=\mathbf{0}_W\} \] is a subspace of $V$.
    • (b) Show that \[ \text{Ran}(T)=\{\mathbf{w}\in W\colon \text{there exists $\mathbf{v}\in V$ such that }T(\mathbf{v})=\mathbf{w}\} \] is a subspace of $W$.
    • (c) Prove that $T$ is an isomorphism if and only if $\text{Null}(T)=\{\mathbf{0}_V\}$ (the trivial subspace) and $\text{Ran}(T)=W$.
  • 5. Let $X,Y\subset V$ be subspaces of $V$.
    • (a) Show that $X\cap Y$ is a subspace of $V$.
    • (b) Show that $X\cup Y$ is a subspace of $V$ if and only if either $X\subset Y$ or $Y\subset X$.
  • 6. Recall that $\mathcal{L}(V,W)$ denotes the space of linear transformations from $V$ to $W$. Consider the following subset \[ \mathcal{IL}(V,W):=\{T\in \mathcal{L}(V,W)\colon T\text{ is invertible}\}. \] Show that $\mathcal{IL}(V,W)$ is a subspace if and only if $O\in \mathcal{IL}(V,W)$ where $O\colon V\to W$ is the trivial linear transformation defined by $O(\mathbf{v})=\mathbf{0}_W$ for all $\mathbf{v}\in V$.
    [Hint: for the ``if'' direction think about what it means for $V$ and $W$ if $O$ is invertible.]
  • Exercises 1 and 5 were graded and the rest were checked for completion.
Homework 3, due Thursday, September 19th (Sections 1.3 - 1.5) Solutions
  • 1. Compute the following products:
    • (a) $\displaystyle \left(\begin{array}{rrr} 1 & 1 & 2 \\ -3 & 0 & 1 \end{array}\right) \left(\begin{array}{c} 2 \\ 1\\ 3\end{array}\right)$
    • (b) $\displaystyle \left(\begin{array}{rrr} -2 & 1 \\ 0 & 1\\ 3 & 2 \end{array}\right) \left(\begin{array}{c} 2 \\ 1\end{array}\right)$
    • (c) $\displaystyle \left(\begin{array}{rrr} 1 & 1 & 2 \\ -3 & 0 & 1 \\ 4 & 0 & 0 \\ 0 & -1 & 5 \end{array}\right) \left(\begin{array}{c} 5 \\ 0 \\ 1\\ 3\end{array}\right)$
  • 2. For each of the following linear transformations $T$, find their matrix representations $[T]$.
    • (a) $T\colon \mathbb{R}^2 \to \mathbb{R}^3$ is the linear transformation defined by $T(x,y)^T = (2x - y, y-3x, 4x)^T$.
    • (b) $T\colon \mathbb{R}^2 \to \mathbb{R}^2$ is the linear transformation that sends a vector $\mathbf{v}\in \mathbb{R}^2$ to its reflection over the line $y=x$.
    • (c) $T\colon \mathbb{R}^3\to\mathbb{R}^3$ projects every vector onto the $x$-$y$ plane.
    • (d) $T\colon \mathbb{R}^3\to\mathbb{R}^3$ reflects every vector through the $x$-$y$ plane.
    • (e) $T\colon \mathbb{R}^3\to\mathbb{R}^3$ rotates the $x$-$y$ plane $\frac{\pi}{6}$ radians counterclockwise, but leaves the $z$-axis fixed.
  • 3. Consider the following matrices \[ A=\left(\begin{array}{rr} 1 & 2 \\ 3 & 1 \end{array}\right),\ B=\left(\begin{array}{rrr} 1 & 0 & 2 \\ 3 & 1 & -2 \end{array}\right),\ C=\left(\begin{array}{rrr} 1 &-2 &3 \\ -2 &1 &-1 \end{array}\right),\ D=\left(\begin{array}{r} -2 \\ 2 \\1 \end{array}\right). \]
    • (a) Determine which of the following products are defined and give the dimension of the result: $AB$, $BA$, $ABC$, $ABD$, $BC$, $BC^T$, $B^TC$, $DC$, and $D^T C^T$.
    • (b) Compute the following matrices: $AB$, $A(3B+C)$, $B^T A$, and $A^TB$.
  • 4. Recall that for an angle $\theta$, the linear transformation $R_\theta\colon\mathbb{R}^2\to\mathbb{R}^2$ that rotates the plane by $\theta$ radians counterclockwise is given by \[ R_\theta = \left(\begin{array}{rr} \cos(\theta) & -\sin(\theta) \\ \sin(\theta) & \cos(\theta)\end{array}\right). \] Let $\phi$ be another angle. Use the fact $R_\theta R_\phi =R_{\theta+\phi}$ to derive the well-known trigonometric formulas for $\sin(\theta+\phi)$ and $\cos(\theta+\phi)$.
  • 5. Find linear transformation $A,B\colon \mathbb{R}^2 \to \mathbb{R}^2$ such that $AB=O$ but $BA\neq O$.
  • 6. Let $A\in M_{n\times m}$ matrix. Define transformations $S,T\colon M_{m\times n}\to \mathbb{F}$ by $S(B)=\text{tr}(AB)$ and $T(B)=\text{tr}(BA)$. Prove that $S$ and $T$ are linear transformations, and that in fact $S=T$. Finally, use this to conclude that $\text{tr}(BC)=\text{tr}(CB)$ for any matrices $B\in M_{m\times n}$ and $C\in M_{n\times m}$.
  • 7. The following are True/False. Prove the True statements and provide counterexamples for the False statements.
    • (a) If $T\colon V\to W$ is a linear transformation and $\mathbf{v}_1,\ldots,\mathbf{v}_n$ are linearly independent in $V$, then $T(\mathbf{v}_1),\ldots, T(\mathbf{v}_n)$ are linearly independent in $W$.
    • (b) If $T\colon V\to W$ is a linear transformation and $\mathbf{v}_1,\ldots,\mathbf{v}_n\in V$ are such that $T(\mathbf{v}_1),\ldots, T(\mathbf{v}_n)$ are linearly independent in $W$, then $\mathbf{v}_1,\ldots, \mathbf{v}_n$ are linearly independent in $V$.
    • (c) Given $\mathbf{v}_1,\mathbf{v}_2\in \mathbb{R}^2$ and $\mathbf{w}_1,\mathbf{w}_2\in \mathbb{R}^2$ such that $\mathbf{v}_1\neq \mathbf{v}_2$, there exists a linear transformation $T\colon \mathbb{R}^2\to \mathbb{R}^2$ such that $T(\mathbf{v}_1)=\mathbf{w}_1$ and $T(\mathbf{v}_2)=\mathbf{w}_2$.
Exercises 6 and 7 were graded and the rest were checked for completion.
Homework 2, due Thursday, September 12th (Sections 1.2 and 1.3) Solutions
  • 1. Does the following system of vectors form a basis in $\mathbb{R}^3$? Justify your answer. \[ \mathbf{v}_1=\left(\begin{array}{r} 1\\ -2\\ 1\end{array}\right) \qquad \mathbf{v}_2=\left(\begin{array}{r} 0\\4 \\ -1\end{array}\right) \qquad \mathbf{v}_3=\left(\begin{array}{r} 3\\ 2\\ 1\end{array}\right) \]
  • 2. Let $\mathbb{P}_2(\mathbb{R})$ be the vector space of polynomials with real coefficients and degree at most $2$. Show that the vectors \[ p_0(x)=1 \qquad p_1(x)=x \qquad p_2(x)=\frac13(2x^2 -1) \] form a basis in $\mathbb{P}_2(\mathbb{R})$.
  • 3. The following are True/False. Prove the True statements and provide counterexamples for the False statements.
    • (a) Any set containing a zero vector is linearly dependent.
    • (b) A basis must contain $\mathbf{0}$.
    • (c) Subsets of linearly dependent sets are linearly dependent.
    • (d) Subsets of linearly independent sets are linearly independent.
    • (e) If $\alpha_1\mathbf{v}_1+\cdots +\alpha_n \mathbf{v}_n=\mathbf{0}$ for vectors $\mathbf{v}_1,\ldots, \mathbf{v}_n\in V$, then all the scalars $\alpha_1,\ldots,\alpha_n$ are zero.
  • 4. We say a matrix $A$ is symmetric if $A^T=A$. Find a basis for the space of symmetric $2\times 2$ matrices (and prove it is in fact a basis). Make note of the number of elements in your basis.
  • 5. Let $\mathbf{v}_1,\ldots, \mathbf{v}_p\in V$ be a system of vectors that is linearly independent but not generating. Show that one can find a vector $\mathbf{v}_{p+1}\in V$ so that the larger system $\mathbf{v}_1,\ldots, \mathbf{v}_p,\mathbf{v}_{p+1}$ is still linearly independent.
  • 6. Suppose $\mathbf{v}_1,\mathbf{v}_2$ form a basis in a vector space $V$. Define $\mathbf{w}_1:=\mathbf{v}_1+\mathbf{v}_2$ and $\mathbf{w}_2:=\mathbf{v}_1 - \mathbf{v}_2$. Prove that $\mathbf{w}_1,\mathbf{w}_2$ is also a basis in $V$.
  • 7. In each of the following, prove whether or not the given transformation is a linear transformation.
    • (a) $T\colon \mathbb{R}^3 \to \mathbb{R}^2$ defined by $ T\left((x,y,z)^T\right) = (x+3y , -z)^T$.
    • (b) $T\colon \mathbb{R}^3 \to \mathbb{R}$ defined by $ T\left((x,y,z)^T\right) = x+4$.
    • (c) Let $V$ be the space of functions of the form $f\colon \mathbb{R}\to \mathbb{R}$ with the usual addition and scalar multiplication. Define $T\colon V\to V$ by $T(f) = f^2$. That is, \[ [T(f)](x)=f(x)^2 \qquad x\in \mathbb{R}. \]
Exercises 3 and 5 were graded and the rest were checked for completion.
Homework 1, due Thursday, September 5th (Section 1.1) Solutions
  • 1. For each of the following, decide whether the objects and operations described form a vector space. If they do, show that they satisfy the axioms. If not, show that they fail to satisfy at least one axiom.
    • (a) The set $\mathbb{R}^3$ of 3 dimensional columns of real numbers, with addition defined by \[ \left( \begin{array}{c} v_1 \\ v_2 \\ v_3 \end{array} \right) + \left( \begin{array}{c} w_1 \\ w_2 \\ w_3 \end{array} \right) =\left( \begin{array}{c}4v_1+4w_1 \\ 4v_2+4w_2 \\ 4v_3 +4w_3 \end{array} \right), \] and the usual scalar multiplication.
    • (b) The set of real polynomials of degree exactly $n$: \[ p(x)=a_n x^n+ a_{n-1} x^{n-1}+... + a_1 x + a_0, \] with $a_n\neq 0$, and with addition and scalar multiplication defined the same way we did in class for polynomials of degree at most $n$.
    • (c) The subset of $\mathbb{R}^3$ given by: \[ V=\left\{ \left( \begin{array}{c} x \\ y \\ z \end{array} \right)\in \mathbb{R}^3\ |\ x+2y-z=0\right\}, \] with the usual addition and scalar multiplication.
    • (d) The subset of $\mathbb{R}^3$ given by: \[ V=\left\{ \left( \begin{array}{c} x \\ y \\ z \end{array} \right)\in \mathbb{R}^3\ |\ x+2y-z=3 \right\}, \] with the usual addition and scalar multiplication.
    • (e) The subset of $\mathbb{R}^3$ given by: \[ V=\left\{ \left( \begin{array}{c} x \\ y \\ z \end{array} \right)\in \mathbb{R}^3\ |\ x^6+y^2+z^4=0 \right\}, \] with the usual addition and scalar multiplication.
    • (f) The set of functions \[ V=\{ f:\mathbb{R}\rightarrow \mathbb{R}\ | \ f(5)=0\}, \] with addition given by $(f+g)(x)=f(x)+g(x)$ and scalar multiplication by $(\alpha f)(x)=\alpha(f(x))$.
  • 2. Let $V$ be a general vector space.
    • (a) For $\mathbf{v}\in V$, prove that $\mathbf{v}$ has a unique additive inverse.
    • (b) For $\mathbf{v}\in V$ and a scalar $\alpha$, prove that $(-\alpha)\mathbf{v}$ is the additive inverse of $\alpha\mathbf{v}$.
    • (c) Prove that the additive inverse of $\mathbf{0}$ is $\mathbf{0}$.
  • 3. Let $V$ be a general vector space with zero vector $\mathbf{0}$. Prove that $\alpha\mathbf{0}=\mathbf{0}$ for any scalar $\alpha$.
    [Hint: treat the cases $\alpha=0$ and $\alpha\neq 0$ separately.]
Exercises 1(c), 1(d), 2, and 3 were graded and the rest checked for completion.

Midterm Exams

Midterm 1 is in class on Thursday, October 3rd. This covers Sections 1.1 - 1.7 and 2.1 - 2.3 in the textbook. Solutions.
Midterm 1 Correction Presentations:

  • Choose one question from Midterm 1 and prepare a solution for it or any of its parts.
  • Schedule a time to meet with me and present your solution (no notes allowed).
  • You can earn up to the full credit of the question back.
  • The deadline for presentations is Friday, October 25th.
(If you choose Question 6 or any parts of it, then please prepare a counterexample showing why the original statement is false and a proof that your corrected version is true.)

Midterm 2 is in class on Thursday, November 14th. This covers Sections 2.4 - 2.8, 3.1 - 3.5, and 4.1 - 4.2 in the textbook. Solutions.
Midterm 2 Correction Presentations:

  • Choose one question from Midterm 2 and prepare a solution for it or any of its parts.
  • Schedule a time to meet with me and present your solution (no notes allowed).
  • You can earn up to the full credit of the question back.
  • The deadline for presentations is Tuesday, November 26th.
(If you choose Question 6 or any parts of it, then please prepare a counterexample showing why the original statement is false and a proof that your corrected version is true.)

Final Exam

The Final exam is on Wednesday, December 11th from 10:00am to 12:00pm in A116 Wells Hall. The exam is cumulative and will therefore cover:

  • Sections 1.1 - 1.7
  • Sections 2.1 - 2.8
  • Sections 3.1 - 3.5
  • Sections 4.1 - 4.2
  • Sections 5.1 - 5.4