For real, symmetric matrices in , the spectral decomposition theorem states that there exist eigenvalues with corresponding eigenvectors such that is an orthonormal basis for In this post, we will provide a proof of the spectral theorem for real symmetric matrices and continue to develop a variational characterization of In our exposition, we will freely use the isomorphism between linear functions over and matrices depending on which representation is more convenient.

**Lemma 1.** Let be an matrix. The eigenvalues of are then precisely the roots of the characteristic polynomial

*Proof.* By definition, is an eigenvalue if and only if there exists a non-zero vector such that Thus, There is a nonzero that satisfies this relation if and only if Thus, the eigenvalues of precisely coincide with the roots of the characteristic polynomial For an matrix, the characteristic polynomial has degree , so has exactly eigenvalues (not necessarily distinct) over the complex numbers.

**Lemma 2.** Let be a symmetric matrix. Then, all eigenvalues of are real.

*Proof.* Let be an eigenvalue of with associated eigenvector Then, we have

where denotes the complex conjugate of Next, since is real and symmetric, and thus Hermitian, we have

Thus, and since it follows that so is real.

**Lemma 3.** Let and let be an eigenvalue of with associated (normalized) eigenvector Then, for all

*Proof.* We have

**Theorem 4 (Spectral Theorem).** Let be a symmetric matrix. Then, there exist eigenvalues of with corresponding eigenvectors such that constitutes an orthonormal basis for

*Proof.* It will be easier to work with linear transformations. Let be the linear transformation that corresponds to Take to be an eigenvalue of and let be the associated (normalized) eigenvector. Let be the orthogonal complement of in Consider the restricted map By Lemma 3, is a linear operator on Since is the restriction map, eigenvalues of are also eigenvalues of Thus, let be an eigenvalue of with associated eigenvector By construction so Now, we can repeat this process by defining to be the orthogonal complement of and considering Repeating this process yields eigenvalues with an orthonormal set of eigenvectors , which must necessarily constitute a basis for

**Theorem 5 (Variational Characterization).** Let be a symmetric matrix with eigenvalues Then, for all

*Proof.* By the spectral theorem, there exist an orthonormal set of eigenvectors corresponding to Then, write as a linear combination of : Then,

since for all It is not hard to see that

with To see this, take By construction, for all , Since , to minimize subject to , we would simply choose and In this case, the objective value is precisely Since we are maximizing over all subspaces with dimension is upper-bounded by the objective. Next, we show that is lower-bounded by the objective. To see this, take any subspace of dimension Since span a subspace of dimension , there is some such that Thus, when we minimize over , there is some such that

and so the objective is lower-bounded by . Thus, we conclude that