Andrew Thangaraj

Aug-Nov 2020

- Vector space \(V\) over a scalar field \(F= \mathbb{R}\) or \(\mathbb{C}\)
- \(m\times n\) matrix A represents a linear map \(T:F^n\to F^m\)
- dim null \(T+\) dim range \(T=\) dim \(V\)
- Solution to \(Ax=b\) (if it exists): \(u+\) null\((A)\)

- Four fundamental subspaces of a matrix
- Column space, row space, null space, left null space

- Eigenvalue \(\lambda\) and Eigenvector \(v\): \(Tv=\lambda v\)
- There is a basis w.r.t. which a linear map is upper-triangular
- If there is a basis of eigenvectors, linear map is diagonal w.r.t. it

- Inner products, norms, orthogonality and orthonormal basis
- There is an orthonormal basis w.r.t. which a linear map is upper-triangular
- Orthogonal projection: distance from a subspace

- Adjoint of a linear map: \(\langle Tv,w\rangle=\langle v,T^*w\rangle\)
- null \(T=\) \((\)range \(T^*)^{\perp}\)

- Self-adjoint: \(T=T^*\), Normal: \(TT^*=T^*T\)
- Complex/real spectral theorem: \(T\) is normal/self-adjoint \(\leftrightarrow\) orthonormal basis of eigenvectors
- Positive operators: self-adjoint with non-negative eigenvalues
- Isometries: normal with absolute value 1 eigenvalues
- Singular values/vectors of \(T\): Eigenvalues/eigenvectors of \(T^*T\)
- \(A=UDV^*\), diagonal w.r.t. singular vector basis

\(A\): \(m\times n\) matrix

\[A=UDV^*\]

\(U\): \(m\times m\) unitary

- columns are left-singular vectors or eigenvectors of \(AA^*\)

\(V\): \(n\times n\) unitary

- columns are right-singular vectors or eigenvectors of \(A^*A\)

\(D\): \(m\times n\) diagonal

singular values on diagonal, usually in decreasing order

singular values: square root of eigenvalues of \(A^*A\) or \(AA^*\)

\(A\): \(n\times n\) matrix representing \(T\)

\[A=UDV^*=(UV^*)(VDV^*)\]

\(UV^*\): unitary, represents an isometry

\(VDV^*\): self-adjoint, represents \(\sqrt{T^*T}\)

*Polar decomposition*

For any operator \(T:V\to V\), there exists an isometry \(S\) such that \(T=S\sqrt{T^*T}\)

- analogous to \(z=e^{i\arg(z)}\lvert z\rvert\) for a complex number \(z\)

Transmitter with \(n\) antennas and receiver with \(m\) antennas

Transmit \(x=(x_1,\ldots,x_n)\in\mathbb{C}^n\)

Gain from antenna \(j\) to antenna \(i\) is \(h_{ij}\in\mathbb{C}\)

Channel \(H=\begin{bmatrix}h_{11}&&h_{1n}\\ &\ddots&\\ h_{m1}&&h_{mn} \end{bmatrix}\)

- Receive \(y=Hx+z\), where \(z\) is random noise

Goal: recover \(x\) from \(y\)

*SVD of channel matrix*

\[H=UDV^*\]

Transmit \(Vx\)

\(V\) is unitary, \(\lVert Vx\rVert=\lVert x\rVert\)

\(y=UDx+z\)

At receiver, compute \(U^*y\)

\(U^*y=Dx+U^*z\)

\(U^*\) is unitary, \(\lVert U^*z\rVert=\lVert z\rVert\)

Divide by singular values in \(D\) to estimate \(x\)

SVD plays an important role in practical cellular systems!

\(m\) movies, \(n\) users

\(A\): \(m\times n\) matrix with \(a_{ij}=\) rating of movie \(i\) by user \(j\)

*Matrix factorization and latent factors*

\[A\approx QP\]

\(Q\): \(m\times k\), \(P\): \(k\times n\), \(k\ll m,n\)

\(k\) “latent” factors (movie genres, say)

\(Q[i,:]\): \(i\)-th row of \(Q\) or \(i\)-th movie expressed as vector over factors

\(P[:,j]\): \(j\)-th column of \(P\) or \(j\)-th user expressed as vector over factors

\(a_{ij} = \langle Q[i,:],P[:,j]\rangle\)

**Goal**: find “best” factorization that can predict missing \(a_{ij}\)’s

*SVD of \(A\)*

\[A=UDV^*\]

can be used for approximations such as

\[A\approx QP\]

\(Q\): \(m\times k\), \(P\): \(k\times n\), \(k\ll m,n\)

Recommender systems use SVD as a starting point to optimize the factorization!