# Positive Operators

Aug-Nov 2020

## Recap

• Vector space $V$ over a scalar field $F= \mathbb{R}$ or $\mathbb{C}$
• $m\times n$ matrix A represents a linear map $T:F^n\to F^m$
• dim null $T+$ dim range $T=$ dim $V$
• Solution to $Ax=b$ (if it exists): $u+$ null$(A)$
• Four fundamental subspaces of a matrix
• Column space, row space, null space, left null space
• Eigenvalue $\lambda$ and Eigenvector $v$: $Tv=\lambda v$
• There is a basis w.r.t. which a linear map is upper-triangular
• If there is a basis of eigenvectors, linear map is diagonal w.r.t. it
• Inner products, norms, orthogonality and orthonormal basis
• There is an orthonormal basis w.r.t. which a linear map is upper-triangular
• Orthogonal projection: distance from a subspace
• Adjoint of a linear map: $\langle Tv,w\rangle=\langle v,T^*w\rangle$
• null $T=$ $($range $T^*)^{\perp}$
• Self-adjoint: $T=T^*$, Normal: $TT^*=T^*T$
• Eigenvectors corresponding to different eigenvalues are orthogonal
• Complex spectral theorem: $T$ is normal $\leftrightarrow$ orthonormal basis of eigenvectors
• Real spectral theorem: $T$ is self-adjoint $\leftrightarrow$ orthonormal basis of eigenvectors

## Definition of positive operators

$V$: inner product space over $F= \mathbb{R}$ or $\mathbb{C}$

An operators $T:V\to V$ is said to be positive if $T$ is self-adjoint and $\langle Tv,v\rangle\ge 0,\quad v\in V$

Necessary condition for being positive

$\langle Tv,v\rangle\in\mathbb{R},\quad v\in V$

$F=\mathbb{C}$: above implies $T$ is self-adjoint

$F=\mathbb{R}$: $T$ self-adjoint is not implied

## In terms of matrices

$A$: $n\times n$ matrix with, possibly, complex entries

$A$ is positive if $A=A^H$ and $x^HAx\ge0$, $x\in F^n$

Notation: $x^H$ denotes conjugate-transpose

Written as $A\succeq0$

1. $A=\begin{bmatrix}\lambda_1&0\\0&\lambda_2\end{bmatrix}$: $A\succeq0$ if $\lambda_1,\lambda_2\ge0$

$x^HAx=\begin{bmatrix}\overline{x_1}&\overline{x_2}\end{bmatrix}\begin{bmatrix}\lambda_1x_1\\\lambda_2x_2\end{bmatrix}=\lambda_1\lvert x_1\rvert^2+\lambda_2\lvert x_2\rvert^2$

2. $A=\begin{bmatrix}2&i\\-i&2\end{bmatrix}$: $A\succeq0$ (exercise)

$x^HAx=2\lvert x_1\rvert^2+i\overline{x_1}x_2-ix_1\overline{x_2}+2\lvert x_2\rvert^2$

3. $A=\begin{bmatrix}1&0\\0&-1\end{bmatrix}$: not positive

## Is self-adjoint necessary in real spaces?

In real spaces, there are some interesting examples.

$A=\begin{bmatrix}1&1\\-1&1\end{bmatrix}$

Not symmetric

$x^TAx=x_1^2+x_2^2\ge0$

$x^TAx=x^T\dfrac{A+A^T}{2}x$

So, w.r.t. positivity, it is sufficient to consider symmetric matrices.

## Orthogonal Projections and Positivity

$U\subseteq V$ subspace, $P_U$: orthogonal projection onto $U$

$P_U$ is positive.

Proof

$P_Uv=u$ implies $v=u+w$ with $u\in U$ and $w\in U^{\perp}$

$\langle v,u\rangle=\langle u,u\rangle+\langle w,u\rangle=\langle u,u\rangle\ge0$

So, $\langle v,P_Uv\rangle=\langle P_Uv,P_Uv\rangle$

Is $P_U$ self-adjoint?

$0=\langle P_Ux,y-P_Uy\rangle=\langle P_Ux,y\rangle - \langle P_Ux,P_Uy\rangle$

$0=\langle x-P_Ux,P_Uy\rangle=\langle x,P_Uy\rangle - \langle P_Ux,P_Uy\rangle$

So, $\langle P_Ux,y\rangle=\langle x,P_Uy\rangle$

## Square root of an operator

$R$ is said to be a square root of $T$ is $T=R^2$.

Square root is said to be positive if $R\succeq0$.

Examples

1. $A=\begin{bmatrix}0&0&1\\ 0&0&0\\ 0&0&0\end{bmatrix}$, $B=\begin{bmatrix}0&1&0\\ 0&0&1\\ 0&0&0\end{bmatrix}$, $B^2=A$

2. $P=\begin{bmatrix}0&0\\ \alpha&1\end{bmatrix}$, $P^2=P$ (is $P$ an orthogonal projection?)

3. $A=\begin{bmatrix}4&0\\ 0&9\end{bmatrix}$, $B=\begin{bmatrix}2&0\\ 0&3\end{bmatrix}$, $B$: positive square root of $A$

$T:V\to W$, $T^*:W\to V$

TT^* and T^*T are positive operators.

Proof

$T^*T$ and $TT^*$ are clearly self-adjoint

$\langle T^*Tv,v\rangle=\langle Tv,Tv\rangle=\lVert Tv\rVert^2\ge0$

$\langle TT^*v,v\rangle=\langle T^*v,T^*v\rangle=\lVert T^*v\rVert^2\ge0$

## Characterising positive operators

The following are equivalent:

1. $T$ is positive
2. $T$ is self-adjoint with non-negative eigenvalues
3. $T$ has a positive square root
4. $T$ has a self-adjoint square root
5. There is an operator $R$ such that $T=RR^*$

Proof

(1) implies (2)

$\lambda$: eigenvalue of $T$ with eigenvector $v$

$0\le\langle Tv,v\rangle=\langle\lambda v,v\rangle=\lambda\langle v,v\rangle$

## Proof (continued)

(2) implies (3), (4), (5)

$\{e_1,\ldots,e_n\}$: orthonormal eigenvector basis of $T$ (coordinates in standard basis)

$T=\lambda_1e_1\overline{e^T_1}+\cdots+\lambda_ne_n\overline{e^T_n}$, $\lambda_i\ge0$

Let $R=\sqrt{\lambda_1}e_1\overline{e^T_1}+\cdots+\sqrt{\lambda_n}e_n\overline{e^T_n}$

Verify $T=R^2$, $R=R^*$

(5) implies (1)

$RR^*$: operator-adjoint product

## Partial ordering of operators

Condition Notation Terminology
$x^HAx>0$ $A\succ0$ positive definite (pd)
$x^HAx\ge0$ $A\succeq0$ positive semidefinite (psd)
$x^HAx<0$ $A\prec0$ negative definite (nd)
$x^HAx\le0$ $A\preceq0$ negative semidefinite (nsd)

$A\succ B$ if $A-B\succ0$
(and similarly for other relations)

Why partial ordering?: There are matrices that are neither positive nor negative.

Example: $\begin{bmatrix}1&0\\0&-1\end{bmatrix}$

So, $A$ and $B$ may not be comparable using $\succ$ or $\prec$

Positive: short for positive semidefinite