Quantum entropies
# Von Neumann entropy
Definition: Von Neumann entropy
The Von Neumann entropy of a density matrix $\rho$ is given by $$S(\rho) = -\text{tr}{\rho\ln\rho}$$
With the decomposition $\rho = \sum_i p_i\ket{\phi_i}\bra{\phi_i}$ the Von Neumann entropy is equal to the Shannon information entropy of the distribution $p_i$.
Therefore $S(\rho)$ corresponds to the uncertainty of the realization of a particular state $\ket{\phi_i}$.
Remarks:
- We have $S(\rho)=0$ for pure states (no uncertainty in the realization) and $S(\rho)>0$ otherwise
- Von Neumann entropy is invariant under unitary transformations of the hilbertspace
- For a composite system $S(\rho)\leq S(\rho^{(1)}) + S(\rho^{(2)})$ with the equality only holding when the system is uncorrelated $\rho = \rho^{(1)}\otimes\rho^{(2)}$: Uncertainty of the composite system is lower: By tracing out subsystems we lose information about the correlations between them
# Relativ entropy
Definition: Relativ entropy
For two density matrices $\rho$ and $\sigma$ the relativ entropy is defined as $$S(\rho||\sigma) = \text{tr}{\rho \ln \rho } - \text{tr}{\rho \ln \sigma }$$
Remarks:
- Change of Von Neumann entropy when tracing: $S(\rho||\rho^{(1)}\otimes\rho^{(2)})= S(\rho^{(1)}) + S(\rho^{(2)}) - S(\rho)$
- Klein inequality: $S(\rho||\sigma) \geq 0$
# Linear entropy
Definition: Linear entropy
The linear entropy of a density matrix $\rho$ is given by $$S_l(\rho) = \text{tr}{\rho - \rho^2}$$
Remarks:
- $0\leq S_l(\rho)\leq 1-\frac{1}{D}$ with $D=\dim\mathcal{H}$