There are two main types of quantum information:

In this talk we will focus on analogue quantum information, and special states in this setting which are called Gaussian states. Gaussian states naturally occur as ground states or thermal equilibrium states after a “small-oscillations” limit. This reminds the Central Limit Theorem in Probability which states that:

The normalized average of a sequence of independent and identically distributed random variables with finite variances converges in distribution to the standard normal distribution $N(0,1)$.

In fact, there exists a quantum analogue of the Central Limit Theorem which gives that Gaussian states arise as analogous limits.

First, recall the classical Gaussian function, i.e. the density $\varphi_{\mu, \sigma^2}(x)= \frac{1}{\sigma \sqrt{2\pi}} e^{-\frac{1}{2} \left( \frac{x-\mu}{\sigma}\right)^2}$, $x \in \mathbb{R}$, of the normal distribution on $\mathbb{R}$.

Density_of_normal_distributions.png

Similarly, the multivariate Gaussian function is equal to

$$ \varphi(x_1,\ldots , x_n)= \frac{1}{\sqrt{(2\pi)^n \text{det}(\Sigma)}} \text{exp}\left(-\frac{1}{2} (\bm{x} -\bm{\mu})^\intercal \Sigma (\bm{x}-\bm{\mu})\right) $$

where $\bm{x}^\intercal=(x_1, \ldots , x_n)$ and $\Sigma$ is a positive definite $n \times n$ matrix, i.e. $\langle \bm{x} | \Sigma \bm{x} \rangle > 0$ for every $\bm{x} \in \mathbb{R}^n \backslash \{ \bm{0} \}$. In particularly, the density of the standard normal distribution on $\mathbb{R}^n$ is equal to

$$ \varphi(\bm{x})= \frac{1}{\sqrt{(2 \pi )^n}} e^{- \frac{1}{2} | \bm{x} |^2}, \quad \bm{x} \in \mathbb{R}^n. $$

Multivatiate_Gaussian.png

Characteristic function

Recall that the characteristic function of a real valued random variable $X$ is defined by

$$ \varphi(x)= E (e^{itx}), \quad x \in \mathbb{R}. $$

Similarly, the characteristic function of a random variable $X$ which takes values in $\mathbb{R}^n$ is defined by

$$ \varphi (x) = E(e^{it^\intercal X}), \quad x \in \mathbb{R}^n. $$

In particular, the characteristic function of the vector random variable $X\sim N(\mu, \Sigma)$ is given by

$$ \varphi (t)= \text{exp}\, ( i \mu^\intercal t - \frac{1}{2} t^\intercal \Sigma t ), \quad t \in \mathbb{R}^n. $$