∫e−x2dx = √π
and thus a normalized distribution (probability density function) is
e−x2 / √π.
The expectation for x2 can be computed from
∫x2e−x2dx = (√π)/2.
(Run this.)
Thus E(x2) = 1/2.
Statisticians like a distribution with 1 as the expected square deviation
from the mean.
When the mean is 0 they call E(x2) the variance and
√E(x2) the standard deviation.
If we scale e−x2 by √2 we
get the standard normal distribution, sometimes written N(0, 1):
e−x2/2/√(2π).
Its variance is 1.
In general N(m, σ2) refers to the normal
distribution whose
mean is m and whose variance is σ2.
(2πσ2)−1/2e−(x
− m)2/(2σ2)
Variance is mathematically convenient and also has some pretty properties. The variance of the sum of two random independent variable is the sum of their variance, whatever their means. The mean of the sum of two random variables is the sum of their means even if they are dependent.