Definition Variance

Variance is a measure of dispersion that explains the distribution of values around the mean. The standard deviation - also often used to measure the dispersion of a distribution - is the square root of the variance. We can calculate the variance by dividing the sum of the squared deviations of all measured values by the number of all measured values. The symbol of the variance of a random variable is usually σ².

 
To provide an example: Consider the variable 'age' in a sample of 5 people. The measured values are 14, 17, 20, 24, and 25 years. The mean value is therefore 100/5 = 20 years. Now, we can calculate the deviation from the mean of each measured value:(14-20) = -6, (17-20) = -3, (20-20) = 0, (24-20) = 4, (25-20) = 5. The squared deviations are 36, 9, 0, 16, 25. Their sum is 86. The variance therefore is 86/5 = 17.2 years².
 
As can be seen in the example, the disadvantage of the variance is that its unit is a different one than the unit of the measured values. At first glance, we do not get concrete information on how far the values are actually scattered. For an easier interpretation, therefore, we often use the standard deviation, which results from the square root of the variance.

Please note that the definitions in our statistics encyclopedia are simplified explanations of terms. Our goal is to make the definitions accessible for a broad audience; thus it is possible that some definitions do not adhere entirely to scientific standards.