How do you interpret variance

Definition of variance

The variance is a measure of dispersion that characterizes the distribution of values ​​around the mean. It is the square of the standard deviation. The variance is calculated by dividing the sum of the squared deviations of all measured values ​​from the arithmetic mean by the number of measured values. The symbol of the variance for a random variable is “σ²”, that for the empirical variance of a sample is “s²”.

Example: The age characteristic is examined in a sample of 5 people. The measured values ​​are 14, 17, 20, 24 and 25 years. The mean value is therefore 100/5 = 20 years. Now the deviations of the individual measured values ​​from the mean are calculated: (14-20) = - 6, (17-20) = - 3, (20-20) = 0, (24-20) = 4 and (25-20) = 5. The squared deviations are 36, 9, 0, 16, 25 and result in a total of 86. The variance is thus 86/5 = 17.2 years².

As can be seen in the example, the variance has the disadvantage that it has a different unit than the observed measured values ​​due to the squaring. At first glance, no concrete statements can be made about the range of variation. In practice, therefore, the standard deviation, which results from the square root of the variance, is often used for interpretation.

Example: The result of the above example is considered. The variance is 17.2 years². Years² is not a common measure and the spread cannot be interpreted directly. If, however, the standard deviation is calculated using the square root, a value of 4.15 years is obtained for this. An easier interpretation can now be made for normally distributed characteristics (see standard deviation).

Please note that the individual definitions in our statistics lexicon are simplified explanations. The aim here is to bring the individual terms closer to the broadest possible user group. In this respect, it is possible that individual definitions do not fully correspond to scientific standards.