Dispersion Measures Disregarding Signs | CFA® Level 1 Exam Prep

Disregarding Signs: Measure of Dispersion | CFA® Level 1 Exam

Prev Question Next Question

Question

Which measure of dispersion disregards the algebraic signs (plus and minus) of each difference between X and the mean?

Answers

Explanations

Click on the arrows to vote for the correct answer

A. B. C. D. E.

C

The mean deviation is the mean of the absolute values of the deviations from the mean.

The measure of dispersion that disregards the algebraic signs (plus and minus) of each difference between X and the mean is called the mean deviation.

Mean deviation, also known as the average deviation or mean absolute deviation, measures the average amount by which data points deviate from the mean, regardless of their direction (positive or negative). It quantifies the spread or dispersion of data points around the mean.

To calculate the mean deviation, you follow these steps:

  1. Calculate the mean (average) of the data set.
  2. Find the difference between each data point and the mean, disregarding the signs.
  3. Take the absolute value of each difference.
  4. Calculate the mean of these absolute differences.

By disregarding the signs, the mean deviation treats deviations from the mean as positive quantities. This means that the mean deviation is always a positive value or zero, representing the average absolute distance of the data points from the mean.

In the provided answer choices:

  • Standard deviation (option A) and variance (option B) both take into account the signs of differences from the mean. They involve squaring the differences, which preserves the sign information. Therefore, these options do not disregard the signs.
  • Mean (option D) is not a measure of dispersion but rather a measure of central tendency.
  • The correct answer is option C, mean deviation, as it is the measure that disregards the signs of the differences between the data points and the mean when calculating dispersion.