Now that we have assigned a number to the outcome of an event, we can
define an ``average'' value for the r.v. over the possible events.
This average value is called the expectation value for the random
variable
, and has the following definition:
One can define a unique, real-valued function of a r.v., which will also be a
r.v.
That is, given a r.v.
, then the real-valued function
is also a
r.v. and we can define the expectation value of
:
The expectation value of a linear combination of r.v.'s is simply the linear combination of their respective expectation values;
The expectation value is simply the ``first moment'' of the r.v., meaning that
one is finding the average of the r.v. itself, rather than its square or cube
or square root.
Thus the mean is the average value of the first moment of the r.v.,
and one might ask whether or not averages of the higher moments have any
significance.
In fact, the average of the square of the r.v. does lead to an
important quantity, the variance, and we will now define the higher
moments of a r.v.
as follows:
We also define ``central'' moments that express the variation of a r.v. about its mean, hence ``corrected for the mean'':
The first central moment is zero. The second central moment is the variance:
It is straightforward to show the following important identity:
We will also find useful the square root of the variance, which is the standard deviation,