The following problem appeared in Volume 131, Issue 9 (2024) of The American Mathematical Monthly.
Let
and
be independent normally distributed random variables, each with its own mean and variance. Show that the variance of
conditioned on the event
is smaller than the variance of
alone.
We suppose that ,
,
, and
. With these definitions, we may write
and
, where
and
are independent standard normal random variables.
The goal is to show that . In the previous two posts, we showed that it will be sufficient to show that
, where
and
. We also showed that
, where
and
is the cumulative distribution function of the standard normal distribution.
To compute
,
we begin with
.
The expected value in the numerator is a double integral:
,
where is the joint probability density function of
and
. Since
and
are independent,
is the product of the individual probability density functions:
.
Therefore, we must compute
,
where I wrote for the event
.
I’m not above admitting that I first stuck this into Mathematica to make sure that this was doable. To begin, we compute the inner integral:
.
At this point, I used a standard technique/trick of completing the square to rewrite the integrand as a common pdf.
.
I now rewrite the integrand so that has the form of the probability density function of a normal distribution, writing and multiplying and dividing by
in the denominator:
.
This is an example of making a problem easier by apparently making it harder. The integrand is equal to , where
is a normally distributed random variable with
and
. Since
, we have
,
and so
.
We note that this reduces to what we found in the second special case: if ,
, and
, then
,
, and
. Since
, we have
,
matching what we found earlier.
In the next post, we consider the calculation of .