The following problem appeared in Volume 131, Issue 9 (2024) of The American Mathematical Monthly.
Let
and
be independent normally distributed random variables, each with its own mean and variance. Show that the variance of
conditioned on the event
is smaller than the variance of
alone.
We suppose that ,
,
, and
. With these definitions, we may write
and
, where
and
are independent standard normal random variables.
The goal is to show that . In previous posts, we showed that it will be sufficient to show that
, where
and
. We also showed that
, where
and
is the cumulative distribution function of the standard normal distribution.
To compute
,
we showed in the previous post that
.
We now turn to the second conditional expectation:
.
The expected value in the numerator is a double integral:
,
where is the joint probability density function of
and
. Since
and
are independent,
is the product of the individual probability density functions:
.
Therefore, we must compute
,
where I wrote for the event
.
I’m not above admitting that I first stuck this into Mathematica to make sure that this was doable. To begin, we compute the inner integral:
we begin by using integration by parts on the inner integral:
Therefore,
.
The second term is equal to since the double integral is
. For the first integral, we complete the square as before:
.
I now rewrite the integrand so that has the form of the probability density function of a normal distribution, writing and multiplying and dividing by
in the denominator:
.
This is an example of making a problem easier by apparently making it harder. The integrand has the probability density function of a normally distributed random variable with
and
. Therefore, the integral is equal to
, so that
,
.
Therefore,
.
We note that this reduces to what we found in the second special case: if , then
and
, so that
, matching what we found earlier.
In the next post, we consider the calculation of .