The following problem appeared in Volume 131, Issue 9 (2024) of The American Mathematical Monthly.
Let
and
be independent normally distributed random variables, each with its own mean and variance. Show that the variance of
conditioned on the event
is smaller than the variance of
alone.
We suppose that ,
,
, and
. With these definitions, we may write
and
, where
and
are independent standard normal random variables.
Based on the experience of the special cases, it seems likely that I’ll eventually need to integrate over the joint probability density function of and
. However, it’s a bit easier to work with standard normal random variables than general ones, and so I’d like to rewrite in terms of
and
to whatever extent is possible.
As it turns out, the usual scaling and shifting properties of variance apply to a conditional variance on any event . The event that we have in mind, of course, is
. As discussed in the previous post, this can be rewritten as
, where
and
.
We are now ready to derive the scaling and shift properties for . We begin by using the definition
.
Let’s examine the unconditional expectations and
. First,
,
and so
.
Next,
,
and so
.
Therefore,
.
So, not surprisingly, .
Also, the ultimate goal is to show that is less than
, where
is the event
or, equivalently,
. We see that it will be sufficient to show that
.
We start the calculations of this conditional variance in the next post.