Solving Problems Submitted to MAA Journals (Part 7h)

The following problem appeared in Volume 131, Issue 9 (2024) of The American Mathematical Monthly.

Let X and Y be independent normally distributed random variables, each with its own mean and variance. Show that the variance of X conditioned on the event X>Y is smaller than the variance of X alone.

In previous posts, we reduced the problem to showing that if g(x) = \sqrt{2\pi} x e^{x^2/2} \Phi(x), then f(x) = 1 + g(x) is always positive, where

\Phi(z) = \displaystyle \frac{1}{\sqrt{2\pi}} \int_{-\infty}^z e^{-z^2/2} \, dz

is the cumulative distribution function of the standard normal distribution. If we can prove this, then the original problem will be true.

When I was solving this problem for the first time, my progress through the first few steps was hindered by algebra mistakes and the like, but I didn’t doubt that I was progressing toward the answer. At this point in the solution, however, I was genuinely stuck: nothing immediately popped to mind for showing that g(x) must be greater than -1.

So I turned to Mathematica, just to make sure I was on the right track. Based on the graph, the function of f(x) certainly looks positive.

What’s more, the graph suggests attempting to prove a couple of things: f is an increasing function, and \displaystyle \lim_{x \to -\infty} f(x) = 0 or, equivalently, \displaystyle \lim_{x \to -\infty} g(x) = -1. If I could prove both of these claims, then that would prove that f must always be positive.

I started by trying to show

\displaystyle \lim_{x \to -\infty} g(x) = \lim_{x \to \infty}  x e^{x^2/2} \int_{-\infty}^x e^{-t^2/2} \, dt = -1.

I vaguely remembered something about the asymptotic expansion of the above integral from a course decades ago, and so I consulted that course’s textbook, by Bender and Orszag, to refresh my memory. To derive the behavior of g(x) as x \to -\infty, we integrate by parts. (This is permissible: the integrands below are well-behaved if x<0, so that 0 is not in the range of integration.)

g(x) = \displaystyle x e^{x^2/2} \int_{-\infty}^x e^{-t^2/2} \, dt

= \displaystyle x e^{x^2/2} \int_{-\infty}^x \frac{1}{t} \frac{d}{dt} \left(-e^{-t^2/2}\right) \, dt

= \displaystyle  x e^{x^2/2} \left[ -\frac{1}{t} e^{-t^2/2} \right]_{-\infty}^x - x e^{x^2/2} \int_{-\infty}^x \frac{d}{dt} \left(\frac{1}{t} \right) \left( -e^{-t^2/2} \right) \, dt

= \displaystyle  x e^{x^2/2} \left[ -\frac{1}{x} e^{-x^2/2} - 0 \right] + |x| e^{x^2/2} \int_{-\infty}^x \frac{1}{t^2} e^{-t^2/2} \, dt

\displaystyle  = - 1 +|x| e^{x^2/2} \int_{-\infty}^x \frac{1}{t^2} e^{-t^2/2} \, dt

\displaystyle = -1+ |x| e^{x^2/2} \int_{-\infty}^x \frac{1}{t^2} e^{-t^2/2} \, dt.

This is agonizingly close: the leading term is -1 as expected. However, I was stuck for the longest time trying to show that the second term goes to zero as x \to -\infty.

So, once again, I consulted Bender and Orszag, which outlined how to show this. We note that

\left|g(x) + 1\right| = \displaystyle |x| e^{x^2/2} \int_{-\infty}^x \frac{1}{t^2} e^{-t^2/2} \, dt < \displaystyle |x| e^{x^2/2} \int_{-\infty}^x \frac{1}{x^2} e^{-t^2/2} \, dt = \displaystyle \frac{g(x)}{x^2}.

Therefore,

\displaystyle \lim_{x \to -\infty} \left| \frac{g(x)+1}{g(x)} \right| \le \lim_{x \to -\infty} \frac{1}{x^2} = 0,

so that

\displaystyle \lim_{x \to -\infty} \left| \frac{g(x)+1}{g(x)} \right| =\displaystyle \lim_{x \to -\infty} \left| 1 + \frac{1}{g(x)} \right| = 0.

Therefore,

\displaystyle \lim_{x \to -\infty} \frac{1}{g(x)} = -1,

or

\displaystyle \lim_{x \to -\infty} g(x) = -1.

So (I thought) I was halfway home with the solution, and all that remained was to show that f was an increasing function.

And I was completely stuck at this point for a long time.

Until I realized — much to my utter embarrassment — that showing f was increasing was completely unnecessary, as discussed in the next post.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.