Predicate Logic and Popular Culture: Index

I’m doing something that I should have done a long time ago: collecting a series of posts into one single post. The following links comprised my series on using examples from popular culture to illustrate principles of predicate logic. My experiences teaching these ideas to my discrete mathematics students led to my recent publication (John Quintanilla, “Name That Tune: Teaching Predicate Logic with Popular Culture,” MAA Focus, Vol. 36, No. 4, pp. 27-28, August/September 2016).

Unlike other series that I’ve made, this series didn’t have a natural chronological order. So I’ll list these by concept illustrated from popular logic.

green lineLogical and \land:

  • Part 1: “You Belong To Me,” by Taylor Swift
  • Part 21: “Do You Hear What I Hear,” covered by Whitney Houston
  • Part 31: The Godfather (1972)
  • Part 45: The Blues Brothers (1980)
  • Part 53: “What Does The Fox Say,” by Ylvis
  • Part 54: “Billie Jean,” by Michael Jackson
  • Part 98: “Call Me Maybe,” by Carly Rae Jepsen.

Logical or \lor:

  • Part 1: Shawshank Redemption (1994)

Logical negation \lnot:

  • Part 1: Richard Nixon
  • Part 32: “Satisfaction!”, by the Rolling Stones
  • Part 39: “We Are Never Ever Getting Back Together,” by Taylor Swift

Logical implication \Rightarrow:

  • Part 1: Field of Dreams (1989), and also “Roam,” by the B-52s
  • Part 2: “Word Crimes,” by Weird Al Yankovic
  • Part 7: “I’ll Be There For You,” by The Rembrandts (Theme Song from Friends)
  • Part 43: “Kiss,” by Prince
  • Part 50: “I’m Still A Guy,” by Brad Paisley
  • Part 76: “You’re Never Fully Dressed Without A Smile,” from Annie.
  • Part 109: “Dancing in the Dark,” by Bruce Springsteen.
  • Part 122: “Keep Yourself Alive,” by Queen.

For all \forall:

  • Part 3: Casablanca (1942)
  • Part 4: A Streetcar Named Desire (1951)
  • Part 34: “California Girls,” by The Beach Boys
  • Part 37: Fellowship of the Ring, by J. R. R. Tolkien
  • Part 49: “Buy Me A Boat,” by Chris Janson
  • Part 57: “Let It Go,” by Idina Menzel and from Frozen (2013)
  • Part 65: “Stars and Stripes Forever,” by John Philip Sousa.
  • Part 68: “Love Yourself,” by Justin Bieber.
  • Part 69: “I Will Always Love You,” by Dolly Parton (covered by Whitney Houston).
  • Part 74: “Faithfully,” by Journey.
  • Part 79: “We’re Not Gonna Take It Anymore,” by Twisted Sister.
  • Part 87: “Hungry Heart,” by Bruce Springsteen.
  • Part 99: “It’s the End of the World,” by R.E.M.
  • Part 100: “Hold the Line,” by Toto.
  • Part 101: “Break My Stride,” by Matthew Wilder.
  • Part 102: “Try Everything,” by Shakira.
  • Part 108: “BO$$,” by Fifth Harmony.
  • Part 113: “Sweet Caroline,” by Neil Diamond.
  • Part 114: “You Know Nothing, Jon Snow,” from Game of Thrones.
  • Part 118: “The Lazy Song,” by Bruno Mars.
  • Part 120: “Cold,” by Crossfade.
  • Part 123: “Always on My Mind,” by Willie Nelson.

For all and implication:

  • Part 8 and Part 9: “What Makes You Beautiful,” by One Direction
  • Part 13: “Safety Dance,” by Men Without Hats
  • Part 16: The Fellowship of the Ring, by J. R. R. Tolkien
  • Part 24 : “The Chipmunk Song,” by The Chipmunks
  • Part 55: The Quiet Man (1952)
  • Part 62: “All My Exes Live In Texas,” by George Strait.
  • Part 70: “Wannabe,” by the Spice Girls.
  • Part 72: “You Shook Me All Night Long,” by AC/DC.
  • Part 81: “Ascot Gavotte,” from My Fair Lady
  • Part 82: “Sharp Dressed Man,” by ZZ Top.
  • Part 86: “I Could Have Danced All Night,” from My Fair Lady.
  • Part 95: “Every Breath You Take,” by The Police.
  • Part 96: “Only the Lonely,” by Roy Orbison.
  • Part 97: “I Still Haven’t Found What I’m Looking For,” by U2.
  • Part 105: “Every Rose Has Its Thorn,” by Poison.
  • Part 107: “Party in the U.S.A.,” by Miley Cyrus.
  • Part 112: “Winners Aren’t Losers,” by Donald J. Trump and Jimmy Kimmel.
  • Part 115: “Every Time We Touch,” by Cascada.
  • Part 117: “Stronger,” by Kelly Clarkson.

There exists \exists:

  • Part 10: “Unanswered Prayers,” by Garth Brooks
  • Part 15: “Stand by Your Man,” by Tammy Wynette (also from The Blues Brothers)
  • Part 36: Hamlet, by William Shakespeare
  • Part 57: “Let It Go,” by Idina Menzel and from Frozen (2013)
  • Part 93: “There’s No Business Like Show Business,” from Annie Get Your Gun (1946).
  • Part 94: “Not While I’m Around,” from Sweeney Todd (1979).
  • Part 104: “Wild Blue Yonder” (US Air Force)
  • Part 106: “No One,” by Alicia Keys.
  • Part 116: “Ocean Front Property,” by George Strait.

Existence and uniqueness:

  • Part 14: “Girls Just Want To Have Fun,” by Cyndi Lauper
  • Part 20: “All I Want for Christmas Is You,” by Mariah Carey
  • Part 23: “All I Want for Christmas Is My Two Front Teeth,” covered by The Chipmunks
  • Part 29: “You’re The One That I Want,” from Grease
  • Part 30: “Only You,” by The Platters
  • Part 35: “Hound Dog,” by Elvis Presley
  • Part 73: “Dust In The Wind,” by Kansas.
  • Part 75: “Happy Together,” by The Turtles.
  • Part 77: “All She Wants To Do Is Dance,” by Don Henley.
  • Part 90: “All You Need Is Love,” by The Beatles.

DeMorgan’s Laws:

  • Part 5: “Never Gonna Give You Up,” by Rick Astley
  • Part 28: “We’re Breaking Free,” from High School Musical (2006)

Simple nested predicates:

  • Part 6: “Everybody Loves Somebody Sometime,” by Dean Martin
  • Part 25: “Every Valley Shall Be Exalted,” from Handel’s Messiah
  • Part 33: “Heartache Tonight,” by The Eagles
  • Part 38: “Everybody Needs Somebody To Love,” by Wilson Pickett and covered in The Blues Brothers (1980)
  • Part 46: “Mean,” by Taylor Swift
  • Part 56: “Turn! Turn! Turn!” by The Byrds
  • Part 63: P. T. Barnum.
  • Part 64: Abraham Lincoln.
  • Part 66: “Somewhere,” from West Side Story.
  • Part 71: “Hold On,” by Wilson Philips.
  • Part 80: Liverpool FC.
  • Part 84: “If You Leave,” by OMD.
  • Part 103: “The Caisson Song” (US Army).
  • Part 111: “Always Something There To Remind Me,” by Naked Eyes.
  • Part 121: “All the Right Moves,” by OneRepublic.

Maximum or minimum of a function:

  • Part 12: “For the First Time in Forever,” by Kristen Bell and Idina Menzel and from Frozen (2013)
  • Part 19: “Tennessee Christmas,” by Amy Grant
  • Part 22: “The Most Wonderful Time of the Year,” by Andy Williams
  • Part 48: “I Got The Boy,” by Jana Kramer
  • Part 60: “I Loved Her First,” by Heartland
  • Part 92: “Anything You Can Do,” from Annie Get Your Gun.
  • Part 119: “Uptown Girl,” by Billy Joel.

Somewhat complicated examples:

  • Part 11 : “Friends in Low Places,” by Garth Brooks
  • Part 27 : “There is a Castle on a Cloud,” from Les Miserables
  • Part 41: Winston Churchill
  • Part 44: Casablanca (1942)
  • Part 51: “Everybody Wants to Rule the World,” by Tears For Fears
  • Part 58: “Fifteen,” by Taylor Swift
  • Part 59: “We Are Never Ever Getting Back Together,” by Taylor Swift
  • Part 61: “Style,” by Taylor Swift
  • Part 67: “When I Think Of You,” by Janet Jackson.
  • Part 78: “Nothing’s Gonna Stop Us Now,” by Starship.
  • Part 89: “No One Is Alone,” from Into The Woods.
  • Part 110: “Everybody Loves My Baby,” by Louis Armstrong.

Fairly complicated examples:

  • Part 17 : Richard Nixon
  • Part 47: “Homegrown,” by Zac Brown Band
  • Part 52: “If Ever You’re In My Arms Again,” by Peabo Bryson
  • Part 83: “Something Good,” from The Sound of Music.
  • Part 85: “Joy To The World,” by Three Dog Night.
  • Part 88: “Like A Rolling Stone,” by Bob Dylan.
  • Part 91: “Into the Fire,” from The Scarlet Pimpernel.

Really complicated examples:

  • Part 18: “Sleigh Ride,” covered by Pentatonix
  • Part 26: “All the Gold in California,” by the Gatlin Brothers
  • Part 40: “One of These Things Is Not Like the Others,” from Sesame Street
  • Part 42: “Take It Easy,” by The Eagles

A Long-Sought Proof, Found and Almost Lost

I enjoyed this article from Quanta Magazine, both for its mathematical content as well as the human interest story.

A Long-Sought Proof, Found and Almost Lost

From the opening paragraphs:

Known as the Gaussian correlation inequality (GCI), the conjecture originated in the 1950s, was posed in its most elegant form in 1972 and has held mathematicians in its thrall ever since. “I know of people who worked on it for 40 years,” said Donald Richards, a statistician at Pennsylvania State University. “I myself worked on it for 30 years.”

[Thomas] Royen hadn’t given the Gaussian correlation inequality much thought before the “raw idea” for how to prove it came to him over the bathroom sink… In July 2014, still at work on his formulas as a 67-year-old retiree, Royen found that the GCI could be extended into a statement about statistical distributions he had long specialized in. On the morning of the 17th, he saw how to calculate a key derivative for this extended GCI that unlocked the proof. “The evening of this day, my first draft of the proof was written,” he said.

Not knowing LaTeX, the word processer of choice in mathematics, he typed up his calculations in Microsoft Word, and the following month he posted his paper to the academic preprint site arxiv.org. He also sent it to Richards, who had briefly circulated his own failed attempt at a proof of the GCI a year and a half earlier. “I got this article by email from him,” Richards said. “And when I looked at it I knew instantly that it was solved” …

Proofs of obscure provenance are sometimes overlooked at first, but usually not for long: A major paper like Royen’s would normally get submitted and published somewhere like the Annals of Statistics, experts said, and then everybody would hear about it. But Royen, not having a career to advance, chose to skip the slow and often demanding peer-review process typical of top journals. He opted instead for quick publication in the Far East Journal of Theoretical Statistics, a periodical based in Allahabad, India, that was largely unknown to experts and which, on its website, rather suspiciously listed Royen as an editor. (He had agreed to join the editorial board the year before.)

With this red flag emblazoned on it, the proof continued to be ignored… No one is quite sure how, in the 21st century, news of Royen’s proof managed to travel so slowly. “It was clearly a lack of communication in an age where it’s very easy to communicate,” Klartag said.

A nice article on recent progress on solving the twin prime conjecture

The twin prime conjecture (see here, here and here for more information) asserts that there are infinitely many primes that have a difference of 2. For example:

3 and 5 are twin primes;

5 and 7 are twin primes;

11 and 13 are twin primes;

17 and 19 are twin primes;

29 and 31 are twin primes; etc.

While most mathematicians believe the twin prime conjecture is correct, an explicit proof has not been found. Indeed, this has been one of the most popular unsolved problems in mathematics — not necessarily because it’s important, but for the curiosity that a conjecture so simply stated has eluded conquest by the world’s best mathematicians.

Still, research continues, and some major progress has been made in the past few years. (I like sharing this story with my students to convince them that not everything that can be known about mathematics has been figure out yet — a misconception encouraged by the structure of the secondary curriculum — and that research continues to this day.) Specifically, it was recently shown that, for some integer N that is less than 70 million, there are infinitely many pairs of primes that differ by N.

http://video.newyorker.com/watch/annals-of-ideas-yitang-zhang-s-discovery-2015-01-28

http://www.newyorker.com/magazine/2015/02/02/pursuit-beauty

For more on recent progress:

 

 

 

 

 

 

My Favorite One-Liners: Part 100

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them.

Today’s quip is one that I’ll use surprisingly often:

If you ever meet a mathematician at a bar, ask him or her, “What is your favorite application of the Cauchy-Schwartz inequality?”

The point is that the Cauchy-Schwartz inequality arises surprisingly often in the undergraduate mathematics curriculum, and so I make a point to highlight it when I use it. For example, off the top of my head:

1. In trigonometry, the Cauchy-Schwartz inequality states that

|{\bf u} \cdot {\bf v}| \le \; \parallel \!\! {\bf u} \!\! \parallel \cdot \parallel \!\! {\bf v} \!\! \parallel

for all vectors {\bf u} and {\bf v}. Consequently,

-1 \le \displaystyle \frac{ {\bf u} \cdot {\bf v} } {\parallel \!\! {\bf u} \!\! \parallel \cdot \parallel \!\! {\bf v} \!\! \parallel} \le 1,

which means that the angle

\theta = \cos^{-1} \left( \displaystyle \frac{ {\bf u} \cdot {\bf v} } {\parallel \!\! {\bf u} \!\! \parallel \cdot \parallel \!\! {\bf v} \!\! \parallel} \right)

is defined. This is the measure of the angle between the two vectors {\bf u} and {\bf v}.

2. In probability and statistics, the standard deviation of a random variable X is defined as

\hbox{SD}(X) = \sqrt{E(X^2) - [E(X)]^2}.

The Cauchy-Schwartz inequality assures that the quantity under the square root is nonnegative, so that the standard deviation is actually defined. Also, the Cauchy-Schwartz inequality can be used to show that \hbox{SD}(X) = 0 implies that X is a constant almost surely.

3. Also in probability and statistics, the correlation between two random variables X and Y must satisfy

-1 \le \hbox{Corr}(X,Y) \le 1.

Furthermore, if \hbox{Corr}(X,Y)=1, then Y= aX +b for some constants a and b, where a > 0. On the other hand, if \hbox{Corr}(X,Y)=-1, if \hbox{Corr}(X,Y)=1, then Y= aX +b for some constants a and b, where a < 0.

Since I’m a mathematician, I guess my favorite application of the Cauchy-Schwartz inequality appears in my first professional article, where the inequality was used to confirm some new bounds that I derived with my graduate adviser.

My Favorite One-Liners: Part 99

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them.

Today’s quip is a light-hearted one-liner that I’ll use to lighten the mood when in the middle of a complex calculation, like the following limit problem from calculus:

Let f(x) = 11-4x. Find \delta so that |f(x) - 3| < \epsilon whenever $|x-2| < \delta$.

The solution of this problem requires isolating x in the above inequality:

|(11-4x) - 3| < \epsilon

|8-4x| < \epsilon

-\epsilon < 8 - 4x < \epsilon

-8-\epsilon < -4x < -8 + \epsilon

At this point, the next step is dividing by -4. So, I’ll ask my class,

When we divide by -4, what happens to the crocodiles?

This usually gets the desired laugh out of the middle-school rule about how the insatiable “crocodiles” of an inequality always point to the larger quantity, leading to the next step:

2 + \displaystyle \frac{\epsilon}{4} > x > 2 - \displaystyle \frac{\epsilon}{4},

so that

\delta = \min \left( \left[ 2 + \displaystyle \frac{\epsilon}{4} \right] - 2, 2 - \left[2 - \displaystyle \frac{\epsilon}{4} \right] \right) = \displaystyle \frac{\epsilon}{4}.

Formally completing the proof requires starting with |x-2| < \displaystyle \frac{\epsilon}{4} and ending with |f(x) - 3| < \epsilon.

My Favorite One-Liners: Part 88

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them.

In the first few weeks of my calculus class, after introducing the definition of a derivative,

\displaystyle \frac{dy}{dx} = y' = f'(x) = \lim_{h \to 0} \displaystyle \frac{f(x+h) - f(x)}{h},

I’ll use the following steps to guide my students to find the derivatives of polynomials.

  1. If f(x) = c, a constant, then \displaystyle \frac{d}{dx} (c) = 0.
  2. If f(x) and g(x) are both differentiable, then (f+g)'(x) = f'(x) + g'(x).
  3.  If f(x) is differentiable and c is a constant, then (cf)'(x) = c f'(x).
  4. If f(x) = x^n, where n is a nonnegative integer, then f'(x) = n x^{n-1}.
  5. If f(x) = a_n x^n + a_{n-1} x^{n-1} + \dots + a_1 x + a_0 is a polynomial, then f'(x) = n a_n x^{n-1} + (n-1) a_{n-1} x^{n-2} + a_1.

After doing a few examples to help these concepts sink in, I’ll show the following two examples with about 3-4 minutes left in class.

Example 1. Let A(r) = \pi r^2. Notice I’ve changed the variable from x to r, but that’s OK. Does this remind you of anything? (Students answer: the area of a circle.)

What’s the derivative? Remember, \pi is just a constant. So A'(r) = \pi \cdot 2r = 2\pi r.

Does this remind you of anything? (Students answer: Whoa… the circumference of a circle.)

Generally, students start waking up even though it’s near the end of class. I continue:

Example 2. Now let’s try V(r) = \displaystyle \frac{4}{3} \pi r^3. Does this remind you of anything? (Students answer: the volume of a sphere.)

What’s the derivative? Again, \displaystyle \frac{4}{3} \pi is just a constant. So V'(r) = \displaystyle \frac{4}{3} \pi \cdot 3r^2 = 4\pi r^2.

Does this remind you of anything? (Students answer: Whoa… the surface area of a sphere.)

By now, I’ve really got my students’ attention with this unexpected connection between these formulas from high school geometry. If I’ve timed things right, I’ll say the following with about 30-60 seconds left in class:

Hmmm. That’s interesting. The derivative of the area of a circle is the circumference of the circle, and the derivative of the area of a sphere is the surface area of the sphere. I wonder why this works. Any ideas? (Students: stunned silence.)

This is what’s known as a cliff-hanger, and I’ll give you the answer at the start of class tomorrow. (Students groan, as they really want to know the answer immediately.) Class is dismissed.

If you’d like to see the answer, see my previous post on this topic.

My Favorite One-Liners: Part 50

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them.

Here’s today’s one-liner: “To prove that two things are equal, show that the difference is zero.” This principle is surprisingly handy in the secondary mathematics curriculum. For example, it is the basis for the proof of the Mean Value Theorem, one of the most important theorems in calculus that serves as the basis for curve sketching and the uniqueness of antiderivatives (up to a constant).

And I have a great story that goes along with this principle, from 30 years ago.

I forget the exact question out of Apostol’s calculus, but there was some equation that I had to prove on my weekly homework assignment that, for the life of me, I just couldn’t get. And for no good reason, I had a flash of insight: subtract the left- and right-hand sides. While it was very difficult to turn the left side into the right side, it turned out that, for this particular problem, was very easy to show that the difference was zero. (Again, I wish I could remember exactly which question this was so that I could show this technique and this particular example to my own students.)

So I finished my homework, and I went outside to a local basketball court and worked on my jump shot.

Later that week, I went to class, and there was a great buzz in the air. It took ten seconds to realize that everyone was up in arms about how to do this particular problem. Despite the intervening 30 years, I remember the scene as clear as a bell. I can still hear one of my classmates ask me, “Quintanilla, did you get that one?”

I said with great pride, “Yeah, I got it.” And I showed them my work.

And, either before then or since then, I’ve never heard the intensity of the cussing that followed.

Truth be told, probably the only reason that I remember this story from my adolescence is that I usually was the one who had to ask for help on the hardest homework problems in that Honors Calculus class. This may have been the one time in that entire two-year calculus sequence that I actually figured out a homework problem that had stumped everybody else.

My Favorite One-Liners: Part 46

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them. Today’s one-liner is something I’ll use after completing some monumental calculation. For example, if z, w \in \mathbb{C}, the proof of the triangle inequality is no joke, as it requires the following as lemmas:

  • \overline{z + w} = \overline{z} + \overline{w}
  • \overline{zw} = \overline{z} \cdot \overline{w}
  • z + \overline{z} = 2 \hbox{Re}(z)
  • |\hbox{Re}(z)| \le |z|
  • |z|^2 = z \cdot \overline{z}
  • \overline{~\overline{z}~} = z
  • |\overline{z}| = |z|
  • |z \cdot w| = |z| \cdot |w|

With all that as prelude, we have

|z+w|^2 = (z + w) \cdot \overline{z+w}

= (z+w) (\overline{z} + \overline{w})

= z \cdot \overline{z} + z \cdot \overline{w} + \overline{z} \cdot w + w \cdot \overline{w}

= |z|^2 + z \cdot \overline{w} + \overline{z} \cdot w + |w|^2

= |z|^2  + z \cdot \overline{w} + \overline{z} \cdot \overline{~\overline{w}~} + |w|^2

= |z|^2 + z \cdot \overline{w} + \overline{z \cdot \overline{w}} + |w|^2

= |z|^2 + 2 \hbox{Re}(z \cdot \overline{w}) + |w|^2

\le |z|^2 + 2 |z \cdot \overline{w}| + |w|^2

= |z|^2 + 2 |z| \cdot |\overline{w}| + |w|^2

= |z|^2 + 2 |z| \cdot |w| + |w|^2

= (|z| + |w|)^2

In other words,

|z+w|^2 \le (|z| + |w|)^2.

Since |z+w| and |z| + |w| are both positive, we can conclude that

|z+w| \le |z| + |w|.

QED

In my experience, that’s a lot for students to absorb all at once when seeing it for the first time. So I try to celebrate this accomplishment:

Anybody ever watch “Home Improvement”? This is a Binford 6100 “more power” mathematical proof. Grunt with me: RUH-RUH-RUH-RUH!!!

My Favorite One-Liners: Part 43

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them. q Q

Years ago, my first class of students decided to call me “Dr. Q” instead of “Dr. Quintanilla,” and the name has stuck ever since. And I’ll occasionally use this to my advantage when choosing names of variables. For example, here’s a typical proof by induction involving divisibility.

Theorem: If n \ge 1 is a positive integer, then 5^n - 1 is a multiple of 4.

Proof. By induction on n.

n = 1: 5^1 - 1 = 4, which is clearly a multiple of 4.

n: Assume that 5^n - 1 is a multiple of 4.

At this point in the calculation, I ask how I can write this statement as an equation. Eventually, somebody will volunteer that if 5^n-1 is a multiple of 4, then 5^n-1 is equal to 4 times something. At which point, I’ll volunteer:

Yes, so let’s name that something with a variable. Naturally, we should choose something important, something regal, something majestic… so let’s choose the letter q. (Groans and laughter.) It’s good to be the king.

So the proof continues:

n: Assume that 5^n - 1 = 4q, where q is an integer.

n+1. We wish to show that 5^{n+1} - 1 is also a multiple of 4.

At this point, I’ll ask my class how we should write this. Naturally, I give them no choice in the matter:

We wish to show that 5^{n+1} - 1 = 4Q, where Q is some (possibly different) integer.

Then we continue the proof:

5^{n+1} - 1 = 5^n 5^1 - 1

= 5 \times 5^n - 1

= 5 \times (4q + 1) - 1 by the induction hypothesis

= 20q + 5 - 1

= 20q + 4

= 4(5q + 1).

So if we let Q = 5q +1, then 5^{n+1} - 1 = 4Q, where Q is an integer because q is also an integer.

QED

green line

On the flip side of braggadocio, the formula for the binomial distribution is

P(X = k) = \displaystyle {n \choose k} p^k q^{n-k},

where X is the number of successes in n independent and identically distributed trials, where p represents the probability of success on any one trial, and (to my shame) q is the probability of failure.

 

 

My Favorite One-Liners: Part 13

In this series, I’m compiling some of the quips and one-liners that I’ll use with my students to hopefully make my lessons more memorable for them.

Here’s a story that I’ll tell my students when, for the first time in a semester, I’m about to use a previous theorem to make a major step in proving a theorem. For example, I may have just finished the proof of

\hbox{Var}(X+Y) = \hbox{Var}(X) + \hbox{Var}(Y),

where X and Y are independent random variables, and I’m about to prove that

\hbox{Var}(X-Y) = \hbox{Var}(X) + \hbox{Var}(Y).

While this can be done by starting from scratch and using the definition of variance, the easiest thing to do is to write

\hbox{Var}(X-Y) = \hbox{Var}(X+[-Y]) = \hbox{Var}(X) + \hbox{Var}(-Y),

thus using the result of the first theorem to prove the next theorem.

And so I have a little story that I tell students about this principle. I think I was 13 when I first heard this one, and obviously it’s stuck with me over the years.

At MIT, there’s a two-part entrance exam to determine who will be the engineers and who will be the mathematicians. For the first part of the exam, students are led one at a time into a kitchen. There’s an empty pot on the floor, a sink, and a stove. The assignment is to boil water. Everyone does exactly the same thing: they fill the pot with water, place it on the stove, and then turn the stove on. Everyone passes.

For the second part of the exam, students are led one at a time again into the kitchen. This time, there’s a pot full of water sitting on the stove. The assignment, once again, is to boil water. Nearly everyone simply turns on the stove. These students are led off to become engineers. The mathematicians are ones who take the pot off the stove, dump the water into the sink, and place the empty pot on the floor… thereby reducing to the original problem, which had already been solved.