## Wednesday, April 02, 2008

### First Draft to Publication, Fifteen Years

I had hoped to be back to regular posting by now, a week and a half after my return from China, but things are still pretty chaotic!

I posted about a year ago on the Two Envelope Paradox, and my paper with Josh Dever on the topic has finally been published (as of Monday, in Sorites).

In 1993, when Josh and I were both graduate students in Berkeley, he introduced me to the paradox, which is very simple to formulate:
You are presented with the choice between two envelopes, Envelope A and Envelope B. You know that one envelope has half as much money as the other, but you don't know which has more. Arbitrarily, you choose Envelope A. Then you think to yourself, should I switch to Envelope B instead? There's a 50-50 chance it has twice as much money and a 50-50 chance it has half as much money. And since double or nothing is a fair bet, double or half should be more than fair! Using the tools of formal decision theory, you might call "X" the amount of money in Envelope A and then calculate the expectation of switching as (.5)*.5(X) + (.5)*2X = 5/4 X. So you switch. (But of course that's absurd.)
For some reason, the problem completely took hold of me. I found myself waking in the middle of the night and writing equations. Josh and I bothered just about every graduate student at Berkeley and about half the faculty with the problem. It seemed to me, to us, that the core problem was in the use of a variable with different expectations in different terms of the expected value equation (in the first term, where Envelope A has more, the expectation of X, the value in Envelope A, is higher than it is in the second term, which represents the possibility that Envelope A has less). Just about everyone we spoke to was eventually won over by our reasoning on this, and I presented a paper on it at a graduate student conference later that year.

For a while, I flirted with the idea of writing my dissertation on decision theory, but when I decided to work on connections between philosophy and developmental psychology instead, it seemed the practical decision to set the essay aside. (Berkeley had at the time, and maybe still has, a culture of discouraging graduate students from attempting to publish essays based on anything other than a virtually completed dissertation.)

A couple years later, one of our professors, Charles Chihara, published a paper on the problem (in which he generously thanks me) with a solution similar to ours but also in some important ways different -- and not, it seemed to me, very mathematically precise. Other approaches to the problem came out through the mid- and late 1990s, when it was briefly trendy, but all of them seemed to me to miss the point.

In 2002, I had a long conversation about the problem with Terry Horgan, who had published a couple of papers on it, and I felt myself almost convincing him that my solution was better than his own. (He might not agree with this description of our conversation!) He advised that I seek publication again, so I teamed up with Josh and wrote a new version of the essay.

Anyone out there with a more convoluted publication story?

1. Splinterd mind.. hey you have a lot in your head. It would have been interesting to meet you and have a chat..

2. I am honestly confused as to this paradox. The expectation equation should be written differently. You don't stand a chance of losing half or doubling in that way. You have to account the probability that you have the high prize as well. So you have the p(V)--the chance that you have the valuable prize. Then to calculate your average return, you take p(v)*2a+ p(v)*a (not a/2--you are double counting if you do that). this happens to be 1.5a, but you have to divide by the mean return--1.5a. Thus, no paradox.

This is more sensible if you normalize your return, and say that the prizes are 1/3 and 2/3. Thus, the odds of making more money off a switch are .5*1/3+.5*2/3, or 50/50.

I am pretty sure this makes sense becasue I did it in stats class a few years ago (statisitics for work). Is there something I am missing?

3. Right, that's the sensible way to do it -- set your variable equal to the amount in the envelope with less. But what's wrong with setting the variable equal to the amount in the envelope you have? Why is one variable choice okay and the other not? That's the tough question (in our view). But you're right that this way of putting the paradox doesn't highlight that question.

4. As you say, variable choices are the key. In this case, the variable choice is actively wrong. You have 2 envelopes, one with amount A, one with 2A. The math needs to reflect the sum. You don't _know_ what you have, and you are actually miuswriting the expectation. Write out the equations with knowing that you have the larger or the smaller--that should clarify this.

This is more intersting in the inverse--why do people commonly pick the version of the equation you did--and not see an error. Something about our minds, math, and understanding makes otherwise simple questions much more difficult. That is the most interesting issue, and is absolutely common in statistics.

5. Thanks, Brennan! Most mathematicians I speak to, like you, don't seem to feel the pull of the wrong way to put things or see why anyone would be tempted by it -- that's good intuitive math! But few can back up their intuition with convincing reasons, as I think you do not (yet) here.

Yes, one envelope has X and one has 2X. But equally -- one might say -- one has Y and one has either half Y or 2Y. I don't think you've articulated yet what's wrong with thinking of things in the second way -- the way that leads to the paradox!

6. I recently became interested in this paradox through the Wikipedia entry and after reading just about every "solution" linked from that page, and some others not linked, I have to say that I found yours to be the clearest and most persuasive. I was wondering if you've looked at the non-probabilistic variant proposed by Raymond Smullyan, and the solution to it offered by James Chase. It seems to have some similarities to your solution, despite not explicitly dealing with probability or expected value.

7. Thanks, Nathan! The Smullyan sounds vaguely familiar. Perhaps I looked at it and decided it wasn't relevant? Do you have a citation for the Smullyan and the Chase?

8. I've recently become fascinated by this problem and spent hours (if not days) scouring the web for information on this and I have to say that your analysis is crystal clear compared with most of the information on this problem.

I still believe that perhaps there is an even more intuitive or simpler explanation of where the mistep is which spawns the paradox (e.g., as simple as the Monty Hall Problem solution), but your explanation certainly has the ring of truth to it. I believe that one might be able to further distill your solution to create a simpler solution that convinces on a more intuitive level.

Then again, simple problems don't always have simple solutions.

9. Thanks for the kind comment, Dave!