tag:blogger.com,1999:blog-26951738.post6811593131899954993..comments2024-03-28T19:14:33.619-07:00Comments on The Splintered Mind: The Two Envelope ParadoxEric Schwitzgebelhttp://www.blogger.com/profile/11541402189204286449noreply@blogger.comBlogger28125tag:blogger.com,1999:blog-26951738.post-5922426940419419232011-06-21T14:08:42.958-07:002011-06-21T14:08:42.958-07:00Odatafan: Nicely put. I think that makes a lot of...Odatafan: Nicely put. I think that makes a lot of sense, which is why I find the "closed envelope" version more interesting and puzzling than the more traditional "open envelope" version. In the closed envelope version, you don't get to see what is in Envelope A. Clearly, there's no point in switching. And yet there seems to be a compelling argument that you should switch. Figuring out what is wrong with that argument is the challenge!Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-49982838959928640042011-06-21T02:50:49.300-07:002011-06-21T02:50:49.300-07:00I'd like to know if this is "satisfactory...I'd like to know if this is "satisfactory" ... <br /><br />One <a href="http://wnio.blogspot.com/2011/06/two-envelope-problem.html" rel="nofollow">solution</a> is this: Switching when you find a very low value is obviously a good plan. But if you see a value so large that it surprises you, then it might be a bad idea to switch. If the amount X in the envelope surprises you because it is very large... that's because you guess than the average envelope contains much less. The fact that you saw X is consistent with one of the following: <br /><br />> The envelopes are better than I thought, and the prizes are X and X/2.<br /><br />> The envelopes are much better than I thought, and the prizes are X and 2X.<br /><br />If your "surprise curve" says that odds of those two conditions are, say 9:4 in favor of the first case, then you should not switch, as the costs of switching down exceed the benefits of switching up. Now, suppose you are surprised to see X, but you figure "I would be more surprised to see twice as much, but not *twice* as surprised," then you are assuming that the distribution of <br />"what the value in an envelope could be" is very flat -- the chance of "between 2X and 3X" exceeds half the chance of "between X and 1.5X". But then we can add in the missing intervals and "the chance of being somewhere" diverges. I think that people do, psychologically, cling to their ignorance, and this is part of the fun of the paradox: in order to have rational behavior, a person must guess a distribution of "what the value could be" and that distribution has to be tight -- gathering most of its support in a small area -- in order to be integrable. If at this moment you rebel and say "I'd rather keep my distribution wide than keep it integrable," let me ask you whether the probability that the prize is between 1 and a million coins is greater than 0.00001. If you believe that, or any similar statement, with the parameters "1", "a million" and "0.00001" changed to any values at all, then you believe in integrability. To not believe in integrability is tantamount to concentrating your distribution at infinity -- for any finite number, you really think that the prize is more probably bigger than this.<br /><br />Another reason the paradox is compelling is because you should definitely switch <a href="http://wnio.blogspot.com/2011/06/wallet-wrongfully-returned.html" rel="nofollow">from a median value</a>. This is balanced in the expected-value calculation by the great sums you would lose by switching from a high value.Odatafanhttps://www.blogger.com/profile/04335737875089687401noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-64599465305571584882009-06-22T13:22:44.209-07:002009-06-22T13:22:44.209-07:00Ah, I get it, Benjamin -- sorry for being dense. ...Ah, I get it, Benjamin -- sorry for being dense. I'm inclined to agree, although some utility measurement may be even better than "credit" as you describe it.<br /><br />However: I do not think that this (closed envelope) version of the two envelope paradox assumes infinite money or credit. The open-envelope version does, if you assume a function on which whatever value you see you think there's a 50% likelihood that the other envelope has more. Mathematically, I think the open envelope and closed-envelope versions are very different.Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-17700702226006464222009-06-22T01:32:20.371-07:002009-06-22T01:32:20.371-07:00Eric,
I hope to clarify my previous post with an ...Eric,<br /><br />I hope to clarify my previous post with an example:<br /><br />Let's call the unit of credit a goldpiece and the unit of money a dollar. Let's also assume there are 5000 goldpieces in the world, represented by 1000 dollar. This means each dollar is worth 5 goldpieces. Let's also state a car costs 100 goldpieces.<br /><br />Suppose I got 100 dollar, worth 500 goldpieces. I can buy 5 cars with this amount of money.<br /><br />The national banks now start to press more money. We now have 2000 dollar representing our 5000 goldpieces. The credit represented per dollar is now 2.5 goldpieces.<br /><br />Suppose I got 100 dollar, worth 250 goldpieces. I can buy 2.5 cars with this amount of money.<br /><br />Where in the latter case I can buy 2.5 cars, in the former case I can buy 5 cars with the SAME amount of money (100 dollars). This insight is necessary to realize that not money, but credit is what we should examine in order to state how rich we are.<br /><br />In the 2 envelopes problem any amount of money could be in the envelopes. This means there is an infinite amount of money, representing a finite number of credit. The credit value per monetary unit is zero. This means that the value of one dollar is the value of one million dollars is zero. So when loosing an amount of money, or gaining double that amount, both would be zero-profit events.<br /><br />In reality though, we never will be in the situation of having an infinite amount of money in the world, this makes the problem purely theoretical.<br /><br />However, from time to time we have witnessed massive presses of money, causing a very high devaluation of money. For instance, in Germany, at the end of the second world war, an egg costed a few hundred thousand marks.<br /><br />When dealing with money in decision making problems, we must always question the economic background of the situation, in order to determine the value of money. <b>Expected utility should be based on gain or loss in credit, not in money. </b>dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-21927165166410221812009-06-21T09:32:33.104-07:002009-06-21T09:32:33.104-07:00Benjamin, you're losing me. How did we end up...Benjamin, you're losing me. How did we end up with "money is worthless" at the end of that? Feel free to mail any spare cash to me! ;)Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-63140241284709286692009-06-21T02:20:33.142-07:002009-06-21T02:20:33.142-07:00We have been disregarding the value of money in re...We have been disregarding the value of money in respect of the credit it corresponds too. If there is a finite amount of credit represented by an infinite amount of money (which is the case in this paradox), the vale of money is the reciprocal of infinity, or plain zero. This means that whether loosing any finite amount of money or gaining it, would not result in a loss or gain in credit. It is the expected utility in credit though that makes us richer or poorer. In most mathematical problems, we ignore this fact because we assume the value of money is a good indicator for the value of credit. This paradox manifests itself in a way there is a workd of difference between money and credit. Taking this in regard, one should not be even playing this game, because money is worthless anyway.dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-13343757207569614132009-06-17T06:24:59.390-07:002009-06-17T06:24:59.390-07:00It's not always easy to distinguish a conditio...It's not always easy to distinguish a condition that is not met with an uncertainty.<br /><br />if i throw a coin and head falls,<br />the tails is a condition that is not meet. before the throw you are uncertain whether to throw head or tails, but this is your a priori distribution rather than uncertianty.<br /><br />if you throw a coin, and keep the outcome secret to me, and then present me a subjective problem where i need to outcome of your throw as part of my a priori distribution, this is an uncertainty for me. i have 2 possible counterfacts: either you threw heads, or tails, but only one of these counterfacts is the real situation.<br /><br />in this envelope problem you write down x and 2x in some envelopes and present me an envelope at random. it is uncertain to me what you have written down. so i'm faced with 2 counterfacts (a/2, a) and (a, 2a), of which i know 1 is reality.<br /><br />i hope it is more clear now what uncertainty really is, and how it affects subjective probalistics, .. how it even keep you from making any rational decision.dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-1843508843844565942009-06-17T06:07:43.197-07:002009-06-17T06:07:43.197-07:00Interesting observation about 3 and 4, Benjamin.
...Interesting observation about 3 and 4, Benjamin.<br /><br />On your first comment, I'm not sure why you want to say either a/2 or 2a in the other envelope is "fictional". Of course we're dealing with uncertainties, so in general anything but the actual amount is in some sense fictional....Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-88642388364000013982009-06-17T04:21:08.803-07:002009-06-17T04:21:08.803-07:00My previous comment accidently got published to so...My previous comment accidently got published to soon, for which I apoligize.<br /><br />I'll provide some clarification about statement 3 and 4. Remember the value in my envelope is 'a'.<br /><br />3. So I gained, this means I gain 'a' and now have '2a'. For my opponent this means he lost 'a', where if he would have won, he would have gained '2a', bringing him from '2a' to '4a'. My gain is thus smaller than his would-have gain(a < 2a).<br /><br />4. So I lost, this means I lost 'a/2' and end up with 'a/2'. For my opponent this means he gained 'a/2', where if he would have lost, he wouls have lost 'a/4', bringing him from 'a/2' to 'a/4'. My lost is this bigger than his would-have lost (a/2 > a/4).dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-85123178241435210922009-06-17T04:09:58.925-07:002009-06-17T04:09:58.925-07:00Let me elaborate a bit on this problem by making s...Let me elaborate a bit on this problem by making some statements:<br /><br />1. if we were to gain, we would gain more than if we were to lose.<br />2. if we were to lose, we would lose less than if we were to gain.<br />3. if we were to gain, we would gain less than if the other person were to gain.<br />4. if we were to lose, we would lose more than if the other person were to lose.<br />5. if we were to gain, we would gain exactly what the other person would lose.<br />6. if we were to lose, we would lose exactly what the other person would gain.<br /><br />Statements 1,2,5,6 are commonly understood. Statements 1 and 2 makes us think we should switch, while statements 5 and 6 makes us think switching doesn't mather.<br />But statements 3 and 4 are overlooked! I believe these last statements are the other side of the coin that represents the first two statements. There were those lead us to switching, these lead us to staying. In fact, I'm not sure whether statements 1,2,3 and 4 complement eachother or should all be regarded as counterfacts, which do not influence our expected utility.dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-57522135548512338052009-06-17T00:35:31.339-07:002009-06-17T00:35:31.339-07:00Let's call the amounts in the evelopes {'x...Let's call the amounts in the evelopes {'x', '2x'}, and the amount in the envelope we pick 'a'.<br /><br />If you ask me whether to switch or not in order to end up with the highest amount of the envelopes,<br />I reason in a bayesian way: <i>50% chance I picked 'x', switching would result in '2x', so I gain 'x'. 50% change I picked '2x', switching would result in 'x', so i lose 'x'.</i> There is no expected gain in switching, so my answer would be: <b>Whether I switch or not, and how many times I would switch doesn't influence the result.</b><br /><br />But you didn't ask me the above question. You asked me whether I should switch or not in order to go home with as much profit as possible. Again I go bayesian on this: <i>100% sure I pick 'a' (given), 50% chance 'a' = 'x', 50% chance 'a' = '2x'. This influences my a priori distribution. Where I know the amounts in the envelope are {x, 2x}, I now also know the amounts in the envelope are either {a/2, a} or {a, 2a}, with each situation equally likely. This changes my a priori distribution from well known and indicative, to subjunctive. That is to say, the amount in the other envelope is either 'a/2' or '2a', depending on the a priori distribution, but in reality one of the 2 amounts is fictional, the other is real.</i><br />If the a priori distribution is not entirely indicative, we do not have enough facts for any puzzle to solve on a bayesian way. The flaw we make is calculating the expected gain of 2 subjunctive a priori distributions, where we know only 1 would in retrospect appear to be indicative. My answer to the above question would be: <b>I do not have enough factual information to make a decision.</b>dfhwzehttps://www.blogger.com/profile/00819365080805880609noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-19748223370610832432008-04-18T06:50:00.000-07:002008-04-18T06:50:00.000-07:00Hey, Chris, thanks for the comment! I don't disag...Hey, Chris, thanks for the comment! I don't disagree as much as you think. *If* you open the envelope and see an amount, then depending on the amount it may be the case that you think the probability of the other envelope having less is equal to the probability of the other envelope having more -- and then you should switch. But only very weird prior probability distributions allow that you'll grant those probabilities no matter what you see.<BR/><BR/>That's why Josh and I focus on the "closed envelope" case, where issue is in some sense simpler (and thus in another sense more difficult).Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-63975262540097389172008-04-17T08:00:00.000-07:002008-04-17T08:00:00.000-07:00Yes, it is really sad to leave comments after such...Yes, it is really sad to leave comments after such a long time but I only discovered the site recently plus I really like this paradox and its one I've not met before. <BR/><BR/>Also, sorry for the 'loser length' post.<BR/><BR/>I'm taking it as read that other correct ways to address the question of whether to swap or not are easy to find but that the issue here is to show why the approach taken goes wrong.<BR/><BR/>Well...<BR/><BR/>The paradoxical formula is correct! Good one eh? The thing is, it doesn't tell you what you think it is telling you.<BR/><BR/>You make the statement "you might call X the amount of money in Envelope A..." and indeed you might. But from now on the expectation formula only works for situations in which Envelope A contains (the fixed) X. And indeed, for all the situations in which Envelope A contains X <B>it is</B> better to swap. Unexpected but true. <BR/><BR/>To make this obvious let's do a couple of things. Firstly, never mind this namby-pampy doubling, we will put 100 times more in one envelope than the other. Secondly, we will actually open envelope A and find out what's inside. OOooh look, X is $1 (one dollar).<BR/><BR/>So, you now have the choice of swapping, where you would get $100 with prob 0.5 or 1 cent with prob 0.5. Sticking will leave you with $1. Even without the math it is obvious that over multiple trials (where Envelope A contains $1) the best gain is by swapping. This might be counter intuitive but it is true. The moment X is fixed at <B>any one</B> named value it really is better to swap in the long run. <BR/><BR/>If X is $10 it is better to swap, because overall, in $10 situations we come out ahead. If X is $50 it is better to swap, because overall, in $50 situations we come out ahead. If X is any named amount it is better to swap.<BR/><BR/>The formula is correctly telling us that for all situations where Envelope A contains X we should swap. In the original problem we don't know what X is. So when we find that the expected gain of swapping is 5X/4 we can't actually put a value in dollars to that number but whatever X is, the formula is true.<BR/><BR/>Now, we feel instinctively that in the original situation it can not actually make any mathematical difference if we swap or not - and this is correct too.<BR/><BR/>To marry the two parts together (formula: swap -> gain, versus common sense: swap -> no gain). I think we have to understand this:<BR/><BR/>The expectation formula is based on an idea of multiple trials. Since we have mentally fixed X at the start of this one trial, it only covers the trials where X is the same as our current trial. It doesn't apply to all the other trials. But to get the 'common sense' answer, what we need is a formula based on this trial, the next trail, the trail after that and so on forever, without missing any trials out (or another method entirely).<BR/><BR/>If we want to get the common sense answer we just can't use the formula as it stands because X is (potentially) different in every trial but the formula keeps it always the same.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-23151643506980142112007-09-10T15:27:00.000-07:002007-09-10T15:27:00.000-07:00Thanks EricThat's an amazing proposition. It shou...Thanks Eric<BR/><BR/>That's an amazing proposition. It should be fairly easy for someone to come up with a formal proof one way or another. <BR/><BR/>My contention is that if the probability distribution has a finite expected value then:<BR/><BR/>1 There are some situations where, upon opening your envelope, you will decide not to switch (hence there is no paradox that you should switch unopened envlopes)<BR/><BR/>2 More generally, in this case switching unopened envelopes always has an expected value of zero.<BR/><BR/>Only where the probability distribution is infinite can you apparently always increase the expected value by switching.<BR/><BR/>Anyway if that's wrong it should be fairly easy to show formally.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-46706727390508593282007-09-09T17:10:00.000-07:002007-09-09T17:10:00.000-07:00I think one can get the paradox, still, with finit...I think one can get the paradox, still, with finite expectations and without money -- so I guess I disagree with you!<BR/><BR/>Suppose it's just a rational number in the envelopes (with one envelope twice the other), but your expected value for the numbers is still finite. Switching still doesn't increase the expected value.<BR/><BR/>It's a problem, I argue, with the use of variables within the expectation formula. I have an essay on this (with Josh Dever) on my homepage.<BR/><BR/>Thanks!Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-47620724519145116582007-09-03T19:52:00.000-07:002007-09-03T19:52:00.000-07:00Thanks Eric. You've actually hit the nail on the ...Thanks Eric. <BR/><BR/>You've actually hit the nail on the head. When you change the rules to eliminate money, and just make it a game about numbers, the paradox disappears. <BR/><BR/>This is because the paradox contains a normative element from rational decision theory. You "should" always switch envelopes, because in doing so you increase your expected wealth. In fact symmetry tells us that you can't make money by switching, and common sense tells us that constantly switching and re-switching can't keep making you more money. It must be a zero sum game. <BR/><BR/>We cure the paradox in the money version of the game by replacing money in the rational decision theory framework with percentage of the total finite wealth of the world. This fairly obvious refinement makes the value of the distribution finite again which cures the problem. <BR/><BR/>With this refinement, while you will USUALLY swap envelopes once you have opened yours, sometimes you won't swap because the number in yours is so huge that you can already command such a large part of the wealth in existence that there's more downside than upside from swapping. <BR/><BR/>Since you don't always swap after opening your envelope you can't make the assertion that you don't need to open your envelope to know whether to swap, hence there is no paradox.<BR/> <BR/>Assuming we all agree that is right, why doesn't the paradox re-emerge once we eliminate money and just make it a game about numbers? <BR/><BR/>As a preliminary point I note the paradox is not usually expressed in this way. Its usually viewed as a challenge to rational decision theory which inherently involves money or some equivalent store of value. Without money (or equivalent) the paradox can't challenge rational decision theory.<BR/><BR/>But is it still a paradox at all?<BR/><BR/>Let's say we create a the right sort of probability distribution which sums to one but has an expected value of infinity.<BR/><BR/>We take two adjacent terms from that distribution and randomly write them, each one inside one of two envelopes.<BR/><BR/>Now we need a rule of the game which says "switch envelopes if the expected size of the number in the other envelope is larger than the expected size of the number in your envelope". There is no reason for that rule other than we made it up. <BR/><BR/>If you open your envelope, the expected size of the number in the other envelope will indeed be larger. So you will always switch if you open your envelope. But since you will always switch, you don't need to open the envelope. <BR/><BR/>So what? <BR/><BR/>So you can just go ahead and switch anyway, and re-switch again and again, thus endlessly increasing the expected value of your envlope?<BR/><BR/>Again, so what?<BR/><BR/>This does admittedly sound a bit paradoxical. Unless of course the expected value of each envelope is infinite to start with, in which case its completely unexceptional. <BR/><BR/>Its just another one of those slightly weird sounding results of dealing with infitity. Its meaningless to increase infinity. Infinity plus anything is still just infinity. So you can go ahead and increase it all day without changing it. <BR/><BR/>This may challenge our minds to grasp the concept of infinity, but it doesn't challenge any useful real world decision making process.<BR/><BR/>To reiterate, the paradox is only a paradox because it appears to challenge modern rational decision theory. Its a great paradox, because it does indeed show that a small obvious refinement to the theory is required when dealing with the possibility of really huge numbers.<BR/><BR/>As a simple numbers game its useful to show how infinity creates some counterintuitive mathematical results, but its hardly one of the world's great paradoxes.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-36661236394343626062007-08-29T07:20:00.000-07:002007-08-29T07:20:00.000-07:00Thanks, Kim, for the thoughtful comment. What if ...Thanks, Kim, for the thoughtful comment. What if we do the same problem, but not with money? Suppose, for example, the envelopes are simply supposed to contain the numeric expressions, in some standard format, for two rational numbers, one of which is exactly half of the other? Wouldn't the same problem arise in calculating the expectation -- the expected numerical value -- of one envelope relative to the other? But now there's no issue about money.Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-29806097082111513172007-08-27T19:12:00.000-07:002007-08-27T19:12:00.000-07:00To me the problem lies with Bayes's theorem whenev...To me the problem lies with Bayes's theorem whenever we attempt to apply it to the possibility of earning really large sums of money.<BR/><BR/>Assume we cast the problem so that we have a valid probability distribution in which Bayes's theorem says its always rational to switch - ie the expected value of switching is always positive.<BR/><BR/>The paradox arises because we assume that the value of money is linear, but this is just a simplification that works with normal amnounts of money. It breaks down when we consider very large amounts of money.<BR/><BR/>This has NOTHING to do with the diminishing marginal utility of money to an individual. This is irrelevant in an efficient market which we implicitly assume exists, because we assume we can (for example) securitise the game such that everyone is in the linear marginal utility area. <BR/><BR/>Rather it arises because money merely represents real wealth which is finite in quantity. We use money in decision theory as a proxy for wealth or utility. This works perfectly if the amount of money is also fixed. However when considering really large amounts of money in our paradox, we must assume that, if required, the envelope-stuffer can simply issue more currency to satisfy the required probability distribution. If we don't make this assumption there is no paradox since the distribution is then bounded on the upside.<BR/><BR/>But at some point issuing the required trillions of dollars will be inflationary. ie the value of the money will decline as we issue more of it.<BR/><BR/>If we replace the amount of money in the decision theoretic framework with the value of the world's underlying wealth which it represents its easy to show the paradox disappears. <BR/><BR/>There is nothing wrong with the mathematics, and really nothing wrong with the decision theory once we realise we have made a small simplifying assumption that money is proportional to value. This assumption is fine, so long as the quantity of money is fixed or the amounts involved are small.<BR/><BR/>If we substitute the % of wealth of the whole world for the quantity of money in the decision theory then, with small amounts of money, you get the same results as with the standard decision theory. As the amounts of money become a meaningful % of the world's total wealth, the modified framework starts to produce different results which are intuitively more correct.<BR/><BR/>Even so there are some slight wrinkles with this approach to the two-envelope paradox, largely around whether you are allowed to burn the unopened envelope or not. But I think it provides a route to a mathematically rigorous solution to the paradox.<BR/><BR/>Such a solution requires no complex math and doesn't challenge our rational decision making approach in any important ways. It merely shows the difficulty we sometimes have in uncovering implicit simplifying assumptions.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-81543969545732797052007-06-04T11:29:00.000-07:002007-06-04T11:29:00.000-07:00That seems right. And yet, once you've made that ...That seems right. And yet, once you've made that choice, why can't you take X = the amount in my envelope as a new starting point?Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-41098581034207269542007-05-30T14:52:00.000-07:002007-05-30T14:52:00.000-07:00I think that the key to understanding this is to r...I think that the key to understanding this is to recognize that the results are determined based on your first choice of envelope - any subsequent switching of envelopes results in 100% probable results depending on the outcome of the first choice. This is very different from halving or doubling a set amount.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-19577487166241540522007-05-18T07:55:00.000-07:002007-05-18T07:55:00.000-07:00Neat strip, Jonathan. I confess I hand't seen it ...Neat strip, Jonathan. I confess I hand't seen it before!Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-46299883121269981252007-05-18T07:53:00.000-07:002007-05-18T07:53:00.000-07:00Thanks for all the posts, folks!Xurxo: Your sugges...Thanks for all the posts, folks!<BR/><BR/>Xurxo: Your suggested solution is very much like Frank Jackson's. Though appealing, I think it is an unnecessarily stringent a requirement on the use of variables in expectation. For example, depending on one's priors, it might be violated in my gift case without (I'd suggest) impeding the merit of that calculation.<BR/><BR/>Joseph: I find your analysis interesting. There's no doubt that your "X" is a well-functioning variable and the original "X" is not. I'm quite sympathetic with the suggestion that the problem with the original X is that it is "not the same" between the two terms. But what does it mean to be "not the same" in the first and second terms of the equation? That, I think, is the key!<BR/><BR/>Tanasije: I agree with your proof that the fallacious reasoning *is* fallacious. Of course (as you point out) the trick is to explain clearly *why* it is so!<BR/><BR/>Anibal: If you can get me an fMRI machine, I'll gladly do it. Just ship to: Philosophy Dept, UC Riverside, Riverside CA 92521! ;)<BR/><BR/>Michael: I think the question, though, is why the envelope case is problematic and the gift case isn't (or doesn't seem to be). Both have the same abtraction-reality structure, no? The Monty Hall problem, I'm inclined to think (with most mathematicians and decision theorists) is just surprising, not paradoxical; and (despite superficial similarities) is very different in logical structure.Eric Schwitzgebelhttps://www.blogger.com/profile/11541402189204286449noreply@blogger.comtag:blogger.com,1999:blog-26951738.post-77730990240036799872007-05-17T17:15:00.000-07:002007-05-17T17:15:00.000-07:00I agree, this is a fun and vexing puzzle.By coinci...I agree, this is a fun and vexing puzzle.<BR/><BR/>By coincidence, I have been presenting it in my webcomic, <A HREF="http://lumpofclay.jollyutter.net/" REL="nofollow">Lump of Clay</A>, this week.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-5054782974556916272007-05-17T06:03:00.000-07:002007-05-17T06:03:00.000-07:00Hi Eric,In the spirit of the puzzle I'm not going ...Hi Eric,<BR/>In the spirit of the puzzle I'm not going to look at any of the proposed solutions until I send this off. I put this paradox in the same category as those of Zeno. They are generated by blending the existential and the mathematical. You have the opening of the envelope/decision, the firing of an arrow, the winning of a race and the mathematical divisibility of that event by discrete moments or probability. In other words real time and mathematical time so to speak. <BR/><BR/>Apply Oscar rules. There is no decision until the envelope is opened. This is a single event. It doesn't have a beginning and an end that can be 'divided', except notionally. Well, you might say, is that not to re-write the script? No, it is to point out the obvious, that mathematics and the real world are different and that you have created infinite divisability where there is none. Not that that's not a useful thing to be able to do.<BR/><BR/>Did you ever come across Mark Haddon's book on the adventures of the Asperger's (maths savant)boy called 'The Curious Incident of the Dog in the Night-Time'. There is a discussion of the Monty Hall Problem in it. Is it a veridical paradox? No, I believe, along with the irate professors, that it is just a plain old paradox.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-26951738.post-69097083448250460572007-05-17T04:24:00.000-07:002007-05-17T04:24:00.000-07:00That paradoxical case between competing alternativ...That paradoxical case between competing alternatives, probablities known and unknown, reational models... resonates to my ears as "avant la letre" neuroeconomics performed by Schwitzgebel and Dover. <BR/><BR/>As it is the case, an interesting reading (and progress in understanding rational choice) of the case, would be scanning (neuroimaging) the subjects when they choose their options, and characterize their personal aversion to loss and willingness to gain. That could inform us philosophically as well. Go ahead Eric and Josh and tell us what you find.Anibal Monasterio Astobizahttps://www.blogger.com/profile/03121020811080165520noreply@blogger.com