• Jeremiah
    1.5k


    You are saying if you switch then you have 50% chance for 1/2X and a 50% for 2X and your argument is centered around expected gain. Which you claim is in favor of switching.

    Algebraically your expected value can be expressed as such:

    1/2(x/2)+1/2(2x) = (5x)/4

    Now let's try thinking about this when we know the contents of both envelopes and we are just flipping a fair coin to determine the envelope selection.

    Let's say envelope A contains 5 bucks and envelope B contains 10 bucks.

    Under your claim if we get A initially then the other envelope either has 2.50$ or 10$.

    Under your claim if we get B initially then the other envelope has 5$ or 20$.

    See how that can't work, as 2.50$ and 20$ are not even possible outcomes.
  • Jeremiah
    1.5k
    What we have to do is average the expected returns between the two cases.

    1/2(5) + 1/2(10) = 7.50 the mid point between 5 and 10. Therefore there is no difference in switching.
  • Michael
    15.8k
    Knowing more information changes the probabilities.

    If you toss a coin and I see that it landed heads then I will say that the probability that it landed heads is 100%.
    If you toss a coin and I don't see the result then I will say that the probability that it landed heads is 50%.

    So your example of knowing the contents of both envelopes isn't at all analogous the example of only knowing what's in the one we have.
  • Jeremiah
    1.5k


    The algebra already showed that you are wrong in more ways than one and so does a real world example. You are not averaging your expected returns over the two possible cases.
  • Michael
    15.8k
    You are not averaging your expected returns over the two possible cases.Jeremiah

    That's exactly what I did here and here. If in each game we see that we have £10 we win more by switching than by not switching.
  • Jeremiah
    1.5k


    Your math and code is wrong, and it has been shown as wrong several time. Pointing to your incorrect math does not prove you right.
  • Jeremiah
    1.5k
    One can only explain things so many different ways. I could keep rephrasing the same thing over and over, but as much fun as that it is, it gets tiring.
  • tim wood
    9.3k
    You then have a choice: walk away with the $Y or return
    the envelope to the table and walk away with whatever is in the other
    envelope. What should you do?

    The "should" is not well-defined. Presumably it means, "How do I get the most." The only to get the most is to take a chance to get the most, with the risk of not getting the most. That argues always switching.

    Or, the first envelope in hand, I can make the following bet: change and either double my take, or see it halved. Again, change.

    But there is in this no opportunity cost. Can I play forever? Do I have but one opportunity to play? For most folks, there is a amount, such that if it is in the first envelope, they'll be happy to keep it - a bird in the hand, so to speak.
  • BlueBanana
    873
    The intuitive solution is that the bigger the amount in your envelope is, the more likely it is to be the one with 2X.
  • unenlightened
    9.3k
    There's more to gain than there is to lose by switching.Michael

    The intuitive solution is that the bigger the amount in your envelope is, the more likely it is to be the one with 2X.BlueBanana

    So the paradox arises because in practice there is a range of X. The lower bound is the smallest denomination of currency, and the upper bound is something less than half all the money in the world.

    In practice the bounds are tighter, because the psychology department, or the entertainment industry has a budget, and also needs to make the bet a bit interesting. If the envelope contains less than £5, who really gives a fuck. If it is happening on a regular basis, the upper bound is- hundreds, maybe thousands, depending. So intuition has accessed extra information that mathematics is not privy to.

    If there were no upper limit, it would always be a good bet to switch, but there must be a limit.

    Imagine a slightly different game. You get an envelope with X, and then you can choose another envelope from 2, with 'double it' and 'halve it'. Then it is clear that whatever X is, it's a good bet to take a second envelope. No paradox.
  • Andrew M
    1.6k
    It looks like you're assuming the player's utility function is the identity function, which would be unusual and unrealistic. Even if we assume that, the quoted calculation doesn't take account of all the available information. There is new information, which is the known dollar value in the opened envelope. That changes the utility calculation.andrewk

    Yes it does. But considering myself, my utility function would be roughly linear for the amounts being talked about here (around $20). Which is to say, I would accept an offer of $1 to either switch or keep since my expected gain from doing so would be $1.
  • Andrew M
    1.6k
    That's exactly what I did here and here. If in each game we see that we have £10 we win more by switching than by not switching.Michael

    As Jeremiah points out, your code doesn't reflect the problem in the OP. Before an envelope is picked, the expected value of each envelope is the same. That doesn't change when you choose an envelope and find it has $10 in it. The expected value of the other envelope is also $10 and so the expected gain from switching is $0.

    To see this, suppose you do the experiment 100 times where you always switch. 50 times (on average) you choose the $2X envelope and switch to $1X. So you earn $50X. 50 times you choose the $X envelope and switch to $2X. So you earn $100X. In total, you earn $150X.

    Now you do the same experiment 100 times where you never switch. 50 times you choose the $2X envelope and don't switch. So you earn $100X. 50 times you choose the $1X envelope and don't switch. So you earn $50X. In total, you earn $150X.

    So the expected value in both sets of experiments is the same.

    The experiment you're modeling with your code is where the second envelope is emptied and randomly filled with either half or twice the amount of the first chosen envelope amount. In which case you should switch.
  • Michael
    15.8k
    To see this, suppose you do the experiment 100 times where you always switch. 50 times (on average) you choose the $2X envelope and switch to $1X. So you earn $50X. 50 times you choose the $X envelope and switch to $2X. So you earn $100X. In total, you earn $150X.

    Now you do the same experiment 100 times where you never switch. 50 times you choose the $2X envelope and don't switch. So you earn $100X. 50 times you choose the $1X envelope and don't switch. So you earn $50X. In total, you earn $150X.
    Andrew M

    Like Baden you're conflating different values of X.

    Given a starting envelope of $10, if 50 times I have the 2X envelope and don't switch then I earn $500 and if 50 times I have the X envelope and don't switch then I earn $500. In total I earn $1,000.

    Given a starting envelope of $10, if 50 times I have the 2X envelope and switch then I earn $250 and if 50 times I have the X envelope and switch then I earn £1,000. In total I earn $1,250.

    In your example you're adding 50X to 100X and getting 150X despite the fact that the X in 50X has a different value to the X in 100X, and so such an addition is wrong (or at least misleading as it generates a third value of X).
  • Michael
    15.8k
    To make this clearer:

    50 times (on average) you choose the $2X envelope and switch to $1X. So you earn $50X.

    Here, X = 5.

    50 times you choose the $X envelope and switch to $2X. So you earn $100X.

    Here, X = 10.

    In total, you earn $150X.

    The only way to make sense of this, given that you've added 50X where X = 5 to 100X where X = 10 is to say that X = 8.333...

    50 times you choose the $2X envelope and don't switch. So you earn $100X.

    Here, X = 5.

    50 times you choose the $1X envelope and don't switch. So you earn $50X.

    Here, X = 10.

    In total, you earn $150X.

    The only way to make sense of this, given that you've added 100X where X = 5 to 50X where X = 10 is to say that X = 6.666...

    So, again, the amount you win is 1.25x greater when you switch, which is what my program showed.
  • andrewk
    2.1k
    As Jeremiah points out, your code doesn't reflect the problem in the OP. Before an envelope is picked, the expected value of each envelope is the same.Andrew M
    It's necessary to distinguish between two cases, which is whether we know the distribution of X, ie the distribution for the lower of the two values.

    If we don't know the distribution then we don't even know whether there is an expected value of the amount in either envelope. For instance, if the amount put in the envelope is a random draw from the Cauchy distribution, there is no expected value (the relevant integral for the Cauchy distribution does not converge).

    If we do know the distribution, and it has an expected value, then it is correct that our expected value is the same for both envelopes. But that changes when we see the amount in the envelope, because that tells us where we are in the distribution. Say we know the distribution of the smaller amount X is uniform on [1,2]. Then if the amount we see is in [1,2] we know it is the smaller amount and we should switch. On the other hand if the amount is in [2,4] we know it is the larger amount and should not switch.

    In the absence of knowing the distribution of X, any calculations based on expected values prior to opening the envelope are meaningless and wrong. Since the claimed paradox relies on such calculations, it dissolves.
  • Andrew M
    1.6k
    Like Baden you're conflating different values of X.Michael

    No, but I can make the same argument using concrete amounts (see below).

    Given a starting envelope of $10Michael

    You can't assume you have a starting envelope of, for example, $10, and that the other envelope has either $5 or $20. That doesn't reflect the problem as specified in the OP.

    The OP instead assumes that you have two envelopes of, for example, $10 and $20, and that you randomly choose one of them. So half the time, the starting envelope would have $10 and half the time the starting envelope would have $20.

    Over 100 runs with a switching strategy, you would switch from the $20 to the $10 envelope 50 times (earning $500) and switch from the $10 to the $20 envelope 50 times (earning $1000) for a total of $1500.

    Over 100 runs with a keeping strategy, you would keep the $20 envelope 50 times (earning $1000) and keep the $10 envelope 50 times (earning $500) for a total of $1500.

    So you earn $1500 (on average) on either strategy.
  • Andrew M
    1.6k
    In the absence of knowing the distribution of X, any calculations based on expected values prior to opening the envelope are meaningless and wrong.andrewk

    The Wikipedia entry uses the expected value being the same for both envelopes as their simple resolution. Do they have that wrong, in your view?
  • andrewk
    2.1k
    Interesting. The problem is different. Unlike in the OP, in the Wiki case, the envelope has not been opened before the option to swap. So the player has no new information in the wiki case. I think their analysis in the Simple Case is wrong, because it assumes the existence of an expected value that may not exist, but I think the conclusion that there is no reason to switch may in spite of that error be correct in that case, but not in this one.
  • Michael
    15.8k
    You can't assume you have a starting envelope of, for example, $10, and that the other envelope has either $5 or $20.Andrew M

    Why not? I've opened the envelope and seen that I have $10. That's in the rules as specified in the OP. And knowing the rules of the game I know there's a 50% chance that the other envelope contains $5 and a 50% chance that the other envelope contains $20.

    The OP instead assumes that you have two envelopes of, for example, $10 and $20, and that you randomly choose one of them. So half the time, the starting envelope would have $10 and half the time the starting envelope would have $20.

    Over 100 runs with a switching strategy, you would switch from the $20 to the $10 envelope 50 times (earning $500) and switch from the $10 to the $20 envelope 50 times (earning $1000) for a total of $1500.

    Over 100 runs with a keeping strategy, you would keep the $20 envelope 50 times (earning $1000) and keep the $10 envelope 50 times (earning $500) for a total of $1500.

    So you earn $1500 (on average) on either strategy.
    Andrew M

    So in your example you've considered repeating the game using the same value envelopes and say that half the time $10 is picked and half the time $20 is picked whereas in my example I've considered repeating the game using the same starting envelope and say that half the time the other envelope contains $5 and half the time the other envelope contains $20.

    Is there some rule of statistics that says that one or the other is the proper way to assess the best strategy for a single instance of the game (where you know that there's $10 in your envelope)?

    I would have thought that if we want to know the best strategy given the information we have then the repeated games we consider require us to have that same information, and the variations are in the possible unknowns – which is what my example does.
  • andrewk
    2.1k
    It occurs to me that, on my understanding of the two main interpretations of probability, and assuming no knowledge of the prior distribution of the lower amount, a Bayesian interpretation can take the approach that, on seeing $10 in the envelope, the expected winnings from switching envelopes is $2.50, but a Frequentist interpretation does not provide any approach at all - it denies the correctness of a probabilistic calculation of the winnings: the winnings are either $10 or -$5, and whichever one it is, is certain - no partial probabilities.

    Sounds like a good reason to be a Bayesian.
  • Jeremiah
    1.5k
    So your example of knowing the contents of both envelopes isn't at all analogous the example of only knowing what's in the one we have.Michael

    What you are overlooking with the real world example is that my algebraic model reflected it exactly and all from the same information you confined your model to. The real world example showed that my algebra and probability model correctly predicted what would be actual. However, your approach created a model that does not reflect reality at all. The goal of probability is to accurately model reality. We test our probability models by either simulating a real situation or actually using them. In this test my model passed while your model failed.
  • Jeremiah
    1.5k


    This has nothing to do with Classical vs. Bayesian. This is about treating X as an unknown, even after you see the contents of the first envelope. If you trust the algebra it leads you to a model that correctly reflects a real world case. The fallacy is an algebraic one.

    Besides you shouldn't be a Bayesian or Frequentist, you should do both.
  • Jeremiah
    1.5k


    I picked this version to generate more discussion. I think this version of the problem forces more conflict of ideas, which in turn generates more discussion.
  • Jeremiah
    1.5k
    In the absence of knowing the distribution of X, any calculations based on expected values prior to opening the envelope are meaningless and wrong. Since the claimed paradox relies on such calculations, it dissolves.andrewk

    Not true.

    Even after seeing the first envelope you still don't know which case you are in, you could have 2X or X, you also don't know what case the other envelope is in, it could be 2X or X. This is also ture before you take your first peak. The fact is opening the envelope does not give us the vaule of X. So if you switch you switch to the same unknown. The errors is in treating X as a known after you open the the first envelope.

    In both cases our expected value then is:

    1/2(X) + 1/2(2X) = 1/2X+X.

    Which is the mid point between X and 2X.
  • Jeremiah
    1.5k
    If you follow the algebra it leads you to a solution which reflects a real world situation and when it comes to modeling probability this should be the goal.
  • Srap Tasmaner
    5k

    Is there a sort of de dicto/de re problem here in how we think about the probabilities? That is, is there a difference between these?

    (1) In repeated trials, about half the time participants pick the larger envelope.
    (2) In repeated trials, about half the time the envelope participants pick is the larger.

    Suppose the experiment in fact only has envelopes containing $2 and $4. Participants will pick each about half the time, averaging a take of $3.

    Your approach is to reason from (2) once the envelope is open. Thus, seeing $2, you reason that half the time this value is the smaller and half the time it is the larger. But this is just false. About half the time participants pick $2 and half the time they pick $4; there is no case in which $2 is the larger value.

    You're right that seeing $2 tells you the possibilities are {1,2} and {2,4}. But on what basis would you conclude that about half the time a participant sees $2 they are in {1,2}, and half the time they are in {2,4}? That is the step that needs to be justified. I think you're imagining a table something like this:
           {1,2}  {2,4}
    Big     1/4    1/4
    Small   1/4    1/4
    
    and those are your chances of being in each situation. But there are not two random events or choices here; there is only one. You are re-using the choice between larger and smaller as the relative frequency of {1,2} and {2,4}. For all you know, {1,2} could be a hundred times more likely than {2,4}. In my version here, {1,2} has a chance of 0, and {2,4} a chance of 1.
  • Michael
    15.8k
    But there are not two random events or choices here; there is only one. You are re-using the choice between larger and smaller as the relative frequency of {1,2} and {2,4}. For all you know, {1,2} could be a hundred times more likely than {2,4}. In my version here, {1,2} has a chance of 0, and {2,4} a chance of 1.Srap Tasmaner

    But what probabilities does the person in the experiment give to {1, 2} and {2, 4}, given his envelope of $2? He can give them a probability of 0 and 1 respectively if he knows that one envelope contains $2 and the other envelope contains $4, but he doesn't know this. The only information he has is that 1) one envelope contains twice as much as the other and 2) his envelope contains $2. Surely in lieu of any evidence to suggest that one of {1, 2} and {2, 4} is more likely he should consider their probabilities equal?

    Or does it make a difference if he knows beforehand that the experimenter has flipped a fair coin to determine which of {1, 2} and {2, 4} is to be used?
  • Srap Tasmaner
    5k
    Or does it make a difference if he knows beforehand that the experimenter has flipped a fair coin to determine which of {1, 2} and {2, 4} is to be used?Michael

    If you know this is the procedure, then you know the distribution. You don't, so you don't.

    Surely in lieu of any evidence to suggest that one of {1, 2} and {2, 4} is more likely he should consider their probabilities equal?Michael

    I wish I could answer this question. Does the principle of indifference justify the use of uninformative priors like this? I have a hunch that someone really good at this sort of thing could show why even the uninformative prior does not lead directly to your conclusion. I am not that person.

    All I can say is that in this case it leads to a mistaken belief that switching is better.
  • Jeremiah
    1.5k
    could show why even the uninformative prior does not lead directly to your conclusion.Srap Tasmaner

    I already did. Recall that my initial algebraic model was based on all the same information as Michael and I used an uninformative prior, and my approach leads to an expected value of 1/2(X) + 1/2(2X) = 1/2X+X. My initial approach modeled the real world example accurately, all based on the same information and situation as Michael.

    You could have X or 2X. If you have X and you switch then you get 2X but lose X so you gain X; so you get a +1 X. However, if you have 2X and switch then you gain X and lose 2X; so you get a -1 X.Jeremiah


    Recall that event L is when you start with 2X and event K is when you start with X

    Since we don't know which it is upon seeing the money we will consider this an uninformative prior and give each a fair 50% likelihood. Then our sample space is [K,L]

    In the event of L our expected gain loss sample space is [-X, 0]

    In the event of K our expected gain loss sample space is [X,0]

    That is the same even if you go the 1/2 route.

    Let's try running a simulation on that structure.

    K <- c("X",0)
    L <- c("-X",0)
    q <- c(K,L)
    w <- sample(q, 10000, replace = TRUE)
    sum(w == "X")
    sum(w == "-X")
    sum(w == 0)

    The Result are:

    x: 2528
    -x: 2510
    0: 4962
    ```
    Jeremiah
  • Jeremiah
    1.5k
    The tick to this problem is trusting the algebra over your eyes, which makes it less than intuitive. In this case the new information you receive can actually mislead you, but if you follow the math it leads you down the right path.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment