• Michael
    15.6k
    Your approach to my example of the red ball would be to say that a coin toss determines the value of some X to be put in the other envelope and so that the expected value is:

    E(U) = P(H)(X) + P(T)(X) = 0.5X + 0.5X = X

    Which I think is a completely misleading way to phrase it and conflates what actually are different values of X. The expected return is the probability of event 1 being the case multiplied by the amount if event 1 is the case plus the probability of event 2 being the case multiplied by the amount if event 2 is the case. The amount (X) is conditional on the event, not some independent constant.
  • Srap Tasmaner
    4.9k

    The coins and colored balls thing is different because there are four possible outcomes, you're just choosing in two steps, maybe because you don't have a four-sided coin. Probability of each is 1/4, just takes two steps to get there.

    I am still thinking about the math.

    This sequence converges:



    This one doesn't:

  • andrewk
    2.1k
    I would say, given the information available to me, that it's possible that there's a blue ball in the second box and that it's possible that there's a green ball in the second box, with a 50% probability of each. The sample space for the other box is [blue, green].Michael

    Probability is used to model uncertainty, and nearly all uncertainty in our world is epistemological. If one is a Hard Determinist (a term that I think is not well-defined, but let's leave that aside for now) then ALL uncertainty is epistemological.

    One constructs a probability space based on one's knowledge, so there is no absolute probability space that models a game but rather a probability space that models a particular stakeholder's perspective on the game.

    In the above case the probability space you describe that is blue and green, each having 50% probability, is an appropriate probability space for the player of the game. For the game host however, who knows whether the second box has blue or green, the probabilities are either 0 and 1, or 1 and 0.

    A minor technical point: the sample space is the set of all conceivable outcomes (called 'events'). I deliberately say 'conceivable' rather than 'possible' because it can contain events that the person knows to be impossible. A probability space is a sample space together with an assignment of probabilities to each event, as well as some other technical stuff (sigma algebras) that we needn't go into here. A probability space can assign zero probabilities to some events, as long as the sum of all assigned probabilities of events in the sample space is 1. Events that the space's 'owner' knows to be impossible will be assigned probability zero.

    So the game host and the player can have the sample space. But they will assign different probabilities to events in it, so they have different probability spaces. If the second box holds green, the host will assign 0 probability to the event 'blue' while the player will assign 0.5 to it.

    Does the answer depend on whether or not one is a Bayesian?Michael
    I'm not sure. I feel the answer may be 'perhaps', but the definition of the Bayesian vs Frequentist divide seems to be very fuzzy. I think a hard-line Frequentist may reject the epistemological interpretation, but that would seem to render them unable to use most methods of modern statistics. EIther I've misunderstood what frequentism is, or there are very few hard-line Frequentists in the world.
  • Michael
    15.6k
    The coins and colored balls thing is different because there are four possible outcomes, you're just choosing in two steps, maybe because you don't have a four-sided coin. Probability of each is 1/4, just takes two steps to get there.Srap Tasmaner

    Using a coin toss to determine if it's red ball + £5 or red ball + £20 is no different to using it to determine if it's £10 + £5 or £10 + £20 which is no different to using it to determine if it's X = 5 or X = 10. So you seem to be tacitly agreeing that if we know that X = 5 or X = 10, as determined by a coin toss, before we start then if we find £10 in our envelope then the expected value of the other envelope is £12.50.

    But then what's the difference between using a coin toss to determine if it's X = 5 or X = 10 and using a dice to determine if it's X = 5 or X = 10 or X = 20 or X = 40 or X = 80 or X = 160? It'll still work out as above that if there's £10 in our envelope then the expected value of the other envelope is £12.50. And what's the difference between using a dice to determine if X is one of those six values and using a random number generator to pick any X between any range? Surely the math behind the expected value of the other envelope works out exactly the same as it does in the case of the red ball and either £5 or £20.
  • Srap Tasmaner
    4.9k

    There are two ways to look at this:

    (1) Knowing the rules of the game, when you get the $10 envelope, you use your amazing Math Powers to deduce that the other envelope contains $5 or $20. (This is like kids educational TV.)

    (2) Upon finding $10 in the first envelope, you start making up fairy tales that convince you to trade your bird in the hand for two in the bush.

    I more and more see the soundness of doing the math as you do, as a matter of fact, which bothers me a bit because the answer it produces is patently wrong. It may not be a question of whether the math is being done right, but whether this particular tool is appropriate for the job at hand.
  • Srap Tasmaner
    4.9k
    The other big picture issue that has gotten short shrift in this thread, by focusing on the open envelope, is the paradox of trading an unopened envelope (or even picking one).

    If you are offered the trade after picking, but without opening, you can conclude readily that the value of the other envelope is 5/4 the value of yours, whatever that is. But if you trade, and still don't open, you can conclude that the value of the first envelope is 5/4 the value of the one you traded for.

    And you can reason this way even before picking, so that the other envelope is always better. How can you even pick one?
  • Srap Tasmaner
    4.9k
    P(A1)Y+P(B1)2X > P(A2)Y+P(B2)X or more importantly P(A1)Y+P(B1)2X is not equal to P(A2)Y+P(B2)XJeremiah

    Which is to say that mean[Y, 2Y] > mean[Y/2, Y].

    I've been thinking some about how this works. If you tried, as the player, to broaden your view of the situation, it might go something like this:

    1. Here's 10.
    2. The other envelope is 5 or 20.
    3. If it's 5, I'm just as likely to have picked 5.
    4. Then I'd think the other envelope has 5/2 or 10.
    5. If it's 20, I'm just as likely to have picked 20.
    6. Then I'd think the other envelope has 10 or 40.

    You could go on, which is why I got to thinking about how going smaller converges, but going bigger doesn't. (The space does have a lower but not an upper bound, so far as you know.) Point being there's no mean value for the space as a whole. Jumping in at any point Y shows you ever increasing gains to your right and ever diminishing losses to your left. You get just a little taste of that when you try to calculate your expectation for the other envelope.
  • Srap Tasmaner
    4.9k
    The rest of the point being that envelopes worth less than yours, yours being worth Y, have an average value of Y/2 0, as a matter of fact. The envelopes worth more than Y have no average value.

    Edit: dumbness.
  • Benkei
    7.7k
    I thought this was solved eons ago? The total amount of the envelopes is fixed; switching or keeping results in the same. The total amount is always 3x. The expected amount (E) in an envelope is therefore 50% x or 50% 2x, e.g. 3x/2. That's the expected amount any way you cut that cookie.
  • Jeremiah
    1.5k


    I posted the solution on the 6th post. However, without the stubborn refusal of reality, then I doubt many of the threads on these philosophy forums would go much of anywhere.
  • Srap Tasmaner
    4.9k

    There are two natural and apparently sound approaches, one of which, the one you mention, produces the correct result. The puzzle is figuring out what's wrong with the other one. (Our efforts have been hampered somewhat by some people thinking the other answer is actually right.)
  • Benkei
    7.7k
    ah... I'd opt for the reductio ad absurdum. If there's no limit to switching, you'd have to switch indefinitely based on the other faulty interpretation which is of course ridiculous.
  • Srap Tasmaner
    4.9k

    Agreed. But it would be nice, knowing that the argument leads to absurdity and is therefore false, to pinpoint the step we should disallow. Like figuring out where you divided by 0 in the *proof we learned as kids that 2 = 1.
  • Benkei
    7.7k
    Ok. I'll have a stab at that (I have to admit I only read the first few pages here so might be repeating things).

    Ok, so we have a wrong approach:

    If the envelope I'm holding is X then switching either gives you 2X or X/2. Either you win X or you lose 1/2X, so switching is a winning proposition.

    One venue I'm thinking about is that this falsely suggests there are three possible values for the envelopes: X, 2X and X/2. But we know there are only two; X and 2X.

    If the envelope I'm holding is X then switching gives me 2X but if it's 2X then switching gives me X. Profit and loss are equal.

    The mistake could also be found in the assumption that the envelope I hold has a determinate amount X of which the values of the other envelope is derived.
  • Srap Tasmaner
    4.9k
    The mistake could also be found in the assumption that the envelope I hold has a determinate amount X of which the values of the other envelope is derived.Benkei

    If only the amount in the first envelope, the envelope you chose and perhaps are even allowed to open, is fixed, and the second envelope is then loaded with either half or twice the amount in yours, then switching is the correct strategy. This is the variant Barry Nalebuff calls the Ali Baba problem.
  • Michael
    15.6k
    One venue I'm thinking about is that this falsely suggests there are three possible values for the envelopes: X, 2X and X/2. But we know there are only two; X and 2X.

    If the envelope I'm holding is X then switching gives me 2X but if it's 2X then switching gives me X. Profit and loss are equal.

    The mistake could also be found in the assumption that the envelope I hold has a determinate amount X of which the values of the other envelope is derived.
    Benkei

    We're dealing with a situation where we know that there's £10 in our envelope. What's the value of the other envelope? It's possible that it's £5, as the envelope set could be £5 and £10, and it's possible that it's £20, as the envelope set could be £10 and £20. It's not possible that it's £1 as the envelope set can't be £1 and £10.

    So this strikes me as a conceptual disagreement over what it means for an outcome to be possible, and I think the disagreement is one between the Bayesian and the frequentist. I recall an earlier discussion where another poster (who I also clashed with here on a similar issue) said that once a coin had been tossed it would be wrong to say that it's equally likely to be heads as tails (even if we haven't looked); instead if it's actually heads then it's not possible that it's tails and if it's actually tails then it's not possible that it's heads. I believe Jeremiah and Srap (and perhaps you?) would take this same reasoning and say that if it's actually £5 in the other envelope then it's not possible that it's £20 and if it's actually £20 in the other envelope then it's not possible that it's £5.

    I don't share this view on probability. We can still talk about the probable outcome of an event that's already happened. If you've flipped a coin and hidden the result from me then I will say that it's equally likely to be heads as tails (and it must be one of these). If you've chosen X = 5 or X = 10 and placed the amounts in the envelopes then I will say that it's equally likely to be X = 5 as X = 10 and so that if there's £10 in my envelope then it's equally likely that the other envelope contains £5 as £20 (and it must be one of these).
  • Benkei
    7.7k
    If only the amount in the first envelope, the envelope you chose and perhaps are even allowed to open, is fixed, and the second envelope is then loaded with either half or twice the amount in yours, then switching is the correct strategy. This is the variant Barry Nalebuff calls the Ali Baba problem.Srap Tasmaner

    Well, uhmm... no...? The first envelope has an amount that is either X or 2X, the other is either X or 2X. The other is not half or twice the amount of X. The other envelope is only half iff the opened envelope contains 2X and it's only twice as much iff the opened envelope contains X.
  • Benkei
    7.7k
    I recall an earlier discussion where another poster (a frequentist) said that once a coin had been tossed it would be wrong to say that it's equally likely to be heads as tails (even if we haven't looked); instead if it's actually heads then it's not possible that it's tails and if it's actually tails then it's not possible that it's heads. I believe Jeremiah and Srap (and perhaps you?) would take this same reasoning and say that if it's actually £5 in the other envelope then it's not possible that it's £20 and if it's actually £20 in the other envelope then it's not possible that it's £5.Michael

    No, I don't think I would agree with that either.

    We're dealing with a situation where we know that there's £10 in our envelope. What's the value of the other envelope? It's possible that it's £5, as the envelope set could be £5 and £10, and it's possible that it's £20, as the envelope set could be £10 and £20. It's not possible that it's £1 as the envelope set can't be £1 and £10.Michael

    The total sum possible for both envelopes in the above assuming one envelope contains 10 GBP is either 3x = 30 or 3x = 15 but we know it's either one of the two, it cannot be both. Your expression however allows for both and therefore has to be wrong by necessity.

    I also refer to my earlier comment that the above expression, if we allow for unlimited switching of envelopes, would entail having to switch indefinitely which is absurd.
  • Michael
    15.6k
    Your expression however allows for both and therefore has to be wrong by necessity.Benkei

    It doesn't.

    The total sum possible for both envelopes in the above assuming one envelope contains 10 GBP is either 3x = 30 or 3x = 15 but we know it's either one of the two, it cannot be both.

    I know it cannot be both. It's one or the other. If 3X = 30 then there's £20 in the other envelope. If 3X = 15 then there's £5 in the other envelope. So the value of the other envelope is either £20 or £5. These are the (only) two possible values. And either because we know that the value of X was chosen at random or because we have no reason to believe that one is more likely than the other, we assign a probability of 0.5 to each being the case.
  • Michael
    15.6k
    I also refer to my earlier comment that the above expression, if we allow for unlimited switching of envelopes, would entail having to switch indefinitely wich is absurd.Benkei

    We wouldn't, because we've opened an envelope in this example. I know that there's £10 in my envelope. If from this we can deduce an expected value of £12.50 in the other envelope then once we switch we have no reason to switch back. Instead we have a reason to stick.
  • Jeremiah
    1.5k
    The error is in making new assumptions based on Y. Before you see Y you know that envelope A, the one you were given, has the possibility to be X or 2X. Which reminds true even after seeing Y, as you don't know if it is Y equals X or 2X. So as to the uncertainty as to if you have X or 2X, which determines what envelope B is, Y provides no useful information, therefore it is not appropriate to change your existing uncertainty based on Y. If you change your assumptions based on Y then you include false information into solution and that is why you get misleading results.
  • Benkei
    7.7k
    Would you agree that your expression allows the envelopes to carry values of either X, 2X or X/2?

    We wouldn't, because we've opened an envelope in this example. I know that there's £10 in my envelope. If from this we can deduce an expected value of £12.50 in the other envelope then once we switch we have no reason to switch back.Michael

    Fair enough. That earlier comment was a reply to the original OP so I suppose with this amendment it doesn't hold water any more (although I haven't worked through it so I'm just assuming you're right).
  • Michael
    15.6k
    Would you agree that your expression allows the envelopes to carry values of either X, 2X or X/2?Benkei

    No. One envelope has £10 and the other envelope has either £5 or £20. All this talk of X and 2X just confuses matters. It is just the case that there's a 50% chance that the other envelope contains £20 and a 50% chance that the other envelope contains £5.
  • Srap Tasmaner
    4.9k
    We wouldn't, because we've opened an envelope in this example. I know that there's £10 in my envelope. If from this we can deduce an expected value of £12.50 in the other envelope then once we switch we have no reason to switch back. Instead we have a reason to stickMichael

    Here's a proof (which you won't accept) that opening the envelope is irrelevant, and that your reasoning should be symmetrical.

    Suppose you choose an envelope and then the facilitator tells you the other envelope has $10 in it. Then you would choose not to switch because yours has an expected value of $12.50.

    Eventually you recognize that you would reason the same way whichever envelope you had chosen.
  • Michael
    15.6k
    Suppose you choose an envelope and then the facilitator tells you the other envelope has $10 in it. Then you would choose not to switch because yours has an expected value of $12.50.Srap Tasmaner

    I agree. I would choose not to switch.
  • Srap Tasmaner
    4.9k

    And it doesn't bother you that if you know the value of A you want B, but if you know the value of B you want A?
  • Srap Tasmaner
    4.9k

    How did you choose an envelope in the first place?

    Suppose you have chosen, perhaps by flipping a coin, if the facilitator then offers to tell you the value of either, how will you choose which value to learn? By flipping a coin?
  • Michael
    15.6k
    If one guarantees me £10 and the other could be £5 or £20 then I'm going to want the one that could be £20. It doesn't make a difference to me if it's my starting envelope or the other envelope.
  • Michael
    15.6k
    I wonder if this is anything like Deal or No Deal. We're down to the last two boxes. Yours either has £20,000 or it has £5,000. The Banker offers you £10,000 to walk away. Do you accept the offer or do you take a risk and hope that your box has £20,000?

    The sample space that describes your box and the Banker’s offer could be written as [X, 2X], correct? So it seems comparable. But wouldn’t you agree that the expected value of your box is greater than the offer, and so that assuming you can stomach a loss of £5,000, you should decline?
  • Srap Tasmaner
    4.9k

    For DOND you accept any offer in the neighborhood of the expected payout, because the banker usually low-balls you. (There was extensive discussion among math types about whether it's a Monty Hall variant.)

    Note, yet again, that all the values that could be in the cases are known from the start. There is no speculation about possible outcomes.
bold
italic
underline
strike
code
quote
ulist
image
url
mention
reveal
youtube
tweet
Add a Comment