• Mathematical Conundrum or Not? Number Six
    That's because I disagree with your interpretation of probability. Your reasoning would seem to suggest that there's a 50% chance of a coin flip landing heads, but that after a flip, but before looking, we can't say that there's a 50% chance that it is heads. I think that we can say that.Michael

    Admittedly, in strident moments I have said things like this.

    But look at my last post. It's not about interpretations of probability. It's about how conditional probability works, and it can be a little counter-intuitive.
  • Mathematical Conundrum or Not? Number Six
    There's a 50% chance of picking the lower-value envelope, and so after having picked an envelope it's in an "unknown state" that has a 50% chance of being either the lower- or the higher-value envelope?Michael

    Let's leave the envelopes aside for a moment.

    Imagine an interval [0, L] for some positive real number L. Now let u and v be unequal real numbers in that interval. What is the chance that u < v? Intuitively, that's just v/L. Given a choice between u and v, what is the chance of picking u or v? 1/2. Given that you picked one of u and v, what is the chance that the one you picked is less than the other? We'll call your choice S and the other T (and abuse that notation):

    P(S < T | S = u) = v/L, P(S < T | S = v) = u/L

    Not only do we not know that those are equal, we know that they aren't, because u and v are unequal. But we can say

    P(S < T | S ∊ {u, v}) = (u + v)/2L

    because the chances of picking u and picking v are equal. Clearly,

    * P(S < T | S ∊ {u, v}) = 1/2

    if only if

    * (u + v)/2L = 1/2
    * u + v = L

    But there's no reason at all to think that u + v = L. All we know is that u + v ≤ 2L. (We could go on to ask what P(u + v = L), but I'm not sure what the point of that would be.)

    So the answer to this question, "What is the chance that the one you picked is smaller?" may or may not be 1/2, even though your chance of picking the smaller is 1/2. (And if it turns out u + v = L, that's sheerest coincidence and has nothing to do with your choice.)

    "My chance of picking the smaller" is just not the same as "the chance of what I picked being smaller", as I've been saying ineffectually for like 3 weeks.
  • Mathematical Conundrum or Not? Number Six

    1. For a single trial, the player cannot calculate an expected value for the other envelope, and therefore either (a) they cannot make a rational decision to switch or stick, or (b) they must use some criterion other than expected value.

    2. For multiple trials, the Always Switch and Always Stick strategies are, in the long run, indistinguishable.

    If I am the player, and I know that (2) is the case, that does not entail, in the case before me, either that I should switch or that I should stick. For any particular trial, either switching or sticking is the right thing to do. But once I know that (1) is the case, I can return to (2) and conclude either that there is no way for me to steer the outcome toward gain, or that I should try some other strategy. (Even if I only get the one chance, I should, if I can, use a method that improves my chances of gain, even if it's a small improvement, unless the disutility of using such a method outweighs the expected gain.) Or I could return to (1) and see if there is anything besides expected value out there.

    I believe both (1) and (2), but I am not clear on the relation between them.
  • Mathematical Conundrum or Not? Number Six

    It's only the difference between describing your expectation conditionally and unconditionally. By describing your expectation conditionally, you leave room for the future evidence you would rely on to update.
  • Mathematical Conundrum or Not? Number Six
    If I'm told that one envelope contains twice as much as the other, and if I pick one at random, am I right in saying before I open it that there's a 50% chance that my envelope contains the smaller amount? If so, I must also be right in saying after I open it and see the amount that there's a 50% chance that my envelope contains the smaller amount.Michael

    I share your frustration, Michael.

    If I offer you a choice between envelopes containing $5 and $10, you have a 50% chance of picking the envelope that has $5 in it. Having chosen an envelope, you no longer have a chance of picking the $5 envelope -- you either did or didn't.

    You still have to express your uncertainty about whether the envelope you did pick was the smaller or the larger, since you don't know the contents of both envelopes, and seeing $10 doesn't tell you whether you got the smaller or the larger. But you cannot say that the amount you observe has a 50% chance of being the smaller. Given more complete knowledge, you would find you had been saying that $10 has a 50% chance of being smaller than $5, which is absurd.

    The only safe way to express your uncertainty -- that is, the only way to formulate it so that increasing your knowledge wouldn't render your beliefs absurd -- is conditionally. And this makes sense. If the larger amount is $10, then picking $10 is necessarily picking the larger amount; if the larger amount is $20, then picking $10 is necessarily picking the smaller amount.

    When you work through expressing the probabilities conditionally, you find that the 50% associated with your choosing an envelope cancels out. And in a sense, it should -- we're now working out the consequences of your choice. The uncertainty that remains does not derive from your choosing at all -- we're past that. The uncertainty that remains is all down to the host's choice of envelope values.

    After that, I'm still a bit murky, I'm sorry to say. Part of it I can see as the sometimes counter-intuitive nature of conditional probabilities, but part of it still eludes me.
  • Mathematical Conundrum or Not? Number Six
    The result that Pr(X=$10) must be zero is not an issue then, because F(X=$10) might not be.JeffJo

    Right, right. (I am actually studying in my spare time, I swear.)

    You can't simply treat a random variable as an unknown. You can consider a set of unknown values from its range, but only if you couple that with their probabilities.JeffJo

    If I may take advantage of your patience a bit more ...

    Suppose I naively approach the problem this way: I think I'm solving for an unknown; I observe the value of y, and I write down my equations:

    (1) y + u = 3x
    (2) y = 10

    I can then get as far as

    u = 3x - 10

    but lacking, say,

    *(3) 3x = 30

    I'm stuck with with an equation that still has two unknowns, so I am forced to treat u and x as variables rather than simply unknown values. As you note, instead of using x I could also write

    (4) y = ru

    where r ∊ {½, 2} but that leaves me with two sets of equations:

    {y = 10, u = 2y}, {y = 10, u = y/2}

    Either of those can be solved, but not knowing the value of r, it still amounts to an equation

    u = 10r

    with two unknowns. Since I can't solve that, I'm back to variables instead of unknowns.
  • Mathematical Conundrum or Not? Number Six
    To remove that option, I recast the problem with the envelopes containing IOUs rather than cashandrewk

    I don't think we really need to agonize over the amounts supposedly being money. We could use real numbers and play competitively. The winner is just whoever ends up with the highest number.
  • Mathematical Conundrum or Not? Number Six

    Thanks. You've told me this before -- and I appreciate your patience. I'll mull it over some more.

    I think I'm just reluctant to see the simple situation of choosing between two envelopes of different values in terms of the strange behavior of infinity.

    I keep thinking of switching as just being a positive or negative change, but the switching argument accepts that!

    Every time I think I've got a handle on this, it slips away.
  • Mathematical Conundrum or Not? Number Six
    the equiprobability condition that alone grounds the derivation of the unconditional 1.25X expectation from switching.Pierre-Normand

    I'm still confused. This makes it sound like the switching argument isn't fallacious -- it just makes an unwarranted assumption. So if every value of X were equally probable, then it would be true that you can expect a .25 gain from switching? I see how the math works, but if that's true, isn't it true whether you know the value of your envelope or not? And if that's true, isn't it also true for the other envelope as well?

    If that's how the calculation goes, then something's wrong because in any pair of envelopes, trading is a gain for one side and a loss for the other.
  • Mathematical Conundrum or Not? Number Six

    I'm still working on it.

    We can also say that

    P(X = a) + P(X = a/2) <= 1

    but other than that, their values can range freely.*** (It is in some sense a coincidence that their sum can also be described as the event Y = a.)

    Do we need to assume that X is not continuous? If it is, all these probabilities are just 0, aren't they?

    *** Urk. Forgetting that at least one of them has to be non-zero.
  • Mathematical Conundrum or Not? Number Six
    you can't define/calculate the prior distribution, and that it was a misguided effort to even try (as you did)JeffJo

    FWIW, my memory is that @Jeremiah only got into the sims & distributions business because everyone was talking about these things and it was his intention to put an end to all the speculation and get the discussion back on track. It seemed to me he did this reluctantly with the intention of showing that even if you made some assumptions you shouldn't -- this has always been his view -- it might not help in the way you think it does.

    Your post suggests you read those old posts as showing Jeremiah is invested in some of these statistical models of the problem and he never has been.
  • Mathematical Conundrum or Not? Number Six
    Nothing new here, just checking my understanding. (Or, rather, whether I have shed all my misunderstandings, even recent ones.) Check my math.

    If I understand it correctly, our situation is something like this:
    2env_e.png

    The host will choose a value for X before the player chooses an envelope. For any value the host chooses, the player's chance of choosing the smaller envelope is



    and you can also sum across all those to get the unconditional probability



    And similarly the chance of picking the 2X envelope is 1/2 for any value of X or for all of them together.

    Also, for any given value a we can say that the probability of picking a is



    And naturally P(X = a | Y = a) is just



    In considering the chance of choosing the smaller envelope, the choice of X drops out, but here we have the opposite: the equiprobability of choices made the the chance of picking some value a a simple mean of the probabilities of X being a and X being a/2; now those chances of choosing cancel out, and we're only comparing probabilities of X values.

    Our expectation for the unpicked envelope:



    And again, choice has dropped out completely.

    What do we know about P(X = a) and P(X = a/2)? We know that



    nearly by definition, although really there are choices canceling out here.

    Do we know that both P(X = a) and P(X = a/2) are non-zero? We know that at least one of P(X = a | Y = a) and P(X = a/2 | Y = a) is non-zero, but we do not know that both are. Without Y = a, we wouldn't even know that at least one of P(X = a) and P(X = a/2) are non-zero. Without knowing that both are non-zero, we can't even safely talk about the odds P(X = a):P(X = a/2).
  • Mathematical Conundrum or Not? Number Six

    Before looking in your envelope, do you have an expectation of gain from swapping?
  • Mathematical Conundrum or Not? Number Six
    To what purpose? It doesn't help you to answer any of the questions.JeffJo

    I thought putting our ignorance front and center could be a feature rather than a bug.

    Also if we do attempt to estimate the shape of the problem as a whole, it will be in terms of X.

    For instance we could ask a simplish question like, what is P(3X - a > a)?

    We'll end up doing exactly things and not doing exactly the same things.
  • Mathematical Conundrum or Not? Number Six

    One more question:

    What if we just say that, having observed the value of our envelope to be a, then the expected value of the other is 3X - a for some unknown X? That formula, unlike the expected value formula, doesn't require any probabilities to be filled in. It's uninformative, but so what?
  • Mathematical Conundrum or Not? Number Six
    If you don't look, the two envelopes have the same expected value. If you do, there is not enough information to say how the probabilities split between having the higher, or lower, value.JeffJo

    Okay -- this is what I keep forgetting.

    Before you look, you could say both envelopes have an expected value of m=3X/2 for some X. Once you've looked in your envelope, its expected value is no longer m and therefore the expected value of the other is no longer m.

    So we are, contrary to my wishes, forced to consider the expected value of the other envelope, and that leads directly to considering more than one possible value for the other envelope, but with no knowledge of the probabilities that attach to those values.

    Thanks for repeating yourself until I get it. I will ask that less of you as time marches on.
  • Mathematical Conundrum or Not? Number Six
    You can't know the odds when you look in an envelope and see a value. You can choose to play, not knowing the odds, but your calculation of the expectation is wrong.JeffJo

    This is the point of the odds calculation I posted before, right? The observed value of the envelope provides no information that could help you decide whether you're in [a/2, a] or [a, 2a], because your choice is always a coin flip:



    (Which raises puzzles about how switching strategies work, and I'd have have to study more to be clear on that. If there is an upper bound, then you'd like to be able to look at a and determine that X=a/2 is more likely than X=a -- that is, that you're getting close to the upper bound and ought not switch. But that's all to one side.)
  • Mathematical Conundrum or Not? Number Six
    Srap Tasmaner is saying that, to someone who knows what is in *both* envelopes, the possibility of gaining or losing is determined. Michael is saying that, to someone who doesn't see both, the two cases should be treated with probabilities that are >=0, and that add up to 1.

    The error is thinking that both must be 50%. Your chance of High or Low is 50% if you don't know the value in the one you chose, but it can't be determined if you do.
    JeffJo

    I do see that. From the player's point of view her uncertainty might as well be modeled as the outcome not yet having been determined and still subject to chance.

    On the other hand, I think the right way to look at it is what I've been saying lately:

    1. there are two choices;
    2. the host's choice determines how much can be gained or lost by switching;
    3. the player's choices determines whether they gain or lose.

    The player's choice actually happens in two steps, picking and then switching or not, but the effect is the same. You could pick A and switch to B, or you could pick B and stick. The player gets to determine whether they end up with A or B by whatever method they enjoy, but that's all they get to do. More reason to think switching is pointless.

    What's frustrating about the whole expected value calculation is that the point of doing it at all is not to tinker with the chances of getting the bigger of the two envelopes the host has offered -- it has to include tinkering with the amounts in those envelopes. There's nothing to be done with the choice because whether you choose in one step or two, it's just a coin flip. (Thus different from Monty Hall, where the key is understanding how your choice works, and the odds of having chosen correctly.)

    So I've been a bit strident about this because to bother with the expected value calculation here means including in your calculation events known to be counterfactual. Is this actually unusual, or am I being stupid? Events that haven't happened yet may become counterfactual by not happening and of course we set odds for those. But I keep thinking that that the player cannot choose until the host chooses -- that's the nature of the game -- and thus before there is an opportunity for the player to choose, some events have definitely become counterfactual already. I've thought that one way to describe this is to say that "the other envelope" is not well-defined until both choices have been made, and they are always made in order, (1) host, (2) player.

    So I agree with your point about the player modeling their uncertainty, the same point @andrewk and @Michael have made. But there are nuances here that puzzle me. If it turns out my concerns are misplaced, that's cool. The whole point of these sorts of puzzles is to sharpen your understanding of probability, which I am eager to do.

    ** ADDENDUM **

    I think the "counterfactual" stuff is wrong.

    It's perfectly intelligible to talk about the chances of something having happened or not happened in the past, and that's putting odds on a counterfactual. It's intelligible both in cases where you know what happened ("He just made a one in a hundred shot!") and in cases where you don't ("What's more likely? That I forgot my own name or that you typed it wrong?").

    So that's not it.

    That leaves two acceptable options:

    1. Ignore the values and only consider the odds that you picked the larger of the two offered; those odds are even.
    2. Consider the values but recognize that you do not know the probability of any particular value being in the other envelope -- in which case your calculation cannot be completed.
  • Mathematical Conundrum or Not? Number Six
    If I know that the odds are even then I will play. If I know that the odds aren't even then I might be willing to play, depending on the odds. If I don't know the odds then I will play.Michael

    This particular quandary isn't supposed to arise in real life. A bookmaker first estimates the odds, and then the payouts are simply those odds minus the vigorish (his profit). If you know that a wager is paying off at 2:1, then you know the odds of it paying off are around 2:1 against.

    If this were not true then (a) bookmaking would not be a thing; more to the point (b) you should not be gambling. Placing a wager based only on the payout without considering the odds of winning is crazy. If the principle of indifference tells you to assume there's even odds whenever you don't know the odds, then the principle of indifference shouldn't be gambling either.

    As you've pointed out, in essence what happens here is that on finding $10, you pocket $5, and then place a $5 wager at 2:1 that the other envelope has $20. So the bookmaker in this case is paying you to play, which is no way to run a book. Suppose instead you had to pay that $5 for the opportunity to switch. That is, you give back the $10 plus pay $5 out of pocket. Would you still play? If the other envelope has $5, you've lost nothing, but if it has $20, you've made $15. That seems alright. A free chance to get $15 with no risk. But then you could have walked away with $10 for no risk.

    Still not sure how to make this into a proper wager. We need to zero-in on the expectation of $12.50 somehow.
  • Mathematical Conundrum or Not? Number Six

    Thanks. (There is lots I have yet to learn, so some of this is going right by me -- for now.)

    I did wonder -- maybe a week ago? it's somewhere in the thread -- if there isn't an inherent bias in the problem toward switching because of the space being bounded to the left, where the potential losses are also getting smaller and smaller, but unbounded to the right, where the potential gains keep getting bigger and bigger.

    It's just that in the single instance, this is all an illusion. There is this decreasing-left-increasing-right image projected onto the single pair of envelopes in front of you, but you can't trust it. It even occurred to me that since there are not one but two bits of folk wisdom warning of this error -- "the grass is always greener .. " and "a bird in the hand ..." -- that this might be a widely recognized (but of course easily forgotten) cognitive bias.
  • Mathematical Conundrum or Not? Number Six

    Yes, and the cutoff can be entirely arbitrary, but the effect will often be tiny. (I spent a few minutes trying to get a feel for how this works and was seeing lots of 1.000417.... sorts of numbers. The argument is sound, so I probably won't spend any more time trying to figure out how to simulate knowing nothing about the sample space and its PDF.)
  • Lying to yourself

    Right? And it's a solid piece of philosophy, to my mind.
  • Lying to yourself

    Read the article if you haven't before. I just reread it and it is as good as I remember.
  • Lying to yourself

    It's very hard to judge which politicians are lying to themselves and which are soul-less tools.

    I'd rather not do more politics, but I wholeheartedly recommend this excellent piece of popular philosophy by the estimable John Scalzi: The Cinemax Theory of Racism. I think it's on point.
  • Lying to yourself

    @jkg20 is arguing the same as I did that self-deception is a violation of our norms of rationality, often related to the treatment of evidence, sometimes related to inference or other elements of reasoning.

    What you're still missing is that reasoning is a process without a pre-selected outcome. You can choose to futz with the process in various ways, and this is quite intentional, but it needn't ever lead to direct confrontation with the outcome -- that can be endlessly pushed aside and never arrived at. So there's no issue of at once assenting to and not assenting to some proposition. You just make sure that proposition never makes it to the floor for a vote. You do this deliberately. You find ways to rationalize doing it, reasons that have nothing to do with your real motivation, reasons that allow you to give what you're doing the color of rationality. No doubt when confronted you will be able to defend yourself at length and explain how every step you took or didn't take was thoroughly justified.

    The natural competitor for describing such a process is simply "being mistaken". But this sort of thing doesn't look much like being mistaken to me.
  • Mathematical Conundrum or Not? Number Six

    I agree completely and have so argued. All you really have to do to get the ball rolling is designate the value in the envelope. It's the innocent "Let Y = ..." This is what I love about that paper Jeremiah linked. I have repeatedly voiced my bafflement that just assigning variables a certain way can lead to so much trouble, and the paper addresses that directly.
  • Mathematical Conundrum or Not? Number Six
    between the X and 2X envelope amountsAndrew M

    Right. The paper Jeremiah linked talks about this too. I was thinking about this on a 6-hour drive a few days ago, and I agree that in general we're talking vanishing smallness. However -- one neat thing about how the game works is that the numbers, and thus the intervals get bigger and bigger faster and faster. Or, rather: all of these strategies are mainly designed to avoid big losses, so we don't really care if small intervals are hard to hit; we only care about the really big intervals and those are if not easy, at least easier to hit. Any big loss avoided will end up pushing you over breaking even.

    This practical reasoning stuff I find endlessly cool -- but it's only barely related to the "puzzle" aspect here.
  • Mathematical Conundrum or Not? Number Six

    Switching is not objectively worse than sticking. It's also not objectively better. Half the time switching is a mistake. Half the time sticking is a mistake. But that's because of your choice, not because of the values in the envelopes.

    But it is still false that you have an expected gain of 1/4 the value of your envelope. You really don't. All these justifications for assigning 50% to more possibilities than two envelopes can hold are mistaken. You picked from one pair of envelopes. This is the only pair that matters. You either have the bigger or the smaller. Trading the bigger is a loss, trading the smaller is a gain, and it's the same amount each way.

    (Maybe one day I'll figure out how to make this into a proper wager -- something with paying each other the value of the envelopes each ends up with. As it is, you make money either way.)
  • Mathematical Conundrum or Not? Number Six

    If X = 10 and your envelope is worth 10, you have the X envelope. By trading, you gain X. This is the X that matters. For any pair of envelopes, there is a single value of X. (If your envelope was worth 20, you would have the 2X envelope and would lose X by trading.)

    If X = 5 and your envelope is worth 10, you have the 2X envelope. By trading, you lose X. (If your envelope was worth 5, you would gain X by trading.)

    Every pair of envelopes has one larger and one smaller. You have an even chance of picking the larger or the smaller. If you picked the larger, you can only lose by trading. If you picked the smaller, you can only gain by trading. There is never any case, no matter what envelope you picked from whatever pair, in which you have a non-zero chance of gaining and a non-zero chance of losing. It's always one or the other and never both.
  • Mathematical Conundrum or Not? Number Six
    I accept all of them. I reject the implicit conclusion that the gain and loss are symmetric. If my envelope contains £10 then 3. and 5. are:

    3. If I trade the 2X = 10 envelope, I lose X = 5.
    5. If I trade the X = 10 envelope, I gain X = 10.
    Michael

    Then you reject 1, because those are two different values of X.
  • Mathematical Conundrum or Not? Number Six

    What I don't understand is what your argument is against the alternative analysis. Which of these do you not accept?

    1. The envelopes are valued at X and 2X, for some unknown X.
    2. You have a 1/2 chance of picking the 2X envelope.
    3. If you trade the 2X envelope, you lose X.
    4. You have a 1/2 chance of picking the X envelope.
    5. If you trade the X envelope, you gain X.
  • Mathematical Conundrum or Not? Number Six
    If there's £10 in my envelope and I know that the other envelope contains either £5 or £20 because I know that one envelope contains twice as much as the other then I have a reason to switch; I want an extra £10 and am willing to risk losing £5 to get it.Michael

    I think you're making two assumptions you shouldn't:

    1. there is non-zero chance C that the other envelope contains twice the value of your envelope;
    2. the odds Not-C:C are no greater than 2:1, so that you can expect at least to break even.

    There is no basis whatsoever for either of these assumptions.

    In my example, if I choose B, you stand to gain 2 or to lose 2, depending on which envelope you chose, and whether you switch. The amount you gain or lose was determined by me in choosing B. If I choose E, you stand to gain 6 or to lose 6, depending on which envelope you chose, and whether you switch. The amount you gain or lose was determined by me in choosing E.
  • Mathematical Conundrum or Not? Number Six

    You may entertain yourself by switching and call that a reason, but there is no expected gain from switching.
  • Mathematical Conundrum or Not? Number Six
    And if I see £10 then I stand to gain £10 and I stand to lose £5.Michael

    If you see £10 then either you stand to gain £10 or you stand to lose £5, but not both.

    I have two pairs of envelopes A = {5, 10} and B = {10, 20}. I'm going to offer you a choice from one pair or the other. What is your chance of getting 10?

    P(10 | A) = P(10 | B) = 1/2

    if we assume you're an average chooser without "10 seeking" talent. Clear enough.

    But what is P(10)?

    P(10) = P(10 | A)P(A) + P(10 | B)P(B) = P(10 | A)P(A) + P(10 | B)(1 - P(A)) = P(A)/2 + 1/2 - P(A)/2 = 1/2.

    So your chance of picking 10 is 1/2, no matter what P(A) is. P(A) drops out.

    What's your chance of picking 5?

    P(5) = P(5 | A)P(A) = P(A)/2

    What's your chance of picking 20?

    P(20) = P(20 | B)(1 - P(A)) = (1 - P(A))/2

    No idea in either case, because P(A) does not drop out.

    If you got a 10, what's your expected value for the other envelope U? You can go two ways here. You could say

    E(U | 10) = 5 * P(A)/2 + 20 * (1 - P(A))/2

    and that would be true, but it somewhat misses the point. I choose before you choose. "The other envelope" is not well-defined until I have chosen A or B, at which point you can say P(A) = 1 or P(A) = 0. You never get to pick from all four envelopes; you only get to pick from the pair I have chosen. We ignored this when calculating P(10) because my choice didn't matter. Now it does.

    E(U | A, 10) = 5 and E(U | B, 10) = 20.

    You'll still want to do this

    E(U | 10) = E(U | A, 10)P(A) + E(U | B, 10)(1 - P(A))

    and then say that since you know nothing about P(A), you can only apply the principle of indifference and assume P(A) = 1/2. You might be wrong; I may be unaccountably inclined toward A to the tune of 1000:1 but you have no way of knowing that and the rational thing to do is go with indifference.

    But this only makes sense because I've told you that I was choosing from two sets of envelopes in the first place. What if I didn't tell you that? What if there only ever was one pair? What if there were thousands? Maybe some of those have duplicate amounts, maybe not. Maybe there's only a single pair with 10 in it. (This is @JeffJo's thing, and it's worth thinking about. You can't really even count on using 1 - P(A), much less assume P(A) = 1/2.)


    Here's a real life version. Suppose I have some cash and two envelopes, and I'm going to split my cash in such a way that one envelope has twice as much as the other. Suppose I have one $10, one $5 and 6 $1's. What are my options?



    There are some combinations I can't make because I don't have enough of the right denominations.

    We could talk about this table as we talked about the collection {{5, 10}, {10,20}}. If you knew how much money I had and in what denominations, there are several cases in which, upon opening an envelope, you'd already know whether you have the larger or the smaller.

    But let's suppose you don't know any of that. You could also figure that if you got an odd number it must be the smaller (because I'm only using bills, no coins) so I'll cleverly not choose one of those; I'll choose only from



    If I choose B, and you draw the 2, you can reason that I would have excluded {1, 2}, so the other must be 4. Similarly, if I choose E, and you draw the 6, then you can reason that I would have excluded {3, 6} and so the other must be 12. Ah well. I'd have to have more money to make the game perfect.

    But what if I choose B and you draw the 4? 8 is mathematically possible, but there's no {4, 8} here. Similarly, if I choose E and you draw the 12; 24 is mathematically possible, but there's no {12, 24} here.

    So what is your expectation before getting an envelope? Unknown. Something less than half the total cash I have on me, which you don't know, but there are other constraints based on the denominations and some gamesmanship.

    Again, there's no game until I choose. Say I choose B. You don't know it, but the average value of B is 3. If you draw the 2, trading gains you 2; if you choose the 4, trading costs you 2. Say I choose E. You don't know it, but the average value of E is 9. If you choose 6, trading gains you 6; if you choose 12, trading costs you 6.

    Once I have chosen, what you stand to gain or lose by switching is always a fixed amount, without regard to how you choose. Even taking the god's-eye-view of all the possibilities, as we did above with {{5, 10}, {10, 20}}, there is no case in which you stand both to gain and to lose.

    You may still think it's rational to assume there is. That is, on drawing 4, to assume the envelopes might very well be {4, 8} rather than {2, 4}, and even to assume the chances of {2, 4} and {4, 8} are equal.

    That's a lot of assuming. (And it will convince you trade your 4, which is a mistake.) You could instead recognize that all of your choices are conditional on my choice: my choice determines how much is to be gained or lost; your choice determines whether you gain or lose. There are some cases where you can guess whether you have the lower value or the higher, but that's just a guess. (If you draw a 6, do you know for a fact that the envelopes aren't {3, 6}? Of course not. I may have chosen {3, 6} just on the chance that you wouldn't expect me to include any odd numbers.)

    So what is the rational expectation for the other envelope, given that I have chosen and given that you have chosen? There is no chance left once we've both chosen, though there is knowledge on my side and ignorance on yours. Does the other envelope contain either half or twice the amount in yours? Yes, of course. Are there non-zero chances of both? No. Should you assume there are anyway? No. You should recognize that I have fixed the amount you will gain or lose by switching; you cannot know whether you chose the larger or the smaller, so you cannot know whether you will gain or lose that fixed amount by switching, so there is no reason either to switch or to stick.

    (Note also that we get here without assigning P(A) or P(B) or P(any letter) a value. I choose, then you choose. That's it.)

    EDIT: Table typos.
  • Ongoing Tractatus Logico-Philosophicus reading group.
    It's just that I would go further that I think he does, and reject the notion of a picture as a model that is distinct from reality.Banno

    A picture is a fact, and thus part of reality, part of the world.
  • Mathematical Conundrum or Not? Number Six
    It still feels to me like we're circling around the difference between

    P(picking larger)

    and

    P(I picked larger | I picked)

    All of us agree the first is just 1/2.** But the second is troublesome. Once you've picked, you definitely have the larger or the smaller, but you don't know which. It might be safe to continue to treat this the same as just the chance of picking larger, so long as you don't use the observed value of what you picked. But if you want to use the observed value, you have to be very careful to avoid saying things that amount to there being a 1/2 chance that 10 > 20.


    ** Although maybe it needn't be. Suppose you actually had data on individuals picking, and one individual is consistently "lucky". We don't need to know why or how, but we could still say this individual's chance of picking the larger is better than average.
  • Epistemic justification
    knowledge is (ontologically) mental phenomena based upon experiencenumberjohnny5

    But a super special kind of mental phenomena. If you want to pick out some of your beliefs and call them "knowledge", you do that by saying something about the connection between those beliefs, the mental phenomena, and the content of those beliefs, what the beliefs are about, and what the beliefs are about is not (necessarily) mental.
  • Mathematical Conundrum or Not? Number Six
    The point is that, in the correct version for your calculation E=($V/2)*P1 + ($2V)*P2, the probability P1 is not the probability of picking the larger value. It is the probability of picking the larger value, given that the larger value is $10. In my example, that 90%. In the OP, you do not have information that will allow you to say what it is.JeffJo

    This is absolutely right. I think the confusion comes when you switch from

    E(other) = (larger)P(picked smaller) + (smaller)P(picked larger)

    where the probabilities of picking smaller and larger are equal, to

    E(other | picked = a) = (2a)P(picked smaller | picked = a) + (a/2)P(picked larger | picked = a)

    because it's tempting to think these conditional probabilities are equal, just like the unconditional probabilities above, but this we do not know.

    (Philosophical aside: I think this is close to my concern that there is a difference between "There's a 1/2 chance of my picking the smaller envelope" and "There's a 1/2 chance that the value of the envelope I picked is the smaller.")

    What is true is that

    P(picked smaller | smaller = c) = P(picked larger | smaller = c) = 1/2

    but that's completely different.

    But averaged over all possible values of V, there will be no expected gain.JeffJo

    I would still like to know more about how this works, though it may be over my head.
  • Mathematical Conundrum or Not? Number Six
    If the initial set up calls for randomly assigning values for the two envelopes in the finite range ((1,2),(2,4),(4,8)) for instance, then, in that case, assuming the player knows this to be the initial set up (and hence uses it as his prior) then the posterior probability conditionally on observing any value of M that isn't either 1 or 8 (that is, conditionally on values 2 or 4 being observed) p will indeed be 1/2.Pierre-Normand

    In one sense, yes, because we can say E(N | M=a) = (3*E(p) +1)/2, where p = P(S=a | M=a).

    But how do we calculate E(p)? I think the player in your example can, but can a player with a lot less information than yours?