• Mathematical Conundrum or Not? Number Six

    You just telescoped the step of multiplying by the chance of picking that number.

    Could put & where you have |.
  • Mathematical Conundrum or Not? Number Six
    P(lower) = P(lower|5) + P(lower|10) + P(lower|20) = 4/10 + 1/10 + 0/10 = 1/2.Andrew M

    This isn't what you mean, is it?

    P(lower | 5) = 4/4 = 1, P(lower | 10) = 1/5, P(lower | 20) = 0/1.
  • Mathematical Conundrum or Not? Number Six
    Reinventing math step-by-step is interesting, and I'm gaining insight by making every possible mistake, and doing so in public, but it would be far more efficient just to study more.
  • Mathematical Conundrum or Not? Number Six
    Imagine u around .75L and v around .9L. They're just randomly selected values in [0, L]. We can't say at the same time that P(u < v) = .9 and P(v < u) = .75. Oops.

    Instead you have to say something like P(u < V | V = v, u in [0, L]) = v/L. And then P(u > V | V = v, u in [0, L]) = 1 - v/L. Anyway that's closer.

    (The other way, you might get, as with the values above, P(u > v) + P(v > u) > 1 or you demand that u + v = L, which is silly.)

    I feel bad about how messy I'm still being with variables.
  • Mathematical Conundrum or Not? Number Six
    It's also slightly more complicated than I wanted because of the "reference" problem. If you don't designate either u or v as the reference variable, it all goes to hell.
  • Mathematical Conundrum or Not? Number Six

    Coin flips and coin flips with colored envelopes are just the wrong kind of example to look at, because (a) you have categorical instead of numeric data, which means you're going to be tempted to substitute the probability of an event for the event, and (b) coin flips have magic numbers built in, magic numbers that happen to match up to the magic numbers you're trying to distinguish (the chances associated with choosing). This is just bad methodology. When you're trying to figure out some bit of math, you should go out of your way to avoid these magic numbers, and only bring them back in as easy-to-solve special cases of the general problem.

    I gave you an example about as general as I could think of. Look at how that example works.
  • Mathematical Conundrum or Not? Number Six
    That's because I disagree with your interpretation of probability. Your reasoning would seem to suggest that there's a 50% chance of a coin flip landing heads, but that after a flip, but before looking, we can't say that there's a 50% chance that it is heads. I think that we can say that.Michael

    Admittedly, in strident moments I have said things like this.

    But look at my last post. It's not about interpretations of probability. It's about how conditional probability works, and it can be a little counter-intuitive.
  • Mathematical Conundrum or Not? Number Six
    There's a 50% chance of picking the lower-value envelope, and so after having picked an envelope it's in an "unknown state" that has a 50% chance of being either the lower- or the higher-value envelope?Michael

    Let's leave the envelopes aside for a moment.

    Imagine an interval [0, L] for some positive real number L. Now let u and v be unequal real numbers in that interval. What is the chance that u < v? Intuitively, that's just v/L. Given a choice between u and v, what is the chance of picking u or v? 1/2. Given that you picked one of u and v, what is the chance that the one you picked is less than the other? We'll call your choice S and the other T (and abuse that notation):

    P(S < T | S = u) = v/L, P(S < T | S = v) = u/L

    Not only do we not know that those are equal, we know that they aren't, because u and v are unequal. But we can say

    P(S < T | S ∊ {u, v}) = (u + v)/2L

    because the chances of picking u and picking v are equal. Clearly,

    * P(S < T | S ∊ {u, v}) = 1/2

    if only if

    * (u + v)/2L = 1/2
    * u + v = L

    But there's no reason at all to think that u + v = L. All we know is that u + v ≤ 2L. (We could go on to ask what P(u + v = L), but I'm not sure what the point of that would be.)

    So the answer to this question, "What is the chance that the one you picked is smaller?" may or may not be 1/2, even though your chance of picking the smaller is 1/2. (And if it turns out u + v = L, that's sheerest coincidence and has nothing to do with your choice.)

    "My chance of picking the smaller" is just not the same as "the chance of what I picked being smaller", as I've been saying ineffectually for like 3 weeks.
  • Mathematical Conundrum or Not? Number Six

    1. For a single trial, the player cannot calculate an expected value for the other envelope, and therefore either (a) they cannot make a rational decision to switch or stick, or (b) they must use some criterion other than expected value.

    2. For multiple trials, the Always Switch and Always Stick strategies are, in the long run, indistinguishable.

    If I am the player, and I know that (2) is the case, that does not entail, in the case before me, either that I should switch or that I should stick. For any particular trial, either switching or sticking is the right thing to do. But once I know that (1) is the case, I can return to (2) and conclude either that there is no way for me to steer the outcome toward gain, or that I should try some other strategy. (Even if I only get the one chance, I should, if I can, use a method that improves my chances of gain, even if it's a small improvement, unless the disutility of using such a method outweighs the expected gain.) Or I could return to (1) and see if there is anything besides expected value out there.

    I believe both (1) and (2), but I am not clear on the relation between them.
  • Mathematical Conundrum or Not? Number Six

    It's only the difference between describing your expectation conditionally and unconditionally. By describing your expectation conditionally, you leave room for the future evidence you would rely on to update.
  • Mathematical Conundrum or Not? Number Six
    If I'm told that one envelope contains twice as much as the other, and if I pick one at random, am I right in saying before I open it that there's a 50% chance that my envelope contains the smaller amount? If so, I must also be right in saying after I open it and see the amount that there's a 50% chance that my envelope contains the smaller amount.Michael

    I share your frustration, Michael.

    If I offer you a choice between envelopes containing $5 and $10, you have a 50% chance of picking the envelope that has $5 in it. Having chosen an envelope, you no longer have a chance of picking the $5 envelope -- you either did or didn't.

    You still have to express your uncertainty about whether the envelope you did pick was the smaller or the larger, since you don't know the contents of both envelopes, and seeing $10 doesn't tell you whether you got the smaller or the larger. But you cannot say that the amount you observe has a 50% chance of being the smaller. Given more complete knowledge, you would find you had been saying that $10 has a 50% chance of being smaller than $5, which is absurd.

    The only safe way to express your uncertainty -- that is, the only way to formulate it so that increasing your knowledge wouldn't render your beliefs absurd -- is conditionally. And this makes sense. If the larger amount is $10, then picking $10 is necessarily picking the larger amount; if the larger amount is $20, then picking $10 is necessarily picking the smaller amount.

    When you work through expressing the probabilities conditionally, you find that the 50% associated with your choosing an envelope cancels out. And in a sense, it should -- we're now working out the consequences of your choice. The uncertainty that remains does not derive from your choosing at all -- we're past that. The uncertainty that remains is all down to the host's choice of envelope values.

    After that, I'm still a bit murky, I'm sorry to say. Part of it I can see as the sometimes counter-intuitive nature of conditional probabilities, but part of it still eludes me.
  • Mathematical Conundrum or Not? Number Six
    The result that Pr(X=$10) must be zero is not an issue then, because F(X=$10) might not be.JeffJo

    Right, right. (I am actually studying in my spare time, I swear.)

    You can't simply treat a random variable as an unknown. You can consider a set of unknown values from its range, but only if you couple that with their probabilities.JeffJo

    If I may take advantage of your patience a bit more ...

    Suppose I naively approach the problem this way: I think I'm solving for an unknown; I observe the value of y, and I write down my equations:

    (1) y + u = 3x
    (2) y = 10

    I can then get as far as

    u = 3x - 10

    but lacking, say,

    *(3) 3x = 30

    I'm stuck with with an equation that still has two unknowns, so I am forced to treat u and x as variables rather than simply unknown values. As you note, instead of using x I could also write

    (4) y = ru

    where r ∊ {½, 2} but that leaves me with two sets of equations:

    {y = 10, u = 2y}, {y = 10, u = y/2}

    Either of those can be solved, but not knowing the value of r, it still amounts to an equation

    u = 10r

    with two unknowns. Since I can't solve that, I'm back to variables instead of unknowns.
  • Mathematical Conundrum or Not? Number Six
    To remove that option, I recast the problem with the envelopes containing IOUs rather than cashandrewk

    I don't think we really need to agonize over the amounts supposedly being money. We could use real numbers and play competitively. The winner is just whoever ends up with the highest number.
  • Mathematical Conundrum or Not? Number Six

    Thanks. You've told me this before -- and I appreciate your patience. I'll mull it over some more.

    I think I'm just reluctant to see the simple situation of choosing between two envelopes of different values in terms of the strange behavior of infinity.

    I keep thinking of switching as just being a positive or negative change, but the switching argument accepts that!

    Every time I think I've got a handle on this, it slips away.
  • Mathematical Conundrum or Not? Number Six
    the equiprobability condition that alone grounds the derivation of the unconditional 1.25X expectation from switching.Pierre-Normand

    I'm still confused. This makes it sound like the switching argument isn't fallacious -- it just makes an unwarranted assumption. So if every value of X were equally probable, then it would be true that you can expect a .25 gain from switching? I see how the math works, but if that's true, isn't it true whether you know the value of your envelope or not? And if that's true, isn't it also true for the other envelope as well?

    If that's how the calculation goes, then something's wrong because in any pair of envelopes, trading is a gain for one side and a loss for the other.
  • Mathematical Conundrum or Not? Number Six

    I'm still working on it.

    We can also say that

    P(X = a) + P(X = a/2) <= 1

    but other than that, their values can range freely.*** (It is in some sense a coincidence that their sum can also be described as the event Y = a.)

    Do we need to assume that X is not continuous? If it is, all these probabilities are just 0, aren't they?

    *** Urk. Forgetting that at least one of them has to be non-zero.
  • Mathematical Conundrum or Not? Number Six
    you can't define/calculate the prior distribution, and that it was a misguided effort to even try (as you did)JeffJo

    FWIW, my memory is that @Jeremiah only got into the sims & distributions business because everyone was talking about these things and it was his intention to put an end to all the speculation and get the discussion back on track. It seemed to me he did this reluctantly with the intention of showing that even if you made some assumptions you shouldn't -- this has always been his view -- it might not help in the way you think it does.

    Your post suggests you read those old posts as showing Jeremiah is invested in some of these statistical models of the problem and he never has been.
  • Mathematical Conundrum or Not? Number Six
    Nothing new here, just checking my understanding. (Or, rather, whether I have shed all my misunderstandings, even recent ones.) Check my math.

    If I understand it correctly, our situation is something like this:
    2env_e.png

    The host will choose a value for X before the player chooses an envelope. For any value the host chooses, the player's chance of choosing the smaller envelope is



    and you can also sum across all those to get the unconditional probability



    And similarly the chance of picking the 2X envelope is 1/2 for any value of X or for all of them together.

    Also, for any given value a we can say that the probability of picking a is



    And naturally P(X = a | Y = a) is just



    In considering the chance of choosing the smaller envelope, the choice of X drops out, but here we have the opposite: the equiprobability of choices made the the chance of picking some value a a simple mean of the probabilities of X being a and X being a/2; now those chances of choosing cancel out, and we're only comparing probabilities of X values.

    Our expectation for the unpicked envelope:



    And again, choice has dropped out completely.

    What do we know about P(X = a) and P(X = a/2)? We know that



    nearly by definition, although really there are choices canceling out here.

    Do we know that both P(X = a) and P(X = a/2) are non-zero? We know that at least one of P(X = a | Y = a) and P(X = a/2 | Y = a) is non-zero, but we do not know that both are. Without Y = a, we wouldn't even know that at least one of P(X = a) and P(X = a/2) are non-zero. Without knowing that both are non-zero, we can't even safely talk about the odds P(X = a):P(X = a/2).
  • Mathematical Conundrum or Not? Number Six

    Before looking in your envelope, do you have an expectation of gain from swapping?
  • Mathematical Conundrum or Not? Number Six
    To what purpose? It doesn't help you to answer any of the questions.JeffJo

    I thought putting our ignorance front and center could be a feature rather than a bug.

    Also if we do attempt to estimate the shape of the problem as a whole, it will be in terms of X.

    For instance we could ask a simplish question like, what is P(3X - a > a)?

    We'll end up doing exactly things and not doing exactly the same things.
  • Mathematical Conundrum or Not? Number Six

    One more question:

    What if we just say that, having observed the value of our envelope to be a, then the expected value of the other is 3X - a for some unknown X? That formula, unlike the expected value formula, doesn't require any probabilities to be filled in. It's uninformative, but so what?
  • Mathematical Conundrum or Not? Number Six
    If you don't look, the two envelopes have the same expected value. If you do, there is not enough information to say how the probabilities split between having the higher, or lower, value.JeffJo

    Okay -- this is what I keep forgetting.

    Before you look, you could say both envelopes have an expected value of m=3X/2 for some X. Once you've looked in your envelope, its expected value is no longer m and therefore the expected value of the other is no longer m.

    So we are, contrary to my wishes, forced to consider the expected value of the other envelope, and that leads directly to considering more than one possible value for the other envelope, but with no knowledge of the probabilities that attach to those values.

    Thanks for repeating yourself until I get it. I will ask that less of you as time marches on.
  • Mathematical Conundrum or Not? Number Six
    You can't know the odds when you look in an envelope and see a value. You can choose to play, not knowing the odds, but your calculation of the expectation is wrong.JeffJo

    This is the point of the odds calculation I posted before, right? The observed value of the envelope provides no information that could help you decide whether you're in [a/2, a] or [a, 2a], because your choice is always a coin flip:



    (Which raises puzzles about how switching strategies work, and I'd have have to study more to be clear on that. If there is an upper bound, then you'd like to be able to look at a and determine that X=a/2 is more likely than X=a -- that is, that you're getting close to the upper bound and ought not switch. But that's all to one side.)
  • Mathematical Conundrum or Not? Number Six
    Srap Tasmaner is saying that, to someone who knows what is in *both* envelopes, the possibility of gaining or losing is determined. Michael is saying that, to someone who doesn't see both, the two cases should be treated with probabilities that are >=0, and that add up to 1.

    The error is thinking that both must be 50%. Your chance of High or Low is 50% if you don't know the value in the one you chose, but it can't be determined if you do.
    JeffJo

    I do see that. From the player's point of view her uncertainty might as well be modeled as the outcome not yet having been determined and still subject to chance.

    On the other hand, I think the right way to look at it is what I've been saying lately:

    1. there are two choices;
    2. the host's choice determines how much can be gained or lost by switching;
    3. the player's choices determines whether they gain or lose.

    The player's choice actually happens in two steps, picking and then switching or not, but the effect is the same. You could pick A and switch to B, or you could pick B and stick. The player gets to determine whether they end up with A or B by whatever method they enjoy, but that's all they get to do. More reason to think switching is pointless.

    What's frustrating about the whole expected value calculation is that the point of doing it at all is not to tinker with the chances of getting the bigger of the two envelopes the host has offered -- it has to include tinkering with the amounts in those envelopes. There's nothing to be done with the choice because whether you choose in one step or two, it's just a coin flip. (Thus different from Monty Hall, where the key is understanding how your choice works, and the odds of having chosen correctly.)

    So I've been a bit strident about this because to bother with the expected value calculation here means including in your calculation events known to be counterfactual. Is this actually unusual, or am I being stupid? Events that haven't happened yet may become counterfactual by not happening and of course we set odds for those. But I keep thinking that that the player cannot choose until the host chooses -- that's the nature of the game -- and thus before there is an opportunity for the player to choose, some events have definitely become counterfactual already. I've thought that one way to describe this is to say that "the other envelope" is not well-defined until both choices have been made, and they are always made in order, (1) host, (2) player.

    So I agree with your point about the player modeling their uncertainty, the same point @andrewk and @Michael have made. But there are nuances here that puzzle me. If it turns out my concerns are misplaced, that's cool. The whole point of these sorts of puzzles is to sharpen your understanding of probability, which I am eager to do.

    ** ADDENDUM **

    I think the "counterfactual" stuff is wrong.

    It's perfectly intelligible to talk about the chances of something having happened or not happened in the past, and that's putting odds on a counterfactual. It's intelligible both in cases where you know what happened ("He just made a one in a hundred shot!") and in cases where you don't ("What's more likely? That I forgot my own name or that you typed it wrong?").

    So that's not it.

    That leaves two acceptable options:

    1. Ignore the values and only consider the odds that you picked the larger of the two offered; those odds are even.
    2. Consider the values but recognize that you do not know the probability of any particular value being in the other envelope -- in which case your calculation cannot be completed.
  • Mathematical Conundrum or Not? Number Six
    If I know that the odds are even then I will play. If I know that the odds aren't even then I might be willing to play, depending on the odds. If I don't know the odds then I will play.Michael

    This particular quandary isn't supposed to arise in real life. A bookmaker first estimates the odds, and then the payouts are simply those odds minus the vigorish (his profit). If you know that a wager is paying off at 2:1, then you know the odds of it paying off are around 2:1 against.

    If this were not true then (a) bookmaking would not be a thing; more to the point (b) you should not be gambling. Placing a wager based only on the payout without considering the odds of winning is crazy. If the principle of indifference tells you to assume there's even odds whenever you don't know the odds, then the principle of indifference shouldn't be gambling either.

    As you've pointed out, in essence what happens here is that on finding $10, you pocket $5, and then place a $5 wager at 2:1 that the other envelope has $20. So the bookmaker in this case is paying you to play, which is no way to run a book. Suppose instead you had to pay that $5 for the opportunity to switch. That is, you give back the $10 plus pay $5 out of pocket. Would you still play? If the other envelope has $5, you've lost nothing, but if it has $20, you've made $15. That seems alright. A free chance to get $15 with no risk. But then you could have walked away with $10 for no risk.

    Still not sure how to make this into a proper wager. We need to zero-in on the expectation of $12.50 somehow.
  • Mathematical Conundrum or Not? Number Six

    Thanks. (There is lots I have yet to learn, so some of this is going right by me -- for now.)

    I did wonder -- maybe a week ago? it's somewhere in the thread -- if there isn't an inherent bias in the problem toward switching because of the space being bounded to the left, where the potential losses are also getting smaller and smaller, but unbounded to the right, where the potential gains keep getting bigger and bigger.

    It's just that in the single instance, this is all an illusion. There is this decreasing-left-increasing-right image projected onto the single pair of envelopes in front of you, but you can't trust it. It even occurred to me that since there are not one but two bits of folk wisdom warning of this error -- "the grass is always greener .. " and "a bird in the hand ..." -- that this might be a widely recognized (but of course easily forgotten) cognitive bias.
  • Mathematical Conundrum or Not? Number Six

    Yes, and the cutoff can be entirely arbitrary, but the effect will often be tiny. (I spent a few minutes trying to get a feel for how this works and was seeing lots of 1.000417.... sorts of numbers. The argument is sound, so I probably won't spend any more time trying to figure out how to simulate knowing nothing about the sample space and its PDF.)
  • Lying to yourself

    Right? And it's a solid piece of philosophy, to my mind.
  • Lying to yourself

    Read the article if you haven't before. I just reread it and it is as good as I remember.
  • Lying to yourself

    It's very hard to judge which politicians are lying to themselves and which are soul-less tools.

    I'd rather not do more politics, but I wholeheartedly recommend this excellent piece of popular philosophy by the estimable John Scalzi: The Cinemax Theory of Racism. I think it's on point.
  • Lying to yourself

    @jkg20 is arguing the same as I did that self-deception is a violation of our norms of rationality, often related to the treatment of evidence, sometimes related to inference or other elements of reasoning.

    What you're still missing is that reasoning is a process without a pre-selected outcome. You can choose to futz with the process in various ways, and this is quite intentional, but it needn't ever lead to direct confrontation with the outcome -- that can be endlessly pushed aside and never arrived at. So there's no issue of at once assenting to and not assenting to some proposition. You just make sure that proposition never makes it to the floor for a vote. You do this deliberately. You find ways to rationalize doing it, reasons that have nothing to do with your real motivation, reasons that allow you to give what you're doing the color of rationality. No doubt when confronted you will be able to defend yourself at length and explain how every step you took or didn't take was thoroughly justified.

    The natural competitor for describing such a process is simply "being mistaken". But this sort of thing doesn't look much like being mistaken to me.
  • Mathematical Conundrum or Not? Number Six

    I agree completely and have so argued. All you really have to do to get the ball rolling is designate the value in the envelope. It's the innocent "Let Y = ..." This is what I love about that paper Jeremiah linked. I have repeatedly voiced my bafflement that just assigning variables a certain way can lead to so much trouble, and the paper addresses that directly.
  • Mathematical Conundrum or Not? Number Six
    between the X and 2X envelope amountsAndrew M

    Right. The paper Jeremiah linked talks about this too. I was thinking about this on a 6-hour drive a few days ago, and I agree that in general we're talking vanishing smallness. However -- one neat thing about how the game works is that the numbers, and thus the intervals get bigger and bigger faster and faster. Or, rather: all of these strategies are mainly designed to avoid big losses, so we don't really care if small intervals are hard to hit; we only care about the really big intervals and those are if not easy, at least easier to hit. Any big loss avoided will end up pushing you over breaking even.

    This practical reasoning stuff I find endlessly cool -- but it's only barely related to the "puzzle" aspect here.
  • Mathematical Conundrum or Not? Number Six

    Switching is not objectively worse than sticking. It's also not objectively better. Half the time switching is a mistake. Half the time sticking is a mistake. But that's because of your choice, not because of the values in the envelopes.

    But it is still false that you have an expected gain of 1/4 the value of your envelope. You really don't. All these justifications for assigning 50% to more possibilities than two envelopes can hold are mistaken. You picked from one pair of envelopes. This is the only pair that matters. You either have the bigger or the smaller. Trading the bigger is a loss, trading the smaller is a gain, and it's the same amount each way.

    (Maybe one day I'll figure out how to make this into a proper wager -- something with paying each other the value of the envelopes each ends up with. As it is, you make money either way.)
  • Mathematical Conundrum or Not? Number Six

    If X = 10 and your envelope is worth 10, you have the X envelope. By trading, you gain X. This is the X that matters. For any pair of envelopes, there is a single value of X. (If your envelope was worth 20, you would have the 2X envelope and would lose X by trading.)

    If X = 5 and your envelope is worth 10, you have the 2X envelope. By trading, you lose X. (If your envelope was worth 5, you would gain X by trading.)

    Every pair of envelopes has one larger and one smaller. You have an even chance of picking the larger or the smaller. If you picked the larger, you can only lose by trading. If you picked the smaller, you can only gain by trading. There is never any case, no matter what envelope you picked from whatever pair, in which you have a non-zero chance of gaining and a non-zero chance of losing. It's always one or the other and never both.
  • Mathematical Conundrum or Not? Number Six
    I accept all of them. I reject the implicit conclusion that the gain and loss are symmetric. If my envelope contains £10 then 3. and 5. are:

    3. If I trade the 2X = 10 envelope, I lose X = 5.
    5. If I trade the X = 10 envelope, I gain X = 10.
    Michael

    Then you reject 1, because those are two different values of X.
  • Mathematical Conundrum or Not? Number Six

    What I don't understand is what your argument is against the alternative analysis. Which of these do you not accept?

    1. The envelopes are valued at X and 2X, for some unknown X.
    2. You have a 1/2 chance of picking the 2X envelope.
    3. If you trade the 2X envelope, you lose X.
    4. You have a 1/2 chance of picking the X envelope.
    5. If you trade the X envelope, you gain X.
  • Mathematical Conundrum or Not? Number Six
    If there's £10 in my envelope and I know that the other envelope contains either £5 or £20 because I know that one envelope contains twice as much as the other then I have a reason to switch; I want an extra £10 and am willing to risk losing £5 to get it.Michael

    I think you're making two assumptions you shouldn't:

    1. there is non-zero chance C that the other envelope contains twice the value of your envelope;
    2. the odds Not-C:C are no greater than 2:1, so that you can expect at least to break even.

    There is no basis whatsoever for either of these assumptions.

    In my example, if I choose B, you stand to gain 2 or to lose 2, depending on which envelope you chose, and whether you switch. The amount you gain or lose was determined by me in choosing B. If I choose E, you stand to gain 6 or to lose 6, depending on which envelope you chose, and whether you switch. The amount you gain or lose was determined by me in choosing E.