Comments

  • Mathematical Conundrum or Not? Number Six
    We're given a choice between envelopes valued unequally at a and b. We won't know which one we picked. The expected value of switching is



    Since a and b are both greater than 0, a/b and b/a are both defined. We can, if we like, also say



    or



    for whatever reason.

    For instance, if a/b = 1/2, then

    Then



    or



    Isn't all of this true whichever of a and b is larger, and whatever their ratio?
  • Mathematical Conundrum or Not? Number Six
    When I was looking for a way to describe "success", I picked the average value as a cutoff; more than that is success. The curious thing is that there is a whole range of values between k and 2k that would work just as well, would pick out "having the larger envelope" in exactly the same cases. (And we could also talk about which other values of k they would work for.) That interval is bounded at the one end by k and at the other by 2k. In a way, the "arbitrary cutoff" switching strategy is built into the problem, because the goal of that strategy is to land on a value somewhere in that interval. Any such value would work as a criterion for switching because it would also work as a criterion of success -- that is for not switching.

    On the one hand, my description was too general, a description of the case where the higher and lower values are separated by an arbitrary interval rather than a prescribed one. Since that interval is determined, there is this "bucket" effect, where every amount above k is normalized as, or equivalent to, 2k. And it works in the other direction too. Anyway, there is this pressure here to empty the space of all values except k and 2k: it's impossible to be less than k or more than 2k. But in between there's conflict: On the one hand k wants to treat every value greater than itself as 2k; 2k wants to treat every value less than itself as k, which means they cannot agree on how to treat the interval between them. The only way to maintain consistency is to empty it completely, so the space becomes as discrete as it possibly can while presenting two choices: it narrows to a set of 2.

    It occurred to me when trying to pick a criterion for success, that the other envelope is it -- and that creates some weirdness about what picking is -- but I shrugged that off on the grounds that knowing its value would merely allow you to deduce the average value. But there's an element of truth to the idea. And I want to say there is something strangely not number-like, or very minimally number-like, going on. In effect, there are only the two elements, and an order defined on the set that says one is greater than the other. But that turns out to be arbitrary. Could be "red" and "blue", or "heads" and "tails", or "0" and "1", so long as one of them designated as better (and one worse).

    When I was trying to think of analogies for the "criterion of success" argument I thought of things like picking red and blue marbles, but labeling them in a language you don't know. With my playing cards, there is a standard order which I just tossed out: instead of saying you need to get the high card, you're supposed to get whatever I say. And I want now to say there is something in the problem that encourages this, because when you see the value of your choice you have to make lots of assumptions even to guess whether that's a good value or a bad, much less know it's good or bad. Numbers don't usually act like that. We usually have some context for saying whether a number is big or small. (Why I had to toss the standard order for playing cards.)

    All of which makes doing calculations of any kind quite strange, because there is so much about the problem that makes it impossible to treat what are manifestly numbers as numbers. Once you have complete knowledge, everything becomes normal again and you can say "10 < 20" or "10 > 5". But with anything less than complete knowledge, the game acts like something else entirely.

    Just some musing on waking up in the middle of the night. Does this make any sense to anyone else?
  • Mathematical Conundrum or Not? Number Six

    It sounds to me like you're trying to figure out what would be a good prior for what amounts to "Pick a number." I mean, you could do research, see what people tend to pick and in what range, but what can you do with just math here?
  • Mathematical Conundrum or Not? Number Six
    "I don't know" is not the same as "There is a 50% chance of each."Srap Tasmaner

    This is the part I'm still struggling with a bit.

    Even if I were to convince Michael that he had mistakenly assumed the chances for each criterion of success were equal, his response will still be something like this:

    There are still exactly two possibilities, and by stipulation I don't know anything about the odds for each, so by the principle of indifference I will still treat them as having even odds.

    So there's this backdoor 50% tries to sneak back in through.

    I know we have the argument about the sample space for X being uniform and unbounded. I just think there must be something else.
  • Mathematical Conundrum or Not? Number Six

    Yes, I believe it is entirely consistent with criticizing the conclusion of the faulty inference. I think we would like to believe that invalid inferences can always be shown to have undesired consequences, but that also requires agreement on (a) the further inference, and (b) the undesirability of the consequence. I suggested at one point in this thread that if told the value of the other envelope instead of your own, then you would want not to switch; I found this conclusion absurd but my interlocutor did not. Go figure.
  • Mathematical Conundrum or Not? Number Six
    @Pierre-Normand, @JeffJo

    What is the source of the paradox in your view?Andrew M

    I believe there is not a paradox here but a fallacy.

    Outside of being told by reliable authority "You were successful!" you need to know two things to know whether you have been successful:

    • what the criterion of success is; and
    • whether you have met that criterion.

    That these are different, and thus that your uncertainty about one is not the same as your uncertainty about the other -- although both contribute to your overall uncertainty that you were successful -- can be readily seen in cases where probabilities attach to each and they differ.

    Here are two versions of a game with cards. In both, I will have in my hand a Jack and a Queen, and there will be two Jacks and a Queen on the table. You win if you pick the same card as I do.

    Game 1: the cards on the table are face down. I select a card from my hand and show it to you. If I show you the Jack, then your chances of winning are 2 in 3.

    Game 2: the cards on the table are face up. I select a card from my hand but don't show it to you. You select a card. If you select a Jack, your chances of winning are 1 in 2.

    In both cases, winning would be both of us selecting a Jack, but the odds of my choosing a Jack and your choosing are different. In game 1, you know the criterion of success, but until you know what you picked, you don't know whether you met it; in game 2, you know what you picked meets the criterion "Jack", but you don't know whether the winning criterion is "Jack" or "Queen".

    (If you buy your kid a pack of Pokemon cards, before he rips the pack open neither of you know whether he got anything "good". If he opens it and shows you what he got, he'll know whether he got anything good but you still won't until he explains it to you at length.)


    Let's define success as picking the larger-valued envelope. There is a fixed amount of money distributed between the two envelopes, so half that amount is the cutoff. Greater than that is success. One envelope has less than half and one envelope has more, so your chances of meeting that criterion, though it's value is unknown to you, are 1 in 2. After you've chosen but before opening the envelope, you could reasonably judge your chances to be 1 in 2.

    You open your envelope to discover 10. Were you successful? Observing 10 is consistent with two possibilities: an average value of 7.5 and an average value of 15. 10 meets the criterion "> 7.5", but you don't know whether that's the criterion.

    What are the chances that "> 7.5" is the criterion for success? Here is one answer:

    We know that our chance of success was 1 in 2. Since we still don't know whether we were successful, our chance of success must still be 1 in 2. Therefore the chance that "> 7.5" is the criterion of success must be 1 in 2.

    This is the fallacy. You reason from the fact that, given the criterion of success, you would have a 1 in 2 chance of picking the envelope that meets that criterion, to a 1 to 2 chance that the unknown criterion of success is the one your chosen envelope meets.

    (No doubt the temptation arises because any given value is consistent with exactly two possible situations, and you are given a choice between two envelopes.)

    You can criticize the conclusion that, for any value you find in your envelope, the two situations consistent with that value must be equally likely, but my criticism is of the inference.

    Now since we do not know anything at all about how the amounts in the envelopes were determined, we're not in a position to say something like "Oh, no, the odds are actually 2 in 7 that '> 7.5' is the criterion." So I contend the right thing to say now is "I don't know whether I was successful" and not attach a probability to your answer at all. "I don't know" is not the same as "There is a 50% chance of each."

    You can reason further that one of the two possible criteria, "> 7.5" and "> 15", must be the situation you are in, and the other the situation you are not in. Then you can look at each case separately and conclude that since the value in the unopened envelope is the same as it's always been, your choice to stick or switch is the same choice you faced at the beginning of the game.

    If you switch, you will turn out to be in the lucky-unlucky or the unlucky-lucky track. If you don't, you will turn out to be in the lucky-lucky or the unlucky-unlucky track.
  • Mathematical Conundrum or Not? Number Six
    Whereas, on my view, it is the source of the paradox ;-)Pierre-Normand

    Yes! This is exactly what we disagree on.

    -- For a change, I'm going to take a little time and think through my response. --
  • Mathematical Conundrum or Not? Number Six
    A further perspective is held by those who know the chosen amount and also know the initial distribution but not which envelope pair was initially selected.Andrew M

    Yes, absolutely, and this is specifically beyond the OP. The distributions we've been talking about have almost always been (or should have been) unknown to the player. The player doesn't even know that there is some selection process. There are values in envelopes. How they got there can be discussed, and that can be interesting when the player has, say, partial knowledge of that process, but it is not the source of the paradox, in my opinion.
  • Mathematical Conundrum or Not? Number Six
    Suppose the initial distribution is, unbeknownst to you, ((5,10), (10,20), (20,40)). In that case, if you are being dealt 5, the expected value of sticking is 5. You don't know what the expected gain of switching is. But it's not the case that it is therefore zero. That would only be zero if you knew for a fact that (5, 10) is half as likely as (5, 2.5) in the prior distribution.Pierre-Normand

    This makes no sense to me. Initial distribution of what? If these are pairs of envelopes from which will be chosen the pair that the player confronts, then not only is this sample space unknown to the player, she never interacts with it. She will face the pair chosen and no other.

    If that pair is {5, 10}, and she draws the 5, then she has the low value and she can only gain by switching with no risk of loss. The only case in which switching does not produce an actual gain or loss is when, contrary to the rules, the envelopes have the same value. By switching she gets the 10; but the 10 was there all along and she might have chosen it at the start. What doesn't change is your overall expectation from playing. Always Stick and Always Switch are not strategies that increase or decrease your expected take from playing.

    I am not sure why you are saying that I am facing a choice rather than saying that I simply don't know whether my envelope is smallest or largest (within the pair that was picked).Pierre-Normand

    Why would your ignorance preclude you from facing a choice and making a decision? In the OP, you make at least two choices: which envelope to claim, and whether to keep it or trade it for the other. Whether you end up with the most or the least you could get depends on those two decisions and nothing else. What the amounts are depends on someone else.
  • Mathematical Conundrum or Not? Number Six

    I have come, in broad terms, to see probability as a generalization of logic, or logic as a special case of probability, take your pick. I would credit Frank Ramsey for convincing me to begin thinking this way. As my principle interest is the nature of rationality, what's of interest here -- decision theory, broadly -- is still the nature of inference.

    What we have been arguing about for fifty pages is what inferences are justified and what aren't.
  • Mathematical Conundrum or Not? Number Six

    I'm not sure which of @JeffJo's examples you're referring to.

    As for my "tree" and what it predicts -- You face a choice at the beginning between two values, and the same choice at the end between those same two values. If you flip a coin each time, then your expectation is the average of those two values both times and it is unchanged.

    Opening an envelope changes things somewhat, but only somewhat. It gives more substance to the word "switch", because having opened one envelope you will never be allowed to open another. You are now choosing not between two envelopes that can be treated as having equal values inside, although they do not, but between one that has a known value and another that cannot be treated as equal in value.

    But it is, for all that, exactly the same choice over again, and however many steps there are between beginning and end, your expected take is the average of the values of the two envelopes. If there's an example in which that is not true, I would be surprised.
  • Mathematical Conundrum or Not? Number Six
    If I hadn't gone inside for coffee, I would have had the 1000th post. I feel bad now, but I have coffee.
  • Mathematical Conundrum or Not? Number Six
    I'll admit I probably would not understand the philosophical significance of probability distributions even if I had read the relevant posts.Janus

    Some might unkindly note that it hasn't stopped me.
  • Mathematical Conundrum or Not? Number Six
    Yours isn't really a decision tree that the player must make use of since there is no decision for the player to make at the first node.Pierre-Normand

    I might also have pointed out that when I first started doing this a couple days ago I said

    This is in fact only a "tree" in a charitable sense.Srap Tasmaner

    The point of the tree is to show that the last decision you make in every case is the exact same as the first decision you made, and whatever decisions you were offered in between.
  • Mathematical Conundrum or Not? Number Six
    Once again, @Jeremiah, @JeffJo, @Pierre-Normand, and @andrewk, I'm terribly grateful for the patience you've shown me as I try to learn something about probability.
  • Mathematical Conundrum or Not? Number Six

    https://www.urbandictionary.com/define.php?term=smile%20when%20you%20say%20that

    ((Evidently nearly coined by Owen Wister, author of The Virginian, the basis for one of favorite TV shows when I was a kid.))
  • Mathematical Conundrum or Not? Number Six
    no philosophical significance or interestJanus

    Smile when you say that.
  • Mathematical Conundrum or Not? Number Six
    Where JeffJo approach seems to me to be superior to yours is that it doesn't yield an incorrect verdict for the specific cases where the prior distribution is such as to yield envelope pairs where, conditionally on being dealt either the smaller or the larger amount from this pair, the expected gain from switching isn't zero. Your own approach seems to yield an incorrect result, in that case, it seems to me.Pierre-Normand

    Sorry, I'm not following this. This sounds like you think I said your expected gain when you have the smaller envelope is zero, which is insane.

    Yours isn't really a decision tree that the player must make use of since there is no decision for the player to make at the first node.Pierre-Normand

    Well now that's an interesting thing.

    Is it a decision? You may not immediately know the consequences of your decision, and you may have no rational basis for choosing one way or the other, but which way you decide will have consequences, and you will have the opportunity to know what they are.

    I've always thought the odd thing about the Always Switch argument is precisely that the game could well begin "You are handed an envelope ..." because the analysis takes hold whether their decision puts them in possession of the smaller or the larger envelope. That strikes me as fallacious. Your primary interest should be in picking the larger envelope, and having picked, figuring out whether you have the larger envelope. In my little real world example, it turns out gamesmanship played a far larger role than any probability.
  • Mathematical Conundrum or Not? Number Six
    If you are given v, and so have two x's, you have to consider the relative probabilities of those two x's.JeffJo

    Except that you cannot, and you know that you cannot.

    Suppose the sample space for X is simply {5}, one sole value. All the probabilities of assignments of values to X must add up to 1, so the assignment of the value 5 gets 1. Suppose the sample space for X is {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} and the probability of each assignment is {0, 0, 0, 0, 1, 0, 0, 0, 0, 0}. What is the difference? Could I ever tell which of these was the case and would it matter to me if I could?

    I appreciate the rest of your comments, and may address some of them.
  • Mathematical Conundrum or Not? Number Six
    Here's my decision tree again, fleshed out in terms of @Michael's £10.
    envelope_tree_d.png
    The value of k is either 5 or 10, and you have no way of knowing which.

    k = 5

    If you observe 10, you are in the blue branch: you have picked the higher amount and only stand to lose k by switching with no chance of gaining. You could not have observed 10 in the red branch. At the end, the choice you face between sticking (10) and switching (5) is the exact same choice you faced at the beginning.

    For you, the blue branch is true and the red branch, in which you have the lower amount, and face a choice between sticking (10) and switching (20), is false.

    k = 10

    If you observe 10, you are in the red branch: you have picked the lower amount and only stand to gain k by switching with no chance of losing. You could not have observed 10 in the blue branch. At the end, the choice you face between sticking (10) and switching (20) is the exact same choice you faced at the beginning.

    For you, the red branch is true and the blue branch, in which you have the higher amount, and face a choice between sticking (10) and switching (5), is false.


    There are always only two envelopes with fixed values in play. Any choice you make is always between those same two envelopes with those same two values.


    ((The case descriptions sound like the lamest D&D campaign ever.))
  • Mathematical Conundrum or Not? Number Six

    My current, and I think "final", position is that this isn't really a probability puzzle at all. Here are my arguments for my view and against yours.

    1. The only probability anyone has ever managed to assign any value to is the probability of choosing the larger (or the smaller) envelope -- and even that is only the simplest noninformative prior.
    2. All other probabilities used in solutions such as yours are introduced only to immediately say that we do not and cannot know what their values are.
    3. The same is true for the sample space for X. Many have been used in examples and solutions but always with the caveat that we do not and cannot know what the sample space for X is.
    4. Much less the PDF on that space.
    5. By the time the player chooses, a value for X has been determined. Whatever happened before that is unknown, unknowable, and therefore irrelevant. As far the player is concerned, the PDF that matters assigns a value of 1 to some unknown value of X and 0 to everything else.
    6. We might also describe that as the host's choice of a value for X. Whatever the host had to choose from (for instance in the real-world example of cash found in my wallet), and whatever issues of probability and gamesmanship were involved, the host makes a choice before the player makes a choice. (In your multiple-and-duplicate envelopes analysis, which I found very helpful, you allow the player to choose the envelope pair and then choose an envelope from that pair. We needn't.)
    7. That choice is the very first step of the game and yet it appears nowhere in the probabilistic solutions, which in effect treat X as a function of the player's choices and what the player observes.
    8. The exact opposite is the case. The player makes choices but the consequences of those choices, what she will observe, is determined by the choice beforehand of the value of X.
    9. The values of the envelopes any player will see are fixed and unknown. We have only chosen to model them as variables as a way to express our uncertainty.
    10. The probabilistic model can safely be abandoned once it's determined that there will never be any evidence upon which to base a prior much less update one.

    Here's my question for you: what is the advantage of saying that the variable X takes a value from an unknown and unknowable sample space, with an unknown and unknowable PDF, rather than saying X is not a variable but simply an unknown?

    In the now canonical example of @Michael's £10, he could say either:

    (a) the other envelope must contain £20 or £5, but he doesn't know which; or
    (b) there's a "50:50 chance" the other envelope contains £20 or £5, and thus the other envelope is worth £12.50.

    I say (a) is true and (b) is false. The other envelope is worth £20 or £5, and he will gain or lose by switching, but we have no reason to think there is anything probabilistic about it, no reason to think that over many rounds Michael would see £20 about half the time and £5 about half the time, or even £20 some other fraction of the time and £5 the rest. What compels us to say that it is probabilistic but Michael assumes a probability he oughtn't, if we're only going to say that the actual probability is unknown and unknowable? Why not just say (a)?
  • Mathematical Conundrum or Not? Number Six
    Here's a straightforward revision of the decision tree:
    envelope_tree_d.png
    Opening an envelope "breaks the symmetry," as the kids say, so that's the point of no return. Colors represent paths you can be on for each case.

    This one also uses 'c' for both branches, which the previous tree deliberately avoided. This time the intention is that either the red or the blue branch is probability 1 and the other 0. They conflict by design.
  • Mathematical Conundrum or Not? Number Six
    What if we did say that all of the player's choices are conditional on the host's choice? That is, suppose we had X = k, where k is some unknown constant. Then, using 'c' for our observed value,

    p = P(Y=X | Y=c, X=k) = P(c=k)

    Now whatever the value k is, the only permissible values for p are 0 and 1.

    The expectation for the unpicked envelope is then

    E(U) = P(c=k)2c + P(c=2k)c/2

    Once you've observed c, you know that either c=k or c=2k, but you don't know which. That is, observing c tells you one of c=k and c=2k is true (and one false), which is more specific than just

    P(X=c | Y=c) + P(X=c/2 | Y=c) = 1
  • Ongoing Tractatus Logico-Philosophicus reading group.

    Post something when you feel like it. I'll keep the thread bookmarked.
  • Mathematical Conundrum or Not? Number Six
    @Jeremiah, @JeffJo
    This pissing contest is detracting from the thread. Both of you quit it.
  • Mathematical Conundrum or Not? Number Six
    The fallacious premise of the switching argument is that you could observe a given value, whichever envelope you chose and opened. If the envelopes are {5, 10}, you cannot observe 10 by selecting the smaller envelope; if the envelopes are {10, 20}, you cannot observe 10 by selecting the larger envelope. For each round of the game: X has a single value; there is a single pair of envelopes offered; they are valued at X and 2X; when you select an envelope, you select one valued at X or at 2X.
  • Mathematical Conundrum or Not? Number Six
    There is a 50% chance that you observed a, because you chose and opened the X envelope; there is a 50% chance that you observed b, because you chose and opened the 2X envelope. The expected value of switching is:
    E = a/2 - b/4
    
    and since 2a = b
    E = a/2 - a/2 = b/4 - b/4 = 0.
    
  • Mathematical Conundrum or Not? Number Six
    Here's a reasonable way to fill out the rest of the decision tree.
    envelope_tree_b2_1.png

    Either you observed value a, and then you stand to gain a by switching, or you observed b, and you stand to lose b/2 by switching.

    If you chose the X envelope, and observed its value, you stand to gain X by switching; if you chose the 2X envelope, and observed its value, you stand to lose X by switching.
  • Mathematical Conundrum or Not? Number Six
    Here's the OP:

    Problem A
    1. You are given a choice between two envelopes, one worth twice the other.
    2. Having chosen and opened your envelope, you are offered the opportunity to switch.
    3. You get whichever envelope you chose last.

    Here's a slight variation on Problem A:

    Problem B
    1. You are given a choice between two envelopes, one worth twice the other.
    1.5 Having chosen but before opening your envelope, you are offered the opportunity to switch.
    2. Having chosen and opened your envelope, you are offered the opportunity to switch.
    3. You get whichever envelope you chose last.

    Here's a slight variation on Problem B:

    Problem C
    1. You are given a choice between two envelopes, one worth twice the other.
    1.25 Having chosen but before the envelope being designated "yours" for the next step, you are offered the opportunity to switch.
    1.5 Having chosen but before opening your envelope, you are offered the opportunity to switch.
    2. Having chosen and opened your envelope, you are offered the opportunity to switch.
    3. You get whichever envelope you chose last.

    And we can keep going. Wherever you have been offered the opportunity to switch, we can add a step in which you are offered the opportunity to switch back before moving on to the next step. The number of steps between being offered the first choice and getting the contents of (or receiving a score based on) your last choice can be multiplied without end. At some points, there may be a change in labeling the envelopes (now we call this one "your envelope" and that one "the other envelope"); and at one point, you are allowed to learn the contents of an envelope.

    Suppose you wanted to make a decision tree for the OP, Problem A. You'd have to label things somehow to get started.
    envelope_tree_a.png
    How to label the nodes after the first choice? We could switch to "YOURS" and "OTHER"; later, once a value has been observed, we could switch to, say, "Y ∊ {a}" and "U ∊ {a/2, 2a}" for labels.

    But of course all we're doing is relabeling. This is in fact only a "tree" in a charitable sense. There is one decision, labeled here as a choice between "LEFT" and "RIGHT", and there are never any new decisions after that -- no envelopes added or removed, for instance -- and each step, however described, is just the same choice between "LEFT" and "RIGHT" repeated with new labels. You can add as many steps and as many new labels between the first choice and the last as you like, but there is really only one branching point and two possibilities; it is the same choice between the same two envelopes, at the end as it was at the beginning.
  • Mathematical Conundrum or Not? Number Six
    The OP never called for a solution based on expected gainsJeremiah

    Sure. But I'm not trying to figure out whether I should switch. I'm trying to figure out where the fallacy in the 5/4 argument is, and that's an expected gain argument.
  • Mathematical Conundrum or Not? Number Six

    Yes.

    I accept that the expectation of gain would apply whether you looked in the envelope or not, and thus there are symmetrical expectations that each envelope is worth more than the other. I also believe that always switching is equivalent to always sticking in multiple trials. From both of these reasons, I conclude either:

    1. You cannot talk about expectations here at all (which I find troubling); or
    2. The argument is fallacious.
  • Mathematical Conundrum or Not? Number Six
    You have not really proven he can't.Jeremiah

    I know! It's why I'm still here.
  • Mathematical Conundrum or Not? Number Six
    I messed up the mathAndrew M

    You're not in my league at messing up the math!

    It is a nice clear argument, using @JeffJo's multiple sets of envelopes, and makes the point I keep failing to make. P(lower)=1/2, but P(lower | 10) = 1/5.

    I have a feeling though that @Michael will still think that absent knowledge of the distribution, he can turn back to 50% as an assumption.
  • Mathematical Conundrum or Not? Number Six

    You just telescoped the step of multiplying by the chance of picking that number.

    Could put & where you have |.
  • Mathematical Conundrum or Not? Number Six
    P(lower) = P(lower|5) + P(lower|10) + P(lower|20) = 4/10 + 1/10 + 0/10 = 1/2.Andrew M

    This isn't what you mean, is it?

    P(lower | 5) = 4/4 = 1, P(lower | 10) = 1/5, P(lower | 20) = 0/1.
  • Mathematical Conundrum or Not? Number Six
    Reinventing math step-by-step is interesting, and I'm gaining insight by making every possible mistake, and doing so in public, but it would be far more efficient just to study more.
  • Mathematical Conundrum or Not? Number Six
    Imagine u around .75L and v around .9L. They're just randomly selected values in [0, L]. We can't say at the same time that P(u < v) = .9 and P(v < u) = .75. Oops.

    Instead you have to say something like P(u < V | V = v, u in [0, L]) = v/L. And then P(u > V | V = v, u in [0, L]) = 1 - v/L. Anyway that's closer.

    (The other way, you might get, as with the values above, P(u > v) + P(v > u) > 1 or you demand that u + v = L, which is silly.)

    I feel bad about how messy I'm still being with variables.
  • Mathematical Conundrum or Not? Number Six
    It's also slightly more complicated than I wanted because of the "reference" problem. If you don't designate either u or v as the reference variable, it all goes to hell.
  • Mathematical Conundrum or Not? Number Six

    Coin flips and coin flips with colored envelopes are just the wrong kind of example to look at, because (a) you have categorical instead of numeric data, which means you're going to be tempted to substitute the probability of an event for the event, and (b) coin flips have magic numbers built in, magic numbers that happen to match up to the magic numbers you're trying to distinguish (the chances associated with choosing). This is just bad methodology. When you're trying to figure out some bit of math, you should go out of your way to avoid these magic numbers, and only bring them back in as easy-to-solve special cases of the general problem.

    I gave you an example about as general as I could think of. Look at how that example works.