• Mathematical Conundrum or Not? Number Six
    If it would be your own money at stake here, you shouldn't be playing at all.Srap Tasmaner

    One way to adjust the game so that your own money is at stake would be to merely write down the two amounts in the envelopes. The cost for playing is the value v that's written in your envelope. If you chose to play, and switch, then you must pay this amount v upfront and the game master must give you back the amount that's written down in the second envelope. On the assumption that you gain no knowledge at all (not even probabilistic knowledge) about the the probability that your cost is smaller than the potential reward, then the paradox ensues since if we make no assumption at all regarding the prior distribution being either bounded, or unbounded and uniform, then the arguments that the expected value of switching is v or 1.25v seem to be equally valid.
  • For the third millennium, Aristotle: dogma, science, or description?
    I don't believe that for a moment. I think that why science is currently embroiled in what Jim Baggott calls 'fairytale physics' is precisely the complete and total absence of an 'immanent unity'.Wayfarer

    There is an abundant contemporary literature on what's called scientific practice. The focus on scientific practice is a focus on what productive scientists actually do. More emphasis seems to be placed on the work of physicists and biologists rather than cognitive and social scientists. Thomas Kuhn and Joseph Rouse are two philosophers of science who are focusing on scientific practice.

    There appears to be a significant disconnect between the metaphysical assumptions that scientists make, as manifested in their practice, and the sorts of metaphysical pronouncements that they make while trying to articulate what it is that they view as "the scientific method", or their views of the nature of the "physical" word. I often refer to the later as the modern scientific (and scientistic, foundationalist or reductionistic) view of the world. Maybe what @apokrisis is saying applies to current (and old) scientific practice as manifested in most of the productive natural scientific fields of inquiry whereas what you say applies to the crudely materialistic world view that permeates much contemporary scientific thinking: that is, to what scientists say rather than what they do.
  • Mathematical Conundrum or Not? Number Six
    So to make this a better analogy, let's say that some third party asks us both to play the game. He will roll two dice, and if I win then you give me £10 and if I lose then I give you £5. He doesn't tell us what result counts as a win for me and what counts as a win for you. It could be that 1-11 is a win for you, or it could be that 1-11 is a win for me, or it could be that 1-6 is a win for me.

    I would be willing to play, as I have more to gain than I have to lose. You, presumably, wouldn't be willing to play, as you have more to lose than you have to gain.
    Michael

    Indeed, if you assume it to be equally likely that the odds of winning (irrespective of the amount) are stacked in your favor as that they are stacked in my favor, then, with this specific and asymmetrical payoff ratio, your overall expected value is positive while mine is negative. But this problem is importantly disanalogous to the two envelope problem.

    To make this example more relevantly analogous, the game master would need to hand out to each of us an envelope while only informing us that one envelope contains twice the amount of the other envelope. She would then roll two dice, both players would reveal their envelope contents, and whoever wins will be entitled to switch envelopes just in case she doesn't already have the larger amount. The odds of winning, as before, are unknown. In this version of the game, each player, who initially knows only the content of her own envelope, still stands to win twice as much as she stands to lose. So, you would still seem to be committed to conclude that it is rationally mandated that they should chose to throw the dice (after having been dealt their envelopes and seen its content). And this is true for both of them. Does that make sense?
  • Mathematical Conundrum or Not? Number Six
    Imagine you're given £100 and are offered the choice to pay £100 to play a game with a 5/6 chance of winning (say a dice roll). If you win then you win £1,000,000 and if you lose then you lose all the money you've won up to that point.

    The average return for repeated games is 0, as you're almost certain to lose at some point. But playing it just once? That's worth it.

    This is why I think talking about average returns over repeated games is a red herring.
    Michael

    This is a nice example but you seem to be offering it as a purported counterexample to the principle that what makes it rational to play a game (and apply a given strategy) only once is for that game (and specific strategy) not to yield a negative expected value and hence not to tend to average negative expected gains when played repeatedly.

    But your example is defective since if you are offered the options to "play just once" or play until you lose everything, then what you are comparing are two strategies applied to a single game and it is quite clear that the first strategy has a very large expected value (namely, £5,000,500/6) while the second strategy has a null expected value. Those are the amounts that you can expect to gain, on average, while playing the game repeatedly while applying those strategies. The best strategy, in order to maximize your expected value, would be to play until (and if) your total earnings exceed £5,000,000 and then stop. Past that point, your expected gain from rolling the die once more becomes negative. What dictates your choice of strategy still is its average return, or expected value, even if the game only is being played once.
  • Mathematical Conundrum or Not? Number Six
    It cannot be equally likely without postulating a benefactor with (A) an infinite supply of money, (B) the capability to give you an arbitrarily-small amount of money, and (C) a way to select a random number uniformly from the set of all integers from -inf to inf.

    All three of which are impossible.

    But the reason you should reject the solution you use is because it is not a correctly-formed expectation. You are using the probability of picking the smaller value, where you should use the probability that the pair of values is (v,2v) *AND* you picked the smaller, given that you picked v.
    JeffJo

    I just want to note that we seem to be in agreement on everything. The only reason why we seemingly disagreed in our recent exchange is because you objected to my stated requirement that the game be conceived as a "real world" problem, and hence that the possibility of a uniform and unbounded prior distribution ought to be precluded, in order that the switching strategy could be shown to yield a zero conditional expected gain rather than a 0.25*v conditional expected gain. I was thus merely expressing the caveat that you are now making explicit in your first paragraph. There is an abundance of discussion of the ideal and impractical case where this "real world" constraint doesn't apply in the literature about the two envelope paradox, and this is the case which, unlike the "real world" case where the prior distribution is well behaved, still is controversial. (See the Chalmers' paper, mentioned earlier in the thread).
  • Mathematical Conundrum or Not? Number Six
    How is this any different to saying that I'm equally likely to win as lose?Michael

    It is obviously different since on the assumption that you are equally likely to win as lose it follows that the expected value of switching is 0.25*v whereas saying that the odds neither are nor aren't in your favor is equivalent to saying that the average expected value of switching is v.

    (I'll come back to this conversation tomorrow)
  • Mathematical Conundrum or Not? Number Six
    No, because I know the probabilities aren't in my favour. If I know that they're not in my favour then I won't play. If I know that they're in my favour then I will play. If I don't know the odds then I will play.Michael

    In the two envelope case, you don't know the odds of winning. But you do know (or ought to be able to deduce) that the odds aren't either in your favor, neither in your disfavor. The expected gains from either switching or sticking both are zero. That is the case, anyway, on the assumption that the game master doesn't have access to infinite funds.
  • Mathematical Conundrum or Not? Number Six
    But you're also not saying that sticking is a winning strategy. If sticking isn't preferred then I am going to switch, because I am willing to risk losing £5 for the chance to win £10. I have more to gain than I have to lose. That neither strategy gains over the other after repeated games doesn't change this.Michael

    That only makes sense if you favor taking the chance of gaining a larger amount A than the amount B that you can possibly lose irrespective of their relative probabilities and, hence, irrespective of the expected value of the outcome.

    Suppose I offer you to play a game with two dice. You throw them once and sum them up. If you roll any value from 1 to 11, you must give me £5. If you roll 12 then I must give you £10. Let us assume that we only are going to play this game once. Would you also say, in this case, that you are willing to risk losing £5 for the chance to win £10?
  • Mathematical Conundrum or Not? Number Six
    If there's no reason to believe that we're more likely to lose than win on switching, i.e. if there's no reason to prefer sticking, and if we can afford to lose, then switching is a good gamble for a single game, even if not a winning strategy over many games. I either lose £5 or I gain £10. That's a bet worth making for me, and so if I were to play this game and find £10 in my envelope then I would switch.Michael

    I would say that, if it's not a winning strategy over many games of the same kind, then it's not a rational strategy over just one game of that kind; unless you aren't averse to playing money games with negative or null expected value. Playing the game only once merely increases the variance. It changes nothing to the expected value. (Although, as andrewk earlier noted, what choice you ought to make also depends on what your personal utility valuations are; here I am simply assuming that the player's only goal is to act such as to maximize expected value).

    What would make the switching choice worth making would be if the chance of your losing £5 isn't at least twice as large as your chance of winning £10 is. But you don't know that to be the case either. If you naively rely on the principle of indifference, this will lead you to make a mistake in every case where you are playing this game, are being dealt an envelope with value v, and, conditional on the amount in the envelope dealt to you being v, the chance of your losing £5 is more than twice as large as your chance of winning £10. In the other cases, your choice to switch yields a null or positive expected value. The only thing that you know for sure is that, over the long run, such mistakes would exactly cancel out your gains. So, the expected gain from switching, when you have no clue at all where the value v that you are holding is located within the probability distribution of the possible envelope values, is exactly zero. It is not 1.25v.

    Lastly, if you have no clue at all what the distribution is, and you expect any possible distribution to be equally likely with no constraint at all on the maximum or minimum amounts possibly (and plausibly) being contained in the envelopes, then, yes, the expected value of switching, conditionally on v being the value of your envelope, is 1.25v. But that can only happen in a world where the game master has an infinite fund of money.
  • Reviews of new book, Neo-Aristotelian Perspectives in the Natural Sciences
    The Transactional Interpretation of Quantum Mechanics, 2012, p.33Wayfarer

    Thanks. Since I am not familiar with the transactional interpretation of quantum mechanics, I downloaded the new paper by Kastner, Kauffman and Epperson. At first gloss, it seems to me like the best features of this approach are shared by the relational/pragmatist approaches favored by Heelan, Rovelli and Bitbol. The relational/pragmatist interpretations, though, appear to me to better comport with Aristotelian metaphysics, and to more radically jettison the foundationalist and reductionist prejudices of modern scientific thinking than the transactional interpretation appears to do. But I'll have to read the paper more carefully to see if my worries are warranted and before expressing more precise objections.
  • Reviews of new book, Neo-Aristotelian Perspectives in the Natural Sciences
    What interests me about that article, however, is the idea of 'potentia' as 'real but not actually existing'. 'The unmanifest' was tacked on by me at the end, it might be misleading - that's not the main point of the article.Wayfarer

    It is to be applauded that some physicists will grant existence to pure potentialities, but it seems to rub against the spirit of Aristotelian metaphysics to suggest that what is actual takes place in spacetime (or in the phenomenal world of ordinary experience) whereas what exists as pure potential is outside of spacetime (or in some Platonic intelligible world). This idea doesn't mesh with Aristotle's idea of there being first and second actualities, since first actualities, themselves being kinds of potentialities, would have to exist both within spacetime and outside of it. Some person's property of being sighted, or of being able to speak French, for instance, are first actualities, while the exercise of sight, or the act of speaking French, are second actualities. When a doctor restores the ability of sight in a formerly blind person, it would be weird to say that this restored ability is something that exists both outside of spacetime (qua potentiality to see) and inside of it (qua first actuality).

    Maybe those physicists would hold that only very special and fundamental sorts of potentials, namely, quantum potentials, exist outside of spacetime. But now the objectionable dualism is being replaced by a crude reductionism. What are we to make of the ontological status of the unactualized potentialities of ordinary things, and of the unactualized powers of objects of sciences other than those of fundamental particle physics?
  • Mathematical Conundrum or Not? Number Six
    So we’re assuming that the other envelope is equally likely to contain either £20 or £5, and that’s a reason to switch. We either lose £5 or gain £10. That, to me, is a reasonable gamble.Michael

    It's not necessarily equally likely. It may be equally likely, conditionally on £10 being the value of the first envelope, but it may also be morel likely that the other envelope contains £20, or more likely that it contains £5. In the first two cases, you may fare better if you switch. In the third case, you fare worse. (It ought to be twice as likely that the other envelope contains £5 rather than £20 for the switching strategy to break even). On (weighted) average, over all the possible values that you can be dealt initially, you don't fare any better by switching. Only in the case where the prior distribution of possible envelope values is uniform and unbounded do you have a positive expected gain from switching conditionally on any value v being seen in your envelope.
  • Mathematical Conundrum or Not? Number Six
    Isn't all of this true whichever of a and b is larger, and whatever their ratio?Srap Tasmaner

    Yes, this is true of the unconditional expected values of sticking or switching. But those are not the same as the conditional values of sticking or switching (conditional, that is, on the specific value of one of the two envelopes). In the case where the prior distribution isn't uniform and unbounded, then, the unconditional values of sticking and switching are equal to each other, and they are finite. In the case where the prior distribution is uniform and unbounded, the unconditional values still are equal to each other since they are both infinite. And also, the conditional values of switching or sticking both are equal to 1.25v, conditional on v being the value of any one of the two envelopes. (The trick, of course, is to see why this doesn't lead to a contradiction in the case where the prior distribution is uniform and unbounded. It's the same reason why the guests of the Hilbert Rational Hotel are rationally justified to blindly move to new rooms, and, if they have already moved, are rationally justified to blindly move back to their previous rooms).
  • Mathematical Conundrum or Not? Number Six
    But it isn't logically consistent. With anything. That's what I keep trying to say over and over.

    1.25v is based on the demonstrably-false assumption that Pr(X=v/2)=Pr(X=v) regardless of what v is. It's like saying that the hypotenuse of every right triangle is 5 because, if the legs were 3 and 4, the hypotenuse would be 5.
    JeffJo

    What you are saying is correct in any case (most cases?) where the prior probability distribution of the envelope values isn't unbounded and uniform. In the case where it is, then there is no inconsistency between the expected value of the unseen envelope being 1.25v conditionally on the value of the seen envelope being v, and this being the case regardless of which one of the two envelopes has been seen.

    Exp(other) = (v/2)*Pr(picked higher) + (2v)*Pr(picked lower) is a mathematically incorrect formula, because it uses the probabilities of the wrong events.

    Actually, the formula is correct in the special case where the prior distribution is uniform and unbounded, since, in that special case, the conditional probabilities Pr(picked higher|V=v) and Pr(picked lower|V=v) remain exactly 1/2 whatever v might be. In the more general case, the formula should rather be written:

    Exp(other) = (v/2)*Pr(picked higher|V=v) + (2v)*Pr(picked lower|V=v)

    Exp(other) = (v/2)*Pr(V=v|picked higher) + (2v)*Pr(V=v|picked lower) is the mathematically correct formula, because it uses the probabilities of the correct events.

    Are you sure you didn't rather mean to write "Exp(other) = (v/2)*Pr(picked higher|V=v) + (2v)*Pr(picked lower|V=v)"?
  • Mathematical Conundrum or Not? Number Six
    A pedant would insist you need to include one probability from the probability distribution of whichever you choose. But it divides out so it isn't necessary in practice.JeffJo

    Edited: I had posted an objection that doesn't apply to what you said since I overlooked that you were only here considering the case where both envelopes remain sealed. I agree with your post.
  • Mathematical Conundrum or Not? Number Six
    And since the OP does not include information relating to this, it does not reside in this "real world."JeffJo

    That's fine with me. In that case, one must be open to embracing both horns of the dilemma, and realize that there being an unconditional expectation 1.25v for switching, whatever value v one might find in the first envelope, isn't logically inconsistent with there being an unconditional expectation 1.25w for sticking, whatever value w one might be shown to be in the other envelope (as @Srap Tasmaner had suggested, by way of reductio of the unconditional switching strategy). The situation would therefore be analogous to the one that I illustrated with my Hilbert's Infinite Hotel thought experiment.
  • Mathematical Conundrum or Not? Number Six
    I'm not sure what "real world" has to do with anything. But...JeffJo

    It has to do with the sorts of inferences that are warranted on the ground of the assumption that the player still "doesn't know" whether her envelope is or isn't the largest one whatever value she finds in it. And this, in turn, depends on how "doesn't know" is meant to be interpreted. Is that meant to imply that the the player is entitled to apply the principle of indifference and therefore assign a 50% probability (exactly) to each one of the two possibilities irrespective of the value that she finds in her envelope? This is what I would take to entail that the prior distribution is uniform, and not to be consistent with a prior belief that the amount of money in the universe is finite.

    When I say that the prior distribution is uniform, I mean this to represent the player's prior expectation (or credence) that whatever real positive value v it is that she will find in her envelope, it remains equally likely, from her point of view, that the other envelope might contain v/2 or 2*v. This would also entail that the prior expectation that the player would find a value v in her first envelope that is lower than some upper bound M, however large M might be, is infinitesimal. That such uniform and unbounded prior expectations don't apply to rational agents being faced with "real world" problems is what I meant.
  • Mathematical Conundrum or Not? Number Six
    It'll be fine once we use a MCMC and get the HDI.Jeremiah

    Yes, everything will be fine and dandy. How did I not think of that...
  • Mathematical Conundrum or Not? Number Six
    A normal distribution does not have to have a mid of 0, nor do they need negative values.Jeremiah

    I did not say that it has to be centered on zero. Normal distributions are unbounded on both sides, however. They assign positive probability densities to all real values.
  • Mathematical Conundrum or Not? Number Six
    A normal prior would actually make more sense, as empirical investigations have shown it robust against possible skewness.Jeremiah

    I am not sure how you would define a normal prior for this problem since it is being assumed, is it not, that the amount of money that can be found in the envelopes is positive? If negative values are allowed, and the player can incur a debt, then, of course, a normal prior centered on zero yields no expected gain from switching. Maybe a Poisson distribution would be a better fit for the player's prior credence in a real world situation where negative amounts are precluded but no more information is explicitly given. But such a prior credence would also fail to satisfy the principle of indifference as applied to the 'v is lowest' and 'v is highest' possibilities, conditionally on most values v being observed in the first envelope.
  • Mathematical Conundrum or Not? Number Six
    A random variable is defined by a real world functionJeremiah

    That's a bit like saying that a geometrical circle is defined by a real world cheese wheel.
  • Mathematical Conundrum or Not? Number Six
    (...) That distribution is an unknown function F1(x). After picking high/low with 50:50 probability, the value in our envelope is a new random variable V. Its distribution is another unknown function F2(v), but we do know something about it. Probability theory tells us that F2(v) = [F1(v)+F1(2v)]/2. But it also tells us that the distribution of the "other" envelope, random variable Y, is F3(y) = [F1(y)+F1(2y)]/2. Y is, of course, not independent of V. The point is that it isn't F3(v/2)=F3(v)=1/2, either.JeffJo

    I agree with this, and with much of what you wrote before. I command you for the clarity and rigor of your explanation.

    Looking in the envelope does change our role from that of the game player, to the gamemaster. Just like seeing the color of your die does not. Simply "knowing" v (and I use quotes because "treat it as an unknown" really means "treat it as if you know the value is v, where v can be any *single* value in the range of V") does not change increase our knowledge in any way.

    I am not so sure about that. The game master does have prior exact knowledge of the function F1(x), which I possibly misleadingly earlier called the "initial distribution". According to the OP specification, the player doesn't necessarily know what this function is (although, under one interpretation of the problem, it must be assumed to be uniform and unbounded). When the player opens her envelope her epistemic position remains distinct from that of the gamemaster since she still is ignorant of F1(x).

    My earlier point, which is a concession to some of the things @Jeremiah and @Michael have said, is that the nature of the "system" whereby some probability distribution is being generated, and which F1(x) represents, only is relevant to a player who plays this specific game, either once or several times. But what it is that I may not have conveyed sufficiently clearly, is that when the player not only plays this game only once, but also, isn't seeking to maximize her expected value with respect to the specific probability space defined by this "game" of "system", but rather with respect to the more general probability space being generated by the distribution of all the possible games that satisfy the OP general specification, the specific F1(x) that is being known to the game master is irrelevant to the player at any step. The mutual dependences of F1, F2 and F3, though, as you correctly point out, are indeed relevant to the player's decision. They are relevant to constraining the player's relevant probability space. This is the point that @Michael may have missed.
  • Mathematical Conundrum or Not? Number Six
    Claiming this case is "ideal" is an entirely subjective standard pumped full of observational bias.Jeremiah

    The OP doesn't specify that the thought experiment must have physically possible instantiations. The only ideal, here, consists in pursuing strictly the mathematical/logical consequences of the principle of indifference even when it is interpreted such as to entails that the relevant distribution of possible pairs of envelope contents must be assumed by the player to be uniform and unbounded.
  • Mathematical Conundrum or Not? Number Six
    I suggested at one point in this thread that if told the value of the other envelope instead of your own, then you would want not to switch; I found this conclusion absurd but my interlocutor did not. Go figure.Srap Tasmaner

    That was a very pointed challenge, and I think it has force in the case where the analysis is assumed to apply to a game that can be instantiated in the real physical world and that can be played by finite creatures such as us. But in the ideal case where the prior distribution is assumed to be uniform and unbounded, then, the conclusion, although seemingly absurd, would be warranted.
  • Mathematical Conundrum or Not? Number Six
    (...) This is the fallacy. You reason from the fact that, given the criterion of success, you would have a 1 in 2 chance of picking the envelope that meets that criterion, to a 1 to 2 chance that the unknown criterion of success is the one your chosen envelope meets. (...)Srap Tasmaner

    This is a very neat analysis and I quite agree. One way to solve a paradox, of course, is to disclose the hidden fallacy in the argument that yields one of the two inconsistent conclusions. Your explanation indeed shows that, in the case where the criterion of success is unknown, and we don't have any reason to judge that the two possible success criteria are equiprobable, then it is fallacious to infer that the expected value of switching is 1.25v (where v is the value observed in the first envelope).

    This is consistent with what I (and some others) have been arguing although I have also highlighted another source of the paradox whereby the equiprobability assumption (regarding the two possible albeit unknown success criteria) would appear to be warranted, without any reliance on the fallacy that you have diagnosed, and that is what occurs in the case where the prior distribution of possible envelope pairs which the player is entitled to judge to be possible is assumed to be uniform and unbounded. This is the ideal case, physically impossible to realize in the real world, that I have illustrated by means of my Hilbert Rational Hotel analogy.
  • Mathematical Conundrum or Not? Number Six
    No, it isn't fair to say that. No more than saying that the probability of heads is different for a single flip of a random coin, than for the flips of 100 random coinsJeffJo

    Well, that's missing the point ;-) My purpose was to highlight a point that @Srap Tasmaner, yourself and I now seem to be agreeing on. Suppose, again, that a die throwing game will be played only once (i.e., there will be only one throw) with a die chosen at random between two oppositely biased dice as earlier described. The point was that, in the special case where the die that has been picked at random is biased towards 6, and hence the "initial distribution" is likewise biased, this fact is irrelevant to the "problem space" that the player is entitled to rely on. The initial coin picking occurred within the black box of the player's ignorance, as it were, such that the whole situation can still be treated (by her) as the single throw of a fair die.

    This is what I took @Srap Tasmaner to mean when he said that "the player" doesn't "interact" with the "sample space", which is a statement that I glossed as the player's "problem space" (maybe not the best expression) not being (fully) constrained by the "initial distribution" of envelope pairs, which must be treated as being unknown but, as you've pointed out repeatedly, still belongs to a range that is subject to definite mathematical constraints that can't be ignored.
  • Mathematical Conundrum or Not? Number Six
    Hilbert's Grand Hotel Revisited

    Here is an improvement on my earlier Hilbert Grand Hotel analogy to the two-envelope paradox. The present modification makes it into a much closer analogy.

    Rather than considering an Infinite Hotel where the countably infinitely many rooms are numbered with the natural numbers, this time, they will be numbered with the (also countably infinitely many) strictly positive rational numbers. Hence, for instance, room 3/2 will be located midway between room 1 and room 2, and will be small enough to fit between them. Guests of the Hilbert Rational Hotel who want to go into rooms corresponding to rational numbers that have very large denominators will have to swallow very many magic shrinking pills, let us assume (and may have to leave their luggage behind). Initially, infinitely many guests are being randomly distributed such that there are, on average, about one hundred guests in each room. Every morning, each guest is awarded $Q, where Q is their rational room number. Every night, each guest is being given the opportunity to flip a coin to determine if she will move from room Q to either room Q/2 (if she lands tails up) or to room 2*Q (if she lands heads up). Let us suppose that all the guests are greedy and rational and, hence, all choose to flip the coin (rather than stay) in order to increase their expected earnings to 1.25*Q on the next morning. (If we take a random sample of guests within any given bounded segment of Hilbert's Rational Hotel, it is clear that they fare better, on average, than they would if they had all chosen to stay. On average, they fare 1.25 times better.)

    The first thing to notice is that, after the daily reshuffling of guests, the average population of the rooms stays the same since each room is accessible from exactly two other rooms and as many guests, on average, move from room Q to room 2Q, for any Q, than the reverse. So, although any finite random sample of guests improves its fare 1.25 times, on average, the average room occupancy doesn't change and the overall population density (measured as guests per room) on any segment of the hotel doesn't vary at all.

    Now, suppose that on some particular nights -- on Sunday nights, say -- all the guests swallow a pill that makes them forget what room it is that they had moved from on the previous night. They are then offered the opportunity to blindly move back to whatever room it is that they previously occupied. If they are rational and greedy, they ought to choose to move back since, if they now are in room Q, it is equally likely that they came from room Q/2 than it is that they came from room 2*Q. Hence, on average, the guests who move back to the room where they came from fare 1.25 times better than they would if they stayed.

    This explains why, likewise, in the two-envelope game, with an unbounded and uniform distribution, (if such a thing is intelligible at all), it appears rational to switch envelopes in order to increase one's expected value by 1.25 and, nevertheless, after one has switched blindly, it would appear to be equally rational to switch back in order to increase one's expected value by 1.25. Such distributions, though, can no more be instantiated in real games than Hilbert's Grand Hotel can be built.
  • Mathematical Conundrum or Not? Number Six
    Why would your ignorance preclude you from facing a choice and making a decision? In the OP, you make at least two choices: which envelope to claim, and whether to keep it or trade it for the other. Whether you end up with the most or the least you could get depends on those two decisions and nothing else. What the amounts are depends on someone else.Srap Tasmaner

    Your ignorance doesn't preclude you from facing a choice and making a decision. What it precludes you from doing is basing your decision to switch (if you decide to) on a determinate expected gain, since the value of such an expected gain (conditionally on your having seen v in the first envelope) is unknown (and, in particular, is not known to be zero).

    Only in the case where the distribution is uniform and unbounded can you know what the expected gain from switching is, and know it to be precisely 1.25v. (And that's because, in the unbounded case, it is precluded that you might ever hit the top of the distribution(*) and hence lose in one fell swoop all the expected gains potentially accrued from the other cases in the average time that it takes for this large loss to randomly occur). In that unbounded case, you would know that switching will earn you either v/2 or 2*v with equal probability. It merely appears paradoxical, in that unbounded case, that you are always entitled to switch but that, nevertheless, if you were to switch blindly the first time, and only open the second envelope to find some value w, then, you would still expect 1.25w from switching back. It is this apparent paradox, occurring with infinite and uniform distributions, that my Hilbert Grand Hotel analogy was meant to illustrate.

    (*) I am only considering uniform probability distributions over elements of a single discrete doubling sequence, for simplicity. We can also assume it to be truncated below, at $1, say.
  • Mathematical Conundrum or Not? Number Six
    Conversely, the expected gain that the player calculates will still be the unconditional gain of zero since she doesn't know the initial distribution or both amounts in the selected envelope pair.Andrew M

    On the assumption, of course, that the player takes this initial distribution to be bounded above by M for some (possibly uncknown) M; or that, if unbounded, its sum or integral converges.
  • Mathematical Conundrum or Not? Number Six
    There are values in envelopes. How they got there can be discussed, and that can be interesting when the player has, say, partial knowledge of that process, but it is not the source of the paradox, in my opinion.Srap Tasmaner

    Whereas, on my view, it is the source of the paradox ;-)
  • Mathematical Conundrum or Not? Number Six
    This makes no sense to me. Initial distribution of what? If these are pairs of envelopes from which will be chosen the pair that the player confronts, then not only is this sample space unknown to the player, she never interacts with it. She will face the pair chosen and no other.Srap Tasmaner

    Yes, it is tempting to say that if the game is only being played once then the shape of the initial distribution(*) isn't relevant to the definition of the player's known problem space. That's a fair point, and I may not have been sufficiently sensitive to it. This is broadly the argument @Michael and @Jeremiah are making, although they are reaching opposite conclusions regarding the expected gain from switching. (And they're, in two different ways, both right; hence the paradox). The player doesn't know what this distribution is, and, since she only is playing once, and hence only is being dealt one single envelope pair, how is it relevant what the other unrealized envelope pairs (and their probabilistic frequencies) were within this specific distribution? But it is actually relevant what the distribution might look like since there are logical constraints on the shape of that distribution that can be inferred from the assumptions that ground either strategies (i.e. switching or being indifferent to switching).

    The advocate of the switching strategy is actually right in saying that if, even after she has seen that her envelope contains v, she still has no reason to assign an at least twice higher probability to the other envelope being v/2 rather than 2v, then she is justified in switching. The switching strategy can only be justified since, precisely, the prior probability distribution of this game, which is only being played once, doesn't interact with this player's problem space. When she gets to play such a game again, if ever, the prior distribution might be different.

    What rather defines the problem space of this player, in accordance with the general specification of the problem that is provided to her (as described in the OP), is the range of possible envelope pairs (and their probabilistic weighs) that is being merely consistent with the general (and abstract) specification of the problem. There are however (roughly) two ways to generate such a range: bounded or unbounded. If it's unbounded, then the switching strategy is justified since the expected gain from switching is 1,25v conditionally on any v being found in the first envelope. If the range is bounded, because the player assumes that there is some finite amount of money M in the universe, however big M might be, then her problem space is such that the average expected gain from switching is zero. This is because her problem space is built up from some arbitrary weighing of all the possible bounded prior distributions, and all of those individually yield an average expected gain from switching that is zero, and so is any weighted sum of them.

    (*) I am defining this initial distribution with reference to the method, unknown to the player, by means of which the initial contents of the two envelopes is effectively being determined, by means of a pseudo-random number generator, quantum device, or whatever.
  • Mathematical Conundrum or Not? Number Six
    I'm not sure which of JeffJo's examples you're referring to.Srap Tasmaner

    I was not referring to a specific example but rather to his general resolution of the apparent problem. It both justifies the zero expected gain for the unconditional switching strategy and explains why the indifference principle can't be applied for inferring that the expected utility of switching is 1.25v, in the case where your envelope contains v.

    As for my "tree" and what it predicts -- You face a choice at the beginning between two values, and the same choice at the end between those same two values. If you flip a coin each time, then your expectation is the average of those two values both times and it is unchanged.

    I am not sure why you are saying that I am facing a choice rather than saying that I simply don't know whether my envelope is smallest or largest (within the pair that was picked). I am not facing a choice between two values. I am holding v, and I am offered to trade v for another envelope which, for all I know, might contain v/2 or 2v. One wrong inference that one might make is that just because I don't know whether the other envelope contains v/2 or 2v, and don't have any determinate means to estimate it, therefore the indifference principle applies and I can assign 1/2 to the probability of either cases. That would indeed yield an expected value of 1.25v for the second envelope. But that is a wrong inference from the fact that I don't know either P(v, v/2) or P(v, 2v). I only know that those two probabilities add up to one and their exact values are dependent on the prior distribution of possible envelope pairs. The only way for such an initial distribution to be such as to guarantee that P(v, v/2) = P(v, 2v) = 1/2 for any v would be for the initial distribution to be uniform and unbounded. For any other feasible way to randomly determine amounts of value pairs in accordance with some well behaved probability distribution, what the player can infer only is that, on average, after repeated trials, the expected value for switching tends towards zero.

    Opening an envelope changes things somewhat, but only somewhat. It gives more substance to the word "switch", because having opened one envelope you will never be allowed to open another. You are now choosing not between two envelopes that can be treated as having equal values inside, although they do not, but between one that has a known value and another that cannot be treated as equal in value.

    But it is, for all that, exactly the same choice over again, and however many steps there are between beginning and end, your expected take is the average of the values of the two envelopes. If there's an example in which that is not true, I would be surprised.

    I disagree. Suppose the initial distribution is, unbeknownst to you, ((5,10), (10,20), (20,40)). In that case, if you are being dealt 5, the expected value of sticking is 5. You don't know what the expected gain of switching is. But it's not the case that it is therefore zero. That would only be zero if you knew for a fact that (5, 10) is half as likely as (5, 2.5) in the prior distribution.
  • Mathematical Conundrum or Not? Number Six
    The OP asks, "What should you do?"

    Think of the problem as being in the same family as Pascal's Wager, involving decision theory and epistemology.
    Andrew M

    Indeed. And just as is the case with Newcomb's problem, with the two-envelope paradox also, the dominance principle and the (maximum) expected utility principle appear to recommend inconsistent strategies when carelessly applied. Newcomb's problem is more controversial, even, than the two-envelope paradox. It is also quite rich in philosophical implications.
  • Mathematical Conundrum or Not? Number Six
    What area of philosophy do you think the significance would obtain?Janus

    Probability is a big philosophical topic. It is quite tightly enmeshed with both metaphysics and epistemology. Michael Ayers wrote a lovely book, The Refutation of Determinism, which explores some of the philosophical problems associated with the concepts of probability, necessity and possibility, and also pursues some implications for the problem of free will and determinism. There is also a close connection with epistemology, Gettier examples, and some of the most puzzling paradoxes of Barn Facade County, which arise, it seems to me, from assumptions regarding the grounding of knowledge that are closely related to some of the assumption that give rise to the two-envelope paradox. Maybe I'll create a new topic about this when time permits.

    A first step, which is being pursued in this thread, is to get clear on (what should be) the uncontroversial steps in the mathematical reasoning.
  • Mathematical Conundrum or Not? Number Six
    Sorry, I'm not following this. This sounds like you think I said your expected gain when you have the smaller envelope is zero, which is insane.Srap Tasmaner

    No, that's not what I was saying. I was rather suggesting that, assuming there is some determinate albeit unknown probability distribution of possible envelope pairs, then, conditional on some one specific pair having been selected from this initial range, and consistently with $5 being one of the two amounts within this pair (because $5 is the amount that you have seen in your envelope, say), then, the expected gain of switching appears to be zero, according to your decision tree analysis. According to @JeffJo, it could be $2.5, $6.25, $10, or something else, and not necessarily zero. What it actually is, is unknown to the player. What is known to the player only is the average gain from the unconditional switching strategy. And that is zero.
  • Mathematical Conundrum or Not? Number Six
    It's truly remarkable that a question which is of no philosophical significance or interest could generate so many responses on a philosophy forum!Janus

    It does have some philosophical implications. Some of @andrewk's replies raised good philosophical points regarding the status and significance of probability distributions, which are being involved in the analysis of this apparent paradox.
  • Mathematical Conundrum or Not? Number Six
    Here's my decision tree again (...)Srap Tasmaner

    Yours isn't really a decision tree that the player must make use of since there is no decision for the player to make at the first node. Imagine a game where there are two dice, one of which is biased towards six (and hence against one) and the other die is equally biased towards one (and hence against six). None of the dice are biased against any of the other possible results, 2,3,4 and 5, which therefore still have a 1/6 chance of occurring. Suppose the game involves two steps. In the first step, the player is dealt one of the two dice randomly. In the second step the player rolls this die and is awarded the result in dollars. What is the player's expected reward? It is $3.5, of course, and upon playing the game increasingly many times, the player can expect the exact same uniform random distribution of rewards ($1,$2,$3,$4,$5,$6) as she would expect from repeatedly throwing one single unbiased die. Indeed, so long as the two dice look the same, and thus can't be reidentified from one iteration of the game to the next, she would have no way to know that the dice aren't unbiased. There just isn't any point in distinguishing two steps of the "decision" procedure, since the first "step" isn't acted upon, yields no information to the player, and can thus be included into a black box, as it were. Either this game, played with two dice biased in opposite directions, or the same game played with one single unbiased die, can be simulated with the very same pseudo-random number generator. Those two games really only are two different implementations of the very same game, and both call for the exact same strategies being implemented for achieving a given goal.

    In the case of the two envelopes paradox, the case is similar. The player never has the opportunity to chose which branch to take at the first node. So, the player must treat this bifurcation as occurring within a black box, as it were, and assign each branch some probability. But, unlike my example with two equally biased dice, those probabilities are unknown. @JeffJo indeed treats them as unknown, but he demonstrates that, whatever they are, over the whole range of possible dealings of two envelopes that may occur at the first step of the game, they simply divide out in the calculation of the expected gain of the switching strategy, which is zero in all cases of possible (bounded, or, at least, convergent) initial distributions. Where @JeffJo's approach seems to me to be superior to yours is that it doesn't yield an incorrect verdict for the specific cases where the prior distribution is such as to yield envelope pairs where, conditionally on being dealt either the smaller or the larger amount from this pair, the expected gain from switching isn't zero. Your own approach seems to yield an incorrect result, in that case, it seems to me.
  • Mathematical Conundrum or Not? Number Six
    This still looks like you're considering what would happen if we always stick or always switch over a number of repeated games. I'm just talking about playing one game. There's £10 in my envelope. If it's the lower bound then I'm guaranteed to gain £10 by switching. If it's the upper bound then I'm guaranteed to lose £5 by switching. If it's in the middle then there's an expected gain of £2.50 for switching. I don't know the distribution and so I treat each case as equally likely, as per the principle of indifference. There's an expected gain of £2.50 for switching, and so it is rational to switch.Michael

    This works if you are treating all the possible lower and upper bounds of the initial distribution as being equally likely, which is effectively the same as assuming a game where the distribution is uniform and unbounded. In that case, your expected value for switching is indeed 1.25 * v, conditionally on whatever value v you have found in your envelope, because there is no upper bound to the distribution. The paradox arises.

    If we then play repeated games then I can use the information from each subsequent game and switch conditionally, as per this strategy (or in R), to realize the .25 gain.

    If there is an upper bound M to the distribution, and you are allowed to play the game repeatedly, then you will eventually realize that the losses incurred whenever you switch after being initially dealt the maximum value M tend to wipe out your cumulative gains from the other situations. If you play the game x times, then your cumulative gain (which will tends towards zero) will tend, as x grows larger, towards being x times the average expected value of the gains for switching while playing the game only once. This average expected value will therefore also be zero. To repeat, conditionally on where v is situated in the bounded distribution, the expected value of switching could be either one of 2*v, 1.25*v or 0.5*v. On average, it will be v.
  • Mathematical Conundrum or Not? Number Six
    And some of the time it will be 2v, because it could also be the lower bound. So given that v = 10, the expected value is one of 20, 12.5, or 5. We can be indifferent about this too, in which case we have 1/3 * 20 + 1/3 * 12.5 + 1/3 * 5 = 12.5.Michael

    No. The conditional expected values of switching conditional on v = 10, and conditional on 10 being either at the top, bottom or middle of the distribution aren't merely such that we are indifferent between the three of them. They have known dependency relations between them. Although we don't know what those three possible conditional expected values are, for some given v (such as v = 10), we nevertheless know that, on average, for all the possible value of v in the bounded distribution, the weighted sum of the three of them is v rather than 1.25v.
  • Mathematical Conundrum or Not? Number Six
    But I don't know if my envelope contains the upper bound. Why would I play as if it is, if I have no reason to believe so?Michael

    Which is why you have no reason to switch, or not to switch. It may be that your value v is at the top of the distribution, or that it isn't. The only thing that you can deduce for certain, provided only that the distribution is bounded, if you are entirely ignorant of the probability that v might be at the top of the distribution, is that, whatever this probability might be, it always is such that the average expected values of switching, conditional on having been dealt some random envelope from the distribution, is the same as the average expected value of sticking. Sure, most of the time, the conditional expected value will be 1.25v. But some of the times it will be 0.5v.

Pierre-Normand

Start FollowingSend a Message