Isn't all of this true whichever of a and b is larger, and whatever their ratio? — Srap Tasmaner
But it isn't logically consistent. With anything. That's what I keep trying to say over and over.
1.25v is based on the demonstrably-false assumption that Pr(X=v/2)=Pr(X=v) regardless of what v is. It's like saying that the hypotenuse of every right triangle is 5 because, if the legs were 3 and 4, the hypotenuse would be 5. — JeffJo
Exp(other) = (v/2)*Pr(picked higher) + (2v)*Pr(picked lower) is a mathematically incorrect formula, because it uses the probabilities of the wrong events.
Exp(other) = (v/2)*Pr(V=v|picked higher) + (2v)*Pr(V=v|picked lower) is the mathematically correct formula, because it uses the probabilities of the correct events.
A pedant would insist you need to include one probability from the probability distribution of whichever you choose. But it divides out so it isn't necessary in practice. — JeffJo
And since the OP does not include information relating to this, it does not reside in this "real world." — JeffJo
I'm not sure what "real world" has to do with anything. But... — JeffJo
It'll be fine once we use a MCMC and get the HDI. — Jeremiah
A normal distribution does not have to have a mid of 0, nor do they need negative values. — Jeremiah
A normal prior would actually make more sense, as empirical investigations have shown it robust against possible skewness. — Jeremiah
A random variable is defined by a real world function — Jeremiah
(...) That distribution is an unknown function F1(x). After picking high/low with 50:50 probability, the value in our envelope is a new random variable V. Its distribution is another unknown function F2(v), but we do know something about it. Probability theory tells us that F2(v) = [F1(v)+F1(2v)]/2. But it also tells us that the distribution of the "other" envelope, random variable Y, is F3(y) = [F1(y)+F1(2y)]/2. Y is, of course, not independent of V. The point is that it isn't F3(v/2)=F3(v)=1/2, either. — JeffJo
Looking in the envelope does change our role from that of the game player, to the gamemaster. Just like seeing the color of your die does not. Simply "knowing" v (and I use quotes because "treat it as an unknown" really means "treat it as if you know the value is v, where v can be any *single* value in the range of V") does not change increase our knowledge in any way.
Claiming this case is "ideal" is an entirely subjective standard pumped full of observational bias. — Jeremiah
I suggested at one point in this thread that if told the value of the other envelope instead of your own, then you would want not to switch; I found this conclusion absurd but my interlocutor did not. Go figure. — Srap Tasmaner
(...) This is the fallacy. You reason from the fact that, given the criterion of success, you would have a 1 in 2 chance of picking the envelope that meets that criterion, to a 1 to 2 chance that the unknown criterion of success is the one your chosen envelope meets. (...) — Srap Tasmaner
No, it isn't fair to say that. No more than saying that the probability of heads is different for a single flip of a random coin, than for the flips of 100 random coins — JeffJo
Why would your ignorance preclude you from facing a choice and making a decision? In the OP, you make at least two choices: which envelope to claim, and whether to keep it or trade it for the other. Whether you end up with the most or the least you could get depends on those two decisions and nothing else. What the amounts are depends on someone else. — Srap Tasmaner
Conversely, the expected gain that the player calculates will still be the unconditional gain of zero since she doesn't know the initial distribution or both amounts in the selected envelope pair. — Andrew M
There are values in envelopes. How they got there can be discussed, and that can be interesting when the player has, say, partial knowledge of that process, but it is not the source of the paradox, in my opinion. — Srap Tasmaner
This makes no sense to me. Initial distribution of what? If these are pairs of envelopes from which will be chosen the pair that the player confronts, then not only is this sample space unknown to the player, she never interacts with it. She will face the pair chosen and no other. — Srap Tasmaner
I'm not sure which of JeffJo's examples you're referring to. — Srap Tasmaner
As for my "tree" and what it predicts -- You face a choice at the beginning between two values, and the same choice at the end between those same two values. If you flip a coin each time, then your expectation is the average of those two values both times and it is unchanged.
Opening an envelope changes things somewhat, but only somewhat. It gives more substance to the word "switch", because having opened one envelope you will never be allowed to open another. You are now choosing not between two envelopes that can be treated as having equal values inside, although they do not, but between one that has a known value and another that cannot be treated as equal in value.
But it is, for all that, exactly the same choice over again, and however many steps there are between beginning and end, your expected take is the average of the values of the two envelopes. If there's an example in which that is not true, I would be surprised.
The OP asks, "What should you do?"
Think of the problem as being in the same family as Pascal's Wager, involving decision theory and epistemology. — Andrew M
What area of philosophy do you think the significance would obtain? — Janus
Sorry, I'm not following this. This sounds like you think I said your expected gain when you have the smaller envelope is zero, which is insane. — Srap Tasmaner
It's truly remarkable that a question which is of no philosophical significance or interest could generate so many responses on a philosophy forum! — Janus
Here's my decision tree again (...) — Srap Tasmaner
This still looks like you're considering what would happen if we always stick or always switch over a number of repeated games. I'm just talking about playing one game. There's £10 in my envelope. If it's the lower bound then I'm guaranteed to gain £10 by switching. If it's the upper bound then I'm guaranteed to lose £5 by switching. If it's in the middle then there's an expected gain of £2.50 for switching. I don't know the distribution and so I treat each case as equally likely, as per the principle of indifference. There's an expected gain of £2.50 for switching, and so it is rational to switch. — Michael
If we then play repeated games then I can use the information from each subsequent game and switch conditionally, as per this strategy (or in R), to realize the .25 gain.
And some of the time it will be 2v, because it could also be the lower bound. So given that v = 10, the expected value is one of 20, 12.5, or 5. We can be indifferent about this too, in which case we have 1/3 * 20 + 1/3 * 12.5 + 1/3 * 5 = 12.5. — Michael
But I don't know if my envelope contains the upper bound. Why would I play as if it is, if I have no reason to believe so? — Michael
No, because the expected gain is $333,334, which is more than my 333,334X. — Michael
But we're just talking about a single game, so whether or not there is a cumulative expected gain for this strategy is irrelevant. — Michael
If it's more likely that the expected gain for my single game is > X than < X then it is rational to switch.
Or if I have no reason to believe that it's more likely that the other envelope contains the smaller amount then it is rational to switch, as I am effectively treating a gain (of X) as at least as likely as a loss (of 0.5X).
The other way to phrase the difference is that my solution uses the same value for the chosen envelope (10) and your solution uses different values for the chosen envelope (sometimes 10 and sometimes 20 (or 5)). — Michael
then I am effectively treating both cases as being equally likely, and if I am treating both cases as being equally likely then it is rational to switch. — Michael
There's also the possibility that £10 is the bottom of the distribution, in which case the expected value for switching is £20. — Michael
Assuming your goal merely is to maximise your expected value, you have not reason to favor switching over sticking.
— Pierre-Normand
Which, as I said before, is equivalent to treating it as equally likely that the other envelope contains the smaller amount as the larger amount, and so it is rational to switch.
If there's £10 in my envelope then the expected value for switching is £12.50, and the expected value for switching back is £10. — Michael
I have no way of knowing that my value is "average". Perhaps the 10^100 in my envelope is a puny value because the upper bound is Graham's number. — Michael
My argument is that given how arbitrarily large the numbers in the envelopes can be (using points rather than money), there isn't really a point at which one would consider it more likely that your envelope has the larger value. If my envelope is 10 then it's rational to switch. If it's 1,000 then it's rational too switch. If it's 10100 then it's rational to switch. — Michael
Sure, the practical limitations of real life play a role, but I wonder if such limitations go against the spirit of the problem. What if instead of money it's points, and the goal of the game is to earn the most points? There isn't really a limit, except as to what can be written on paper, but with such things as Knuth's up-arrow notation, unfathomably large numbers like Graham's number aren't a problem. — Michael
So my takeaway is that if it isn't rational to stick then it's rational to switch. — Michael
But this just seems to be saying that there's no reason to believe that it's more likely that the other envelope contains the smaller amount and no reason to believe that it's more likely that the other envelope contains the larger amount and so you're effectively treating each case as equally likely, in which case it would be rational to switch. — Michael
So what's the rational decision if you know that the prior distribution is isn't uniform and unbounded? There's £10 in your envelope. Should you stick or switch? — Michael
