If it would be your own money at stake here, you shouldn't be playing at all. — Srap Tasmaner
I don't believe that for a moment. I think that why science is currently embroiled in what Jim Baggott calls 'fairytale physics' is precisely the complete and total absence of an 'immanent unity'. — Wayfarer
So to make this a better analogy, let's say that some third party asks us both to play the game. He will roll two dice, and if I win then you give me £10 and if I lose then I give you £5. He doesn't tell us what result counts as a win for me and what counts as a win for you. It could be that 1-11 is a win for you, or it could be that 1-11 is a win for me, or it could be that 1-6 is a win for me.
I would be willing to play, as I have more to gain than I have to lose. You, presumably, wouldn't be willing to play, as you have more to lose than you have to gain. — Michael
Imagine you're given £100 and are offered the choice to pay £100 to play a game with a 5/6 chance of winning (say a dice roll). If you win then you win £1,000,000 and if you lose then you lose all the money you've won up to that point.
The average return for repeated games is 0, as you're almost certain to lose at some point. But playing it just once? That's worth it.
This is why I think talking about average returns over repeated games is a red herring. — Michael
It cannot be equally likely without postulating a benefactor with (A) an infinite supply of money, (B) the capability to give you an arbitrarily-small amount of money, and (C) a way to select a random number uniformly from the set of all integers from -inf to inf.
All three of which are impossible.
But the reason you should reject the solution you use is because it is not a correctly-formed expectation. You are using the probability of picking the smaller value, where you should use the probability that the pair of values is (v,2v) *AND* you picked the smaller, given that you picked v. — JeffJo
How is this any different to saying that I'm equally likely to win as lose? — Michael
No, because I know the probabilities aren't in my favour. If I know that they're not in my favour then I won't play. If I know that they're in my favour then I will play. If I don't know the odds then I will play. — Michael
But you're also not saying that sticking is a winning strategy. If sticking isn't preferred then I am going to switch, because I am willing to risk losing £5 for the chance to win £10. I have more to gain than I have to lose. That neither strategy gains over the other after repeated games doesn't change this. — Michael
If there's no reason to believe that we're more likely to lose than win on switching, i.e. if there's no reason to prefer sticking, and if we can afford to lose, then switching is a good gamble for a single game, even if not a winning strategy over many games. I either lose £5 or I gain £10. That's a bet worth making for me, and so if I were to play this game and find £10 in my envelope then I would switch. — Michael
The Transactional Interpretation of Quantum Mechanics, 2012, p.33 — Wayfarer
What interests me about that article, however, is the idea of 'potentia' as 'real but not actually existing'. 'The unmanifest' was tacked on by me at the end, it might be misleading - that's not the main point of the article. — Wayfarer
So we’re assuming that the other envelope is equally likely to contain either £20 or £5, and that’s a reason to switch. We either lose £5 or gain £10. That, to me, is a reasonable gamble. — Michael
Isn't all of this true whichever of a and b is larger, and whatever their ratio? — Srap Tasmaner
But it isn't logically consistent. With anything. That's what I keep trying to say over and over.
1.25v is based on the demonstrably-false assumption that Pr(X=v/2)=Pr(X=v) regardless of what v is. It's like saying that the hypotenuse of every right triangle is 5 because, if the legs were 3 and 4, the hypotenuse would be 5. — JeffJo
Exp(other) = (v/2)*Pr(picked higher) + (2v)*Pr(picked lower) is a mathematically incorrect formula, because it uses the probabilities of the wrong events.
Exp(other) = (v/2)*Pr(V=v|picked higher) + (2v)*Pr(V=v|picked lower) is the mathematically correct formula, because it uses the probabilities of the correct events.
A pedant would insist you need to include one probability from the probability distribution of whichever you choose. But it divides out so it isn't necessary in practice. — JeffJo
And since the OP does not include information relating to this, it does not reside in this "real world." — JeffJo
I'm not sure what "real world" has to do with anything. But... — JeffJo
It'll be fine once we use a MCMC and get the HDI. — Jeremiah
A normal distribution does not have to have a mid of 0, nor do they need negative values. — Jeremiah
A normal prior would actually make more sense, as empirical investigations have shown it robust against possible skewness. — Jeremiah
A random variable is defined by a real world function — Jeremiah
(...) That distribution is an unknown function F1(x). After picking high/low with 50:50 probability, the value in our envelope is a new random variable V. Its distribution is another unknown function F2(v), but we do know something about it. Probability theory tells us that F2(v) = [F1(v)+F1(2v)]/2. But it also tells us that the distribution of the "other" envelope, random variable Y, is F3(y) = [F1(y)+F1(2y)]/2. Y is, of course, not independent of V. The point is that it isn't F3(v/2)=F3(v)=1/2, either. — JeffJo
Looking in the envelope does change our role from that of the game player, to the gamemaster. Just like seeing the color of your die does not. Simply "knowing" v (and I use quotes because "treat it as an unknown" really means "treat it as if you know the value is v, where v can be any *single* value in the range of V") does not change increase our knowledge in any way.
Claiming this case is "ideal" is an entirely subjective standard pumped full of observational bias. — Jeremiah
I suggested at one point in this thread that if told the value of the other envelope instead of your own, then you would want not to switch; I found this conclusion absurd but my interlocutor did not. Go figure. — Srap Tasmaner
(...) This is the fallacy. You reason from the fact that, given the criterion of success, you would have a 1 in 2 chance of picking the envelope that meets that criterion, to a 1 to 2 chance that the unknown criterion of success is the one your chosen envelope meets. (...) — Srap Tasmaner
No, it isn't fair to say that. No more than saying that the probability of heads is different for a single flip of a random coin, than for the flips of 100 random coins — JeffJo
Why would your ignorance preclude you from facing a choice and making a decision? In the OP, you make at least two choices: which envelope to claim, and whether to keep it or trade it for the other. Whether you end up with the most or the least you could get depends on those two decisions and nothing else. What the amounts are depends on someone else. — Srap Tasmaner
Conversely, the expected gain that the player calculates will still be the unconditional gain of zero since she doesn't know the initial distribution or both amounts in the selected envelope pair. — Andrew M
There are values in envelopes. How they got there can be discussed, and that can be interesting when the player has, say, partial knowledge of that process, but it is not the source of the paradox, in my opinion. — Srap Tasmaner
This makes no sense to me. Initial distribution of what? If these are pairs of envelopes from which will be chosen the pair that the player confronts, then not only is this sample space unknown to the player, she never interacts with it. She will face the pair chosen and no other. — Srap Tasmaner
I'm not sure which of JeffJo's examples you're referring to. — Srap Tasmaner
As for my "tree" and what it predicts -- You face a choice at the beginning between two values, and the same choice at the end between those same two values. If you flip a coin each time, then your expectation is the average of those two values both times and it is unchanged.
Opening an envelope changes things somewhat, but only somewhat. It gives more substance to the word "switch", because having opened one envelope you will never be allowed to open another. You are now choosing not between two envelopes that can be treated as having equal values inside, although they do not, but between one that has a known value and another that cannot be treated as equal in value.
But it is, for all that, exactly the same choice over again, and however many steps there are between beginning and end, your expected take is the average of the values of the two envelopes. If there's an example in which that is not true, I would be surprised.
The OP asks, "What should you do?"
Think of the problem as being in the same family as Pascal's Wager, involving decision theory and epistemology. — Andrew M
What area of philosophy do you think the significance would obtain? — Janus
Sorry, I'm not following this. This sounds like you think I said your expected gain when you have the smaller envelope is zero, which is insane. — Srap Tasmaner
It's truly remarkable that a question which is of no philosophical significance or interest could generate so many responses on a philosophy forum! — Janus
Here's my decision tree again (...) — Srap Tasmaner
This still looks like you're considering what would happen if we always stick or always switch over a number of repeated games. I'm just talking about playing one game. There's £10 in my envelope. If it's the lower bound then I'm guaranteed to gain £10 by switching. If it's the upper bound then I'm guaranteed to lose £5 by switching. If it's in the middle then there's an expected gain of £2.50 for switching. I don't know the distribution and so I treat each case as equally likely, as per the principle of indifference. There's an expected gain of £2.50 for switching, and so it is rational to switch. — Michael
If we then play repeated games then I can use the information from each subsequent game and switch conditionally, as per this strategy (or in R), to realize the .25 gain.
And some of the time it will be 2v, because it could also be the lower bound. So given that v = 10, the expected value is one of 20, 12.5, or 5. We can be indifferent about this too, in which case we have 1/3 * 20 + 1/3 * 12.5 + 1/3 * 5 = 12.5. — Michael
But I don't know if my envelope contains the upper bound. Why would I play as if it is, if I have no reason to believe so? — Michael