Then you reject 1, because those are two different values of X. — Srap Tasmaner
If X = 10 and your envelope is worth 10, you have the X envelope. By trading, you gain X. This is the X that matters. For any pair of envelopes, there is a single value of X. (If your envelope was worth 20, you would have the 2X envelope and would lose X by trading.)
If X = 5 and your envelope is worth 10, you have the 2X envelope. By trading, you lose X. (If your envelope was worth 5, you would gain X by trading.)
Every pair of envelopes has one larger and one smaller. You have an even chance of picking the larger or the smaller. If you picked the larger, you can only lose by trading. If you picked the smaller, you can only gain by trading. There is never any case, no matter what envelope you picked from whatever pair, in which you have a non-zero chance of gaining and a non-zero chance of losing. It's always one or the other and never both. — Srap Tasmaner
But it is still false that you have an expected gain of 1/4 the value of your envelope. You really don't. All these justifications for assigning 50% to more possibilities than two envelopes can hold are mistaken. You picked from one pair of envelopes. This is the only pair that matters. You either have the bigger or the smaller. Trading the bigger is a loss, trading the smaller is a gain, and it's the same amount each way. — Srap Tasmaner
*** You might have (3) and (4) a little wrong but I can't judge. The McDonnell & Abbott paper makes noises about the player using Cover's strategy having no knowledge of the PDF of X. — Srap Tasmaner
between the X and 2X envelope amounts — Andrew M
That paper appears to put forward the same position as mine: that always-switching delivers no expected gain, even if the envelope has been opened, but that a strategy based on switching only if the observed amount is less than some pre-selected value delivers a positive expected gain.Right. The paper Jeremiah linked talks about this too.
But suppose you don't do this. Suppose you just select some X at random from {1, 2, 3, 4, 5, 6, 7, 8, 9, 10}, put it in one envelope, and then put 2X in the other envelope. I select one at random and open it to see £10. Am I right in saying that {£5, £10} and {£10, £20} are equally likely? — Michael
If we don't have this information then we should apply the principle of indifference, which will have us say that {£5, £10} and {£10, £20} are equally likely. — Michael
The argument is sound, so I probably won't spend any more time trying to figure out how to simulate knowing nothing about the sample space and its PDF. — Srap Tasmaner
I did wonder -- maybe a week ago? it's somewhere in the thread -- if there isn't an inherent bias in the problem toward switching because of the space being bounded to the left, where the potential losses are also getting smaller and smaller, but unbounded to the right, where the potential gains keep getting bigger and bigger. — Srap Tasmaner
So, you are willing to risk losing $5 for the chance to gain $10? Regardless of the odds behind that risk?If there's £10 in my envelope and I know that the other envelope contains either £5 or £20 because I know that one envelope contains twice as much as the other then I have a reason to switch; I want an extra £10 and am willing to risk losing £5 to get it. — Michael
The highlighted assertion is incorrect. First off, "objective probability" means the "likelihood of a specific occurrence, based on repeated random experiments and measurements (subject to verification by independent experts), instead of on subjective evaluations." Se have no such repeated measurements, so any assessment of Pr(X=a∣A=a) is subjective.If he selects it at random from a distribution that includes a2 and a then the objective probability of X=a∣A=a is 0.5 and the objective probability of 2X=a∣A=a is 0.5. So there would be an objective expected gain. — Michael
The highlighted assertion is incorrect. First off, "objective probability" means the "likelihood of a specific occurrence, based on repeated random experiments and measurements (subject to verification by independent experts), instead of on subjective evaluations." Se have no such repeated measurements, so any assessment of Pr(X=a∣A=a) is subjective. — JeffJo
So, you are willing to risk losing $5 for the chance to gain $10? Regardless of the odds behind that risk? — JeffJo
In the OP, you have no way of knowing whether your benefactor was willing to part with more than $10. If all he had was a $5 bill and a $10 bill, then he can't. Your chances of picking Low=$5 or High=$10 were indeed 50% each, but your chances of picking Low=$10 were nil. — JeffJo
Well, I can't address your disagreement unless you explain why you feel that way. That characterization is correct. There may be different ways people express their uncertainty, but it still boils down to the same concept.In broad terms I do not disagree with that characterisation. — andrewk
What kind of differences are you talking about? There is no single way to express a sample space, and in fact what constitutes an "outcome" is undefined. We've experienced that here: some will use a random variable V that means the value in your envelope, while others represent the same property of the process by the difference D (which is the low value as well).But there is often more than one way to represent uncertainty, and these lead to different probability spaces. I have referred previously to the observation that in finance many different, mutually incompatible probability spaces can be used to assign a value to a portfolio of derivatives.
Maybe I was mixing Andrews up. I apologize.And what you seem to be avoiding with that attitude, is that the expectation formula (v/2)/2 + (2v)/2 is already assuming: — JeffJo
I am not an advocate for that expectation formula, so I don't see why you'd think I am avoiding those objections to it.
There is never any case, no matter what envelope you picked from whatever pair, in which you have a non-zero chance of gaining and a non-zero chance of losing — Srap Tasmaner
If I know that the odds are even then I will play. If I know that the odds aren't even then I might be willing to play, depending on the odds. If I don't know the odds then I will play. — Michael
Certainly. It isn't "objective." I thought I made that pretty clear.So if I pick a number at random from {1, 2, 3, 4, 5, 6, 7, 8, 9, 10} you disagree that the objective probability that I will pick 5 is 0.1? — Michael
And that's the point. You cannot know this in the two envelope problem, when you know what value you are switching from. Unless, of course, you know how the amounts were determined. Do you?If I know that the odds are even then I will play.
I'm assuming, based on the first sentence here, that you left a "not" out of the second?If I know that the odds aren't even then I might be willing to play, depending on the odds. If I don't know the odds then I will play.
Yep. Now, do you know how the odds of having only a $5 and a $10 compare to only having a $10 and a $20? No? Then you can't say that the chances of gaining are the same as the chances of losing.And if all he had was a $10 bill and a $20 bill then my chances of picking High = $10 is nil.
Say "I don't know how to compare the chances of gaining to the chance of losing."So what am I supposed to do if I don't know how the values are selected?
Nope. The PoI applies only if you can establish that there is a causal equivalence between every member of the set to which you apply it. It is specifically inapplicable here, because there cannot be a strategy for filling the envelopes where it is true for an arbitrary value you see.As I said here, if I don't have any reason to believe that X = 5 is more likely than X = 10 then why wouldn't I switch? I think the principle of indifference is entirely appropriate in this circumstance.
That is a different question. The point is that the risk is unknowable, and probably not 50%. Whether you think that a $5 loss is acceptable regardless of the risk is a subjective decision only you can make.There's more to gain than there is to lose, and a loss of £5 is an acceptable risk.
Srap Tasmaner is saying that, to someone who knows what is in *both* envelopes, the possibility of gaining or losing is determined. Michael is saying that, to someone who doesn't see both, the two cases should be treated with probabilities that are >=0, and that add up to 1.
The error is thinking that both must be 50%. Your chance of High or Low is 50% if you don't know the value in the one you chose, but it can't be determined if you do. — JeffJo
You can't know the odds when you look in an envelope and see a value. You can choose to play, not knowing the odds, but your calculation of the expectation is wrong. — JeffJo
On the other hand, I think the right way to look at it is what I've been saying lately:
there are two choices;
the host's choice determines how much can be gained or lost by switching;
the player's choices determines whether they gain or lose. — Srap Tasmaner
If you don't look, the two envelopes have the same expected value. If you do, there is not enough information to say how the probabilities split between having the higher, or lower, value. — JeffJo
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.