What's the probability that the two envelopes are filled with {5,10} and the probability that the two envelopes are filled with {10,20}? You know that you're in one of these states given that you just saw 10. — fdrake
If you did, that statement was completely obfuscated by your sloppy techniques. And it isn't the argument that is flawed, it is the assumption that you know the distribution. As shown by the points you have ignored in my posts.I specifically said that my simulation is not about finding expected results as that entire argument is flawed.
The thing to notice here is that you can't use your "X"- the lower value - as a condition unless you see both envelopes, which renders analysis pointless.The thing to notice here is that in all cases the absolute value of the difference between column one and column two is always equal to the lesser of the two (save rounding errors). The lesser of the two is X.
I pointed out a few times that the sample space of event R and the sample space of event S are equal subsets of each other, which means mathematically we can treat them the same.
This is unintelligble.R and S is the event you have X or 2X when you get A. By definition of A and B, if A=X then B =2X, or if A =2X then B=X. So by definition if A equals X then B cannot also equal X.
This may be what you were referring to when about "Y." It still refers to ill-defined sets, not distributions.I also pointed out that as soon as you introduce Y they are no longer equal subsets and therefore mathematically cannot be treated the same.
But if the host selects the value any other way then the objective probabilities will differ. However, assuming the participant doesn't know how the values are selected, he'll just apply the principle of indifference and assume a probability of 0.5 each. — Michael
Right. So there are two cases.
A: One envelope is filled with X and the other 2X.
B: One envelope is filled with X and the other X/2.
Knowing that your envelope has X=10 doesn't let you distinguish between the two cases, right? Same for any value of X. — fdrake
In case A you're assigned either X or 2X.
In case B you're assigned either X or X/2. — fdrake
Informally: you have X, yes, but you don't know whether it really is X or 2X.
You see U, you don't know whether it is [X or 2X] or [X/2 or X], so you don't know if you gain by switching.
I hope I'm not talking down, that isn't my point. But probability really needs great care with terminology, and it has been noticeably lacking in this thread. This question is a result of that lack.You're going it again. Is X the value of my envelope or the value of the smallest envelope? You're switching between both and it doesn't make any sense, especially where you say "if you have the lower value envelope X/2".
Now, if we don't see a value in an envelope, we know that v-w must be either +x, or -x, with a 50% chance for either. So switching can't help. The point to note is that we don't know what value we might end up with; it could be anything in the full range of V.
We can ask whether a probability space appears to help towards achieving its aim, but it makes no sense to ask whether a probability space is correct. — andrewk
Even in that more general case, the Bayesian approach can give a switching strategy with a positive expected net gain. Based on our knowledge of the world - eg how much money is likely to be available for putting in envelopes - we adopt Bayesian priors for U and V that are iid. We can use the priors to calculate the expected gain from switching as a function of the observed amount Y. That gain function will be a function that starts at 0, increases, reaches a maximum then decreases, going negative at some critical point and staying negative thereafter. The strategy is to switch if the observed amount Y is less than that critical point. — andrewk
Indeed, and that's where utility curves come in. If a parent has a child who will die unless she can get medicine costing M, and the parent can only access amount F, the parent should switch if the observed amount is less than M-F and not switch otherwise.You could also use different loss functions rather than raw expected loss to leverage other contextual information, but I don't see any useful way of doing that here. — fdrake
Yes, if we assume a uniform distribution for X on the interval [1,M]. If we assume a more shaped distribution that decays gradually to the right then it will be something different. A gradually changing distribution would be more realistic because it would be strange to say that the probability density of choosing X=x is constant until we reach M and then suddenly plunges to zero. The calculations get messier and hard to discuss without long equations if we use fancy distributions (such as beta distributions or truncated lognormals) rather than a simple uniform distribution. But they can be done.That critical point is going to be the highest XX that can (will?) be selected by the host, correct? — Michael
Each probability space serves a different purpose. We can ask whether a probability space appears to help towards achieving its aim, but it makes no sense to ask whether a probability space is correct. — andrewk
Indeed, and that's where utility curves come in. If a parent has a child who will die unless she can get medicine costing M, and the parent can only access amount F, the parent should switch if the observed amount is less than M-F and not switch otherwise. — andrewk
It absolutely makes sense to ask if it is correct, and that should be the first question you ask yourself whenever you model something. — Jeremiah
By this do you just mean that if we know that the value of X is to be chosen from a distribution of 1 - 100 then if we open our envelope to find 150 then we know not to switch? — Michael
Unfortunately, we don't (and can't) know the probabilities that remain. For some values of v, it may be that you gain by switching; but then for some others, you must lose. The average over all possible values of v is no gain or loss.
What you did, was assume Pr(X=v/2) = Pr(X=v) for every value of v. That can never be true. — JeffJo
You're right that seeing $2 tells you the possibilities are {1,2} and {2,4}. But on what basis would you conclude that about half the time a participant sees $2 they are in {1,2}, and half the time they are in {2,4}? That is the step that needs to be justified. — Srap Tasmaner
Essentially, we include the impossible values that may come up in calculations in the range, and make them impossible in the probability distribution. — JeffJo
Unfortunately, we don't (and can't) know the probabilities that remain. For some values of v, it may be that you gain by switching; but then for some others, you must lose. The average over all possible values of v is no gain or loss. — JeffJo
So then the puzzle is what to do about the Always Switch argument, which appears to show that given any value for an envelope you can expect the other envelope to be worth 1/4 more, so over a large number of trials you should realize a gain by always switching. This is patently false, so the puzzle is to figure out what's wrong with the argument. — Srap Tasmaner
And I know I am just speaking in the wind at this point — Jeremiah
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.