You then have a choice: walk away with the $Y or return
the envelope to the table and walk away with whatever is in the other
envelope. What should you do?
There's more to gain than there is to lose by switching. — Michael
The intuitive solution is that the bigger the amount in your envelope is, the more likely it is to be the one with 2X. — BlueBanana
It looks like you're assuming the player's utility function is the identity function, which would be unusual and unrealistic. Even if we assume that, the quoted calculation doesn't take account of all the available information. There is new information, which is the known dollar value in the opened envelope. That changes the utility calculation. — andrewk
That's exactly what I did here and here. If in each game we see that we have £10 we win more by switching than by not switching. — Michael
To see this, suppose you do the experiment 100 times where you always switch. 50 times (on average) you choose the $2X envelope and switch to $1X. So you earn $50X. 50 times you choose the $X envelope and switch to $2X. So you earn $100X. In total, you earn $150X.
Now you do the same experiment 100 times where you never switch. 50 times you choose the $2X envelope and don't switch. So you earn $100X. 50 times you choose the $1X envelope and don't switch. So you earn $50X. In total, you earn $150X. — Andrew M
50 times (on average) you choose the $2X envelope and switch to $1X. So you earn $50X.
50 times you choose the $X envelope and switch to $2X. So you earn $100X.
In total, you earn $150X.
50 times you choose the $2X envelope and don't switch. So you earn $100X.
50 times you choose the $1X envelope and don't switch. So you earn $50X.
In total, you earn $150X.
It's necessary to distinguish between two cases, which is whether we know the distribution of X, ie the distribution for the lower of the two values.As Jeremiah points out, your code doesn't reflect the problem in the OP. Before an envelope is picked, the expected value of each envelope is the same. — Andrew M
Like Baden you're conflating different values of X. — Michael
Given a starting envelope of $10 — Michael
In the absence of knowing the distribution of X, any calculations based on expected values prior to opening the envelope are meaningless and wrong. — andrewk
You can't assume you have a starting envelope of, for example, $10, and that the other envelope has either $5 or $20. — Andrew M
The OP instead assumes that you have two envelopes of, for example, $10 and $20, and that you randomly choose one of them. So half the time, the starting envelope would have $10 and half the time the starting envelope would have $20.
Over 100 runs with a switching strategy, you would switch from the $20 to the $10 envelope 50 times (earning $500) and switch from the $10 to the $20 envelope 50 times (earning $1000) for a total of $1500.
Over 100 runs with a keeping strategy, you would keep the $20 envelope 50 times (earning $1000) and keep the $10 envelope 50 times (earning $500) for a total of $1500.
So you earn $1500 (on average) on either strategy. — Andrew M
So your example of knowing the contents of both envelopes isn't at all analogous the example of only knowing what's in the one we have. — Michael
In the absence of knowing the distribution of X, any calculations based on expected values prior to opening the envelope are meaningless and wrong. Since the claimed paradox relies on such calculations, it dissolves. — andrewk
{1,2} {2,4} Big 1/4 1/4 Small 1/4 1/4
But there are not two random events or choices here; there is only one. You are re-using the choice between larger and smaller as the relative frequency of {1,2} and {2,4}. For all you know, {1,2} could be a hundred times more likely than {2,4}. In my version here, {1,2} has a chance of 0, and {2,4} a chance of 1. — Srap Tasmaner
Or does it make a difference if he knows beforehand that the experimenter has flipped a fair coin to determine which of {1, 2} and {2, 4} is to be used? — Michael
Surely in lieu of any evidence to suggest that one of {1, 2} and {2, 4} is more likely he should consider their probabilities equal? — Michael
could show why even the uninformative prior does not lead directly to your conclusion. — Srap Tasmaner
You could have X or 2X. If you have X and you switch then you get 2X but lose X so you gain X; so you get a +1 X. However, if you have 2X and switch then you gain X and lose 2X; so you get a -1 X. — Jeremiah
Recall that event L is when you start with 2X and event K is when you start with X
Since we don't know which it is upon seeing the money we will consider this an uninformative prior and give each a fair 50% likelihood. Then our sample space is [K,L]
In the event of L our expected gain loss sample space is [-X, 0]
In the event of K our expected gain loss sample space is [X,0]
That is the same even if you go the 1/2 route.
Let's try running a simulation on that structure.
K <- c("X",0)
L <- c("-X",0)
q <- c(K,L)
w <- sample(q, 10000, replace = TRUE)
sum(w == "X")
sum(w == "-X")
sum(w == 0)
The Result are:
x: 2528
-x: 2510
0: 4962
``` — Jeremiah
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.