So you're saying that before I look I can say that there's a 50% chance that my envelope is envelope X but after looking I can't? — Michael
Suppose X = 1.
You pick an envelope. On opening it, you find $2. You have chosen the 2X envelope but you don't know it.
Does that mean there is a 1/2 chance that X = 2?
No. X = 1. It's just not true that half the time 2 = 1 and half the time it doesn't.
Your not knowing whether you have the X or the 2X envelope doesn't change anything. — Srap Tasmaner
If you could prove that always switching is the best strategy over the long term, doesn't that amount to proving that you are more likely to have chosen the smaller envelope? Why doesn't that bother you? — Srap Tasmaner
That's a 2:1 payout. — Michael
1. We pick an envelope at random
2. There's a 50% chance that my envelope is the X envelope and a 50% chance that my envelope is the 2X envelope.
3. I open my envelope and see £10
4. From 2 and 3, there's a 50% chance that my £10 envelope is the X envelope and a 50% chance that my £10 envelope is the 2X envelope. — Michael
I'm just doing this:
1. We pick an envelope at random
2. There's a 50% chance that my envelope is the X envelope and a 50% chance that my envelope is the 2X envelope.
3. I open my envelope and see £10
4. From 2 and 3, there's a 50% chance that my £10 envelope is the X envelope and a 50% chance that my £10 envelope is the 2X envelope.
5. From 4, there's a 50% chance that the other envelope contains £20 and a 50% chance that the other envelope contains £5. — Michael
Check my previous post – I explained why this reasoning is fallacious. — Snakes Alive
There is no step that's wrong — Snakes Alive
No, your conclusion doesn't establish what you want it to (that switching is more profitable). — Snakes Alive
What I am saying is that as a strategy, switching does not increase your chances of earning more money regardless of how many times the game is played. — Snakes Alive
Your average earnings are the same regardless of whether you switch or not, and regardless of how many times the game is played.
But take one game in isolation? You have a 2:1payout with an even chance of winning. That's a bet worth making.
So in our case, I have £10 and the other envelope contains either £5 or £20. There's an even chance of winning, but the payout is 2:1. — Michael
This reasoning is fallacious. Did you read the post? — Snakes Alive
The fallacy is that there is some value, X, determined in each case. You are acting as if there is an independent variable Y, viz. what you drew first, and that switching will get you either .5Y or 2Y, and so on average the "bet" is worth taking, since your average payout is then 1.25Y.
This is fallacious, because there is no such variable: Y is defined in terms of X (either it is X, or 2X), and across the two scenarios you're averaging, the value of Y changes. Hence, there is no single value of Y across the two situations, and the value 1.25Y is a chimera. — Snakes Alive
On average you will break even, but if we just consider a single game then a 2:1 payout with an even chance of winning is a bet worth making. — Michael
If you define Y in terms of X, this illusion disappears. Thus, as you said, you either drew X or 2X. Therefore, there's a 50% chance that switching gets you X, and a 50% chance that it gets you 2X. Likewise if you stay. The average payout for either is 1.5X. Switching does not offer favorable odds.
This cannot be. — Snakes Alive
<?php $switch = $no_switch = 0; for ($i = 1; $i <= 1000000; ++$i) { // set X $X = random_int(1, 100); // randomly select whether you get the X or the 2X $choice = random_int(0, 1) ? 1 : 2; // if you swap, you get the other one, duh $swap = 1 + $choice % 2; // If we switch $switch += $X * $swap; // If we don't switch $no_switch += $X * $choice; } echo 'Switch: £' . number_format($switch) . PHP_EOL; echo 'No Switch: £' . number_format($no_switch);
Your reasoning switches the value of X across the two scenarios. — Snakes Alive
The very definition of probability is the frequency of occurrences over repeated random events. — Jeremiah
So in your example you've considered repeating the game using the same value envelopes and say that half the time $10 is picked and half the time $20 is picked whereas in my example I've considered repeating the game using the same starting envelope and say that half the time the other envelope contains $5 and half the time the other envelope contains $20.
Is there some rule of statistics that says that one or the other is the proper way to assess the best strategy for a single instance of the game (where you know that there's $10 in your envelope)?
I would have thought that if we want to know the best strategy given the information we have then the repeated games we consider require us to have that same information, and the variations are in the possible unknowns – which is what my example does.
That doesn't model the case we're considering, which is that we know we have £10. — Michael
Get involved in philosophical discussions about knowledge, truth, language, consciousness, science, politics, religion, logic and mathematics, art, history, and lots more. No ads, no clutter, and very little agreement — just fascinating conversations.