P(Tuesday|Awoken) = (P(Awoken|Tuesday) / P(Awoken)) * P(Tuesday)
Sleeping Beauty is awoken with probability 3/4 on an average day (Monday or Tuesday). On Tuesdays, she is awoken with P = 1/2. Therefore, P(Awoken|Tuesday) / P(Awoken) = (1/2)/(3/4) = 2/3.
This (2/3) is the Bayesian updating factor. The unconditioned probability of her being awoken on Tuesday is 1/2. The updated probability therefore is P(Tuesday|Awoken) = (2/3)*(1/2) = 1/3, as expected. — Pierre-Normand
This is not a probability. It's a ratio of probabilities. The updated probability P(Heads|Awoken) is 2/3. The quoted ratio being larger than one just reflects the fact that Bayesian updating results in a probability increase in this case. — Pierre-Normand
It makes it twice as likely that individual bets are winning bets. Right? Likewise in Sleeping Beauty's problem, the fact that she is being awoken twice when the coin lands heads makes is more likely that a randomly selected awakening is the result of a coin having landed heads. — Pierre-Normand
After being woken up, which of these is the most rational consideration?
1. The coin almost certainly didn't land heads 100 times, and so this is most certainly my first and only interview, or
2. If this experiment was repeated 2100 times then the total number of interviews after the coin landed heads 100 times is greater than the total number of interviews after it didn't, and so if I was to pick an interview at random from that set then there is a greater probability that that interview would have followed the coin landing heads 100 times.
I think the first is the most (and only) rational consideration.
[Although] the second is true ... given that the experiment isn't conducted by picking an interview at random from that set and dropping Sleeping Beauty into it, it's also irrelevant.
P(Selected|Heads) / P(Selected) is 2/1.5 = 4/3. — Pierre-Normand
You're inviting us to imagine ourselves in Sleeping Beauty's shoes to support the halfer position. — Pierre-Normand
To make this scenario more directly analogous to the original problem, let's imagine that Sleeping Beauty, upon each awakening, can not only express her belief about the coin toss but also place a bet on it. In the long run, she would profit from taking the bet as a thirder, further reinforcing this position. — Pierre-Normand
Your two definitions of E[z] aren't equivalent. — sime
Yes, I see that. So why are you redefining y? — sime
No covert redefinitions of y are happening — sime
It's contradictory expectation values don't appeal to faulty reasoning given acceptance of the premises. — sime
Rather the switching argument is unsound, for among it's premises is an improper prior distribution over x, the smallest amount of money in an envelope. And this premise isn't possible in a finite universe.
Intuitively, it's contradictory conclusions makes sense; if the smallest amount of money in an envelope could be any amount of money, and if the prior distribution over the smallest amount of money is sufficiently uniform, then whatever value is revealed in your envelope, the value of the other envelope is likelier to be higher. — sime
You can be sure that the expected value for the other envelope is 5/4 of that of the one you have. — SolarWind
If I understand right, if the coin is heads 100 times, she wakes up on Monday and is not woken up on Tuesday. If the coin is not heads 100 times, she wakes up on Monday and Tuesday? Then the experiment ends. — fdrake
Michael - you ruined my mind again god damnit. — fdrake
I'll analyse that case if you can describe it very specifically. Like in the OP. — fdrake
if it’s heads 100 times in a row I wake you up 2101 times, otherwise I wake you up once
Because SB wakes up more on tails, a given wake up event is more likely to be caused by a tail flip that a head flip. — PhilosophyRunner
the probability of you seeing heads when you wake up is conditional on how often you wake up for heads and how often for tails — PhilosophyRunner
I was talking about frequency not probability. — PhilosophyRunner
In the SB problem it is 1 for heads and 2 for tails. — PhilosophyRunner
Where have I gone wrong? — Srap Tasmaner
In that case it is more likely that given an instance I wake up I will see the coin has been flipped heads 100 times in a row. — PhilosophyRunner
I flip a coin and if it lands heads I wake you up tomorrow, if it lands tails you never wake you up. If you wake up and are asked the probability the coin landed heads, what would you say? — PhilosophyRunner
She is more likely to wake up and see a coin showing tails, as she will wake up more often if the coin lands on tails. — PhilosophyRunner
Given a set {10, 20}, the expected value of a number selected from that set is 15. There's nothing wrong with your first set of equations, and it gives the right answer. You don't have to go through all that; you just need the average. — Srap Tasmaner
(1) What are the chances that y = x and the chances that y = 2x if y is chosen randomly from a set {x, 2x}? (You may, if you like, write it backwards as x = y and x = y/2.)
(2) What are the chances that a y chosen randomly from a set {x, 2x} was chosen from a set {y, 2y} and the chances it was chosen from a set {y/2, y}? — Srap Tasmaner
We can only consider this from the perspective of the participant, who only knows that one envelope contains twice as much as the other and that he picked one at random. His assessment of and can only use that information.
Is it correct that, given what he knows, ?
Is it correct that, given what he knows, ?
If so then, given what he knows, .
Perhaps this is clearer if we understand that means "a rational person's credence that his envelope contains the smaller amount given that he knows that his envelope contains £10".
Whereas you know it does, as the "sampling day" of SB's report depends upon the coin flip. — fdrake
Eh, probability modelling also includes assigning random variables. It has a lot to do with what random variables you put in play. — fdrake
I could imagine using it for teaching probability modelling. Get students to analyse the problem. Then do it IRL with both sampling mechanisms. Should be a cool demonstration of "physical" differences between what's seen as a merely "epistemic" probability assignment! — fdrake
