|
On January 04 2013 16:51 EtherealDeath wrote: The explicit is that the host flips a fair coin, and that coin flip which you do not observe determines what the host does. And Bayesian analysis says it is 1/2. But betting 1/2 would lose you money. You can see it easier by modifying the problem such that heads results in a wake amnesia sleep wake etc cycle for an arbitarily large number of days, and that on the last day, you walk free after tea. Then the problem becomes bettering on whether or not you walk free after tea, and if you bet 1/2 you sure as hell are going to be losing lots of money.
Ok, let's backtrack. Clearly we aren't talking about Monty Hall anymore but some version of OP's game.
Either way you didn't answer if the philosopher asks the question every time. That's the key assumption not whether he wakes up Sleeping Beauty or not.
Under the assumption that he does ask the question every time he wakes Sleeping Beauty up the answer is 1/3 not 1/2. Or 0 for the arbitrarily large case. Note 0 is a nonsensical answer but the assumption that the philosopher lives forever is nonsense so that's expected. Under different assumptions the answer is different. I.e. if we're betting for money and we expect the philosopher to try to maximize his EV the right play is to guess heads 50% and tails 50%. That's not a Bayesian answer. It turns out that simply saying that the philosopher tries to maximize his EV gives no new information, it's exactly the same as asking what the philosopher's strategy is while specifying that we know nothing about it. *
*It turns out that if we play the Nash equilibrium (guess tails 50% of the time) then the philosopher really is indifferent between any strategies. Since we are playing a game where he can observe our strategy but we can't observe his, he can play anything he damn well pleases and still get an EV of 0. So assuming rationality on his part gives no new information and we're back to the original problem of asking about something we explicitly said we know nothing about.
|
let's make it a game theory problem and say you don't know his strategy :D
edit: wait, if you guess 1/2 and the mad philosopher always asks, you lose....
edit: keep in mind that he's a philosopher so he knows what you're thinking
edit: I think I'm too drunk to talk about this goodnight
|
On January 04 2013 17:24 sam!zdat wrote: let's make it a game theory problem and say you don't know his strategy :D
edit: wait, if you guess 1/2 and the mad philosopher always asks, you lose....
edit: keep in mind that he's a philosopher so he knows what you're thinking
What's the exact rule? If he says: "Did the coin land heads or tails?" and we bet I won't lose by answering randomly. How could I?
If he asks: "What's the probability it's Monday" of course I lose. edit: Or equivalently he asks what's the probability the coin landed tails. /edit He's asking me to guess a real number between 0 and 1 that he determined beforehand.
|
Sorry, one last clarification. For the case when you can only take the side that the coin landed tails:
Clearly this is a bet you should not take. The rational strategy from the philosopher is to only ask this question when the coin actually landed heads. So assuming a rational (if mad) philosopher the probability of the coin having landed tails after the philosopher offers the bet is 0. So no odds is fair and we should just refuse the bet.
|
On January 04 2013 17:20 hypercube wrote:Show nested quote +On January 04 2013 16:51 EtherealDeath wrote: The explicit is that the host flips a fair coin, and that coin flip which you do not observe determines what the host does. And Bayesian analysis says it is 1/2. But betting 1/2 would lose you money. You can see it easier by modifying the problem such that heads results in a wake amnesia sleep wake etc cycle for an arbitarily large number of days, and that on the last day, you walk free after tea. Then the problem becomes bettering on whether or not you walk free after tea, and if you bet 1/2 you sure as hell are going to be losing lots of money. Ok, let's backtrack. Clearly we aren't talking about Monty Hall anymore but some version of OP's game. Either way you didn't answer if the philosopher asks the question every time. That's the key assumption not whether he wakes up Sleeping Beauty or not. Under the assumption that he does ask the question every time he wakes Sleeping Beauty up the answer is 1/3 not 1/2. Or 0 for the arbitrarily large case. Note 0 is a nonsensical answer but the assumption that the philosopher lives forever is nonsense so that's expected. Under different assumptions the answer is different. I.e. if we're betting for money and we expect the philosopher to try to maximize his EV the right play is to guess heads 50% and tails 50%. That's not a Bayesian answer. It turns out that simply saying that the philosopher tries to maximize his EV gives no new information, it's exactly the same as asking what the philosopher's strategy is while specifying that we know nothing about it. * *It turns out that if we play the Nash equilibrium (guess tails 50% of the time) then the philosopher really is indifferent between any strategies. Since we are playing a game where he can observe our strategy but we can't observe his, he can play anything he damn well pleases and still get an EV of 0. So assuming rationality on his part gives no new information and we're back to the original problem of asking about something we explicitly said we know nothing about. You are over analyzing and sticking in extra assumptions/strategies where non exist. The only point is that yea it should be 1/3 but it you calculate it using Bayes theorem it is 1/2.
And the EV is yours not his. Pretend he is a robot programmed to not fuck with you but follow the rules precisely as written.
|
man i got 1/3
I simplified the problem statement a lot. The problem I did look like this:
Day0: A single state of sleep S0
Day1: 2 states, a wake state W1, and a sleep state S1
Day2: 1 state, a wake state W2
S0 transition to W1 with probability 1/2 (head) S0 transition to S1 with probability 1/2 (tail) Both W1 and S1 transition to W2 with probability 1
I made a blatant assumption that P(Day0) = P(Day1) = P(Day2) = 1/3, this let me simplify some expressions. Basically this let us condition some events on which day it is. I feel this is a bit ridiculous, but Bayesians put bullshit priors on random stuff anyways hehehe
Now we are after this query: P(T | Wake), i.e. what's the probability of tail given the princess is awake at the moment (of the question from the wizard)
P(T | Wake) = P(Wake | T) * P(T) / P(Wake) //bayues rule
We now compute the components
## Computing P(Wake):
P(Wake) = P(W1) + P(W2) //If ur wake, ur in one of the 2 wake states = P(Day0)P(W1 | Day0) + P(Day1)P(W1 | Day1) + P(Day2)P(W1 | Day2) + ...P(Day0)P(W2 | Day0) + P(Day1)P(W2 | Day1) + P(Day2)P(W2 | Day2) //decompose based on which day it could be = 0 + P(Day1)P(W1 | Day1) + 0 + ...0 + 0 + P(Day2)P(W2 | Day2) //most of these are 0... = 1/3 * P(W1 | Day1) + 1/3 * P(W2 | Day2)
P(W1 | Day1) = 1/2 // because P(W1 | Day1) = P(S1 | Day1) = 1/2 depending on your coin toss P(W2 | Day2) = 1 // because ur wake no matter what on day2
Thus: P(Wake) = 1/3 * 1/2 + 1/3 * 1 = 3/6
##Computing P(Wake | T) P(Wake | T) = P(Day0)*P(Wake | T, Day0) + P(Day1)*P(Wake | T, Day1) + P(Day2)*P(Wake | T, Day2) = 0 + 0 + P(Day2)*P(Wake | T, Day2) P(Wake | T, Day2) = P(W1 | T, Day2) + P(W2 | T, Day2) = 0 + 1 = 1 Thus: P(Wake | T) = 1/3 * 1 = 1/3
##Finally: P(T | Wake) = P(Wake | T) * P(T) / P(Wake) = (1/3 * 1/2) / (3/6) = 1/6 / 3/6 = 1/3
|
wouldn't the probability on waking on monday be 25% and the probability of waking on tuesday 75%?
if the coin flips tails you wake on tuesday, 50% chance of this happening. if the coin flips heads you have 25% of waking on monday and 25% of waking on tuesday.
|
Russian Federation823 Posts
There's no philosophy in this, just math. I might solve it later if i have time.
|
On January 04 2013 19:21 kusto wrote: There's no philosophy in this, just math. I might solve it later if i have time.
The problem is about giving credence to a certain proposition in a particular thought-experiment situation. Of course it is philosophy (epistemology). Academic philosophy is (or rather: can be) way closer to math than you might expect.
|
Russian Federation823 Posts
On January 04 2013 19:04 evanthebouncy! wrote:man i got 1/3 I simplified the problem statement a lot. The problem I did look like this: Day0: A single state of sleep S0 Day1: 2 states, a wake state W1, and a sleep state S1 Day2: 1 state, a wake state W2 S0 transition to W1 with probability 1/2 (head) S0 transition to S1 with probability 1/2 (tail) Both W1 and S1 transition to W2 with probability 1 I made a blatant assumption that P(Day0) = P(Day1) = P(Day2) = 1/3, this let me simplify some expressions. Basically this let us condition some events on which day it is. I feel this is a bit ridiculous, but Bayesians put bullshit priors on random stuff anyways hehehe Now we are after this query: P(T | Wake), i.e. what's the probability of tail given the princess is awake at the moment (of the question from the wizard) P(T | Wake) = P(Wake | T) * P(T) / P(Wake) //bayues rule We now compute the components ## Computing P(Wake): P(Wake) = P(W1) + P(W2) //If ur wake, ur in one of the 2 wake states = P(Day0)P(W1 | Day0) + P(Day1)P(W1 | Day1) + P(Day2)P(W1 | Day2) + ...P(Day0)P(W2 | Day0) + P(Day1)P(W2 | Day1) + P(Day2)P(W2 | Day2) //decompose based on which day it could be = 0 + P(Day1)P(W1 | Day1) + 0 + ...0 + 0 + P(Day2)P(W2 | Day2) //most of these are 0... = 1/3 * P(W1 | Day1) + 1/3 * P(W2 | Day2) P(W1 | Day1) = 1/2 // because P(W1 | Day1) = P(S1 | Day1) = 1/2 depending on your coin toss P(W2 | Day2) = 1 // because ur wake no matter what on day2 Thus: P(Wake) = 1/3 * 1/2 + 1/3 * 1 = 3/6 ##Computing P(Wake | T) P(Wake | T) = P(Day0)*P(Wake | T, Day0) + P(Day1)*P(Wake | T, Day1) + P(Day2)*P(Wake | T, Day2) = 0 + 0 + P(Day2)*P(Wake | T, Day2) P(Wake | T, Day2) = P(W1 | T, Day2) + P(W2 | T, Day2) = 0 + 1 = 1 Thus: P(Wake | T) = 1/3 * 1 = 1/3 ##Finally: P(T | Wake) = P(Wake | T) * P(T) / P(Wake) = (1/3 * 1/2) / (3/6) = 1/6 / 3/6 = 1/3
You must have made a mistake somewhere - your computations are quite complicated, i did it similar to EtherealDeath and 3/4 is the correct answer. What you're searching for is p(T | waking up on Mo) + p(T | waking up on Tuesday) = p(T | waking up on an unknown day).
|
Russian Federation823 Posts
On January 04 2013 20:26 Prog wrote:Show nested quote +On January 04 2013 19:21 kusto wrote: There's no philosophy in this, just math. I might solve it later if i have time. The problem is about giving credence to a certain proposition in a particular thought-experiment situation. Of course it is philosophy (epistemology). Academic philosophy is (or rather: can be) way closer to math than you might expect.
OK, then the problem is the word "credence", which might have inherited some unpractical definitions. For me, it's just the probability p(coin was Tails | i have woken up on any day) - with this premise, the problem is perfectly solvable.
|
On January 04 2013 18:41 EtherealDeath wrote:Show nested quote +On January 04 2013 17:20 hypercube wrote:On January 04 2013 16:51 EtherealDeath wrote: The explicit is that the host flips a fair coin, and that coin flip which you do not observe determines what the host does. And Bayesian analysis says it is 1/2. But betting 1/2 would lose you money. You can see it easier by modifying the problem such that heads results in a wake amnesia sleep wake etc cycle for an arbitarily large number of days, and that on the last day, you walk free after tea. Then the problem becomes bettering on whether or not you walk free after tea, and if you bet 1/2 you sure as hell are going to be losing lots of money. Ok, let's backtrack. Clearly we aren't talking about Monty Hall anymore but some version of OP's game. Either way you didn't answer if the philosopher asks the question every time. That's the key assumption not whether he wakes up Sleeping Beauty or not. Under the assumption that he does ask the question every time he wakes Sleeping Beauty up the answer is 1/3 not 1/2. Or 0 for the arbitrarily large case. Note 0 is a nonsensical answer but the assumption that the philosopher lives forever is nonsense so that's expected. Under different assumptions the answer is different. I.e. if we're betting for money and we expect the philosopher to try to maximize his EV the right play is to guess heads 50% and tails 50%. That's not a Bayesian answer. It turns out that simply saying that the philosopher tries to maximize his EV gives no new information, it's exactly the same as asking what the philosopher's strategy is while specifying that we know nothing about it. * *It turns out that if we play the Nash equilibrium (guess tails 50% of the time) then the philosopher really is indifferent between any strategies. Since we are playing a game where he can observe our strategy but we can't observe his, he can play anything he damn well pleases and still get an EV of 0. So assuming rationality on his part gives no new information and we're back to the original problem of asking about something we explicitly said we know nothing about. You are over analyzing and sticking in extra assumptions/strategies where non exist. The only point is that yea it should be 1/3 but it you calculate it using Bayes theorem it is 1/2.
Nope, you just miscalculated.
|
What you're looking for is P(Tails|Waken up)
By Bayes theorem
P(Tails|Waken up) = P(Waken up|Tails)*P(Tails, a priori)/P(Waken up) = 0.5*0.5/0.75= 1/3
I could go back and check your math, but in these cases the Bayes theorem always gives the same result as the traditional way of counting elementary cases. If you get a different result you messed up somewhere since the two ways are mathematically equivalent.
edit:
On January 04 2013 15:58 EtherealDeath wrote:Hopefully I didnt fuck something up in my head. Ok so we want Pr(Monday | tea), so Pr(tea | Monday) * Pr(Monday) / Pr(Tea). Pr(tea | Monday) = Pr(Heads) = 1/2 if we do it Bayesian. Pr(Monday) = 1/2 (pick a day at random, unbiased manner) Pr(tea) = 3/4 fuck it never mind lololol I inserted a 1 randomly in my head.
Which is exactly the same as you got, so I don't understand the problem.
|
Its 1/3 for tails and 2/3 head imrite ?
So she is wrong edit: didn't read properly OP, its the mad scientist who is talking. Anyway it is just a point of view problem. The correct answer (if she hasn't forgotten the rules of the game) should be that it is more likely to be head than tails.
|
On January 05 2013 00:37 Boblion wrote: Its 1/3 for tails and 2/3 head imrite ?
So she is wrong.
Reading comprehension fail on my part
She's actually right numerically: P(Monday|tea) does equal 1/3 and the calculation is right. Checking by counting cases there are two cases of Tuesday AND Tea and only one of Monday AND Tea, so 1/3 is correct.
But it's not obvious how you get P(Tails|tea) = 1/3 from that. So, it's really a different calculation that doesn't say anything about P(Tails|Tea).
|
On January 04 2013 21:09 kusto wrote:Show nested quote +On January 04 2013 20:26 Prog wrote:On January 04 2013 19:21 kusto wrote: There's no philosophy in this, just math. I might solve it later if i have time. The problem is about giving credence to a certain proposition in a particular thought-experiment situation. Of course it is philosophy (epistemology). Academic philosophy is (or rather: can be) way closer to math than you might expect. OK, then the problem is the word "credence", which might have inherited some unpractical definitions. For me, it's just the probability p(coin was Tails | i have woken up on any day) - with this premise, the problem is perfectly solvable.
What's the probability "I have woken up on any day"
that's a hard one
edit: remember that propositions should tell you what possible world you are in
edit: anybody who answers 1/3 must then account for arbitrarily large case
|
Several people have mentioned they got 3/4, that is not a commonly argued answer, how did you get this?
edit: you have to consider also, say he asks on sunday
"what is the probability that the coin will be tails"
ezpz 1/2
then she goes to sleep and wakes up and gains no information at all
then he asks
"what is the probability that the coin was tails"
ermmmmmmmm now it's harder but no information was gained?!?
|
Now that i'm thinking about it the main problem is that the affirmation of the mad guy is incredibly vague. And OP isn't really clear about what we are discussing here. It seems that we have to take the girl point of view.
Problem is that she lost her memory and well the girl can't really answer anything other than "well if you say so it was tails and we are tuesday". If she remembers the rules she can add a bit more and describe the whole experiment (like several people tried to do in this thread) but i have no idea about what she is supposed to answer other than that. I mean she could always say "no you a liar" but eh i guess she has to believe him...
|
I'll jump on the occasion to ask a question about probabilities : when you say that there's 1/2 chances that it's tails, is it out of the infinite, as in if you threw a coin ad infinitam it would split between two equal occurrances of tails and heads? If so, isn't it theoretically possible to throw a coin every minute for a whole century and only obtain tails ? And if so, doesn't it mean that probabilities are, strictly speaking, empirical observations?
|
^ Every time i have to read one of your post i'm like wtf is wrong with you. You are probably the most confused and confusing guy on TL. I mean there are some really dumb guys and smart guys on TL but at least even if their stupidity (or intelligence) is sometimes hard to understand at least it makes sense. On the other hand you are always playing with words which is something extremly annoying. You also seem to always make a confusion between theory, concepts and real life and this is unhealthy imo. You are STERILE and always babbling.
1- Yes tails and heads are the only outcome as described by OP (although you could argue you can get the "edge" of the coin irl). 2- Yea you can obtain tails or heads forever, its called luck. 3- Probabilities are RULES. The outcome (irl) is the empirical observation.
Wtf is so hard about this ?
User was warned for this post
|
|
|
|