|
On January 04 2013 15:49 sam!zdat wrote: you can also ask the question, which I originally had and edited out because I'm dumb, "what is the probability that it is Monday?"
but that involves an indexical so it's harder Well, from a Bayesian sense it would seem to be 1/4, but from a betting standpoint I suppose I'd bet 1/3.
|
ah, maybe that was what happened.
i was thinking it was 75% that it was tuesday. if it's tuesday it's 50% of tails. 25% that it's monday. 0% tails.
i'd always put my money on heads.
|
|
so part of the problem is that Heads(monday) and Heads (tuesday) are in the same possible world, but at different times.
I mean, what bayesian credence can you give to claim "it's 5:00"? idk man. either you know, or you have no fucking idea. at which point as Double Reed tells me you give equal credence to all possible things, but what are all the possible times??? my head explode
|
I love threads like these. Thanks for making it, I still like one third though. I think wikipedia says that if you run this on a computer with many trials it comes out to one third, so while expressing the probability as one third may not represent the problem fully, I like this answer because its consistent over time.
|
running it over many trials is what we mean when we talk about EV
|
The answer is 1/3 if we know the philosopher will ask the question. Otherwise the philosopher can manipulate the probability to be any amount he wishes.
There's an analogous situation in the Monty Hall problem. If the game show host has the choice of offering or not offering the switch he can manipulate probabilities to the point where switching offers no benefits (and this can't be exploited by the contestant).
|
On January 04 2013 15:53 sam!zdat wrote: how do you get 1/4? Hopefully I didnt fuck something up in my head.
Ok so we want Pr(Monday | tea), so
Pr(tea | Monday) * Pr(Monday) / Pr(Tea).
Pr(tea | Monday) = Pr(Heads) = 1/2 if we do it Bayesian. Pr(Monday) = 1/2 (pick a day at random, unbiased manner) Pr(tea) = 3/4
fuck it never mind lololol I inserted a 1 randomly in my head.
|
assume the philosopher explains the whole thing to sleeping beauty. she knows as much as we know.
but say, if you think 1/3, it's arbitrarily many number of days. either sleeping beauty escapes, or she is trapped forever in groundhog day + amnesia. is there an arbitrarily small possibility she will see prince chamring again??
|
On January 04 2013 15:54 sam!zdat wrote: so part of the problem is that Heads(monday) and Heads (tuesday) are in the same possible world, but at different times.
I mean, what bayesian credence can you give to claim "it's 5:00"? idk man. either you know, or you have no fucking idea. at which point as Double Reed tells me you give equal credence to all possible things, but what are all the possible times??? my head explode
The possible times are whatever you design them to be. If you limit your resolution to a minute, then Pr(5:00) = 1/1440 :D
Yeah, I'm not really sure how to approach this, either.
|
On January 04 2013 15:58 hypercube wrote: The answer is 1/3 if we know the philosopher will ask the question. Otherwise the philosopher can manipulate the probability to be any amount he wishes.
There's an analogous situation in the Monty Hall problem. If the game show host has the choice of offering or not offering the switch he can manipulate probabilities to the point where switching offers no benefits (and this can't be exploited by the contestant). Except if I recall correctly there is no problem there with what we would like it to be, and what it turns out to be from a Bayesian analysis.
|
three possible states of waking up.
after the cointoss is head -> 50% you can either be on monday or tuesday, it seems fair to assume that there is an equal probability of both, since nothing 'happens' in between, and the events are already determined at this point. hence 25% on each (half of 50%).
after the cointoss is tails -> 50% it is 100% that it is tuesday.
so when you wake up in one of these three states there's no point in going for tails.
|
Monty Hall is not problematic like this is, it's much more simple because it doesn't involve memory loss.
I should say that the reason I make this is I wish to defend claim:
"It is not the case that Bayesian reasoning can be applied to all types of beliefs"
|
On January 04 2013 16:02 sam!zdat wrote: Monty Hall is not problematic like this is, it's much more simple because it doesn't involve memory loss.
I should say that the reason I make this is I wish to defend claim:
"It is not the case that Bayesian reasoning can be applied to all types of beliefs" Lol that's a fun way of putting it. A problematic result from Bayesian reasoning when amnesia is applied.
|
On January 04 2013 16:01 nunez wrote: three possible states of waking up.
after the cointoss is head -> 50% you can either be on monday or tuesday, it seems fair to assume that there is an equal probability of both, since nothing 'happens' in between, and the events are already determined at this point. hence 25% on each (half of 50%).
after the cointoss is tails -> 50% it is 100% that it is tuesday.
so when you wake up in one of these three states there's no point in going for tails.
but the point is you're asked to say what odds you want to bet on that it's tails. the payoff doesn't have to be equal; YOU set the payoff
|
On January 04 2013 16:04 sam!zdat wrote:Show nested quote +On January 04 2013 16:01 nunez wrote: three possible states of waking up.
after the cointoss is head -> 50% you can either be on monday or tuesday, it seems fair to assume that there is an equal probability of both, since nothing 'happens' in between, and the events are already determined at this point. hence 25% on each (half of 50%).
after the cointoss is tails -> 50% it is 100% that it is tuesday.
so when you wake up in one of these three states there's no point in going for tails. but the point is you're asked to say what odds you want to bet on that it's tails. the payoff doesn't have to be equal; YOU set the payoff
a valid point.
i can't think of a way to set an odds, i can only get so far as to saying p(heads) > p(tails) when you wake up.
|
On January 04 2013 16:00 EtherealDeath wrote:Show nested quote +On January 04 2013 15:58 hypercube wrote: The answer is 1/3 if we know the philosopher will ask the question. Otherwise the philosopher can manipulate the probability to be any amount he wishes.
There's an analogous situation in the Monty Hall problem. If the game show host has the choice of offering or not offering the switch he can manipulate probabilities to the point where switching offers no benefits (and this can't be exploited by the contestant). Except if I recall correctly there is no problem there with what we would like it to be, and what it turns out to be from a Bayesian analysis.
Can you rephrase that, I don't understand what you mean.
|
On January 04 2013 15:59 sam!zdat wrote: assume the philosopher explains the whole thing to sleeping beauty. she knows as much as we know.
but say, if you think 1/3, it's arbitrarily many number of days. either sleeping beauty escapes, or she is trapped forever in groundhog day + amnesia. is there an arbitrarily small possibility she will see prince chamring again?? Yea that is curious. If we changed the Heads condition instead to some arbitrarily large number of days, each day being woken for tea, and say that after Tuesday (the last day), she's free to go home, what we are actually saying is that upon being woken up, there is a 1/2 chance we are good to go after tea.
Which sure as fuck is not the case. What a pain.
|
Would it be a cop-out to claim that agents with amnesia don't qualify as rational any more?
|
On January 04 2013 16:06 hypercube wrote:Show nested quote +On January 04 2013 16:00 EtherealDeath wrote:On January 04 2013 15:58 hypercube wrote: The answer is 1/3 if we know the philosopher will ask the question. Otherwise the philosopher can manipulate the probability to be any amount he wishes.
There's an analogous situation in the Monty Hall problem. If the game show host has the choice of offering or not offering the switch he can manipulate probabilities to the point where switching offers no benefits (and this can't be exploited by the contestant). Except if I recall correctly there is no problem there with what we would like it to be, and what it turns out to be from a Bayesian analysis. Can you rephrase that, I don't understand what you mean. From a Bayesian standpoint, you repick. So, no apparently false conclusion. This however makes the Bayesian answer look stupid as fuck.
|
|
|
|