|
ok i need someone to break this down and explain it to me because i am retarded
-3 guys go to a hotel which is 30 bucks a night, so they each pay 10 -the clerk forgets that its a special night or some bullshit and its only $25 a nyte -he gives the bellboy $5 to give it to the 3 dudes which r prob renting a room for hot gay sex but thats irrelevant -the bellboy rips them off and pockets 2$ and gives 1$ to each one
OK so they each payed 10$ at first but got 1$ back each ($9 x 3 = $27). the bellboy pocketed 2$
27 + 2 = 29! o noez wheres the missing dollar?
anyway i kno theres some kind of faulty calculation in there, but i just dont remember where and since im an idiot i cant figure it out. dont try to explain this to me by solving it backwards. I KNOW THERES NO MISSING DOLLAR, i just want to understand where and how the riddle tricks the reader
i know tl.net is here for my problems
|
So old, yet I don't remember it either. lol
|
Simple. He pockets 2 dollars, and gives them each back 1$, but that is not equivalent to saying they each payed 9$, because of the 2$ the he still has, you'd have to factor in the fact that they are still technically paying that.
5/3 = 1.666 = 1 2/3. Therefore they each payed 10 - 1 2/3 which = 8 and 1/3.
8 1/3 x 3 = 25$.
|
If the 3 guys pay $10 each, they total $30, right? So when the clerk gives the $5 to the bellboy, they're down to $25. He takes $2, leaving the total at $28. That way, they each get $1 back ($28 - $3 = $25). You follow?
Man that was badly explained. Let me think about it some more and I'll get back to you!
|
On July 22 2007 13:58 Element)LoGiC wrote: Simple. He pockets 2 dollars, and gives them each back 1$, but that is not equivalent to saying they each payed 9$, because of the 2$ the he still has, you'd have to factor in the fact that they are still technically paying that.
5/3 = 1.666 = 1 2/3. Therefore they each payed 10 - 1 2/3 which = 8 and 1/3.
8 1/3 x 3 = 25$.
i dont understand a word u just said
|
It s not 27$ + 2$ = 29$ but
27$ (what they pay) - 2$ (what was stolen) = 25$ (the real price of the room)
|
hotel: +25$ guys: -27$ (each -9$ as you pointed out) bellboy: +2$ ------------------ 0$
this is no riddle. it's just a way to show how easily it is to confuse ppl and to get them to believe bullshit.
|
its $25 plus 3 dollar rebate. The bellboy snaked $2 but couldnt do math.
|
my bad i should have named this thread "help me figure out this way to show how easily it is to confuse peopel and to get them to believe bullshit"
|
dude that's obvious :D they payed 25 dollars so that's something like 8,33333$ each and they got again 1$ back so it's 9,33333$
but it somehow manages to fuck your logics though 
|
element logic doesn't stand up for his name lol [edit] asta explained it best, read his post
|
Beyonder
Netherlands15103 Posts
first math class of the year they asked this question.. and I was the only one in the class who got it. I explained it three times, and in the end some still didnt get it.
=[
|
MyLostTemple
United States2921 Posts
|
Canada7170 Posts
If all else fails you can just chart where the money goes. Trumped my entire physics class with this one.
Then again, they got fooled by the Monty Hall problem too.
EDIT: Easy explanation of the original problem, bellboy's got $2, the guys have $3, the hotel has $25.
|
On July 22 2007 13:53 zizou21 wrote: OK so they each payed 10$ at first but got 1$ back each ($9 x 3 = $27). the bellboy pocketed 2$
27 + 2 = 29! o noez wheres the missing dollar? Actually, Element)Logic explained the easiest way since the entire riddle is based off of the quoted statement. The guys didn't get $1 each back, they got $5 total, just the bellboy jacked $2. $9x3 = $27 assumes they paid the hotel $27 which they obviously didn't.
|
OK so they each payed 10$ at first but got 1$ back each ($9 x 3 = $27)
Wrong but understandable conclusion :p
27 is what the guys paid in the end, you should review what you've calculated exactly. This statistic is not what you're after in this sum. What matters is: They paid 30, got 3 back, and the bellboy took 2. So 30 -3 -2 = 25. So the total amount of money stays the same. (guys have 3 dollars, bellboy has 2, hotel has 25)
|
|
umm.. there is no riddle?
|
They payed $30 - 3*$1 = $27. The Bellboy got $2. The hotel got $25.
|
It tricks the reader by using 27 + 2, which is irrelevant.
It should be 27 - 2, or 30 - 3 - 2 to explain the cost of the room, which is $25.
|
ok so the problem lies in the 27+2 when it should be 27-2 altho im still kind of confused to how this works
so 27-2 why substract 2 to the amount they paid? is the 2$ the bellboy jacked INCLUDED in the 9$ they paid? yes i know im an idiot
|
On July 22 2007 14:18 MyLostTemple wrote: i suck at math : [
doesnt have much to do with math, more of your ability to problem solve
|
btw does anyone any other sweet ass riddles like this (not necessarily math) cuz at dinner i like to pop these up, but the other nyte i popped this one up but couldnt give them the answer LOL
|
On July 23 2007 00:11 zizou21 wrote: ok so the problem lies in the 27+2 when it should be 27-2 altho im still kind of confused to how this works
so 27-2 why substract 2 to the amount they paid? is the 2$ the bellboy jacked INCLUDED in the 9$ they paid? yes i know im an idiot I cannot believe this question got to the second page.
You need to ask yourself questions. For example:
HOW MUCH DID THE GUYS PAY? $9x3 = $27
HOW MUCH DID THE HOTEL KEEP? $25
Okay... so, bellboy keeps $2 (the amount the guys are ripped off). Right? No questions now.
Just like OldBoy tells us, ask the wrong questions and you'll get the wrong answer.
|
On July 23 2007 00:22 zizou21 wrote: btw does anyone any other sweet ass riddles like this (not necessarily math) cuz at dinner i like to pop these up, but the other nyte i popped this one up but couldnt give them the answer LOL Use google search?
|
On July 23 2007 00:24 WhatisProtoss wrote:Show nested quote +On July 23 2007 00:11 zizou21 wrote: ok so the problem lies in the 27+2 when it should be 27-2 altho im still kind of confused to how this works
so 27-2 why substract 2 to the amount they paid? is the 2$ the bellboy jacked INCLUDED in the 9$ they paid? yes i know im an idiot I cannot believe this question got to the second page. You need to ask yourself questions. For example: HOW MUCH DID THE GUYS PAY?$9x3 = $27 HOW MUCH DID THE HOTEL KEEP?$25 Okay... so, bellboy keeps $2 (the amount the guys are ripped off). Right? No questions now. Just like OldBoy tells us, ask the wrong questions and you'll get the wrong answer.
ok word
|
OK so they each payed 10$ at first but got 1$ back each ($9 x 3 = $27). the bellboy pocketed 2$
This tells you all the information you need.
The total amount of money that left the guys' pockets by the end is $27 (they paid $30 and got $3 back, so now they are out $27 total)
Where did that $27 they spent go? $25 went to pay for their hotel room, and $2 went into the bellboy's pocket.
Think about what you're doing from here. The $2 the bellboy has are part of the $27. Adding another $2 to the $27 makes no sense. That money is already accounted for. The "missing" three dollars to get from 27 to 30 that you're looking for is the three extra the guys paid and then got returned to them.
Bonus explanation with more equations (because equations are cool). All positive numbers are sources of money, all negative numbers are drains:
GOOD: +$30 (TOTAL MONEY PAID) - $3 (refund) - $2 (bellboy) - $25 (cost of room) = 0 (all accounted for)
What the problem has you do is subtract their refund from what they paid first, and then add the amount the bellboy took to come up with a meaningless number of 29:
BAD: +$30 (total money) - $3 (refund) + $2 (wtf are you doing the bellboy took money away he didn't add it) = $29
So we can see that $29 comes out of nowhere and makes no sense, unless the bellboy put in $2 of his own money.
|
sorry to be rude but this was a riddle?
|
its more of a trick than a riddle ; ). its more like counting your fingers backward from 10-1 folding each after each count. you stop counting a 6 then when you say 6+4(four fingers up) =10 + 1(1 folded finger left) = 11.
|
almost as good as how much change can you have without being able to break a dollar =)
|
|
On July 23 2007 00:22 zizou21 wrote: btw does anyone any other sweet ass riddles like this (not necessarily math) cuz at dinner i like to pop these up, but the other nyte i popped this one up but couldnt give them the answer LOL
monty hall problem. I remember when someone looked it up and posted it here when the map came out.
|
A nice cute "riddle" showing that statistics and probabilities can be confusing as well. It is based on an old swedish TV-show (bingolotto):
You get to choose one of three boxes. One of them will give you a nice prize (car or something) the other two will give you nothing. You first choose one. The game leader then opens one of the OTHER two boxes (the one you chose is not opened) which turns out to be empty. You are now presented with the choice to either stick with the one you picked in the first step, or to switch box to the other not opened one. Should you?
|
On July 23 2007 03:23 Cascade wrote: A nice cute "riddle" showing that statistics and probabilities can be confusing as well. It is based on an old swedish TV-show (bingolotto):
You get to choose one of three boxes. One of them will give you a nice prize (car or something) the other two will give you nothing. You first choose one. The game leader then opens one of the OTHER two boxes (the one you chose is not opened) which turns out to be empty. You are now presented with the choice to either stick with the one you picked in the first step, or to switch box to the other not opened one. Should you?
Haha yeah. That one is good. One should always switch because the probability of winning is greater when switching.
|
Simple, They payed 27. 25 to the hotel, and 2 to the guy they gave 30 and got back 3. Its all in the wording
|
On July 22 2007 14:03 zizou21 wrote: my bad i should have named this thread "help me figure out this way to show how easily it is to confuse peopel and to get them to believe bullshit"
you see, there never was a riddle. as ppl pointed out, the calculation is 30 - 3 - 2 = 25 which is obviously correct. in your text it says 27 + 2 = 29 (!= 30) which is an equation that has _nothing_ to do with the whole situation. but some people (including you) never even start to think what the real equation looks like so they don't see the mistake in the text. they simply believe the bs that is fed to them. they even find it amusing or interesting or whatever.
learn to use your head to reflect upon the stuff you read. it might not always be true and sometimes it might even be easy to see that it isn't. for example in this case i'm sure, if it wasn't for that misleading sentence in the text, you would have seen that there is no riddle at all and you would have calculated it right yourself (30 - 2 - 3 = 25). it's not like the math behind it was difficult. not even for someone who sucks at math.
|
he got $5 from the homel clerk, kept $2 and gave $1 to each of the other guys. would it make more sence if he kep $3 and gave each of them $1 each because 3 + 1 + 1 = 5.
|
You are paying 25 for a hotel not 30. You falsely assumed that you are still paying for 30 dollars, and added 2 to 27 to make 29 and become confused. You should subtract 2 from 27 to make 25, which is the true price of the hotel.
|
On July 23 2007 03:31 Jumbalumba wrote:Show nested quote +On July 23 2007 03:23 Cascade wrote: A nice cute "riddle" showing that statistics and probabilities can be confusing as well. It is based on an old swedish TV-show (bingolotto):
You get to choose one of three boxes. One of them will give you a nice prize (car or something) the other two will give you nothing. You first choose one. The game leader then opens one of the OTHER two boxes (the one you chose is not opened) which turns out to be empty. You are now presented with the choice to either stick with the one you picked in the first step, or to switch box to the other not opened one. Should you? Haha yeah. That one is good. One should always switch because the probability of winning is greater when switching. explain this please?
|
On July 23 2007 03:31 Jumbalumba wrote:Show nested quote +On July 23 2007 03:23 Cascade wrote: A nice cute "riddle" showing that statistics and probabilities can be confusing as well. It is based on an old swedish TV-show (bingolotto):
You get to choose one of three boxes. One of them will give you a nice prize (car or something) the other two will give you nothing. You first choose one. The game leader then opens one of the OTHER two boxes (the one you chose is not opened) which turns out to be empty. You are now presented with the choice to either stick with the one you picked in the first step, or to switch box to the other not opened one. Should you? Haha yeah. That one is good. One should always switch because the probability of winning is greater when switching.
Ninja edit!
Nope! :p You got tricked.
The probability is exactly the same. The host always knows which box is empty, because if he opens the one with the prizes the show is over.
Now, imagine you have 3 boxes [x] [1] [2], may the "x" be the prize, and 1, 2 be the empty boxes.
There are some possibilities we can discuss:
You chose box [x], the host open empty box 1 or 2. You are left with 2 boxes, one with prize, one with no prize, the chances are 1/2
You choose box [1], the host open empty box 2. You are left with 2 boxes, one with prize, one with no prize, the chances are 1/2
You choose box [2], the host open empty box 1. You are left with 2 boxes, one with prize, one with no prize, the chances are 1/2
There is a similar trick where people try to guess the bean under the 3 cups and the host always flip over an empty cup. It does not change the outcome of switching or not switching the cups.
|
it should be 27-2 because the 2 dollars is already accounted for in the 27 dollars that they paid....
|
You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch.
You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize.
EDIT: also note that in two of the three possible situations you list, you will win if you switch.
|
|
On July 23 2007 09:09 Jathin wrote:Show nested quote +On July 23 2007 03:23 Cascade wrote: A nice cute "riddle" showing that statistics and probabilities can be confusing as well. It is based on an old swedish TV-show (bingolotto):
You get to choose one of three boxes. One of them will give you a nice prize (car or something) the other two will give you nothing. You first choose one. The game leader then opens one of the OTHER two boxes (the one you chose is not opened) which turns out to be empty. You are now presented with the choice to either stick with the one you picked in the first step, or to switch box to the other not opened one. Should you? This is also known as the famous "Monty Hall" problem. Hence the map name. There are 3 paths you can choose to expo to. http://en.wikipedia.org/wiki/Monty_hall_problem
omg, is THAT the monty hall problem!?! I always thought it was just that stupid TV-show..... I just lost a lot of respect for the fancy-sounding Monty hall problem. And for the record we now know that there is a fourth viable option on where to place your first expo on that map. Change map name?
|
Calgary25980 Posts
There's still three doors. Same as in the problem.
|
Calgary25980 Posts
Before the Monty Hall was "famous" to us in university, two of my friends tried to test it out. They put a shot behind the door, and neutral thrid person opened an empty door and then he decided if he wanted to switch or not. If he got the shot, he'd have to drink it, if not the other guy would drink it. It proved nothing but a lot of arguing and they both got fucked up.
|
On July 23 2007 08:51 Cascade wrote:You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch. You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize. EDIT: also note that in two of the three possible situations you list, you will win if you switch. 
Hmmmmm after some wikipedia you are right! Amazing riddle I'm glad I heard it.
After re-reading your post just now you explained it perfectly.
So let me re-reason for abit...
[x] [1] [2]
3 boxes, when you pick one and stick to it, ignore w/e the host does, your chance is 1/3 and remains 1/3.
However, when you let the host remove 1 first then choose, your chance is 1/2
Now, you choose first, getting a 1/3 chance, the host removes an empty, leaving the chances improved.
Host cannot touch your door, and therefore the chances are only improved in the remaining 2 doors by the host identifying the false one, and therefore you should switch.
Brejkalh weird logic hahaha`
|
|
The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that.
|
Calgary25980 Posts
On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that.
Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%.
|
On July 23 2007 10:09 evanthebouncy! wrote:Show nested quote +On July 23 2007 08:51 Cascade wrote:You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch. You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize. EDIT: also note that in two of the three possible situations you list, you will win if you switch.  Hmmmmm after some wikipedia you are right! Amazing riddle I'm glad I heard it. After re-reading your post just now you explained it perfectly. So let me re-reason for abit... [x] [1] [2] 3 boxes, when you pick one and stick to it, ignore w/e the host does, your chance is 1/3 and remains 1/3. However, when you let the host remove 1 first then choose, your chance is 1/2 Now, you choose first, getting a 1/3 chance, the host removes an empty, leaving the chances improved. Host cannot touch your door, and therefore the chances are only improved in the remaining 2 doors by the host identifying the false one, and therefore you should switch. Brejkalh weird logic hahaha`
just think of it like this.
2/3 chance of picking an empty box 1/3 chance to pick the Box containing prize If you pick an empty box, the host will take away the other empty box. You switch now and win. If you pick the prize box, you miss out when you switch.
The trick is that you are counting on the higher probability of picking the empty box at the begining.
|
Calgary25980 Posts
On July 23 2007 12:02 gameguard wrote:Show nested quote +On July 23 2007 10:09 evanthebouncy! wrote:On July 23 2007 08:51 Cascade wrote:You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch. You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize. EDIT: also note that in two of the three possible situations you list, you will win if you switch.  Hmmmmm after some wikipedia you are right! Amazing riddle I'm glad I heard it. After re-reading your post just now you explained it perfectly. So let me re-reason for abit... [x] [1] [2] 3 boxes, when you pick one and stick to it, ignore w/e the host does, your chance is 1/3 and remains 1/3. However, when you let the host remove 1 first then choose, your chance is 1/2 Now, you choose first, getting a 1/3 chance, the host removes an empty, leaving the chances improved. Host cannot touch your door, and therefore the chances are only improved in the remaining 2 doors by the host identifying the false one, and therefore you should switch. Brejkalh weird logic hahaha` just think of it like this. 2/3 chance of picking an empty box 1/3 chance to pick the Box containing prize If you pick an empty box, the host will take away the other empty box. You switch now and win. If you pick the prize box, you miss out when you switch. The trick is that you are counting on the higher probability of picking the empty box at the begining.
This is the simplest way I've ever thought of it. Thanks.
|
On July 23 2007 12:08 Chill wrote:Show nested quote +On July 23 2007 12:02 gameguard wrote:On July 23 2007 10:09 evanthebouncy! wrote:On July 23 2007 08:51 Cascade wrote:You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch. You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize. EDIT: also note that in two of the three possible situations you list, you will win if you switch.  Hmmmmm after some wikipedia you are right! Amazing riddle I'm glad I heard it. After re-reading your post just now you explained it perfectly. So let me re-reason for abit... [x] [1] [2] 3 boxes, when you pick one and stick to it, ignore w/e the host does, your chance is 1/3 and remains 1/3. However, when you let the host remove 1 first then choose, your chance is 1/2 Now, you choose first, getting a 1/3 chance, the host removes an empty, leaving the chances improved. Host cannot touch your door, and therefore the chances are only improved in the remaining 2 doors by the host identifying the false one, and therefore you should switch. Brejkalh weird logic hahaha` just think of it like this. 2/3 chance of picking an empty box 1/3 chance to pick the Box containing prize If you pick an empty box, the host will take away the other empty box. You switch now and win. If you pick the prize box, you miss out when you switch. The trick is that you are counting on the higher probability of picking the empty box at the begining. This is the simplest way I've ever thought of it. Thanks.
Cascade's idea was better for me, basically...
Pick one w/ .33 chance, the other 2 contains .66 chance You take away empty box, thus condensing the .66 chance into 1 box, you pick that box.
|
Calgary25980 Posts
On July 23 2007 12:16 evanthebouncy! wrote:Show nested quote +On July 23 2007 12:08 Chill wrote:On July 23 2007 12:02 gameguard wrote:On July 23 2007 10:09 evanthebouncy! wrote:On July 23 2007 08:51 Cascade wrote:You should switch. It is 33% that you pick the right one at first, so 67% to get the prize if you switch. You are falling in the trap of this riddle. One is tempted to say "two boxes, we do not know which one contains the prize, so 50%". It is however not two boxes of which we know nothing: As you said, the host knows where the prize is, and as he always takes away the empty one, he is "moving" the probability to find the prize in both boxes into just one of the boxes. The 50% argument is only valid if the situation is completely symmetric, which it is not: the host has rejected to open the other box, thereby increasing the probability for it to contain the prize. EDIT: also note that in two of the three possible situations you list, you will win if you switch.  Hmmmmm after some wikipedia you are right! Amazing riddle I'm glad I heard it. After re-reading your post just now you explained it perfectly. So let me re-reason for abit... [x] [1] [2] 3 boxes, when you pick one and stick to it, ignore w/e the host does, your chance is 1/3 and remains 1/3. However, when you let the host remove 1 first then choose, your chance is 1/2 Now, you choose first, getting a 1/3 chance, the host removes an empty, leaving the chances improved. Host cannot touch your door, and therefore the chances are only improved in the remaining 2 doors by the host identifying the false one, and therefore you should switch. Brejkalh weird logic hahaha` just think of it like this. 2/3 chance of picking an empty box 1/3 chance to pick the Box containing prize If you pick an empty box, the host will take away the other empty box. You switch now and win. If you pick the prize box, you miss out when you switch. The trick is that you are counting on the higher probability of picking the empty box at the begining. This is the simplest way I've ever thought of it. Thanks. Cascade's idea was better for me, basically... Pick one w/ .33 chance, the other 2 contains .66 chance You take away empty box, thus condensing the .66 chance into 1 box, you pick that box.
To each his own.
|
The argument above is really simple and goes straight to 33%. It is actually more difficult to understand why the 50% argument is not valid. Just presenting the two arguments: 1) when you pick the first you got 33% to pick right, so you should switch. 2) You don't know which box the price it is in, so you got 50%, does not matter which you take. At first sight it is not trivial to tell which argument is flawed, both seems quite intuitive. It is a common example to show that one has to be careful with where your intuition takes you in statistics. + Show Spoiler +As I said earlier, the reason for the 50% argument to be flawed is that you do not have symmetry between the two boxes. Monty has actually said something about the other box by not picking it to open. + Show Spoiler +We are really approaching the limit where spoilers start to become annoying...
And my explanation and gameguards are exactly the same so i really don't see your argument. For once someone maybe has learned something from a thread like this. I now know what monty hall is for example!
|
I don't get it I did the math a different way and I got 30... I see nothing missing..
25 in total guy takes 2 gives 3 back
2+3 = 5 5+25= 30
no need to do any other complicated math bs... and Im not sure why people are responding with 2 paragraph replies.. 
|
if anyone needs more convincing, you can just write a program to run this 1,000 times.
in the first group, you make the person not switch.
in the second group, you make the person switch.
and keep track of how many times each person wins and see who wins more. this avoids all the thinking that makes my head hurt.
|
United States7166 Posts
oh i get it now
the only reason why you would do 9 * 3 calculation for this is to find out how much money the hotel ends up with at the end, which should equal 25.
each of the 3 tenants effectively ends up giving the hotel 9 dollars each, 9 * 3 = 27, but 2 of the dollars were also ended up with a greedy bellboy, so 27 - 2 = 25.
all the money is accounted for, lets see how much money each party ends up with at the end:
hotel : 25 bellboy : 2 tenants : 3
25 + 2 + 3 = 30
|
On July 23 2007 12:40 geometryb wrote: if anyone needs more convincing, you can just write a program to run this 1,000 times.
in the first group, you make the person not switch.
in the second group, you make the person switch.
and keep track of how many times each person wins and see who wins more. this avoids all the thinking that makes my head hurt.
Or you can think this way:
You have 100000000000000000 boxes You pick one, knowing you have 1/100000000000000000 chances Host eliminate 999999999999998 boxes, 2 remains
You better switch.
|
On July 23 2007 12:45 Zelniq wrote: oh i get it now
the only reason why you would do 9 * 3 calculation for this is to find out how much money the hotel ends up with at the end, which should equal 25.
each of the 3 tenants effectively ends up giving the hotel 9 dollars each, 9 * 3 = 27, but 2 of the dollars were also ended up with a greedy bellboy, so 27 - 2 = 25.
all the money is accounted for, lets see how much money each party ends up with at the end:
hotel : 25 bellboy : 2 tenants : 3
25 + 2 + 3 = 30
9*3 makes perfect sense. Its 27+2 that is nonsensical.
lol ninja edit :0
|
On July 22 2007 14:18 MyLostTemple wrote: i suck at math : [
That's why you have a brother.
|
On July 22 2007 14:00 Gatsu wrote: It s not 27$ + 2$ = 29$ but
27$ (what they pay) - 2$ (what was stolen) = 25$ (the real price of the room)
this is about right. You don't add the 2 cuz they were stolen i.e. subtract
|
At first that monty hall problem was kinda weird to me, but now i thought of it, and - here's my explanation: You have 3 boxes, one of them is the right box. Still, you have only 2 options: - Pick the right box - 33% chance. - pick the wrong box - 66% chance.
Lets now analyze the odds for the switching strategy. If you pick the right box, the host will remove one of the false boxes, you will switch and pick the remaining wrong box. As mentioned, this will happen whenever you pick the right box at the start.
However, if you initially pick the wrong box (66% chance), 2 boxes will remain, with one of them being the right box. Out of these 2 boxes, the host will remove the wrong one, and when you switch, u score the right box. This scenario occurs whenever your first choice is the wrong box, which happens in 66% of the cases.
|
On July 23 2007 11:56 Chill wrote:Show nested quote +On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%.
Switching would be 90%, if the host eliminates all but one of the doors you didn´t pick.
|
Calgary25980 Posts
On July 23 2007 16:52 Sr18 wrote:Show nested quote +On July 23 2007 11:56 Chill wrote:On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%. Switching would be 90%, if the host eliminates all but one of the doors you didn´t pick.
Right.
|
If your odds start at 33.3333% to fail on 3 doors, they obviously get better on 2 doors (50%) but if you add them both together its 83.333% chance to fail.
|
Canada7170 Posts
On July 23 2007 17:17 CharlieMurphy wrote: If your odds start at 33.3333% to fail on 3 doors, they obviously get better on 2 doors (50%) but if you add them both together its 83.333% chance to fail.
...wrong.
It's the ORDER that the things happen in.
You pick a door. Let's say it's the wrong door.
There are two left, one of which is the correct one, the other, he eliminates. So switching after picking the wrong door is 100% to become the correct one.
So if you have a 2/3 of picking the wrong door initially, you have 2/3 of a chance to win after switching.
|
Cayman Islands24199 Posts
monty hall is not that special if the selection bias of the host is made clear. much stuff over presentation. for ppl used to precisely stated conditions, vaguness will be a pain
|
i hated explaining monty hall problem to others, but i've foudn a really simple solution that everyone can clearly understand
you only lose by switching if you picked the right box to begin with. what are the odds of picking the right box to begin with?
that's usually enough =p
|
On July 23 2007 17:17 CharlieMurphy wrote: If your odds start at 33.3333% to fail on 3 doors, they obviously get better on 2 doors (50%) but if you add them both together its 83.333% chance to fail.
LOL
|
On July 23 2007 11:56 Chill wrote:Show nested quote +On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%.
Nope it still counts, what you forgot is when adding doors, obviously you have to divide the chance by that added amount.
Choice 1 will be 10% no matter what they do like I said right? And there's still sure to be a price so:
100 - 10 = 90 % chance of ONE of the 8 doors left being the right one.
90/8 = 11,25% of getting it right when switching.
On July 23 2007 17:17 CharlieMurphy wrote: If your odds start at 33.3333% to fail on 3 doors, they obviously get better on 2 doors (50%) but if you add them both together its 83.333% chance to fail.
What are you even calculating here? The question is should he switch or not. I don't understand what you did here at all. :p
|
Maybe I'm not sure how the show works but what I meant is: The first choice you make between 3 doors is 66% chance to win and a 33% chance to lose out of 3 doors. If you get a winner the first time and try to get the other winner which is now a 50/50 shot you've just made your odds worse because essentially its like you picked 2 doors at once, Which is going to be a fail at 83% of the time, correct? or is it 66%?
|
On July 23 2007 20:36 CharlieMurphy wrote: Maybe I'm not sure how the show works but what I meant is: The first choice you make between 3 doors is 66% chance to win and a 33% chance to lose out of 3 doors. If you get a winner the first time and try to get the other winner which is now a 50/50 shot you've just made your odds worse because essentially its like you picked 2 doors at once, Which is going to be a fail at 83% of the time, correct? or is it 66%?
I still have no idea what you mean.
Here's how the show works:
-3 doors, 1 prize (so 33% of getting it right if picking a random). -You pick 1 door. -The host removes 1 of the remaining 2 doors, and you know he removed the one which isn't the one with a prize.
Now the question is this: Will he have more chance of winning if he switches or not.
answer: Yes he should, 66% of the other one being right, 33% of getting it right if he keeps the original pick
|
On July 23 2007 20:17 Frits wrote:Show nested quote +On July 23 2007 11:56 Chill wrote:On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%. Nope it still counts, what you forgot is when adding doors, obviously you have to divide the chance by that added amount. Choice 1 will be 10% no matter what they do like I said right? And there's still sure to be a price so: 100 - 10 = 90 % chance of ONE of the 8 doors left being the right one. 90/8 = 11,25% of getting it right when switching.
Wrong!
Look at it this way: 10 doors You choose one door ----> 1/10 chance of getting it right no matter what! Out of the 9 other doors the host reveals 8
So essentially you are either choosing 1 of two groups: The group of one or the group of 9 (of which 8 have been revealed)
Therefore it is a 90% prob of getting it right if you switch! There is no "dividing the chance by that added mount"!
To further illustrate my point, lets make the problem even bigger: say you to choose a number between 1 in a million. There is 1 random correct number.
Then someone (who obviously knows what the number is) says it is either your number or it is a different one they tell you. You would have to be absolutely insane to stick with your number because it is a 1/1000000 shot no matter what where as the other number is 999999/1000000
|
Cayman Islands24199 Posts
well if the host only reveals 1 door, then it is as frits said. that the host always opens an empty door is the key to this problem.
|
On July 22 2007 13:53 zizou21 wrote: ok i need someone to break this down and explain it to me because i am retarded
-3 guys go to a hotel which is 30 bucks a night, so they each pay 10 -the clerk forgets that its a special night or some bullshit and its only $25 a nyte -he gives the bellboy $5 to give it to the 3 dudes which r prob renting a room for hot gay sex but thats irrelevant -the bellboy rips them off and pockets 2$ and gives 1$ to each one
OK so they each payed 10$ at first but got 1$ back each ($9 x 3 = $27). the bellboy pocketed 2$
27 + 2 = 29! o noez wheres the missing dollar?
anyway i kno theres some kind of faulty calculation in there, but i just dont remember where and since im an idiot i cant figure it out. dont try to explain this to me by solving it backwards. I KNOW THERES NO MISSING DOLLAR, i just want to understand where and how the riddle tricks the reader
i know tl.net is here for my problems
I don't get why this makes sense...
Ok, they paid 9$ each, so they are out 27 dollars.
The bellboy takes 2$.
27-2=25, which is how much the room costs...
|
On July 23 2007 20:43 Frits wrote:Show nested quote +On July 23 2007 20:36 CharlieMurphy wrote: Maybe I'm not sure how the show works but what I meant is: The first choice you make between 3 doors is 66% chance to win and a 33% chance to lose out of 3 doors. If you get a winner the first time and try to get the other winner which is now a 50/50 shot you've just made your odds worse because essentially its like you picked 2 doors at once, Which is going to be a fail at 83% of the time, correct? or is it 66%? I still have no idea what you mean. Here's how the show works: -3 doors, 1 prize (so 33% of getting it right if picking a random). -You pick 1 door. -The host removes 1 of the remaining 2 doors, and you know he removed the one which isn't the one with a prize. Now the question is this: Will he have more chance of winning if he switches or not. answer: Yes he should, 66% of the other one being right, 33% of getting it right if he keeps the original pick
Nice how this turned into a Monty Hall problem.
I didn't really understand this problem until I read "Curious Incident of a Dog at a Nighttime" (or something like that).
IIRC, they used pictures to illustrate the problem, and it really cleared things up.
Like someone already mentioned above, if you switch, you'll win 66% of the time.
For instance, the boxes are:
L L W
You choose 1st box, host removes 2nd box, you switch, you win. You choose 2nd box, host removes 1st box, you switch, you win. You choose 3rd box, host removes either 1st or 2nd box, you switch, you lose.
So you win 2/3 of the time, and you lose 1/3 of the time.
Whereas if you don't switch
L L W
You choose 1st box, host removes 2nd box, you lose. You choose 2nd box, host removes 1st box, you lose. You choose 3rd box, host removes 1st or 2nd box, you win.
You lost 2/3 of the time, and you win 1/3 of the time.
By switching, you double your chance of winning.
|
Braavos36374 Posts
it's easier if you visualize it with 100 doors
you pick door #1
host eliminates doors #2-99
would you switch to door #100 or would you stick with #1?
|
On July 23 2007 12:45 evanthebouncy! wrote:Show nested quote +On July 23 2007 12:40 geometryb wrote: if anyone needs more convincing, you can just write a program to run this 1,000 times.
in the first group, you make the person not switch.
in the second group, you make the person switch.
and keep track of how many times each person wins and see who wins more. this avoids all the thinking that makes my head hurt. Or you can think this way: You have 100000000000000000 boxes You pick one, knowing you have 1/100000000000000000 chances Host eliminate 999999999999998 boxes, 2 remains You better switch.
if you want more experimental results then i guess.
|
On July 23 2007 20:56 [r]h_probe wrote:Show nested quote +On July 23 2007 20:17 Frits wrote:On July 23 2007 11:56 Chill wrote:On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%. Nope it still counts, what you forgot is when adding doors, obviously you have to divide the chance by that added amount. Choice 1 will be 10% no matter what they do like I said right? And there's still sure to be a price so: 100 - 10 = 90 % chance of ONE of the 8 doors left being the right one. 90/8 = 11,25% of getting it right when switching. Wrong! Look at it this way: 10 doors You choose one door ----> 1/10 chance of getting it right no matter what! Out of the 9 other doors the host reveals 8 So essentially you are either choosing 1 of two groups: The group of one or the group of 9 (of which 8 have been revealed) Therefore it is a 90% prob of getting it right if you switch! There is no "dividing the chance by that added mount"! To further illustrate my point, lets make the problem even bigger: say you to choose a number between 1 in a million. There is 1 random correct number. Then someone (who obviously knows what the number is) says it is either your number or it is a different one they tell you. You would have to be absolutely insane to stick with your number because it is a 1/1000000 shot no matter what where as the other number is 999999/1000000
I don't understand the problem, do you guys reveal one more door or reveal eight more (say we have ten doors).
If you only reveal one more, it doesn't affect the total outcome whether you switch or you don't switch.
Say you have ten doors and one goat.
GGGGGGGGGC
If you don't switch, you have 10% chance of winning.
If you switch: If you pick door 1-9, you switch, each case generates 9 more subcases, one that you win, and eight that you lose. If you pick door 10, you switch, you have 9 more subcases, all of wich you lose.
So... 9/90 = 10%
Either way, switch or no switch, you have 10% chance of winning.
nvm, it still improves the probability slightly.
It's still 1/10 if you don't switch.
If you switch, since he kills a door, You get a 1/8 probability of getting it right.
So...
9/80 which is 11.25% vs the original 10%
Same goes for 100 obviously
1% vs ( 99 / 980 = ) ~1.0102 %
So for 10 doors, what if we remove 2?
If we don't switch, still 10%
If we switch, now you have a 1/7 chance if you switch if you choose the wrong door first, which is 9/10. But we lose (0%) if we have the right one at first (so 1/10) What's the probability? 9/70 = 0.129
What about three doors?
Still 10% if we don't switch, but 9/60 = 15% if we do.
What about removing eight doors? Still 10% if we don't switch, If we do, after removing eight doors, we have only two doors left! If our first door has a goat (9/10), then we win (100%) If our first door has a car (1/10), then we lose (0%)
So, this gives us 90% chance to win ( 9/10 * 1 + 1/10 * 0 ).
In other words, if we don't switch, we are not taking advantage of the "removing a door" rule, by removing a door, we naturally increase our likelihood of winning.
|
On July 23 2007 21:27 geometryb wrote:Show nested quote +On July 23 2007 12:45 evanthebouncy! wrote:On July 23 2007 12:40 geometryb wrote: if anyone needs more convincing, you can just write a program to run this 1,000 times.
in the first group, you make the person not switch.
in the second group, you make the person switch.
and keep track of how many times each person wins and see who wins more. this avoids all the thinking that makes my head hurt. Or you can think this way: You have 100000000000000000 boxes You pick one, knowing you have 1/100000000000000000 chances Host eliminate 999999999999998 boxes, 2 remains You better switch. if you want more experimental results then i guess.
Hun no lol It's merely there to help ppl visualize things better by exaggerating the situation, who's gonna experiment? We're talking theory here!
|
On July 23 2007 17:44 JeeJee wrote: i hated explaining monty hall problem to others, but i've foudn a really simple solution that everyone can clearly understand
you only lose by switching if you picked the right box to begin with. what are the odds of picking the right box to begin with?
that's usually enough =p
Genius
|
On July 23 2007 21:47 evanthebouncy! wrote:Show nested quote +On July 23 2007 21:27 geometryb wrote:On July 23 2007 12:45 evanthebouncy! wrote:On July 23 2007 12:40 geometryb wrote: if anyone needs more convincing, you can just write a program to run this 1,000 times.
in the first group, you make the person not switch.
in the second group, you make the person switch.
and keep track of how many times each person wins and see who wins more. this avoids all the thinking that makes my head hurt. Or you can think this way: You have 100000000000000000 boxes You pick one, knowing you have 1/100000000000000000 chances Host eliminate 999999999999998 boxes, 2 remains You better switch. if you want more experimental results then i guess. Hun no lol It's merely there to help ppl visualize things better by exaggerating the situation, who's gonna experiment? We're talking theory here!
i was talking about my thing about writing a program to repeat the game thousands of times. :p
theory and people's thinking often leads to a lot of wrong answers because you never know if you missed something or made a mistake or were just wrong to begin with. it's like when I say, "theoritically, it should work" most of the time it doesn't.
so the best way to do it is. come up with a hypothesis based on your theory. and then test it with experiment. and if you're right, yippee. if you're wrong, oh no.
|
On July 23 2007 20:56 [r]h_probe wrote:Show nested quote +On July 23 2007 20:17 Frits wrote:On July 23 2007 11:56 Chill wrote:On July 23 2007 11:43 Frits wrote: The monty hall problem is a very simple riddle.
Choice 1 will always stay 33,3% no matter what they do, and there's sure to be a price. So the odds of the remaining doors together are 100%. So if you have 2 choices, and 1 is 33,3%, the other one has 66,6% chance of being the price. (100-33,3)
The case in which it wouldn't matter which door you took is when they take away one of the choices without knowing if that was the prize or not. (In which case it would be 33-33.)
Everyone here should be able to understand that. Counter example: Monty Hall with 10 doors. You can switch or not switch. Don't switch = 10% chance to win. By your explination above, switch should = 90%. Nope it still counts, what you forgot is when adding doors, obviously you have to divide the chance by that added amount. Choice 1 will be 10% no matter what they do like I said right? And there's still sure to be a price so: 100 - 10 = 90 % chance of ONE of the 8 doors left being the right one. 90/8 = 11,25% of getting it right when switching. Wrong! Look at it this way: 10 doors You choose one door ----> 1/10 chance of getting it right no matter what! Out of the 9 other doors the host reveals 8 So essentially you are either choosing 1 of two groups: The group of one or the group of 9 (of which 8 have been revealed) Therefore it is a 90% prob of getting it right if you switch! There is no "dividing the chance by that added mount"! To further illustrate my point, lets make the problem even bigger: say you to choose a number between 1 in a million. There is 1 random correct number. Then someone (who obviously knows what the number is) says it is either your number or it is a different one they tell you. You would have to be absolutely insane to stick with your number because it is a 1/1000000 shot no matter what where as the other number is 999999/1000000
What I did with more doors was asssume he still only reveals one, not all but one.
There's a 90% prob if he eliminates 8 doors yes, but you have to divide the chance by the amount of doors if there's more than one door left to pick besides the original.
|
|
|
|