• Log InLog In
  • Register
Liquid`
Team Liquid Liquipedia
EDT 02:27
CEST 08:27
KST 15:27
  • Home
  • Forum
  • Calendar
  • Streams
  • Liquipedia
  • Features
  • Store
  • EPT
  • TL+
  • StarCraft 2
  • Brood War
  • Smash
  • Heroes
  • Counter-Strike
  • Overwatch
  • Liquibet
  • Fantasy StarCraft
  • TLPD
  • StarCraft 2
  • Brood War
  • Blogs
Forum Sidebar
Events/Features
News
Featured News
HomeStory Cup 27 - Info & Preview18Classic wins Code S Season 2 (2025)16Code S RO4 & Finals Preview: herO, Rogue, Classic, GuMiho0TL Team Map Contest #5: Presented by Monster Energy6Code S RO8 Preview: herO, Zoun, Bunny, Classic7
Community News
Weekly Cups (June 23-29): Reynor in world title form?6FEL Cracov 2025 (July 27) - $8000 live event13Esports World Cup 2025 - Final Player Roster14Weekly Cups (June 16-22): Clem strikes back1Weekly Cups (June 9-15): herO doubles on GSL week4
StarCraft 2
General
StarCraft Mass Recall: SC1 campaigns on SC2 thread The SCII GOAT: A statistical Evaluation Weekly Cups (June 23-29): Reynor in world title form? How does the number of casters affect your enjoyment of esports? Esports World Cup 2025 - Final Player Roster
Tourneys
HomeStory Cup 27 (June 27-29) WardiTV Mondays SOOPer7s Showmatches 2025 FEL Cracov 2025 (July 27) - $8000 live event $200 Biweekly - StarCraft Evolution League #1
Strategy
How did i lose this ZvP, whats the proper response Simple Questions Simple Answers [G] Darkgrid Layout
Custom Maps
[UMS] Zillion Zerglings
External Content
Mutation # 480 Moths to the Flame Mutation # 479 Worn Out Welcome Mutation # 478 Instant Karma Mutation # 477 Slow and Steady
Brood War
General
BGH Auto Balance -> http://bghmmr.eu/ BW General Discussion StarCraft & BroodWar Campaign Speedrun Quest ASL20 Preliminary Maps Unit and Spell Similarities
Tourneys
[BSL20] GosuLeague RO16 - Tue & Wed 20:00+CET The Casual Games of the Week Thread [Megathread] Daily Proleagues [BSL20] ProLeague LB Final - Saturday 20:00 CET
Strategy
Simple Questions, Simple Answers I am doing this better than progamers do.
Other Games
General Games
Stormgate/Frost Giant Megathread Nintendo Switch Thread Path of Exile What do you want from future RTS games? Beyond All Reason
Dota 2
Official 'what is Dota anymore' discussion
League of Legends
Heroes of the Storm
Simple Questions, Simple Answers Heroes of the Storm 2.0
Hearthstone
Heroes of StarCraft mini-set
TL Mafia
TL Mafia Community Thread Vanilla Mini Mafia
Community
General
Effective Commercial Building Cost Assessment Tips Trading/Investing Thread US Politics Mega-thread Stop Killing Games - European Citizens Initiative Things Aren’t Peaceful in Palestine
Fan Clubs
SKT1 Classic Fan Club! Maru Fan Club
Media & Entertainment
Anime Discussion Thread [Manga] One Piece [\m/] Heavy Metal Thread Korean Music Discussion
Sports
2024 - 2025 Football Thread Formula 1 Discussion NBA General Discussion TeamLiquid Health and Fitness Initiative For 2023 NHL Playoffs 2024
World Cup 2022
Tech Support
Computer Build, Upgrade & Buying Resource Thread
TL Community
The Automated Ban List
Blogs
from making sc maps to makin…
Husyelt
Blog #2
tankgirl
Game Sound vs. Music: The Im…
TrAiDoS
StarCraft improvement
iopq
Heero Yuy & the Tax…
KrillinFromwales
Trip to the Zoo
micronesia
Customize Sidebar...

Website Feedback

Closed Threads



Active: 588 users

Is the mind all chemical and electricity? - Page 91

Forum Index > General Forum
Post a Reply
Prev 1 89 90 91 92 93 104 Next
Reason
Profile Blog Joined June 2006
United Kingdom2770 Posts
July 18 2013 14:32 GMT
#1801
The problem here is that 0 exactly or 1 exactly doesn't tell you if it's an impossible or certain event or not so using that language really doesn't help. That's why I was making the point of writing 1(almost sure) or 1(sure) before so people actually understand what you're saying there.

The probability of eventually randomly selecting any particular real number between [0,1] is 0 (but not impossible)
The probability of eventually randomly selecting 0.548 is 1 exactly. (is it almost sure, or sure)?

The last part there I'm not sure about, it's basically the same question as before anyway.
Speak properly, and in as few words as you can, but always plainly; for the end of speech is not ostentation, but to be understood.
uzyszkodnik
Profile Joined April 2010
Poland64 Posts
Last Edited: 2013-07-18 14:47:13
July 18 2013 14:46 GMT
#1802
i think that we can denote 1(almost sure) as p(x)=1-epsilon and 1(sure) as p(x)=epsilon, where epsilon as a number infinitely small, it would be kinda more mathematical, as for the question involving probability of selecting particular number a from range [0,1] in an infinite number of tries(let's say continuum) just take a 2d uniform distribution and calc the integrals -_-' , so it'd be int int f(x,y) dx dy, where x = a to a, y = 0 to 1. where a is our number.
Tobberoth
Profile Joined August 2010
Sweden6375 Posts
July 18 2013 14:46 GMT
#1803
On July 18 2013 23:32 Reason wrote:
The problem here is that 0 exactly or 1 exactly doesn't tell you if it's an impossible or certain event or not so using that language really doesn't help. That's why I was making the point of writing 1(almost sure) or 1(sure) before so people actually understand what you're saying there.

The probability of eventually randomly selecting any particular real number between [0,1] is 0 (but not impossible)
The probability of eventually randomly selecting 0.548 is 1 exactly. (is it almost sure, or sure)?

The last part there I'm not sure about, it's basically the same question as before anyway.

I'd say the probability of eventually randomly selecting 0.548 is 1 (almost sure), because just like before, it's conceivable that the it will never ever select that number since it's random.
Shiori
Profile Blog Joined July 2011
3815 Posts
July 18 2013 14:48 GMT
#1804
On July 18 2013 23:46 uzyszkodnik wrote:
i think that we can denote 1(almost sure) as p(x)=1-epsilon and 1(sure) as p(x)=epsilon, where epsilon as a number infinitely small, it would be kinda more mathematical, as for the question involving probability of selecting particular number a from range [0,1] just take a 2d uniform distribution and calc the integrals -_-' .

My understanding is that the only infinitely small number (i.e. infinitesimal) over the real numbers is zero, which means that epsilon is exactly zero.

Also, I think you mean for the second equation to be p(x)=1, not p(x)=epsilon, since that would imply that 1(sure) has p(x) ~=~ 0.
Reason
Profile Blog Joined June 2006
United Kingdom2770 Posts
Last Edited: 2013-07-18 14:51:58
July 18 2013 14:51 GMT
#1805
On July 18 2013 23:46 Tobberoth wrote:
Show nested quote +
On July 18 2013 23:32 Reason wrote:
The problem here is that 0 exactly or 1 exactly doesn't tell you if it's an impossible or certain event or not so using that language really doesn't help. That's why I was making the point of writing 1(almost sure) or 1(sure) before so people actually understand what you're saying there.

The probability of eventually randomly selecting any particular real number between [0,1] is 0 (but not impossible)
The probability of eventually randomly selecting 0.548 is 1 exactly. (is it almost sure, or sure)?

The last part there I'm not sure about, it's basically the same question as before anyway.

I'd say the probability of eventually randomly selecting 0.548 is 1 (almost sure), because just like before, it's conceivable that the it will never ever select that number since it's random.

mmmm that was my guess too... although...

...was this guy onto anything or not? (about the flipping heads thing)
On July 18 2013 20:58 uzyszkodnik wrote:
@Rassy
Strong law of large numbers says that such trail wont happen even once.

Speak properly, and in as few words as you can, but always plainly; for the end of speech is not ostentation, but to be understood.
uzyszkodnik
Profile Joined April 2010
Poland64 Posts
Last Edited: 2013-07-18 15:00:48
July 18 2013 14:54 GMT
#1806
On July 18 2013 23:48 Shiori wrote:
Show nested quote +
On July 18 2013 23:46 uzyszkodnik wrote:
i think that we can denote 1(almost sure) as p(x)=1-epsilon and 1(sure) as p(x)=epsilon, where epsilon as a number infinitely small, it would be kinda more mathematical, as for the question involving probability of selecting particular number a from range [0,1] just take a 2d uniform distribution and calc the integrals -_-' .

My understanding is that the only infinitely small number (i.e. infinitesimal) over the real numbers is zero, which means that epsilon is exactly zero.

Also, I think you mean for the second equation to be p(x)=1, not p(x)=epsilon, since that would imply that 1(sure) has p(x) ~=~ 0.


yeah sorry

sure - p(x)=1, almost sure p(x) = 1-epsilon, impossible p(x) = 0 , almost impossible p(x)=epsilon

as for the second part,

http://mathb.in/8653

hence the probability of selecting number a in continuum infinite number of tries =0 ,
i assumed that the probability mass function is f(x,y) = 1.

If you think that's wrong, please talk the math to me .
Shiori
Profile Blog Joined July 2011
3815 Posts
Last Edited: 2013-07-18 14:56:17
July 18 2013 14:55 GMT
#1807
On July 18 2013 22:53 paralleluniverse wrote:
Show nested quote +
On July 18 2013 21:07 Rassy wrote:
On July 18 2013 19:42 paralleluniverse wrote:
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.



Now i have a question to parralel universe who seems to be verry sure in his statements.

If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.


Now you pick a number between 0 and 1 an infinite amount of times, what are the odds to pick 0.548 exactly at least once?
If it is 0 exactly as he say, then the answer should be 0
However if it is infinitesimally close to 0, then the odds of picking this number at least once would be 1

No?

I'll think about it.

I'd say that it'd still be zero, but I'm not entirely sure. I kinda feel like this (maybe) touches on transfinite arithmetic, but division of N_0 by N_1 or vice versa is undefined so I'm stuck. I guess you could define it as zero, but I have no idea what that would do.

http://math.stackexchange.com/questions/146844/how-to-divide-aleph-numbers

These guys seem to be having a tonne of trouble making sense of cardinal division, even when it's uncountable and countable rather than uncountable and uncountable.

Fucking transfinite how does it work
paralleluniverse
Profile Joined July 2010
4065 Posts
Last Edited: 2013-07-19 11:55:54
July 18 2013 15:23 GMT
#1808
On July 18 2013 21:07 Rassy wrote:
Show nested quote +
On July 18 2013 19:42 paralleluniverse wrote:
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.



Now i have a question to parralel universe who seems to be verry sure in his statements.

If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.[/QUOTE

Now you pick a number between 0 and 1 an infinite amount of times, what are the odds to pick 0.548 exactly at least once?
If it is 0 exactly as he say, then the answer should be 0
However if it is infinitesimally close to 0, then the odds of picking this number at least once would be 1

No?

For those of you quick enough, you may have notice that I edited out my first response to this question. That's because it was incorrect. The correct solution is 0.

If U_n are independent uniform random variables on [0,1] for n = 1, 2, ..., infinity, then the probability that U_n is eventually 0.548 is equal to 0. This is because P(U_n = 0.548 eventually) = 1 - P(U_n != 0.548 infinitely often), and by the 2nd Borel-Cantelli lemma, P(U_n != 0.548 infinitely often) = 1.

So even if you select (countably many) infinite random numbers that are uniform on [0,1], you still won't get 0.548 exactly. Interestingly, if you have a Brownian motion, which is in some sense like randomly selecting a normal random variable at every time instance in [0,1], then the probability that it's equal to 0.548 at any particular time t, is 0. But the probability that it will equal 0.548 exactly, infinitely often, has probability 1. Note that this doesn't contradict the solution above, because here we are on an uncountable set.

The Borel-Cantelli lemma also says that if we let E_n = "getting all heads in infinite flips on trial n" (a trial is one string of infinite flips), then P(E_n infinitely often) = 0. This is contrary to claims above saying that if you have infinite trials of infinite flips, then you'll get infinitely many trials of all heads. Those claims are wrong.
paralleluniverse
Profile Joined July 2010
4065 Posts
Last Edited: 2013-07-18 15:29:20
July 18 2013 15:28 GMT
#1809
On July 18 2013 22:55 Shiori wrote:
Show nested quote +
Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.


This is really what I was trying to say, in a roundabout way of using examples. I was very poor at communicating it, because probability isn't really my focus in math, and because I'm nothing more than an (competent, I like to think) undergraduate, so thank you very much for making this post (and same with DoubleReed).

While I probably didn't know enough to attempt to convince wherebugsgo in a precise fashion, I find these sorts of debates really helpful at learning aspects of math that I don't usually work with, because there's the opportunity to have someone criticize perceived weaknesses in an argument.

I am very relieved to know that I wasn't wrong about the probability actually being 0 over uncountably infinite possibilities. Thanks muchly.

Are you a mathematician, by the way? May I ask what your specialty is?

Yes, in the sense that I have a math degree and job in math (not academic). I specialize in probability theory and statistics.
Shiori
Profile Blog Joined July 2011
3815 Posts
July 18 2013 15:34 GMT
#1810
On July 19 2013 00:23 paralleluniverse wrote:
Show nested quote +
On July 18 2013 21:07 Rassy wrote:
On July 18 2013 19:42 paralleluniverse wrote:
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.



Now i have a question to parralel universe who seems to be verry sure in his statements.

If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.[/QUOTE

Now you pick a number between 0 and 1 an infinite amount of times, what are the odds to pick 0.548 exactly at least once?
If it is 0 exactly as he say, then the answer should be 0
However if it is infinitesimally close to 0, then the odds of picking this number at least once would be 1

No?

For those of you quick enough, you may have notice that I edited out my first response to this question. That's because it was incorrect. The correct solution is 0.

If U_n are independent uniform random variables on [0,1] for n = 1, 2, ..., infinity, then the probability that U_n is eventually 0.548 is equal to 0. This is because P(U_n = 0.548 eventually) = 1 - P(U_n = 0.548 infinitely often), and by the 2nd Borel-Cantelli lemma, P(U_n = 0.548 infinitely often) = 1.

So even if you select (countably many) infinite random numbers that are uniform on [0,1], you still won't get 0.548 exactly. Interestingly, if you replace "uniform" with "normal", and you randomly selected for every time instance in [0,1] (which is uncountable), then you will get 0.548 exactly, infinitely often with probability 1. That's because it becomes a Brownian motion, which is known to have this property. So the countability (or uncountability) in the number of draws is very important. I don't know without having to do some work what happens if you select uniform random variables on [0,1] for each time instance in [0,1].

Awesome! Thanks very much for your aid in resolving this dispute ><.
The Borel-Cantelli lemma also says that if we let E_n = "getting all heads in infinite flips on trial n" (a trial is one string of infinite flips), then P(E_n infinitely often) = 0. This is contrary to claims above, saying that if you have infinite trials of infinite flips, then you'll get infinitely many trials of all heads. Those claims are wrong.

Holy shit my intuition was actually correct. I redacted the argument in an edit because I thought my reasoning was wrong (and I'm pretty sure it was).

I have a related question, though. If we do a countably infinite number of trials (a trial is one string of infinite flips) what is the probability that we will get a string of all heads at least once? Maybe that's a silly question but I just want to be absolutely sure, haha, since a lot of this stuff is kinda mind-boggling in ways.
paralleluniverse
Profile Joined July 2010
4065 Posts
Last Edited: 2013-07-18 15:44:23
July 18 2013 15:40 GMT
#1811
On July 19 2013 00:34 Shiori wrote:
Show nested quote +
On July 19 2013 00:23 paralleluniverse wrote:
On July 18 2013 21:07 Rassy wrote:
On July 18 2013 19:42 paralleluniverse wrote:
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.



Now i have a question to parralel universe who seems to be verry sure in his statements.

If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.[/QUOTE

Now you pick a number between 0 and 1 an infinite amount of times, what are the odds to pick 0.548 exactly at least once?
If it is 0 exactly as he say, then the answer should be 0
However if it is infinitesimally close to 0, then the odds of picking this number at least once would be 1

No?

For those of you quick enough, you may have notice that I edited out my first response to this question. That's because it was incorrect. The correct solution is 0.

If U_n are independent uniform random variables on [0,1] for n = 1, 2, ..., infinity, then the probability that U_n is eventually 0.548 is equal to 0. This is because P(U_n = 0.548 eventually) = 1 - P(U_n = 0.548 infinitely often), and by the 2nd Borel-Cantelli lemma, P(U_n = 0.548 infinitely often) = 1.

So even if you select (countably many) infinite random numbers that are uniform on [0,1], you still won't get 0.548 exactly. Interestingly, if you replace "uniform" with "normal", and you randomly selected for every time instance in [0,1] (which is uncountable), then you will get 0.548 exactly, infinitely often with probability 1. That's because it becomes a Brownian motion, which is known to have this property. So the countability (or uncountability) in the number of draws is very important. I don't know without having to do some work what happens if you select uniform random variables on [0,1] for each time instance in [0,1].

Awesome! Thanks very much for your aid in resolving this dispute ><.
Show nested quote +
The Borel-Cantelli lemma also says that if we let E_n = "getting all heads in infinite flips on trial n" (a trial is one string of infinite flips), then P(E_n infinitely often) = 0. This is contrary to claims above, saying that if you have infinite trials of infinite flips, then you'll get infinitely many trials of all heads. Those claims are wrong.

Holy shit my intuition was actually correct. I redacted the argument in an edit because I thought my reasoning was wrong (and I'm pretty sure it was).

I have a related question, though. If we do a countably infinite number of trials (a trial is one string of infinite flips) what is the probability that we will get a string of all heads at least once? Maybe that's a silly question but I just want to be absolutely sure, haha, since a lot of this stuff is kinda mind-boggling in ways.

This is the same question as the one I answered, but with the events "U_n = 0.548" replaced with the events "trial n is all head". Since both these events have probability 0, the answer is 0 using the same argument as above.

Lesson: You can't get zero probability events (like selecting a particular real number at random, or flipping infinite heads) occurring for sure, even if you repeat it (countably) infinitely many times.
Shiori
Profile Blog Joined July 2011
3815 Posts
July 18 2013 15:44 GMT
#1812
On July 19 2013 00:40 paralleluniverse wrote:
Show nested quote +
On July 19 2013 00:34 Shiori wrote:
On July 19 2013 00:23 paralleluniverse wrote:
On July 18 2013 21:07 Rassy wrote:
On July 18 2013 19:42 paralleluniverse wrote:
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.



Now i have a question to parralel universe who seems to be verry sure in his statements.

If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.[/QUOTE

Now you pick a number between 0 and 1 an infinite amount of times, what are the odds to pick 0.548 exactly at least once?
If it is 0 exactly as he say, then the answer should be 0
However if it is infinitesimally close to 0, then the odds of picking this number at least once would be 1

No?

For those of you quick enough, you may have notice that I edited out my first response to this question. That's because it was incorrect. The correct solution is 0.

If U_n are independent uniform random variables on [0,1] for n = 1, 2, ..., infinity, then the probability that U_n is eventually 0.548 is equal to 0. This is because P(U_n = 0.548 eventually) = 1 - P(U_n = 0.548 infinitely often), and by the 2nd Borel-Cantelli lemma, P(U_n = 0.548 infinitely often) = 1.

So even if you select (countably many) infinite random numbers that are uniform on [0,1], you still won't get 0.548 exactly. Interestingly, if you replace "uniform" with "normal", and you randomly selected for every time instance in [0,1] (which is uncountable), then you will get 0.548 exactly, infinitely often with probability 1. That's because it becomes a Brownian motion, which is known to have this property. So the countability (or uncountability) in the number of draws is very important. I don't know without having to do some work what happens if you select uniform random variables on [0,1] for each time instance in [0,1].

Awesome! Thanks very much for your aid in resolving this dispute ><.
The Borel-Cantelli lemma also says that if we let E_n = "getting all heads in infinite flips on trial n" (a trial is one string of infinite flips), then P(E_n infinitely often) = 0. This is contrary to claims above, saying that if you have infinite trials of infinite flips, then you'll get infinitely many trials of all heads. Those claims are wrong.

Holy shit my intuition was actually correct. I redacted the argument in an edit because I thought my reasoning was wrong (and I'm pretty sure it was).

I have a related question, though. If we do a countably infinite number of trials (a trial is one string of infinite flips) what is the probability that we will get a string of all heads at least once? Maybe that's a silly question but I just want to be absolutely sure, haha, since a lot of this stuff is kinda mind-boggling in ways.

This is the same question as the one I answered, but with the events "U_n = 0.548" replaced with the events "trial n is all head". Since both these events have probability 0, the answer is 0 using the same argument as above.

Lesson: You don't zero probability occurring for sure, even if you repeat it (countably) inifinitely many times.

Okay thanks very much. That's what I initially thought but I was kinda getting confused since probability/statistics have been, thus far, basically pushed to the side during my degree for the sake of satiating my desire to sample different fields than just math. After this thread, though, I'm definitely looking forward to taking the required statistics/probability courses in the future a lot more than I was beforehand, haha.

Thanks ^.^
Myrddraal
Profile Joined December 2010
Australia937 Posts
Last Edited: 2013-07-19 01:32:52
July 19 2013 01:30 GMT
#1813
On July 18 2013 19:42 paralleluniverse wrote:
Show nested quote +
On July 18 2013 09:10 DoubleReed wrote:
On July 18 2013 08:43 yOngKIN wrote:
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

measurable only in terms of our limited huma knowledge of math and physics


Uhh... no. Measurable as in Lebesgue Measure.

Yep. You are completely correct.

One thing I've learned is that people like to debate math on the internet and get it all wrong (like this guy), when in fact math is almost never up for debate. Especially, when it's well-established, hundreds of years old math, like measure theory. Math debates are often the most futile because people substitute their intuition with unrelenting zeal in place of mathematical rigor. Particularly in topics like measure theory, which produces counter-intuitive results to those who haven't learned the subject.

In a math "debate", a general rule that I observe is the following: If you're debating about technical details, then you're talking to a crank (and probably getting nowhere). If you're debating about philosophy, it's not necessarily apparent that you're talking to a crank.

Appeals to ignorance are also very common, as you've just experience. For example. people love saying that we don't understand infinity. There are some things in math that we don't understand, infinity is not one of them. Infinity is a rigorously defined and well-understood concept. With knowledge from a high school or 1st or 2nd year math course, pretty much any perceived problems or hole in our human knowledge of math that one would think of (other than famous unsolved problems), isn't actually a problem nor a hole, but rather a personal lack of knowledge in math.

Show nested quote +
On July 18 2013 07:42 DoubleReed wrote:
What are you guys talking about?

Finite vs Infinite shouldn't matter for those problems. Uncountable and countable are the only restrictions on such sets. There is no additivity of an uncountable number of sets.

If A and B are disjoint (and measurable), Measure(A) + Measure(B) = Measure(A U B)
You can also do this for many sets. Sum(Measure(An)) = Measure(Union(An)) where n is finite (just stretching the previous statement to multiple sets). So {An} is a finite sequence of disjoint, measurable sets.
You can also do this if n is a countably infinite set, so {An} is a countably infinite sequence of disjoint, measurable sets.
But you can't do that if n is an uncountably infinite set. That doesn't work. You can't pretend that it does, and there are plenty of easy exceptions.

I'm confused because it seems like you should be differentiating between countable and uncountable, rather than finite and infinite. This is measure theory. So just use measure theory.

This is correct. The distinction is between countable and uncountable. As an example to back up your fact that "Sum(Measure(An)) = Measure(Union(An))" doesn't work when n is an element of an uncountable set, we can use the uncountable set [0,infinity) and set An as the independent events "a Brownian motion hits 3 at time n". Then the LHS = 0 and the RHS = 1.

Also, no I don't understand what they're arguing about either. But in a probably futile attempt to resolve it, let me state the following fact:
If every real number in [0,1] has equal probability of selection, then the probability of randomly selecting any particular number between [0,1] (e.g. 0.548 exactly), is 0 exactly.

Not "approximately 0", or "infinitesimally close to 0", or "1/infinity", or "approaches 0", or "0 in the limit", or whatever. It's simply 0.


I'm going to ignore the fact that you started this post off by being an ass, because I really want to understand this assertion that both you and Shiori have made.

Let me start this by saying that I am a programmer, not a mathematician, so I am approaching this from a logical perspective with limited mathematical knowledge.

Logically, if you have an equal probability of selecting any number from an infinite selection, selecting one number should have a probability of 1/infinity. I am not denying that it is mathematically provable to be 0, but in order for me to believe you that the probability is 0 and not 1/infinity and hence invalidate my intuition I need to see solid evidence that it is not 1/infinity.

On July 19 2013 00:23 paralleluniverse wrote:
For those of you quick enough, you may have notice that I edited out my first response to this question. That's because it was incorrect. The correct solution is 0.

If U_n are independent uniform random variables on [0,1] for n = 1, 2, ..., infinity, then the probability that U_n is eventually 0.548 is equal to 0. This is because P(U_n = 0.548 eventually) = 1 - P(U_n != 0.548 infinitely often), and by the 2nd Borel-Cantelli lemma, P(U_n != 0.548 infinitely often) = 1.


As far as I can tell your solution does not disprove P(U_n = 0.548 eventually) = 1/infinity since the 2nd Borel-Cantelli lemma obtains a probability of 1 using a limit so it could equally be represented as 1 - 1/infinity (or if this is the case http://en.wikipedia.org/wiki/0.999.. as Tobberoth posted earlier, 1 could always be represented as 1 - 1/infinity). Therefore it would be equally true to say P(U_n != 0.548 infinitely often) = 1 - 1/infinity, hence P(U_n = 0.548 eventually) = 1 - (1 - 1/infinity) = 1/infinity.

If the above is correct, you could say that the answer is 0 and (1/infinity, infinitesimally small, approaching 0, whatever) but you could not say it is 0 and not infinitesimally small, and you must both concede that you made an incorrect assertion, if it's not correct please tell me what I am missing or what I have done incorrectly in my reasoning.

[stranded]: http://www.indiedb.com/games/stranded
DoubleReed
Profile Blog Joined September 2010
United States4130 Posts
Last Edited: 2013-07-19 03:36:18
July 19 2013 03:11 GMT
#1814
Myrddraal: allow me to help. This stuff is pretty counter-intuitive but also really cool.

"Infinitesimally small" is kind of a weird and vague intuitive idea. It's not a rigorous thing as far as I know. Like when people say "approaching 0," this doesn't actually make sense. Try writing something like this down. It's just not the way limits work. Limits equal things.

Another reason why this doesn't work is because of the countable/uncountable thing that I was trying to get people to understand. This is not a problem of finite and infinite. This is a problem of countable and uncountable.

Example:

Rather than look at the probability of picking a single number between 0 and 1, let's look at the probability of picking a rational number between 0 and 1. There are certainly infinite rational numbers, each with 1/infinity chance of picking it. So obviously we have absolutely no idea, if you look at it this way. It just doesn't make sense. There's infinite rational numbers and infinite irrational numbers. What do we do?

This is why you can't think of it this way. Here's the correct way: the rational numbers are countable (you can align the rational numbers with the natural numbers in one-to-one correspondence). All countable sets have measure zero. So the probability is zero. The irrational numbers are uncountable. There's way way way more irrational numbers than rational numbers.

If you were to actually pick a random real number between 1 and 0, it's going to be a irrational number.

Hope that helps and blows your mind.
Shiori
Profile Blog Joined July 2011
3815 Posts
Last Edited: 2013-07-19 03:22:25
July 19 2013 03:18 GMT
#1815
DoubleReed explained it better.

The point about limits is good too. Lim f(x) = L for x-->a doesn't mean that the limit approaches L. It means that f(x) approaches L at a. To say that a limit approaches something is like saying the limit of the limit :p.
Myrddraal
Profile Joined December 2010
Australia937 Posts
Last Edited: 2013-07-19 04:08:02
July 19 2013 04:07 GMT
#1816
On July 19 2013 12:11 DoubleReed wrote:
Myrddraal: allow me to help. This stuff is pretty counter-intuitive but also really cool.

"Infinitesimally small" is kind of a weird and vague intuitive idea. It's not a rigorous thing as far as I know. Like when people say "approaching 0," this doesn't actually make sense. Try writing something like this down. It's just not the way limits work. Limits equal things.
Show nested quote +

I think you may need to elaborate on this, Wikipedia seems to disagree: "In mathematics, a limit is the value that a function or sequence "approaches" as the input or index approaches some value." http://en.wikipedia.org/wiki/Limit_(mathematics)

I'm not trying to deny that limit with an input approaching infinity can be considered equal, but saying that the value does not "approach 0" seems wrong given the above definition. Does the definition change when the input is infinite?

Another reason why this doesn't work is because of the countable/uncountable thing that I was trying to get people to understand. This is not a problem of finite and infinite. This is a problem of countable and uncountable.

Example:

Rather than look at the probability of picking a single number between 0 and 1, let's look at the probability of picking a rational number between 0 and 1. There are certainly infinite rational numbers, each with 1/infinity chance of picking it. So obviously we have absolutely no idea, if you look at it this way. It just doesn't make sense. There's infinite rational numbers and infinite irrational numbers. What do we do?

This is why you can't think of it this way. Here's the correct way: the rational numbers are countable (you can align the rational numbers with the natural numbers in one-to-one correspondence). All countable sets have measure zero. So the probability is zero. The irrational numbers are uncountable. There's way way way more irrational numbers than rational numbers.

If you were to actually pick a random real number between 1 and 0, it's going to be a irrational number.

Hope that helps and blows your mind.


I'm pretty sure I understand what you are saying here, but I still don't understand how it disproves a 1/infinity probability.

Are you saying that 1/infinity can not be represented by rational numbers since they must be countable hence the probability is 0?

In which I would argue that saying that the probability is still 1/infinity is still conceptually correct, even if it cannot be represented with a real number other than 0.

Or are you saying that since there are uncountably infinite possibilities that 1/infinity does not represent the probability?

In which case I would ask what prevents the probability from being 1/(uncountable infinity)?
[stranded]: http://www.indiedb.com/games/stranded
Djzapz
Profile Blog Joined August 2009
Canada10681 Posts
July 19 2013 04:21 GMT
#1817
I check this thread from time to time just to see if somebody managed to revive it.
"My incompetence with power tools had been increasing exponentially over the course of 20 years spent inhaling experimental oven cleaners"
DoubleReed
Profile Blog Joined September 2010
United States4130 Posts
Last Edited: 2013-07-19 04:47:44
July 19 2013 04:29 GMT
#1818
No. I'm saying I have no idea what 1/infinity means. It may be intuitive but it doesn't actually mean anything.

[Edit: If you mean Lim 1/x as x -> infinity then this equals zero. Try writing it down. It doesn't approach zero. It equals zero. Limits don't approach things. Limits equal things. The x approaches infinity in the limit, but the limits themselves don't approach stuff.]

But I forgot something. There's actually a super duper easy way to see that the probability is exactly zero.

Let's look at the probability of picking a random number on [0,1] that it lands on the interval [0.47,0.53]. Well it's 6%, right? Because the length of the interval is 0.06. So let's forget that whole measure thing and just look at lengths of intervals. What's the generalized way to find the length of an interval?

Length[a,b] = b - a. Simple.

Okay. How does this relate to the probability of picking a single number? Well, a single number can be expressed as a closed interval! What's the probability of picking a number on the interval [0.47,0.47]? Well it's just the length of the interval. Which is 0.47 - 0.47 = 0.

No calculus. No infinity. No countability. Just subtraction. That's how we like it.
Myrddraal
Profile Joined December 2010
Australia937 Posts
July 19 2013 07:11 GMT
#1819
On July 19 2013 13:29 DoubleReed wrote:
No. I'm saying I have no idea what 1/infinity means. It may be intuitive but it doesn't actually mean anything.

[Edit: If you mean Lim 1/x as x -> infinity then this equals zero. Try writing it down. It doesn't approach zero. It equals zero. Limits don't approach things. Limits equal things. The x approaches infinity in the limit, but the limits themselves don't approach stuff.]

Sorry if I am being difficult, but it seems like you are getting caught up on the language I am using rather than what I am actually intending. I am assuming we can agree on the language used in the first statement on limits from Wikipedia : In mathematics, a limit is the value that a function or sequence "approaches" as the input or index approaches some value. I will do my best not to differentiate from this language in any way to describe what I am trying to say.

Yes, a limit equals something, but when calculating the limit of 1/x as the input x approaches infinity, the value of the limit L will approach 0. Now, my understanding (perhaps here is where you can fill me in and it will make sense to me) is that it is technically impossible for x to actually "reach" infinity, so it is technically impossible for the limit to "reach" 0, though instead it gets so close so as to make practically no difference. I would argue that while it makes practically no difference, and mathematically we don't run into problems treating them as equal and they are mathematically provable to be equal, if we were to define the difference, the clearest way to define the difference would be infinitesimal.

I think what I am trying to say pretty much is that to me, the concept of 1/infinity or something infinitesimal is effectively equal to 0 in almost every way. Except that I think the separate definition would be useful in terms of theoretical probability to be able to effectively describe the difference between something that is impossible and almost impossible.


But I forgot something. There's actually a super duper easy way to see that the probability is exactly zero.

Let's look at the probability of picking a random number on [0,1] that it lands on the interval [0.47,0.53]. Well it's 6%, right? Because the length of the interval is 0.06. So let's forget that whole measure thing and just look at lengths of intervals. What's the generalized way to find the length of an interval?

Length[a,b] = b - a. Simple.

Okay. How does this relate to the probability of picking a single number? Well, a single number can be expressed as a closed interval! What's the probability of picking a number on the interval [0.47,0.47]? Well it's just the length of the interval. Which is 0.47 - 0.47 = 0.

No calculus. No infinity. No countability. Just subtraction. That's how we like it.


That is a nice concise solution, though I have asserted multiple times that I don't doubt that it is mathematically provable to be 0, I am interested in how it is provable to not be 1/infinity or infinitesimal.
[stranded]: http://www.indiedb.com/games/stranded
Tobberoth
Profile Joined August 2010
Sweden6375 Posts
July 19 2013 07:11 GMT
#1820
On July 19 2013 13:29 DoubleReed wrote:
No. I'm saying I have no idea what 1/infinity means. It may be intuitive but it doesn't actually mean anything.

[Edit: If you mean Lim 1/x as x -> infinity then this equals zero. Try writing it down. It doesn't approach zero. It equals zero. Limits don't approach things. Limits equal things. The x approaches infinity in the limit, but the limits themselves don't approach stuff.]

But I forgot something. There's actually a super duper easy way to see that the probability is exactly zero.

Let's look at the probability of picking a random number on [0,1] that it lands on the interval [0.47,0.53]. Well it's 6%, right? Because the length of the interval is 0.06. So let's forget that whole measure thing and just look at lengths of intervals. What's the generalized way to find the length of an interval?

Length[a,b] = b - a. Simple.

Okay. How does this relate to the probability of picking a single number? Well, a single number can be expressed as a closed interval! What's the probability of picking a number on the interval [0.47,0.47]? Well it's just the length of the interval. Which is 0.47 - 0.47 = 0.

No calculus. No infinity. No countability. Just subtraction. That's how we like it.

So if you have a theoretical one-sided dice, the odds of getting 1 is 0, because 1-1 = 0? I don't think this closed interval thing works.
Prev 1 89 90 91 92 93 104 Next
Please log in or register to reply.
Live Events Refresh
Next event in 4h 33m
[ Submit Event ]
Live Streams
Refresh
StarCraft 2
PiGStarcraft417
mcanning 125
StarCraft: Brood War
Larva 678
TY 478
Snow 158
Noble 17
Hm[arnc] 4
Bale 2
Britney 0
Dota 2
febbydoto18
League of Legends
JimRising 665
Counter-Strike
summit1g8602
Stewie2K762
Other Games
shahzam1097
KnowMe140
NeuroSwarm63
Mew2King48
Organizations
Other Games
gamesdonequick836
StarCraft 2
Blizzard YouTube
StarCraft: Brood War
BSLTrovo
sctven
[ Show 12 non-featured ]
StarCraft 2
• AfreecaTV YouTube
• intothetv
• Kozan
• IndyKCrew
• LaughNgamezSOOP
• Migwel
• sooper7s
StarCraft: Brood War
• BSLYoutube
• STPLYoutube
• ZZZeroYoutube
League of Legends
• Rush1492
• Stunt571
Upcoming Events
Wardi Open
4h 33m
PiGosaur Monday
17h 33m
The PondCast
1d 3h
Replay Cast
1d 17h
RSL Revival
2 days
ByuN vs Classic
Clem vs Cham
WardiTV European League
2 days
Replay Cast
2 days
RSL Revival
3 days
herO vs SHIN
Reynor vs Cure
WardiTV European League
3 days
FEL
3 days
[ Show More ]
Korean StarCraft League
3 days
CranKy Ducklings
4 days
RSL Revival
4 days
FEL
4 days
Sparkling Tuna Cup
5 days
RSL Revival
5 days
FEL
5 days
BSL: ProLeague
5 days
Dewalt vs Bonyth
Replay Cast
6 days
Liquipedia Results

Completed

Proleague 2025-06-28
HSC XXVII
Heroes 10 EU

Ongoing

JPL Season 2
BSL 2v2 Season 3
BSL Season 20
Acropolis #3
KCM Race Survival 2025 Season 2
CSL 17: 2025 SUMMER
Copa Latinoamericana 4
Championship of Russia 2025
RSL Revival: Season 1
Murky Cup #2
BLAST.tv Austin Major 2025
ESL Impact League Season 7
IEM Dallas 2025
PGL Astana 2025
Asian Champions League '25
BLAST Rivals Spring 2025
MESA Nomadic Masters
CCT Season 2 Global Finals
IEM Melbourne 2025
YaLLa Compass Qatar 2025

Upcoming

CSLPRO Last Chance 2025
CSLPRO Chat StarLAN 3
K-Championship
uThermal 2v2 Main Event
SEL Season 2 Championship
FEL Cracov 2025
Esports World Cup 2025
StarSeries Fall 2025
FISSURE Playground #2
BLAST Open Fall 2025
BLAST Open Fall Qual
Esports World Cup 2025
BLAST Bounty Fall 2025
BLAST Bounty Fall Qual
IEM Cologne 2025
FISSURE Playground #1
TLPD

1. ByuN
2. TY
3. Dark
4. Solar
5. Stats
6. Nerchio
7. sOs
8. soO
9. INnoVation
10. Elazer
1. Rain
2. Flash
3. EffOrt
4. Last
5. Bisu
6. Soulkey
7. Mini
8. Sharp
Sidebar Settings...

Advertising | Privacy Policy | Terms Of Use | Contact Us

Original banner artwork: Jim Warren
The contents of this webpage are copyright © 2025 TLnet. All Rights Reserved.