|
Note from micronesia: please read the thread before making comments about how we have just turned physics on its head. |
On January 09 2013 00:04 radscorpion9 wrote:Show nested quote +On January 08 2013 23:26 Douillos wrote:On January 08 2013 22:51 Evangelist wrote: Newtonian mechanics are wrong. They aren't wrong, they just apply at a different scale. Comparing Quantum and Newtonian mecanics is just plain stupid to start with. I think the whole point is that Newtonian mechanics actually doesn't apply at *any* scale unless you want approximations of an answer. If you want an exact answer then you can't use Newtonian mechanics, because...why else? Its wrong. Einstein's theory of general relativity is something that does apply at a different scale, as it gives extremely precise answers on the macroscopic scale. It just doesn't work on the "quantum scale" obviously.
You cant measure anything with infinite accuracy, therefore all calculations and results always bear some uncertainty born in intial measerment. The only real difference between Einsteins theory and Newtonian is level of accuracy when calculating something. Newtonian physics is just fine for most calculations. That is unless You are living in delusional world of theoretical physicist.
|
On January 09 2013 00:14 Silvanel wrote:Show nested quote +On January 09 2013 00:04 radscorpion9 wrote:On January 08 2013 23:26 Douillos wrote:On January 08 2013 22:51 Evangelist wrote: Newtonian mechanics are wrong. They aren't wrong, they just apply at a different scale. Comparing Quantum and Newtonian mecanics is just plain stupid to start with. I think the whole point is that Newtonian mechanics actually doesn't apply at *any* scale unless you want approximations of an answer. If you want an exact answer then you can't use Newtonian mechanics, because...why else? Its wrong. Einstein's theory of general relativity is something that does apply at a different scale, as it gives extremely precise answers on the macroscopic scale. It just doesn't work on the "quantum scale" obviously. You cant measure anything with infinite accuracy, therefore all calculations and results always bear some uncertainty born in intial measerment. The only real difference between Einsteins theory and Newtonian is level of accuracy when calculating something. Newtonian physics is just fine for most calculations. That is unless You are living in delusional world of theoretical physicist. It's not about accuracy, it's about Newtonian models not taking into account relativity. I wouldn't say that Newtonian physics are wrong, but they're certainly incomplete.
|
On January 09 2013 00:11 Mauldo wrote: So what the article says about dark energy, is that a big deal, or are all the physicists in the thread not touching on that? It seems rather awesome, like a significant step forward in figuring out what exactly dark energy is and how it works.
I'm assuming this isn't as awesome as finding the Higgs-Boson, but still pretty awesome. Is that a fairly accurate assessment?
From a glance, it isn't what you make out. The guy just points out that one phenomenon they have observed (negative pressure) is also a property that dark energy needs to explain the accelerating rate of expansion of the universe that we see. It's interesting, but i wouldn't call it a significant step now. Maybe you can call it that in hindsight later, but for now it doesn't stand out.
|
Another case of bad science reporting.
Unfortunately it is common practice in science today to use shiny terminology to convince the layman, who has no idea what is going on, that the work is revolutionary. It's the same principle that is applied in advertizing. We didn't lie, we just took the risk to be misunderstood
|
Hover boards, here we come!
|
On January 09 2013 00:20 synapse wrote:Show nested quote +On January 09 2013 00:14 Silvanel wrote:On January 09 2013 00:04 radscorpion9 wrote:On January 08 2013 23:26 Douillos wrote:On January 08 2013 22:51 Evangelist wrote: Newtonian mechanics are wrong. They aren't wrong, they just apply at a different scale. Comparing Quantum and Newtonian mecanics is just plain stupid to start with. I think the whole point is that Newtonian mechanics actually doesn't apply at *any* scale unless you want approximations of an answer. If you want an exact answer then you can't use Newtonian mechanics, because...why else? Its wrong. Einstein's theory of general relativity is something that does apply at a different scale, as it gives extremely precise answers on the macroscopic scale. It just doesn't work on the "quantum scale" obviously. You cant measure anything with infinite accuracy, therefore all calculations and results always bear some uncertainty born in intial measerment. The only real difference between Einsteins theory and Newtonian is level of accuracy when calculating something. Newtonian physics is just fine for most calculations. That is unless You are living in delusional world of theoretical physicist. It's not about accuracy, it's about Newtonian models not taking into account relativity. I wouldn't say that Newtonian physics are wrong, but they're certainly incomplete.
Which results in calculations based upon Newtonian model to be less accurate in some circaumstances than those based upon Einsteins model. Relatyvistic physics is also incomplete.
|
United States24665 Posts
On January 09 2013 00:03 et wrote: No, that's not the case. There's nothing in nature sticking the label 'Temperature' to the partial derivative of entropy wrt. energy. You could call that something else, and define temperature as something else, and physics would still work. Definitions are really just names. Yes, but there are actual answers when it comes to how science defines temperature. Many people think that science defines temperature as a measure of the vibrational speed of molecules, which is inaccurate. Of course, you could argue that that is one definition of temperature, and the one I subscribe to is another, but you'll be hard pressed to find a justification for this that stands up to the rigors of science, today (this may not have been the case many years ago).
On January 09 2013 00:06 Silvanel wrote:Show nested quote +On January 08 2013 23:52 micronesia wrote:On January 08 2013 23:50 et wrote: That's not really better. A definition is just giving a name to something. A set of definitions can be inconsistent, but calling a definition wrong is weird. I disagree, since we are in the realm of science rather than language You are always within language. Right, but giving something a scientific definition is different than giving something a more general definition.
|
cool. I look forward to hearing more about this over time; and see if they can use this to invent some awesome new tech as it becomes more developed. Scientists always coming up with crazy new things; and weird physics to bypass limitations.
|
On January 09 2013 00:39 micronesia wrote:Show nested quote +On January 09 2013 00:03 et wrote: No, that's not the case. There's nothing in nature sticking the label 'Temperature' to the partial derivative of entropy wrt. energy. You could call that something else, and define temperature as something else, and physics would still work. Definitions are really just names. Yes, but there are actual answers when it comes to how science defines temperature. Many people think that science defines temperature as a measure of the vibrational speed of molecules, which is inaccurate. Of course, you could argue that that is one definition of temperature, and the one I subscribe to is another, but you'll be hard pressed to find a justification for this that stands up to the rigors of science, today (this may not have been the case many years ago). On January 08 2013 23:52 micronesia wrote:Show nested quote +On January 08 2013 23:50 et wrote: That's not really better. A definition is just giving a name to something. A set of definitions can be inconsistent, but calling a definition wrong is weird. I disagree, since we are in the realm of science rather than language It sounds to me that you're saying that if you're speaking in a thermodynamic context, it's wrong to use the popular definition, but that's still a matter of language—scientific language. When you say, "Many people think that science defines temperature as a measure of the vibrational speed of molecules, which is inaccurate," I can go along with that—I only learnt about the entropy definition from this thread—but to say that that's an inherently wrong definition of temperature of general is going too far, IMO.
This is starting to sound like hair-splitting so I'll try to be clear. IMO, the "essence" of the word temperature—the place where all the definitions start—is "the property that leads to what we experience as 'hot' and 'cold'." Now, it turns out that there is more than one property that fits that description—in the ordinary run of things they all coincide, but in certain extraordinary cases there are ramifications between them. There may be good reasons why the scientific definition is most useful, but as long as one's definition of temperature satisfies the 'ur-definition' that I put in bold, it's a valid way to use the word "temperature", unless he's speaking in a specifically thermodynamic context. Could you agree with that?
|
United States24665 Posts
On January 09 2013 04:43 qrs wrote:Show nested quote +On January 09 2013 00:39 micronesia wrote:On January 09 2013 00:03 et wrote: No, that's not the case. There's nothing in nature sticking the label 'Temperature' to the partial derivative of entropy wrt. energy. You could call that something else, and define temperature as something else, and physics would still work. Definitions are really just names. Yes, but there are actual answers when it comes to how science defines temperature. Many people think that science defines temperature as a measure of the vibrational speed of molecules, which is inaccurate. Of course, you could argue that that is one definition of temperature, and the one I subscribe to is another, but you'll be hard pressed to find a justification for this that stands up to the rigors of science, today (this may not have been the case many years ago). Show nested quote +On January 08 2013 23:52 micronesia wrote:On January 08 2013 23:50 et wrote: That's not really better. A definition is just giving a name to something. A set of definitions can be inconsistent, but calling a definition wrong is weird. I disagree, since we are in the realm of science rather than language It sounds to me that you're saying that if you're speaking in a thermodynamic context, it's wrong to use the popular definition, but that's still a matter of language—scientific language. When you say, "Many people think that science defines temperature as a measure of the vibrational speed of molecules, which is inaccurate," I can go along with that—I only learnt about the entropy definition from this thread—but to say that that's an inherently wrong definition of temperature of general is going too far, IMO. This is starting to sound like hair-splitting so I'll try to be clear. IMO, the "essence" of the word temperature—the place where all the definitions start—is " the property that leads to what we experience as 'hot' and 'cold'." Now, it turns out that there is more than one property that fits that description—in the ordinary run of things they all coincide, but in certain extraordinary cases there are ramifications between them. There may be good reasons why the scientific definition is most useful, but as long as one's definition of temperature satisfies the 'ur-definition' that I put in bold, it's a valid way to use the word "temperature", unless he's speaking in a specifically thermodynamic context. Could you agree with that? What are you saying sounds reasonable, but in lieu of calling it hair splitting I'd say we are kinda going off the deep end here all because I worded something somewhat ambiguously that is difficult to word clearly. This all began because some people were saying things which showed they don't really understand the scientific implications of the study in the article (this is why I thought a moderator note would be helpful, also).
There is a reason why we currently define (mathematically) temperature the way we do. It is quite difficult a topic to fully understand though.
|
On January 08 2013 22:56 Evangelist wrote:Show nested quote +On January 08 2013 00:30 adwodon wrote:On January 07 2013 09:23 GGTeMpLaR wrote:On January 07 2013 07:47 adwodon wrote:On January 07 2013 07:22 Microsloth wrote:On January 07 2013 07:16 Chargelot wrote:On January 07 2013 06:12 Desertfaux wrote: Where's my flying car, goddamnit, its 2013 already. It's called a plane. On January 07 2013 06:08 remedium wrote: It's only a matter of time before physicists pull a Mines of Moria and unleash a Balrog on us. Just sayin'. No. Just sayin'. On January 07 2013 04:56 lumencryster wrote: now i'm going to wait till we can get from point A to point B faster than the speed of light. i mean, people didn't think it was possible to fly, seems ridiculous enough since we don't have wings, right? You cannot move faster than the speed of light. You cannot move at the speed of light. Any passage that you take to arrive somewhere, light will travel through it faster. Light will always win. Unless you're that particle that was accelerated beyond the speed of light. Words like cannot and always means you're predicting the future. To that I say: You'll never accurately predict the entire future. Ever. Suck it. You are correct when you say we can't predict the future, but what you're saying is akin to someone suggesting that one day everyone in America might wake up and start speaking Chinese out of the blue, you can't say it won't happen because the future hasn't happened yet, but it violates everything we know about language, learning, behavior etc so its a pretty solid bet (aka solid fact) that it definitely won't happen. I'd say your example is a lot less believable because it literally makes no sense that it could occur without a cause (which is what "out of the blue" implies). Having a higher velocity than the speed of light can at least make some sense if our current set of scientific paradigms is flawed, incomplete, or just wrong, which is definitely more possible than you're admitting. That was essentially the point, by out of the blue, I just mean suddenly, ie some crazy new learning tool allowing you to learn a new language overnight wasn't invented or some other explanation that would make sense with what we know. It would occur with a cause, but any cause would violate everything we understand about language, for instance that it is learned etc etc as a sudden, otherwise unexplainable, mass language 'shift' would imply. That's basically the same as discovering that we can move faster than the speed of light, and remember, this person wasn't talking about some kind of random new shiny particle with exotic properties, which I'll concede there may be a remote chance of discovering, they were arguing that you cannot say 'we' cannot ever travel faster than the speed of light because we can't predict the future. Considering we understand this far better than we do language / the human mind I'd say it was a perfectly reasonable statement, in fact I would say it would be far more believable that everyone would wake up speaking fluent Chinese than discovering normal particles can push past that barrier. Really? One of these problems is the spontaneous transmission of vast quantities of information to every single human being on the planet without an obvious vector capable of effecting such a change. The other is the violation of world lines. It is theoretically possible for faster than light particles but not this side of the relativistic barrier. We also have absolutely no idea how we might discover these particles.
I'll restate once more seeing as people seemed to be confused by my example, and is rather vague so I'll explain my thoughts behind it.
I basically meant that it is more likely that language is not actually learned, that it is ingrained in all of us, all languages and we 'learn' which one to use or something equally bizarre, and somehow it switched at we suddenly started to speak Chinese instead, total nonsense I know but that kind of violation of everything we know about language, how it works, how the mind works, learning etc, is a still not as big as suggesting that NORMAL particles can break the relativistic barrier, I'm not talking about a hypothetical particle that doesn't exist and travels faster than light, the original post I was responding to was talking about us, we, people, macroscopic objects, travelling faster than light.
Could you also explain to me how it is theoretically possible to travel faster than light? The only ways I can find require you to violate Lorentz invariance and seeing as noone's proved that possible it's all just hypothesis and no solid theory.
On January 08 2013 22:51 Evangelist wrote: Newtonian mechanics are wrong. They are a generalisation which assumes a continuous energy distribution and a linear increase of energy with velocity which is of course wrong where v -> c as well as E -> 0 and m -> 0.
In fact they are so wrong that if we were to use Newtonian mechanics as they were originally intended we would not have the ability to treat cancer, amongst other things. Newtonian mechanics are a subset of relativistic mechanics where none of the above conditions apply. It's a simple exercise to derive the Newtonian force and energy equations from their relativistic expressions. It is not really the quantum regime where this applies - we use a different subset of mechanics for that based upon the quantization of properties of given particles where property distribution is no longer continuous.
It doesn't matter that they are useful. They are still wrong and in a lot of cases by several orders of magnitude or more. Relativistic equations will ultimately provide the more accurate answer even in traditionally Newtonian cases. However, there is little need for that kind of accuracy when dealing with macroscopic bodies as ultimately every model we use is a simplification of some sort.
What you are confusing the situation with is the problem of n-body simulation - something we don't deal with by simplifying equations in physics but by statistical averages and assumptions of stability.
Relativistic mechanics still make generalizations and assumptions, if you use either of them to simulate a ball falling to ground neither will account for everything happening but they are both accurate enough for all intensive purposes.
I'm not saying Newtonian mechanics are better, or that we should ignore relativistic mechanics so I don't know why you're little cancer point was relevant. Newtonian mechanics work, maybe not as all encompassing as Newton had thought, but maybe the same will be said about relativistic mechanics in a few 100 years.
To me that doesn't make them 'wrong' because I never saw them as something which tried to explain everything (as if that's some kind of goal of physics?) and they are still relevant today. If they were wrong, we could never have used them. If they were outdated we wouldn't still be using them. I could see an argument for saying Newton was wrong about the applications of his mechanics, that's fine, he was, but his mechanics are still valid.
They are simple, elegant and provide accurate answers when use appropriately. Please explain to me how that makes them wrong in anything but an extremely pedantic sense?
Also I know about statistical mechanics thanks, I was merely demonstrating that we make assumptions to simplify systems otherwise we'd be overwhelmed by the amount we had to deal which is why, at least as far as I see it, if you claim Newtonian mechanics are wrong then so is everything in physics, it all makes assumptions or generalizations which don't reflect the true nature of things but are there for our benefit when dealing with the maths.
So in short, why do you suggest that the assumptions / constraints Newtonian mechanics makes are invalid, yet accept the assumptions / constraints of other theories?
|
On January 08 2013 19:19 adwodon wrote:Show nested quote +On January 08 2013 07:59 GGTeMpLaR wrote: Newtonian physics has been falsified. Saying "it is only applicable within a certain set of boundaries" is exactly why it is fundamentally wrong (assuming you think that it is actually possible there is a right theory capable of holistically explaining the nature of reality).
A wrong theory can still yield correct predictions under some circumstances, but that doesn't make it any less wrong when it fails to accurately describe the nature of reality. Anyone can make an ad-hoc description of past events and claim they've discovered a theory that causally explains the nature of reality within certain boundaries (for example, Astrology is great at this).
Let's say ice cream sales increase and shortly following this trend, the rate of drowning increases. One might conclude that increased ice creams sales causes more people to drown. If you only look at a certain set of boundaries for the results yielded, this theory wouldn't be fundamentally wrong either. However, we all know that increases in ice cream sales doesn't actually cause more people to drown. Just because Newtonian physics is capable of yielding extremely accurate results in most every-day situations doesn't make it right. It is useful and pragmatic, but still fundamentally wrong (unless you're a hardcore pragmatist). I think you completely miss the point of physics and science in general making comments like that. No physicist will ever claim to know what truly happens in nature, this isn't something we aim for because its impossible, we can't know what happens. Instead we simply try to describe what we see, the general tool used for this is mathematics as it has been shown over centuries to be an amazing tool for describing the world around us. This is basically physics, describing the behaviour and interactions of particles / objects through maths. This is where classical mechanics has been phenomenally successful as it is still provides us with accurate descriptions of the classical world. What you are confusing things with is that we eventually we discovered there is more to the world than what we can immediately see, these generally concern extremes, like the extremely fast relativistic mechanics or the incredibly small quantum mechanics. Classical mechanics falls apart at this point and these other forms of mechanics step in, however you would never use quantum mechanics on macroscopic objects etc This does not mean classical mechanics is wrong, it is simply a collection of mathematical rules to describe the motions of macroscopic objects at non-relativistic speeds, this never changed and it still works to make accurate predictions through simple models, the rules still apply, it is still correct. In fact it still does this better than anything else we have, but as mentioned above, it works within certain constraints / under certain assumptions, wander out of these and it will fall apart. Just think about it for a second, if you want a truly accurate description of even something simple, say a ball falling, you'd need to model ever single particle, how it moves, interacts with the ball and all the total outcomes, that's the only 'correct' model, at least according to how we understand the world now, which also makes assumptions and so is probably not entirely accurate. The whole thing is a massive waste of time which is why physics is not about 100% guarantee'd true totally accurate models of exactly what happens, rather making assumptions and applying constraints to make an extremely complex problem manageable ( the ball falls with acceleration ~9.81m/s^2 ), its simply about modelling the world in ways we can understand, some people might like to chase some kind of crazy equation that does it all but most of us just simply want good equations to describe the parts we work on, and generally that's what we get.
I think you misinterpreted or misread what I was saying if you think I was advocating scientific realism. I'm well aware how science generally attempts to describe the nature of reality (and how such an approach may never be perfect, even if capable of yielding useful results).
You seem to be advocating a hard pragmatist stance of "if it works, it isn't wrong", which is what I was criticizing as naive. The fact that it works extremely well in everyday conditions is irrelevant to whether it is wrong or not. The fact that it fails to explain phenomena at extremities is all that is required to say it is in fact wrong and fails to accurately describe the true nature of reality. Whether it is even possible of coming up with a general holistic theory capable accurately describing the true nature of reality is irrelevant to the fact that Newtonian physics is specifically wrong.
You seem to be under the false impression that Newtonian physics yields perfect results at normal living conditions, which is not the case. Newtonian calculations are theorized in a vacuum that is oversimplified. The reasons we say they "work" in everyday conditions is because the margin of error is negligible. This margin of error becomes much more noticeable at extremities, but is is always present. That is why it is wrong. Useful =/= Truth is my point, which you seem to disagree with.
Your last paragraph isn't why physics isn't 100% guaranteed either. The problem of induction is why physics will never guarantee absolute truth. Again though, this really has no impact on whether we can say Newtonian physics is wrong or not. Aristotelian physics works in some situations as well, it just makes many more metaphysical assumptions and it's scope is much more limited than Newtonian physics. It seems like this wouldn't be enough to say it is wrong though according to your pragmatist argument. You're right to note that we probably will never have 100% certainty through science in describing the nature of reality, but you're wrong to say that it follows that we can't say "X theory is wrong" when it clearly fails to describe reality, regardless of whether it is useful in some situations or not.
On January 08 2013 23:26 Douillos wrote:They aren't wrong, they just apply at a different scale. Comparing Quantum and Newtonian mecanics is just plain stupid to start with.
They are wrong. They continue to prove useful because the situations and scopes to which they are wrong are insignificant for everyday uses, but this doesn't grant them the sort of "diplomatic immunity" that makes them unable to be charged with being wrong.
On January 09 2013 00:20 synapse wrote:Show nested quote +On January 09 2013 00:14 Silvanel wrote:On January 09 2013 00:04 radscorpion9 wrote:On January 08 2013 23:26 Douillos wrote:On January 08 2013 22:51 Evangelist wrote: Newtonian mechanics are wrong. They aren't wrong, they just apply at a different scale. Comparing Quantum and Newtonian mecanics is just plain stupid to start with. I think the whole point is that Newtonian mechanics actually doesn't apply at *any* scale unless you want approximations of an answer. If you want an exact answer then you can't use Newtonian mechanics, because...why else? Its wrong. Einstein's theory of general relativity is something that does apply at a different scale, as it gives extremely precise answers on the macroscopic scale. It just doesn't work on the "quantum scale" obviously. You cant measure anything with infinite accuracy, therefore all calculations and results always bear some uncertainty born in intial measerment. The only real difference between Einsteins theory and Newtonian is level of accuracy when calculating something. Newtonian physics is just fine for most calculations. That is unless You are living in delusional world of theoretical physicist. It's not about accuracy, it's about Newtonian models not taking into account relativity. I wouldn't say that Newtonian physics are wrong, but they're certainly incomplete.
Newtonian physics provides an incomplete description of the nature of reality. It does not have perfect accuracy. That is precisely why it is wrong. Whether or not such a "perfect accuracy" exists is irrelevant to the fact that it doesn't have it.
|
On January 09 2013 07:46 adwodon wrote: Relativistic mechanics still make generalizations and assumptions, if you use either of them to simulate a ball falling to ground neither will account for everything happening but they are both accurate enough for all intensive purposes.
I'm not saying Newtonian mechanics are better, or that we should ignore relativistic mechanics so I don't know why you're little cancer point was relevant. Newtonian mechanics work, maybe not as all encompassing as Newton had thought, but maybe the same will be said about relativistic mechanics in a few 100 years.
To me that doesn't make them 'wrong' because I never saw them as something which tried to explain everything (as if that's some kind of goal of physics?) and they are still relevant today. If they were wrong, we could never have used them. If they were outdated we wouldn't still be using them. I could see an argument for saying Newton was wrong about the applications of his mechanics, that's fine, he was, but his mechanics are still valid.
They are simple, elegant and provide accurate answers when use appropriately. Please explain to me how that makes them wrong in anything but an extremely pedantic sense?
Also I know about statistical mechanics thanks, I was merely demonstrating that we make assumptions to simplify systems otherwise we'd be overwhelmed by the amount we had to deal which is why, at least as far as I see it, if you claim Newtonian mechanics are wrong then so is everything in physics, it all makes assumptions or generalizations which don't reflect the true nature of things but are there for our benefit when dealing with the maths.
"Relativity isn't entirely correct either!" doesn't make Newtonian physics any less wrong.
You've dug yourself into a whole where your definition of "wrong" really makes it hard to pin down anything as "wrong" and so the entire concept of right/wrong are useless. Anything has the potential to be useful.
Your italics is entirely baseless. Just because something is wrong doesn't mean it can't be useful, which is what I tried to explain originally.
Why do you have a problem with your bold statement? That is literally the standard definition of "wrong". (making assumptions or generalizations which don't reflect the true nature of things)
The only way I can see your argument working is if you take up the position of an instrumentalist/pragmatist that just accepts truth as equivalent to utility. If that's the case, then we've just been talking over each other and there isn't really anything else to argue about relevant to this thread.
|
This still seems a big breakthrough no?
Kinda challenges how we measure temperature no?
|
Doesnt volume vary with temperature? 0 temperature= zero volume. And now negative temperature, I feel like negative volume is a good way to make a black hole or something lol.
|
dayum. that's all i gotta say
|
On January 09 2013 08:57 Luepert wrote: Doesnt volume vary with temperature? 0 temperature= zero volume. And now negative temperature, I feel like negative volume is a good way to make a black hole or something lol.
This would be good for one of those troll science comics with the ragefaces.
|
United States24665 Posts
On January 09 2013 08:44 heroyi wrote: This still seems a big breakthrough no?
Kinda challenges how we measure temperature no? No. This doesn't change science's understanding of temperature or how to measure it. It merely shows a new type of material that can be made to have a negative temperature (a gas).
|
On January 09 2013 05:20 micronesia wrote: What are you saying sounds reasonable, but in lieu of calling it hair splitting I'd say we are kinda going off the deep end here all because I worded something somewhat ambiguously that is difficult to word clearly. This all began because some people were saying things which showed they don't really understand the scientific implications of the study in the article (this is why I thought a moderator note would be helpful, also).
There is a reason why we currently define (mathematically) temperature the way we do. It is quite difficult a topic to fully understand though.
To nitpick a bit further - I don't think it's correct to say that "science" (it's a bit of a dirty word) defines temperature like that. The "temperatures" differ somewhat between classical thermodynamics and statistical thermodynamics for example I think. Some of the quite common formulas would go bonkers if negative temperatures were allowed (pV=nRT for example wouldn't make much sense) although the temperature . It's not even a particularly good definition to my fairly limited understanding (the math seems to get rather messy very quickly). Wouldn't it be better to rather define such a system as having negative entropy?
Edit: I suppose the negative Entropy thing is a bit tricky for the statistical mechanics people.
|
On January 09 2013 09:47 Hundisilm wrote:Show nested quote +On January 09 2013 05:20 micronesia wrote: What are you saying sounds reasonable, but in lieu of calling it hair splitting I'd say we are kinda going off the deep end here all because I worded something somewhat ambiguously that is difficult to word clearly. This all began because some people were saying things which showed they don't really understand the scientific implications of the study in the article (this is why I thought a moderator note would be helpful, also).
There is a reason why we currently define (mathematically) temperature the way we do. It is quite difficult a topic to fully understand though. To nitpick a bit further - I don't think it's correct to say that "science" (it's a bit of a dirty word) defines temperature like that. The "temperatures" differ somewhat between classical thermodynamics and statistical thermodynamics for example I think. Some of the quite common formulas would go bonkers if negative temperatures were allowed (pV=nRT for example wouldn't make much sense) although the temperature . It's not even a particularly good definition to my fairly limited understanding (the math seems to get rather messy very quickly). Wouldn't it be better to rather define such a system as having negative entropy? Edit: I suppose the negative Entropy thing is a bit tricky for the statistical mechanics people.
First, pV=nRT has only a limited domain of where it is a good approximation. Second, they actually observed negative pressure, so you actually wouldn't have a problem with signs there (that doesn't mean that formula applies though).
|
|
|
|