On December 09 2011 19:30 Cascade wrote:Hi, doing research in physics, and these are my thoughts. Sorry for the wall of text, I bolded the "title" in each paragraph, so you know what it is about.

- Got a bit curious about this thing with
people referring to the US units as "standard", so I went to wikipedia for
US units (btw, google for "wikipedia standard units" gives the SI units as first hit.

). I searched the page for "standard" and found no reference to the system as "the standard system". However, there is an
disambiguation page for "standard units" that is pointing to both units of measurement in general, and to the US set of units. I interpret that as the US system is sometimes referred to as "standard units" but it is probably not the official name of it.
And I have never heard of the
SI units not being "technically" (whatever that means) allowed to be divided by things? It seemed like only one (very biased) person claimed that was the case, so unless someone can back it up with a reference, I'm going to disregard that statement.

I have always divided my meters happily, and I have no intention of stopping.
Further, it is of course true that
in science, and any profession where you measure things regularly, the metric system is superior. (Although yes, a 12-base or 8-base system would be even superiorer

) People bring up single examples where an inch or a pint (my Irish friends that claimed that half a liter of beer wasn't enough, a liter was too much, but a pint "just right"

Nonetheless they would spend the evening drinking 4-5 pints

) or whatever is very good for measure something specific. Most of these cases are just because they are already used to the units, so they feel intuitive. My Irish friends would for sure argue that a pint was too much if they were used to drink half a liter. And exactly what makes 3.15mm objectively a better interval for measuring [whatever it was] than 2mm? And even if there are some cases where you can make this kind of unbiased arguments, there will be equally many cases where the metric system happens to hit a "good" number. Unless you want to measure the length of feet or thumbs, you can not argue that those units are superior.

Actually the meter is defined such that the distance from the equator to the north pole is 10Mm (that's right, that's 10 megameters! :D) So I can from that make an estimate of large distances, like Sweden-Australia being a bit below 20 000km. Cute, right?
The one exception of SI units being useful is the Kelvin, which I think is objectively a bit inconvenient for our everyday life. Why you ask? Because the "0" of Kelvin is not at all intuitive for us, and that all our temperatures in everyday life end up between 250 and 350K, which is a bit stupid. We all know how long 0 meters (or inch for that matter) is, 0kg, 0 seconds. 0 Kelvin? No idea. For us it gets kindof cold in the winter, and super cold in the freezer and in like Siberia its crazy cold and Antarctica is penguin cold. And it can get even colder!! But 0 K? no idea. Stupid unit for everyday use.
Celcius vs farenheit, I'd say they are comparable. The celcius zero makes a bit more sense imo, as that is when it can start snowing, and water will freeze, which pops up frequently in everyday use (at least here in Sweden...). For the 100 though I'd say that the body temperature is a bit more useful (105? that's a fever!!) than the water boiling temperature (anyone take the temperature on water about to boil?) in everyday life. This can ofc be discussed though.
Also, while SI units are superior for professionals,
I can perfectly well understand non-professional US people that are used to their units and don't feel like relearning. For must everyday use, you don't have to do this kind of conversions between units that SI-people like to bring up. So for a ordinary US person that already is used to their units, and don't have a work where these conversion are used, probably would not benefit from relearning the units. Compare to someone researching history and calenders figuring out that it would make more sense to redefine year 0 to second world war, as that would make history and book keeping easier but wouldn't really affect every day life much apart from you having to relearn your birthyear etc. No matter how reasonable the argument was, I would not be very happy about having to relearn. And all you SI-people going "l2SI" (I actually saw that very expression in a stream chat), would you like to start refering to time in terms of deciday (1 dd = 2.4hours), centiday (1cd = 14.4 minutes) and milliday (1md = 86.4 seconds)? If you are not ready to start using those units, don't be too aggressive towards people being lazy about the US units.

Feel free to take down people trying to argue that US units are actually better though.
- See you in a centiday then!
- What? >_>
- L2SI!!!
or
- omg, it'll take at least a deciday to finish these chores...
- What? >_>
- L2SI!!!
or
- I went with this guy yesterday night after the club... he barely lasted a milliday... what a loser.
- What? >_>
- Yeah, incredible, I know. Only milliday...
- No, I mean, what is a milliday?
- L2SI!!!!
you get the point.
