Our version is pretty old and by now we have a whole family tree of Karen equivalents in Poland. It started with Janusz, stereotypical middle aged Pole. Then came his wife, Grażyna. Then their son, Seba(stian). Then their daughter, Karyna. Then their grandkids Brian and Jessica (this is poking fun at people giving their kids foreign names to make them special).
Ask and answer stupid questions here! - Page 767
Forum Index > General Forum |
Sent.
Poland9084 Posts
Our version is pretty old and by now we have a whole family tree of Karen equivalents in Poland. It started with Janusz, stereotypical middle aged Pole. Then came his wife, Grażyna. Then their son, Seba(stian). Then their daughter, Karyna. Then their grandkids Brian and Jessica (this is poking fun at people giving their kids foreign names to make them special). | ||
zatic
Zurich15307 Posts
| ||
Simberto
Germany11258 Posts
| ||
Jockmcplop
United Kingdom9213 Posts
On July 31 2020 00:45 Sent. wrote: Are there any non-English speaking countries other than mine that have their own equivalent of the American "Karen"? Recently I've read that German's didn't have it until recently and that surprised me a little because I thought developing your own version of the Karen meme is something that sooner or later should come naturally. Our version is pretty old and by now we have a whole family tree of Karen equivalents in Poland. It started with Janusz, stereotypical middle aged Pole. Then came his wife, Grażyna. Then their son, Seba(stian). Then their daughter, Karyna. Then their grandkids Brian and Jessica (this is poking fun at people giving their kids foreign names to make them special). In the UK we have Gammon, which is hilarious. | ||
Harris1st
Germany6666 Posts
On July 31 2020 01:35 zatic wrote: Germany doesn't have a Karen but we do have other stereotypical names. "Kevin" is the term for a dumb young person. "Sandy/Mandy " for the female version. "Horst" is strictly speaking ageless but more fitting for middle aged and up men. There are probably others and some terms might be regional too. There are more. We have Lisa, 19 : Someone who did go abroad after school for a few month and as a result doesn't understand german anymore Manny: Your typical Bus driver (or train conductor?) Jodel has 20 more of these. Just can't remember them right now | ||
pelufugo
1 Post
| ||
AliceBrent3
3 Posts
| ||
FiWiFaKi
Canada9858 Posts
You know, like more meaningful things than prettier textures in games, and higher video quality. Would it allow us to do things that currently are unthinkable? | ||
Uldridge
Belgium4470 Posts
Basically, all the infrastructure we have now to do things, which are all particle accelerators, would become obsolete imo. Why pour money into all these crazy engineering feats if you can rely on mathematics that will be proven to be correct anyway? | ||
Simberto
Germany11258 Posts
On August 30 2020 10:50 Uldridge wrote: First thing that springs to mind is the amount of simulations that vary from molecular to sub-atomic level we can perform will be so much higher and so much more reliable, we'd be able to use to base all our assumptions on instead of doing all the necessary (and slow) experiments in the real world. Basically, all the infrastructure we have now to do things, which are all particle accelerators, would become obsolete imo. Why pour money into all these crazy engineering feats if you can rely on mathematics that will be proven to be correct anyway? Some of this is true, but a lot of it is not. You can only simulate stuff that you know stuff about. You cannot just not have any real-world experiments and just hope for the best. Especially in situations where you might have two or more competing theories. The reason we need huge particle accelerators, or huge telescopes, is precisely because we have awesome math, but we need to figure out if that awesome math actually fits the reality we live in. Because math can describe a lot of internally consistent worlds, and many of those are superficially similar to that we live in. The job of real-world experiments is to figure out which of those worlds we live in. One big thing that would change immediately is that basically all currently encrypted data will be very unsafe, and almost all passwords might as well be plain text. Also, far more big data stuff. Because it would get a lot cheaper to analyze your data. | ||
Dangermousecatdog
United Kingdom7084 Posts
On August 30 2020 10:50 Uldridge wrote: First thing that springs to mind is the amount of simulations that vary from molecular to sub-atomic level we can perform will be so much higher and so much more reliable, we'd be able to use to base all our assumptions on instead of doing all the necessary (and slow) experiments in the real world. Basically, all the infrastructure we have now to do things, which are all particle accelerators, would become obsolete imo. Why pour money into all these crazy engineering feats if you can rely on mathematics that will be proven to be correct anyway? Just because something is mathematically beautiful, it doesn't mean it's true. You cannot simulate reality, if you don't know the rules to reality. So much mathematically beautiful physics gets thrown out as experiments are done which show it doesn't model how reality actually works. Which is why the discovery of a particle that seems to be match the properties of a Higgs Bosun was celebrated a few years back. We simply don't know which and what maths actually model reality and what the actual properties may be. In the first place, the reason why mathematical models work, is because many millions of people have worked on the problem beforehand over a long period of time adding to knowledge as a part of a continuous iterative process figuring out how and which mathematical equations can model and fit the real world problem. On a fundamental level, nobody knows how anything "works". We just have models that closely follows reality. You use the maths to fit reality. It doesn't work the other way round. | ||
Uldridge
Belgium4470 Posts
The amount of interesting molecules that can be put into the pipeline would be orders of magnitude greater. If our maths is close enough to reality, we don't need verification any longer no, or very rarely (kind of like a way to see if we're still correct or not). | ||
Acrofales
Spain17746 Posts
On August 31 2020 02:07 Uldridge wrote: I thought our understanding of molecular interactions was quite 'figured out' in a sense, being able to make very sound propositions as to, for example, how well a certain molecule of interest binds in the pocket of a clinically significant protein. The pharma industry would be very happy to be able to simulate 1M times faster, since it not only increases the simulation times (in the order of femto- to nanosecond simulations is often times what can be done for high fidelity simulations - Quantum dynamics), but also the amount of simulations that can be done. The amount of interesting molecules that can be put into the pipeline would be orders of magnitude greater. If our maths is close enough to reality, we don't need verification any longer no, or very rarely (kind of like a way to see if we're still correct or not). Yeah, sure, and a millionfold increase in computing power (without a corresponding increase in power requirements and heat generation, which are the current bottlenecks rather than raw speed) would definitely allow simulations to improve leaps and bounds, but particle accelerators wouldn't suddenly become obsolete. In fact, it might even require more bigger faster accelerators faster to figure out which of the myriad of mathematically beautiful models continues to match empirical results as we can smash particles together at higher and higher energies. A very intriguing domain that would be blown wide open is social simulations. It is currently just getting off the ground, but requires data, social theory and lots of computation (or very simple models of human behavior). More computing power would rapidly accelerate scientific areas like behavioral economics, computational social science, and such. Obviously weather and climate modelling would also improve leaps and bounds. | ||
Uldridge
Belgium4470 Posts
Do we need all the details, or can we do some estimations and still get results that are relevant enough for what we want to achieve? Things where the tiny differences matter obviously should be as precise as possible.. | ||
Harris1st
Germany6666 Posts
But also the possibilites! I read a book (fiction) a while back where some mathematical genius could basically see the future and all possible outcomes of a situation based on the all data available at this moment. The base for this was Laplace's demon. Fascinating stuff | ||
Acrofales
Spain17746 Posts
On August 31 2020 18:44 Harris1st wrote: With that kind of computing power I'm a bit afraid of what abusive minds can do with it. But also the possibilites! I read a book (fiction) a while back where some mathematical genius could basically see the future and all possible outcomes of a situation based on the all data available at this moment. The base for this was Laplace's demon. Fascinating stuff Was that mathematician actually a psychohistorian called Hari Seldon? On August 31 2020 17:13 Uldridge wrote: Is there a point then where we can subsitute reality for mathematics? Do we need all the details, or can we do some estimations and still get results that are relevant enough for what we want to achieve? Things where the tiny differences matter obviously should be as precise as possible.. Never? At least, not from a scientific point of view. You can use mathematical models to predict lots of things, but without actual observations you won't know which of the models is entirely correct. Until, of course, you have built in all possible observations and your simulation is an exact replica of the observable world, which... according to the laws of thermodynamics, is impossible. From a pragmatic "it works well enough" point of view, it's obviously going to depend on the domain. Most new designs for almost anything nowadays are tested first in simulations before prototyping, as simulating material conditions, pharmaceutical effect, construction integrity, etc. etc. etc. is cheaper, faster and easier than building a prototype. Sometimes we can skip prototyping entirely (e.g. a lot of complex construction projects). Sometimes we still need it but maybe won't in the near future. Other times we might never reach the point where we don't need some empirical testing. | ||
Harris1st
Germany6666 Posts
On August 31 2020 19:26 Acrofales wrote: Was that mathematician actually a psychohistorian called Hari Seldon? I googled a bit cause I was curious myself. It was Improbable by Adam Fawer https://www.goodreads.com/book/show/145444.Improbable | ||
Dangermousecatdog
United Kingdom7084 Posts
On August 31 2020 17:13 Uldridge wrote: No. I already explained this. I don't really understand how you can think otherwise. Mathematical modelling is used to describe reality, not the other way round. Is there a point then where we can subsitute reality for mathematics? Example: Using computers to simulate protein folding works because we understand molecular bonding to the level that we modelled it successfully, years before the microchip. Processing power just makes calculations faster, it doesn't change our fundamental understanding and modelling. First comes understanding, then comes simulation. It doesn't flow the other way round. The computer doesn't know how reality works, it only calculates what you tell it to calculate. How does the simulation know what to simulate? You cannot simply use processing power to discover your way to the secrets of the universe. Another example. Say an engineer has a project that involves modelling the fluid flow round a wing or a ship hull under certain conditions. Ten years ago, I would have to leave the computer on for weeks. Nowadays that same simulation would take 24 hours. But someone has to go and input all the parameters, select which algorithms to use, which are applicable, where to apply it, to get that result. Then build a test rig and find out that it doesn't perfectly model reality anyways. Sure, it might be faster, and sometimes it might be "good enough" unless you happen to be on the plane when the engine falls off, but a phenomenon that isn't perfectly modelled is still not perfectly modelled. You can't find out reality from virtual simulation, because the virtual simulation is based on inputs based on modelling reality. Another, but more banal example. 200 years ago, someone build a train/boiler/metal strut to last for 20 years, but it consistently and catastrophically breaks down in 10 years. According to all known knowledge at the time this shouldn't happen. Sounds silly doesn't it, that if you tap a piece of aluminium with your finger many times, eventually it will snap. Very unintuitive, but it turns out that reality is unintuitive sometimes. We know today that metals can break under fatigue that is to say under small repeated loadings (it's a bit more complicated than that), but they don't know that. Time travelling a modern computer, wouldn't help them solve why the train is breaking down so early, because it is entirely inconsistent with their mathematical models of reality. It wouldn't matter what the computer does, because they cannot feed a mathematical model that aligns more closely to reality, nor can a computer work out that small repeated loadings will cause catastrophic failures from sheer processing power. You can only find out by testing reality. | ||
zatic
Zurich15307 Posts
On August 30 2020 08:06 FiWiFaKi wrote: If processing speeds increased 1,000,000x (plus all associated bottlenecks like memory speed, storage sizes, etc), what fundamental things would change in our society? You know, like more meaningful things than prettier textures in games, and higher video quality. Would it allow us to do things that currently are unthinkable? Honestly, very little. There is hardly anything fundamental in our society that is purely constrained by computation. Narrow AI would benefit the most as here the constraint is indeed computation. So you would see more applications of things like GPT3. But the thing about GPT3 is exactly that it is not fundamentally different than previous narrow AI cases, it's just throwing more computation at the same problems. Research would benefit at the margins, but not fundamentally, as it is constrained by a lot more than just computing power. | ||
AbouSV
Germany1278 Posts
On August 30 2020 08:06 FiWiFaKi wrote: If processing speeds increased 1,000,000x (plus all associated bottlenecks like memory speed, storage sizes, etc), what fundamental things would change in our society? You know, like more meaningful things than prettier textures in games, and higher video quality. Would it allow us to do things that currently are unthinkable? A couple other research domain would be affected, and they could potentially have a huge leap forward. But as Simberto mentioned, the biggest impact by far on our current society would be the uselessness of encryption for some time, which I could easily see being a terrible dystopia. About science stuff, an example of domain that would benefit a lot from this would be nuclear fusion. It is currently limited by two things: - The amount of money people/countries are willing to put (currently only VERY large scale projects are relevant). - Computing power. There are some theories around, and we already have more than enough experiments to test them. But them being 7D and concerning way too many particles make them realistically impossible to run as is. So many simplifications are made to have them run in decent times on super computers, and so far there is none good enough to help us with the global picture. | ||
| ||