|
I dont know if this has been discussed on these forums earlier. There are many views of what the technological singularity is and its kind of hard to explain so ill just quote wiki to get us started 
http://en.wikipedia.org/wiki/Technological_singularity
The technological singularity is the hypothesized creation, usually via AI or brain-computer interfaces, of smarter-than-human entities who rapidly accelerate technological progress beyond the capability of human beings to participate meaningfully in said progress. Futurists have varying opinions regarding the timing and consequences of such an event.
Vernor Vinge originally coined the term "singularity" in observing that, just as our model of physics breaks down when it tries to model the singularity at the center of a black hole, our model of the world breaks down when it tries to model a future that contains entities smarter than human.
Statistician I. J. Good first explored the idea of an "intelligence explosion", arguing that machines surpassing human intellect should be capable of recursively augmenting their own mental abilities until they vastly exceed those of their creators. Vernor Vinge later popularized the Singularity in the 1980s with lectures, essays, and science fiction. More recently, some AI researchers have voiced concern over the Singularity's potential dangers.
Some futurists, such as Ray Kurzweil, consider it part of a long-term pattern of accelerating change that generalizes Moore's law to technologies predating the integrated circuit. Critics of this interpretation consider it an example of static analysis...(for longer explanation click the link above;) )
Here are some links if you want more information:
http://www.singinst.org/
"SIAI is a not-for-profit research institute in Palo Alto, California, with three major goals: furthering the nascent science of safe, beneficial advanced AI through research and development, research fellowships, research grants, and science education; furthering the understanding of its implications to society through the AI Impact Initiative and annual Singularity Summit; and furthering education among students to foster scientific research."
http://www.singinst.org/media/singularitysummit2006
Videos from the 2006 Singularity Summit. Audio from the 2007 Summit can also be found on the site.
http://www.ted.com/index.php/talks/view/id/38?gclid=CNb_4PeCvI8CFQXOQwod23vvdA
This is a TED talk given by Ray Kurzweil on accelerating change. I realy recommend this 23 minute talk to anyone who is new on the subject of accelerated change and the singularity.
Let me continue by arguing for why I think that smarter than human intelligence will inevitably come around sooner or later. Let me start by saying that I am an atheist and I am a materialist in the philosofical sense. That is I dont believe in a "soul" any other form of dualism. I believe that everything can be explained by the laws of nature and that includes our brains and minds.
One big reason for this is simply that I believe in Darwins theory of natural selection. If we did have a soul or if there was something beyond this world about humans and our minds, when along our evolution did we get this? Which generation had parents without souls but children who suddenly had souls? Was it in the change from monkey to human? or was it back when we were reptiles? or bacteria? Or maybe the chemical molecules around the orgin of life on earth had souls?
Since there is nothing special about the human brain or mind I see no reason why it cant be reproduced if it has been done atleast once in nature. With our accelerated change I think we will overtake it soon. There is the argument that it might be impossible for an intelligence to understand itself, to understand itself it would have to be that much smarter and then become even harder to understand etc never overcoming that barrier. I dont think this is true, a lot of progress has been done in brainscanning and the understanding of the brain the last couple of years. Even if that were true and a conscious creator couldnt create an intelligence greater than its own then we could still reach the singularity by simulating an accelerated version of evolution either computer/software or organic.
I think we live in interesting times. This whole thing might sound like the nerds rapture or something but the Singularity is an entirely secular, non-mystical process. Not the culmination of any form of religious prophecy or destiny. Since this subject is so big and I can hardly give all the facts and links in a single post I hope if you are interested in artificial intelligence and/or in technological change that you follow the links I gave above. And please discuss
|
Sweden33719 Posts
Too interesting a thread to have it drop out of sight, gonna bump it and hope people with more useful input than my own see it
|
|
Artosis
United States2140 Posts
|
How could I miss that thread ? Oh well it has been more than a year since then and it deserves to be discussed again. Great OP in that thread by travis, I hope he joins the discussion in this thread too. That last thread only got like 20 comments anyway. I hope thats not a sign that people dont think this is interesting :S Cant realy understand how you cant.
|
I saw a quote that went something like: \'the singularity is the intelligent design for people with high IQ\'
I must admit im not an expert on the topic, though I am fascinated by it, and have read Kurzweil\'s book The Singularity Is Near.
I personally find it most plausible (assumed that knowledge is not restricted for whatever reasons, political, economical etc) for some time (how many years I dont know) we will enhance our biochemical intelligence by knowledge and manipulation of genetics, neural effectors, and environment for some time. We may integrate silicon \'neurons\' with biochemical neurons. This will happen for some time before creating brand new intelligence with silicon chips or whatever. My point is that maybe instead of creating new intelligence that will dominate poor humans, humanity itself will be enhanced as knowledge progress. As for reaching singularity, it seems a plausible idea, but who can really know?
|
On November 02 2007 04:10 Blue wrote: I saw a quote that went something like: \'the singularity is the intelligent design for people with high IQ\'
I must admit im not an expert on the topic, though I am fascinated by it, and have read Kurzweil\'s book The Singularity Is Near.
I personally find it most plausible (assumed that knowledge is not restricted for whatever reasons, political, economical etc) for some time (how many years I dont know) we will enhance our biochemical intelligence by knowledge and manipulation of genetics, neural effectors, and environment for some time. We may integrate silicon \'neurons\' with biochemical neurons. This will happen for some time before creating brand new intelligence with silicon chips or whatever. My point is that maybe instead of creating new intelligence that will dominate poor humans, humanity itself will be enhanced as knowledge progress. As for reaching singularity, it seems a plausible idea, but who can really know?
I tend to think this scenario is much more likely to happen then creating AI that surpasses our own as well, but I am not read up on the subject yet, so I can still be swayed either way;p
|
That is crazy how AI would somehow surpass human minds when humans invented them, reminds me of the movie Robots with Will smith
|
On November 02 2007 04:16 il0seonpurpose wrote: That is crazy how AI would somehow surpass human minds when humans invented them, reminds me of the movie Robots with Will smith
probably cause this was the basis for that movie;p
edit: ok obviously it is based off of Isaac Asimov's book, but the idea for that was based off this.
|
You cannot stop judgement day.. Only delay it.
|
On November 02 2007 04:10 Blue wrote: I saw a quote that went something like: \'the singularity is the intelligent design for people with high IQ\'
I must admit im not an expert on the topic, though I am fascinated by it, and have read Kurzweil\'s book The Singularity Is Near.
I personally find it most plausible (assumed that knowledge is not restricted for whatever reasons, political, economical etc) for some time (how many years I dont know) we will enhance our biochemical intelligence by knowledge and manipulation of genetics, neural effectors, and environment for some time. We may integrate silicon \'neurons\' with biochemical neurons. This will happen for some time before creating brand new intelligence with silicon chips or whatever. My point is that maybe instead of creating new intelligence that will dominate poor humans, humanity itself will be enhanced as knowledge progress. As for reaching singularity, it seems a plausible idea, but who can really know? Yeah I think there are both technological and moral/philosophical differences between improving ourselves and creating new intelligence. One thing that comes to mind is humans each generation improving the dna of the next generation that would take 20 or so years for an "update". Source code changing itself could "update" many times every second. Humans who improve themselves might not run the same chance of a matrix- or 2001 scenario. But humans who improve themselves might do those same things to other humans who arnt improved... There is so much we realy cant know thats why its called the singularity ;e
I think one of the most interesting things about the whole concept is that even if 80% of it is either exaggerated or false it would still change a lot of what people on average predict about the future. Lets forget about the singularty itself and just look and accelerated change. Even if we just follow Kurzweils logarithmic graphs 10-20 years into the future a LOT of things will change a LOT far beyond what most people imagine.
|
And I agree that it is just as plausible that IF we/what we create will ever move beyond Earth, it might just as well be as nanobots or lightwaves as with classic SciFi spaceships. But im sure it would be much harder to make an interesting SciFi movie with nanobots or lightwaves moving around : p only way to make a SciFi movie interesting is by space opera. but this is a bit off topic > )
|
Calgary25977 Posts
Good read, something I've never heard about. That's crazy to think about, but makes sense. I don't know how plausible it is, but it's scary to think of a nation controlling these "knowledge bots" and basically exploting all their discoveries.
|
By the way, TED is a fantastic website in general, with lots of interesting talks. Here is one by Jeff Hawkins that is somewhat related to the present topic http://www.ted.com/index.php/talks/view/id/125
He talks about brain theory, and about creating intelligent machines with silicon.
|
This is a bit offtopic, but i'm just gonna plug some great scifi as i assume those people interested in topics like the technological singularity would appreciate it 
ghost in the shell
both the movies and the series (stand alone complex) are both outstanding. they provide deep and relevant insight into what society and individual identity will become in the near future with the advent of a brain-computer/machine interface, cyborg technology, viable AIs, and an internet that has spread across the entire world.
nevermind the fact it's anime if that turns you off, as it's just great science fiction regardless. the original film inspired "The Matrix", and if you're interested a lot of the ideas stem from William Gibson's cyberpunk novel "Neuromancer".
|
intrigue
Washington, D.C9933 Posts
neuromancer is amazing, one of my all-time favorites : ]
|
o and i forget the dune series, though it veers more toward space opera
the core of it is a humanity that has abandoned the advance of technology due to a massive, destructive, centuries-long galactic conflict between humanity and human-created "thinking machines" (AI) in the past, instead relying on superhuman powers developed by various organizations through the use of the superdrug melange spice.
|
I don't have time to explore the links in the OP right now, but I don't see how we could create anything "smarter" than us. Machines are fast, not smart. Unless they're completely replacing the Turing Machine model, which modern computing is based on, I don't see that changing.
I'm skeptical, but I'll explore the topic further when I have time.
edit: the posts in this thread revere machine intelligence like it's magic LOL
|
Iain M Banks. The Culture. Hence my name. Wikipedia it, people!
|
A lot of it is just speculation, and personally, I don't that anything like this will happen in our lifetime at least.
|
|
|
|