Eve of Destruction: Terminator Time |
by Roberto Rivera |
"In the early 21st century, all of mankind united and marveled at our magnificence as we gave birth to AI [artificial intelligence], a singular construction that spawned an entire race of machines."
What Morpheus was describing to Neo sounds like what Ray Kurzweil calls The Singularity: "an era in which our intelligence will become increasingly nonbiological and trillions of times more powerful than it is today — the dawning of a new civilization that will enable us to transcend our biological limitations and amplify our creativity."
At the heart of this "new civilization" will be just machines that make big decisions, programmed of course by fellows with compassion and vision. Only by leveraging their abilities, embracing both the biological and the synthetic, can we become eternally free and eternally young.
As you might expect, "The Singularity" has more than a few detractors.The most obvious concern is that machines that are "smart" enough and powerful enough to usher in Kurzweil's utopia might one day decide that they will no longer take directions from their flesh-and-blood creators, or that humans are superfluous consumers of resources.
This Matrix scenario concerned Kurzweil enough that he took the time to comment on the movies. Aside from stating the obvious -- the second and third movies weren't nearly as good as the first -- he was mostly content to offer a technological critique of the movie ("There are problems and inconsistencies with the conception of virtual reality in the Matrix") and throw around adjectives like "dystopian," "Luddite" and "totalitarian."
Adjectives aren't assurances: Kurzweil never does tell us why we shouldn't fear our prospective machine overlords. Mind you, I don't. Not because I am put at ease by things like the Three Laws of Robotics (the kinds of machines Kurzweil envisions are probably smart enough to circumvent these kinds of limitations) but because I'm willing to bet that no machine will pass the Turing test in the foreseeable future.
In fact, someone has already made that bet: at Long Bets -- The Arena for Accountable Predictions, Mitchell Kapor has bet Kurzweil $20,000 (proceeds go to charity) that "by 2029 no computer -- or machine intelligence -- will have passed the Turing test."
Kapor's reasons, which are worth reading, boil down to this. No matter how fast a machine processes information, it lacks the most important thing: a body. As he puts it "we are embodied creatures; our physicality grounds us and defines our existence in a myriad of ways . . ." (Please read the rest. I gave you the link. Do I need to make a recording?) While he never uses the word, Kapor's skepticism derives, at least in part, from his rejection of the reductionism inherent to the whole AI/Singularity project. We have more than brains: we have minds. Thinking is about much more than electrochemical impulses.
That's why, while I'm bummed at the almost-certain cancellation of The Sarah Connor Chronicles, I give virtually no thought to the prospect of a real-life Skynet taking over. The Matrix may have gotten virtual reality wrong but The Singularity gets a much more important subject wrong: humanity.
"Destruction" score: who knows? "Eve" score: less than a supervolcano, more than a wandering black hole.
Next on "Eve of Destruction": Pushing Daisies.
(Image © Fox)
It's a shame SCC is going to get canceled, but I can see why. We're still watching, but it's confusing and it's hard to tell if it's going anywhere. And it's slow much of the time. Great concept, poor execution.
Posted by: Mike D'Virgilio | March 23, 2009 at 03:48 PM
Heh - more to Roberto's point in this series, Mike, we could make a one-word substitution to your post and get this:
"It's a shame humanity is going to get canceled, but I can see why. We're still watching, but it's confusing and it's hard to tell if it's going anywhere. And it's slow much of the time. Great concept, poor execution."
Posted by: LeeQuod | March 23, 2009 at 04:45 PM
Sounds like the gospel according to Dawkins, Dennitt, Hitchens, Harris, et al.
Posted by: Mike D'Virgilio | March 23, 2009 at 05:07 PM
"The Cylons were created by Man. They rebelled."
Somebody had to say it.
Posted by: labrialumn | March 23, 2009 at 11:29 PM
Well, at least it's more probable then a wandering black hole.
Posted by: Jason Taylor | March 24, 2009 at 01:20 AM
Believe me, labrialumn, I originally did. I had an entire section on the BSG series finale that I cut after I decided that I could write all day on the subject.
Posted by: Roberto Rivera | March 24, 2009 at 07:24 AM
Well...I wouldn't agree that movies 2 and 3 of the Matrix saga were worse than the first; but then, I tend to view the three together as a single story - practically a single movie (that was the intension of the authors, btw); and it takes a bit of thought to understand it all.
That said, I very much have to agree that it will be a long time before a computer will ever pass the Turing Test, if it ever does happen. Sure, we could probably design a computer to be able to work in such a fashion to seemingly pass (as was part of the story line in a recent episode of Numb3rs); but that's going to be it. Reason - not 'body', but 'soul'. There's a whole other dimension to life that computers can't even touch. We see it in all of God's creation - even animals - but nothing man-made; nor will that likely ever happen as the breath of life is something that only God can provide. (Though I wouldn't be surprised if there was something of that nature, tricking people into thinking that life was created, intertwined with how Revelations plays out in this world.)
Whether is Bi-centennial Man, the Matrix Trilogy, Star Wars (C3PO, R2D2, etc.), Red Dwarf (Holly, Kryten), BSG (Cyclons), Star Trek (Data, Borg), or a whole host of others - it's a fiction that will always remain fiction as the created strives to become like the Creator. It's in our nature, by design; but flawed in this sinful world.
Posted by: Benjamen R. Meyer | March 24, 2009 at 12:53 PM
True artificial intelligence is only 20 years away. And always will be.
Posted by: Howard | March 24, 2009 at 01:21 PM
If a machine ever passes the Turing test, it may turn out to be not so much because the computer has gained a "soul" as that the evaluator has lost his.
Posted by: David | March 24, 2009 at 09:41 PM