My Apocalyptic Vision is Very Narrow

More than ten years ago I read Kevin Kelly’s interview with Vernor Vinge in Wired (“Singular Visionary,” vol. 3, no. 6, June 1995) and I have been repeating Mr. Vinge’s formulation of the robot apocalypse almost word for word ever since. But I was never able to locate the original article. Anyway, while reading around the wikipedia page on the technological singularity today I came across a reference to Mr. Vinge and recognized it as the long lost name. A few strokes of the keyboard at Google revealed my favorite dystopian vision:

Kelly: In your books, you sometimes focus on the idea of a singularity — the point at which a mathematical function goes infinite. What does that mean to you in terms of a cultural singularity?

Vinge: All sorts of apocalyptic visions are floating around, but mine is very narrow. It just says that if we ever succeed in making machines as smart as humans, then it’s only a small leap to imagine that we would soon thereafter make — or cause to be made — machines that are even smarter than any human. And that’s it. That’s the end of the human era — the closest analogy would be the rise of the human race within the animal kingdom. The reason for calling this a “singularity” is that things are completely unknowable beyond that point.

Kelly: Do you see any evidence that we are headed toward a singularity?

Vinge: I think the singularity may explain Fermi’s paradox: where is all the other intelligent life in the universe? For years, there have been two theories: the first is that civilizations exterminate themselves, and the second is that these outer civilizations are so weird there’s no way to interact with them. That second explanation has gained a lot of weight in my mind, because I can see us becoming weird — before my very eyes.

The striking thing to me is that qualification, “or cause to be made.” We won’t make the machine smarter than we are. We will only make the machine as smart as we are and then that machine will make the machine more intelligent than us. And then each more intelligent machine will be capable of making another even more intelligent still. Machine evolution will take over and, with software having reproductive cycles that will make bacterial reproduction glacial by comparison, will quickly outstrip human capability or comprehension.

Advertisements