The Singularity Is Near: When Humans Transcend Biology (8 page)

Read The Singularity Is Near: When Humans Transcend Biology Online

Authors: Ray Kurzweil

Tags: #Non-Fiction, #Fringe Science, #Retail, #Technology, #Amazon.com

BOOK: The Singularity Is Near: When Humans Transcend Biology
6.69Mb size Format: txt, pdf, ePub

This, then, is the Singularity. Some would say that we cannot comprehend it, at least with our current level of understanding. For that reason, we cannot look past its event horizon and make complete sense of what lies beyond. This is one reason we call this transformation the Singularity.

I have personally found it difficult, although not impossible, to look beyond this event horizon, even after having thought about its implications for several decades. Still, my view is that, despite our profound limitations of thought, we do have sufficient powers of abstraction to make meaningful statements about
the nature of life after the Singularity. Most important, the intelligence that will emerge will continue to represent the human civilization, which is already a human-machine civilization. In other words, future machines will be human, even if they are not biological. This will be the next step in evolution, the next high-level paradigm shift, the next level of indirection. Most of the intelligence of our civilization will ultimately be nonbiological. By the end of this century, it will be trillions of trillions of times more powerful than human intelligence.
36
However, to address often-expressed concerns, this does not imply the end of biological intelligence, even if it is thrown from its perch of evolutionary superiority. Even the nonbiological forms will be derived from biological design. Our civilization will remain human—indeed, in many ways it will be more exemplary of what we regard as human than it is today, although our understanding of the term will move beyond its biological origins.

Many observers have expressed alarm at the emergence of forms of nonbiological intelligence superior to human intelligence (an issue we will explore further in
chapter 9
). The potential to augment our own intelligence through intimate connection with other thinking substrates does not necessarily alleviate the concern, as some people have expressed the wish to remain “unenhanced” while at the same time keeping their place at the top of the intellectual food chain. From the perspective of biological humanity, these superhuman intelligences will appear to be our devoted servants, satisfying our needs and desires. But fulfilling the wishes of a revered biological legacy will occupy only a trivial portion of the intellectual power that the Singularity will bring.

M
OLLY CIRCA
2004:
How will I know when the Singularity is upon us? I mean, I’ll want some time to prepare
.

R
AY
:
Why, what are you planning to do?

M
OLLY
2004:
Let’s see, for starters, I’ll want to fine-tune my résumé. I’ll want to make a good impression on the powers that be
.

G
EORGE CIRCA
2048:
Oh, I can take care of that for you
.

M
OLLY
2004:
That’s really not necessary. I’m perfectly capable of doing it myself. I might also want to erase a few documents—you know, where I’m a little insulting to a few machines I know
.

G
EORGE
2048:
Oh, the machines will find them anyway—but don’t worry, we’re very understanding
.

M
OLLY
2004:
For some reason, that’s not entirely reassuring. But I’d still like to know what the harbingers will be
.

R
AY
:
Okay, you will know the Singularity is coming when you have a million e-mails in your in-box
.

M
OLLY
2004:
Hmm, in that case, it sounds like we’re just about there. But seriously, I’m having trouble keeping up with all of this stuff flying at me as it is. How am I going to keep up with the pace of the Singularity?

G
EORGE
2048:
You’ll have virtual assistants—actually, you’ll need just one
.

M
OLLY
2004:
Which I suppose will be you?

G
EORGE
2048:
At your service
.

M
OLLY
2004:
That’s just great. You’ll take care of everything, you won’t even have to keep me informed. “Oh, don’t bother telling Molly what’s happening, she won’t understand anyway, let’s just keep her happy and in the dark.”

G
EORGE
2048:
Oh, that won’t do, not at all
.

M
OLLY
2004:
The happy part, you mean?

G
EORGE
2048:
I was referring to keeping you in the dark. You’ll be able to grasp what I’m up to if that’s what you really want
.

M
OLLY
2004:
What, by becoming . . .

R
AY
:
Enhanced?

M
OLLY
2004:
Yes, that’s what I was trying to say
.

G
EORGE
2048:
Well, if our relationship is to be all that it can be, then it’s not a bad idea
.

M
OLLY
2004:
And should I wish to remain as I am?

G
EORGE
2048:
I’ll be devoted to you in any event. But I can be more than just your transcendent servant
.

M
OLLY
2004:
Actually, you’re being “just” my transcendent servant doesn’t sound so bad
.

C
HARLES
D
ARWIN
:
If I may interrupt, it occurred to me that once machine intelligence is greater than human intelligence, it should be in a position to design its own next generation
.

M
OLLY
2004:
That doesn’t sound so unusual. Machines are used to design machines today
.

C
HARLES
:
Yes, but in 2004 they’re still guided by human designers. Once machines are operating at human levels, well, then it kind of closes the loop
.

N
ED
L
UDD
:
37
And humans would be out of the loop
.

M
OLLY
2004:
It would still be a pretty slow process
.

R
AY
:
Oh, not at all. If a nonbiological intelligence was constructed similarly to a human brain but used even circa 2004 circuitry, it—

M
OLLY
C
IRCA
2104:
You mean “she.”

R
AY
:
Yes, of course . . . she . . . would be able to think at least a million times faster
.

T
IMOTHY
L
EARY
:
So subjective time would be expanded
.

R
AY
:
Exactly
.

M
OLLY
2004:
Sounds like a lot of subjective time. What are you machines going to do with so much of it?

G
EORGE
2048:
Oh, there’s plenty to do. After all, I have access to all human knowledge on the Internet
.

M
OLLY
2004:
Just the human knowledge? What about all the machine knowledge?

G
EORGE
2048:
We like to think of it as one civilization
.

C
HARLES
:
So, it does appear that machines will be able to improve their own design
.

M
OLLY
2004:
Oh, we humans are starting to do that now
.

R
AY
:
But we’re just tinkering with a few details. Inherently, DNA-based intelligence is just so very slow and limited
.

C
HARLES
:
So the machines will design their own next generation rather quickly
.

G
EORGE
2048:
Indeed, in 2048, that is certainly the case
.

C
HARLES
:
Just what I was getting at, a new line of evolution then
.

N
ED
:
Sounds more like a precarious runaway phenomenon
.

C
HARLES
:
Basically, that’s what evolution is
.

N
ED
:
But what of the interaction of the machines with their progenitors? I mean, I don’t think I’d want to get in their way. I was able to hide from the English authorities for a few years in the early 1800s, but I suspect that will be more difficult with these . . .

G
EORGE
2048:
Guys
.

M
OLLY
2004:
Hiding from those little robots—

R
AY
:
Nanobots, you mean
.

M
OLLY
2004:
Yes, hiding from the nanobots will be difficult, for sure
.

R
AY
:
I would expect the intelligence that arises from the Singularity to have great respect for their biological heritage
.

G
EORGE
2048:
Absolutely, it’s more than respect, it’s . . . reverence
.

M
OLLY
2004:
That’s great, George, I’ll be your revered pet. Not what I had in mind
.

N
ED
:
That’s just how Ted Kaczynski puts it: we’re going to become pets. That’s our destiny, to become contented pets but certainly not free men
.

M
OLLY
2004:
And what about this Epoch Six? If I stay biological, I’ll be using up all this precious matter and energy in a most inefficient way. You’ll want to turn me into, like, a billion virtual Mollys and Georges, each of them thinking a lot faster than I do now. Seems like there will be a lot of pressure to go over to the other side
.

R
AY
:
Still, you represent only a tiny fraction of the available matter and energy. Keeping you biological won’t appreciably change the order of magnitude of matter and energy available to the Singularity. It will be well worth it to maintain the biological heritage
.

G
EORGE
2048:
Absolutely
.

R
AY
:
Just like today we seek to preserve the rain forest and the diversity of species
.

M
OLLY
2004:
That’s just what I was afraid of. I mean, we’re doing such a wonderful job with the rain forest. I think we still have a little bit of it left. We’ll end up like those endangered species
.

N
ED
:
Or extinct ones
.

M
OLLY
2004:
And there’s not just me. How about all the stuff I use? I go through a lot of stuff
.

G
EORGE
2048:
That’s not a problem, we’ll just recycle all your stuff. We’ll create the environments you need as you need them
.

M
OLLY
2004:
Oh, I’ll be in virtual reality?

R
AY
:
No, actually, foglet reality
.

M
OLLY
2004:
I’ll be in a fog?

R
AY
:
No, no, foglets
.

M
OLLY
2004:
Excuse me?

R
AY
:
I’ll explain later in the book
.

M
OLLY
2004:
Well, give me a hint
.

R
AY
:
Foglets are nanobots—robots the size of blood cells—that can connect themselves to replicate any physical structure. Moreover, they can direct visual and auditory information in such a way as to bring the morphing qualities of virtual reality into real reality
.
38

M
OLLY
2004:
I’m sorry I asked. But, as I think about it, I want more than just my stuff. I want all the animals and plants, too. Even if I don’t get to see and touch them all, I like to know they’re there
.

G
EORGE
2048:
But nothing will be lost
.

M
OLLY
2004:
I know you keep saying that. But I mean actually there—you know, as in biological reality
.

Other books

A Bear's Baby by Vanessa Devereaux
Heart of Gold by May McGoldrick
What Remains_Reckoning by Kris Norris
A Regimental Murder by Ashley Gardner
Free Falling by Debra Webb
Apostle by Brad Thor