The Singularity Is Near: When Humans Transcend Biology (24 page)

Read The Singularity Is Near: When Humans Transcend Biology Online

Authors: Ray Kurzweil

Tags: #Non-Fiction, #Fringe Science, #Retail, #Technology, #Amazon.com

BOOK: The Singularity Is Near: When Humans Transcend Biology
4.94Mb size Format: txt, pdf, ePub

But because many properties of each atom could be exploited to store information—such as the precise position, spin, and quantum state of all of its particles—we can probably do somewhat better than 10
27
bits. Neuroscientist Anders Sandberg estimates the potential storage capacity of a hydrogen atom at about four million bits. These densities have not yet been demonstrated, however, so we’ll use the more conservative estimate.
63
As discussed above, 10
42
calculations per second could be achieved without producing significant heat. By fully deploying reversible computing techniques, using designs that generate low levels of errors, and allowing for reasonable amounts of energy dissipation, we should end up somewhere between 10
42
and 10
50
calculations per second.

The design terrain between these two limits is complex. Examining the technical issues that arise as we advance from 10
42
to 10
50
is beyond the scope of this chapter. We should keep in mind, however, that the way this will play out is not by starting with the ultimate limit of 10
50
and working backward based on various practical considerations. Rather, technology will continue to ramp
up, always using its latest prowess to progress to the next level. So once we get to a civilization with 10
42
cps (for every 2.2 pounds), the scientists and engineers of that day will use their essentially vast nonbiological intelligence to figure out how to get 10
43
, then 10
44
, and so on. My expectation is that we will get very close to the ultimate limits.

Even at 10
42
cps, a 2.2-pound “ultimate portable computer” would be able to perform the equivalent of all human thought over the last ten thousand years (assumed at ten billion human brains for ten thousand years) in ten microseconds.
64
If we examine the “Exponential Growth of Computing” chart (
p. 70
), we see that this amount of computing is estimated to be available for one thousand dollars by 2080.

A more conservative but compelling design for a massively parallel, reversible computer is Eric Drexler’s patented nanocomputer design, which is entirely mechanical.
65
Computations are performed by manipulating nanoscale rods, which are effectively spring-loaded. After each calculation, the rods containing intermediate values return to their original positions, thereby implementing the reverse computation. The device has a trillion (10
12
) processors and provides an overall rate of 10
21
cps, enough to simulate one hundred thousand human brains in a cubic centimeter.

Setting a Date for the Singularity.
A more modest but still profound threshold will be achieved much earlier. In the early 2030s one thousand dollars’ worth of computation will buy about 10
17
cps (probably around 10
20
cps using ASICs and harvesting distributed computation via the Internet). Today we spend more than $10
11
($100 billion) on computation in a year, which will conservatively rise to $10
12
($1 trillion) by 2030. So we will be producing about 10
26
to 10
29
cps of nonbiological computation per year in the early 2030s. This is roughly equal to our estimate for the capacity of all living biological human intelligence.

Even if just equal in capacity to our own brains, this nonbiological portion of our intelligence will be more powerful because it will combine the pattern-recognition powers of human intelligence with the memory- and skill-sharing ability and memory accuracy of machines. The nonbiological portion will always operate at peak capacity, which is far from the case for biological humanity today; the 10
26
cps represented by biological human civilization today is poorly utilized.

This state of computation in the early 2030s will not represent the Singularity, however, because it does not yet correspond to a profound expansion of our intelligence. By the mid-2040s, however, that one thousand dollars’ worth of
computation will be equal to 10
26
cps, so the intelligence created per year (at a total cost of about $10
12
) will be about one billion times more powerful than all human intelligence today.
66

That
will
indeed represent a profound change, and it is for that reason that I set the date for the Singularity—representing a profound and disruptive transformation in human capability—as 2045.

Despite the clear predominance of nonbiological intelligence by the mid-2040s, ours will still be a human civilization. We will transcend biology, but not our humanity. I’ll return to this issue in
chapter 7
.

Returning to the limits of computation according to physics, the estimates above were expressed in terms of laptop-size computers because that is a familiar form factor today. By the second decade of this century, however, most computing will not be organized in such rectangular devices but will be highly distributed throughout the environment. Computing will be everywhere: in the walls, in our furniture, in our clothing, and in our bodies and brains.

And, of course, human civilization will not be limited to computing with just a few pounds of matter. In
chapter 6
, we’ll examine the computational potential of an Earth-size planet and computers on the scale of solar systems, of galaxies, and of the entire known universe. As we will see, the amount of time required for our human civilization to achieve scales of computation—and intelligence—that go beyond our planet and into the universe may be a lot shorter than you might think.

I set the date for the Singularity—representing a profound and disruptive transformation in human capability—as 2045
.

The nonbiological intelligence created in that year will be one billion times more powerful than all human intelligence today
.

Memory and Computational Efficiency: A Rock Versus a Human Brain.
With the limits of matter and energy to perform computation in mind, two useful metrics are the memory efficiency and computational efficiency of an object. These are defined as the fractions of memory and computation taking place in an object that are actually useful. Also, we need to consider the equivalence principle: even if computation is useful, if a simpler method produces equivalent results, then we should evaluate the computation against the simpler algorithm.
In other words, if two methods achieve the same result but one uses more computation than the other, the more computationally intensive method will be considered to use only the amount of computation of the less intensive method.
67

The purpose of these comparisons is to assess just how far biological evolution has been able to go from systems with essentially no intelligence (that is, an ordinary rock, which performs no
useful
computation) to the ultimate ability of matter to perform purposeful computation. Biological evolution took us part of the way, and technological evolution (which, as I pointed out earlier, represents a continuation of biological evolution) will take us very close to those limits.

Recall that a 2.2-pound rock has on the order of 10
27
bits of information encoded in the state of its atoms and about 10
42
cps represented by the activity of its particles. Since we are talking about an ordinary stone, assuming that its surface could store about one thousand bits is a perhaps arbitrary but generous estimate.
68
This represents 10
–24
of its theoretical capacity, or a memory efficiency of 10
–24
.
69

We can also use a stone to do computation. For example, by dropping the stone from a particular height, we can compute the amount of time it takes to drop an object from that height. Of course, this represents very little computation: perhaps 1 cps, meaning its computational efficiency is 10
–42
.
70

In comparison, what can we say about the efficiency of the human brain? Earlier in this chapter we discussed how each of the approximately 10
14
interneuronal connections can store an estimated 10
4
bits in the connection’s neurotransmitter concentrations and synaptic and dendritic nonlinearities (specific shapes), for a total of 10
18
bits. The human brain weighs about the same as our stone (actually closer to 3 pounds than 2.2, but since we’re dealing with orders of magnitude, the measurements are close enough). It runs warmer than a cold stone, but we can still use the same estimate of about 10
27
bits of theoretical memory capacity (estimating that we can store one bit in each atom). This results in a memory efficiency of 10
–9
.

However, by the equivalence principle, we should not use the brain’s inefficient coding methods to rate its memory efficiency. Using our functional memory estimate above of 10
13
bits, we get a memory efficiency of 10
–14
. That’s about halfway between the stone and the ultimate cold laptop on a logarithmic scale. However, even though technology progresses exponentially, our experiences are in a linear world, and on a linear scale the human brain is far closer to the stone than to the ultimate cold computer.

So what is the brain’s computational efficiency? Again, we need to consider
the equivalence principle and use the estimate of 10
16
cps required to emulate the brain’s functionality, rather than the higher estimate (10
19
cps) required to emulate all of the nonlinearities in every neuron. With the theoretical capacity of the brain’s atoms estimated at 10
42
cps, this gives us a computational efficiency of 10
–26
. Again, that’s closer to a rock than to the laptop, even on a logarithmic scale.

Our brains have evolved significantly in their memory and computational efficiency from pre-biology objects such as stones. But we clearly have many orders of magnitude of improvement to take advantage of during the first half of this century.

Going Beyond the Ultimate: Pico- and Femtotechnology and Bending the Speed of Light.
The limits of around 10
42
cps for a one-kilogram, one-liter cold computer and around 10
50
for a (very) hot one are based on computing with atoms. But limits are not always what they seem. New scientific understanding has a way of pushing apparent limits aside. As one of many such examples, early in the history of aviation, a consensus analysis of the limits of jet propulsion apparently demonstrated that jet aircraft were infeasible.
71

The limits I discussed above represent the limits of nanotechnology based on our current understanding. But what about picotechnology, measured in trillionths (10
–12
) of a meter, and femtotechnology, scales of 10
–15
of a meter? At these scales, we would require computing with subatomic particles. With such smaller size comes the potential for even greater speed and density.

We do have at least several very early-adopter picoscale technologies. German scientists have created an atomic-force microscope (AFM) that can resolve features of an atom that are only seventy-seven picometers across.
72
An even higher-resolution technology has been created by scientists at the University of California at Santa Barbara, who have developed an extremely sensitive measurement detector with a physical beam made of gallium-arsenide crystal and a sensing system that can measure a flexing of the beam of as little as one picometer. The device is intended to provide a test of Heisenberg’s uncertainty principle.
73

In the time dimension Cornell University scientists have demonstrated an imaging technology based on X-ray scattering that can record movies of the movement of a single electron. Each frame represents only four attoseconds (10
–18
seconds, each one a billionth of a billionth of a second).
74
The device can achieve spatial resolution of one angstrom (10
–10
meter, which is 100 picometers).

However, our understanding of matter at these scales, particularly in the
femtometer range, is not sufficiently well developed to propose computing paradigms. An
Engines of Creation
(Eric Drexler’s seminal 1986 book that provided the foundations for nanotechnology) for pico- or femtotechnology has not yet been written. However, each of the competing theories for the behavior of matter and energy at these scales is based on mathematical models that are based on computable transformations. Many of the transformations in physics do provide the basis for universal computation (that is, transformations from which we can build general-purpose computers), and it may be that behavior in the pico- and femtometer range will do so as well.

Of course, even if the basic mechanisms of matter in these ranges provide for universal computation in theory, we would still have to devise the requisite engineering to create massive numbers of computing elements and learn how to control them. These are similar to the challenges on which we are now rapidly making progress in the field of nanotechnology. At this time, we have to regard the feasibility of pico- and femtocomputing as speculative. But nano-computing will provide massive levels of intelligence, so if it’s at all possible to do, our future intelligence will be likely to figure out the necessary processes. The mental experiment we should be making is not whether humans as we know them today will be capable of engineering pico- and femtocomputing technologies, but whether the vast intelligence of future nanotechnology-based intelligence (which will be trillions of trillions of times more capable than contemporary biological human intelligence) will be capable of rendering these designs. Although I believe it is likely that our future nanotechnology-based intelligence will be able to engineer computation at scales finer than nanotechnology, the projections in this book concerning the Singularity do not rely on this speculation.

Other books

Xenograffiti by Robert Reginald
Her Last Letter by Nancy C. Johnson
1022 Evergreen Place by Debbie Macomber
After the Rain (The Callahans) by Hayden, Jennifer
The Trouble with Demons by Lisa Shearin
The Beginning of Everything by Robyn Schneider
The Namesake by Jhumpa Lahiri
Saving Farley's Bog by Don Sawyer