Darwin Among the Machines (39 page)

Read Darwin Among the Machines Online

Authors: George B. Dyson

BOOK: Darwin Among the Machines
3.48Mb size Format: txt, pdf, ePub

Telepathic communication in most of Stapledon's societies led to a distributed communications network whereby information was shared freely throughout the species without producing a level of consciousness higher than that of the underlying minds. In other societies, such as the “18th Men,” individuals were “capable of becoming on some occasions mere nodes in a system of radiation which itself should then constitute the physical basis of a single mind.”
24
The difference depends on bandwidth: the amount of information that can be conveyed over a given communication channel in a given time. If bandwidth begins to match the internal processing power of the individual nodes in a communications network, individuality begins to merge. Since human beings (usually) think at a higher speed than that at which they communicate, this situation does
not occur in human society, though hints of it may be found in certain ritual activities in which thoughts are synchronized through music or dance and communication between individuals is speeded up. We can only speculate how mind might emerge, and perhaps has emerged, among aquatic creatures able to visualize
and
telecommunicate by means of underwater sound. Or among millions-of-instructions-per-second microprocessors linked by millions-of-bits-per-second fibers into a telecommunicating web.

Astronomer Fred Hoyle, whose science fiction novel
The Black Cloud
was inspired by Stapledon's ideas, let two of his characters explain as they begin to realize that an electromagnetic, loosely distributed intelligence of interstellar origin is paying a visit to the vicinity of Earth:

“‘The volume of information that can be transmitted radiatively is enormously greater than the amount that we can communicate by ordinary sound. We've seen that with our pulsed radio transmitters. So if this Cloud contains separate individuals, the individuals must be able to communicate on a vastly more detailed scale than we can. What we can get across in an hour of talk they might get across in a hundredth of a second.'”

“‘Ah, I begin to see light,' broke in McNeil. ‘If communication occurs on such a scale then it becomes somewhat doubtful whether we should talk any more of separate individuals!'”

“‘You're home, John!'”
25

Bandwidth alone does not imply intelligence. Television uses a large bandwidth (about six megahertz per channel), but there is not much intelligence at either end. On the other hand, all known systems that exhibit intelligent behavior rely on the communication of information—two people discussing a problem, exchanging less than 100 bits per second, or the unfathomable number of bits per second exchanged among the 100 billion neurons within an individual brain. Information does not imply intelligence, and communication does not imply consciousness. The implications go the other way.

Irving J. Good, colleague of Alan Turing at both Manchester and Bletchley Park, although shying away from a theory of consciousness, which he considered, in the words of Turing, to be more a matter of “faith” than proof, addressed the relations between consciousness and communication in 1962: “The identification of consciousness with the operation of certain types of communication system has various consequences. We seem obliged to attribute more consciousness to two communication systems than to either separately, in fact it is natural to assume additivity. If the two systems, apart from their own operation, are in communication, then this will perhaps increase the
amount of consciousness still further. The extra consciousness may reasonably be identified with the rate of transmission of information from one system to the other. . . . There may be a different kind of consciousness at each level, each metaphysical to the others . . . the total consciousness may not have position in space, but it does have a sort of combinatorial topology, like a set of spheres with interconnecting tubes. . . . Thus, consciousness is a varying pattern of communication of great complexity [and] in order to decrease the complexity of the description of a communication network we may sum the flow of information over each given channel for a certain period of time.”
26

In his 1965 speculations on the development of ultraintelligent machines, Good, who recalled that “at the latest, by 1948, I had read both the books by Olaf Stapledon,”
27
saw wireless communication as the best way to construct an ultraparallel information-processing machine. “In order to achieve the requisite degree of ultraparallel working it might be useful for many of the elements of the machine to contain a very short-range microminiature radio transmitter and receiver. The range should be small compared with the dimensions of the whole machine. A ‘connection' between two close artificial neurons could be made by having their transmitter and receiver on the same or close frequencies. The strength of the connection would be represented by the accuracy of the tuning. The receivers would need numerous filters so as to be capable of receiving on many different frequencies. ‘Positive reinforcement' would correspond to improved tuning of these filters.”
28

Good was writing at the dawn of high-speed data communications, when most bits were still stored or transferred by punching holes in paper cards or tape, as had been done at Bletchley Park. The thicket of wires that had to be untangled every time the cryptanalysts wished to change the programming of the Colossus convinced Good from the beginning that wiring could take intelligence only so far. As computers coalesce into larger and faster networks, they are behaving more and more like the elemental units in Good's wireless ultraintelligent machine. We see the wires plugged into the wall and think of the architecture as constrained by the hardwired topology that the physical connection represents, whereas computationally, our machines belong to a diffuse, untethered cloud of the kind that Good envisioned as the basis of an ultraintelligent machine. All our networking protocols—packet switching, token ring, Ethernet, time-division multiplexing, asynchronous transfer mode, and so on—are simply a way of allowing hundreds of millions of individual processors to tune selectively to each others' signals, free of interference, as they wish.

Paul Baran, pioneer of packet switching, sees the relations between computers and communications advancing along similar, wireless lines. You can plug only so many things at one time into your wall. As everything from taxicabs to telephones to televisions to personal digital assistants becomes connected to the network, universal—and microminiature—wireless is the only way to disentangle the communications web. “But there's not enough wireless bandwidth to go around,” say the skeptics, citing the billions of dollars raised whenever a few slivers of radio spectrum are auctioned off. Baran disagrees. “Tune a spectrum analyzer across a band of UHF frequencies and you encounter a few strong signals. Most of the band at any instant is primarily silence, or a background of weaker signals . . . much of the radio band is empty much of the time! The frequency shortage is caused by thinking solely in terms of dumb transmitters and dumb receivers. With today's smart electronics, even occupied frequencies could potentially be used.”
29

Baran made a similar argument in 1960, advising the government to build an all-digital, packet-switched data network instead of throwing good money after bad trying to blast-harden the centralized, circuit-switched network developed for analog transmission of voice. In both cases, he was right. Yet the regulatory establishment continues to treat radio spectrum as real estate to be sold to the highest bidder, instead of as an ocean across which low-powered, agile small craft can deliver information efficiently by following a few simple rules to keep out of each other's way. “The number of geographically dispersed users that can be simultaneously accommodated by a fixed spectrum varies as the inverse square of the transmission distance,” Baran has pointed out, predicting that the meek and unlicensed may, in the end, inherit the earth. “Cut the range in half, and the number of users that can be supported is [quadrupled]. Cut the range by a factor of ten, and 100 times as many users can be served. . . . In other words, a mixture of terrestrial links plus shorter range radio links has the effect of increasing by orders of magnitude the usable frequency spectrum.”
30

While technologies such as low-earth-orbit satellite networks have received wider attention, Baran has been working to eliminate telecommunications bottlenecks in down-to-earth ways. The growth of a communications network, like any other arborescent, dendritic system, is driven by what happens at the root hairs, the leaf tips, the nerve endings, or, in telecommunications jargon, the network tails. The limit to network growth is peripheral; it is known as the last mile problem, or how to make the connection reaching the subscriber at the end.

A decade ago Baran saw two opportunities to grow new, or better utilize existing, tails. First was the cable-television network, which currently enters 63 percent of U.S. households and reaches the driveways of all but 7 percent. Coaxial television cable can transmit up to a gigahertz (one thousand megahertz) over short distances, enough for five hundred channels if digitally compressed. Most of this bandwidth is vacant most of the time. Baran founded Com21, Inc., in 1991 to develop the ultrafast packet switches and strategic alliances necessary to deliver a broadband digital communication spectrum over coaxial cable to the home, with a fiber-optic backbone linking the head ends of the local tails. Among the various schemes offering to provide broadband network growth, hybrid fiber-coaxial offers the path of least resistance because much of the infrastructure already is in place. What to do with all this bandwidth is a different problem, but history has shown that as bandwidth becomes available, the digital ecology swiftly takes root and grows.

Baran also founded (in 1985) a company named Metricom, better known by the name of its wireless network, Ricochet. A wireless, packet-switched, spread-spectrum digital communications network, Ricochet takes an extreme grassroots approach. It operates over small, collectively intelligent digital repeater-transceivers, about the size of lunch boxes, which perch inconspicuously on lampposts, drawing a trickle of power from the utility grid. The repeaters are situated about a quarter of a mile apart. Wherever large numbers of users congregate, you add more boxes here and there, and an occasional gateway to the local backbone of the net.

The whole system operates, unlicensed, within the 902–928 megahertz frequency band, under Federal Communications Commission rules that forbid the use of any given frequency for more than four hundred milliseconds at a time. The same adaptive-routing techniques that Baran first suggested in 1960, breaking a message up into packets that hop from mainframe to mainframe across a military communications net, are now used to convey messages not only by hopping from lunch box to lunch box, but from frequency to frequency (about ten times per second) at the same time. The twenty-six available megahertz is divided into 162 channels of 160 kilohertz each, and messages are divided into packets of 4,096 bits. From the point of view of an individual packet, not only is there a huge number of physically distinct paths from A to B through the mesh of lunch boxes, but there are 162 alternative channels leading to the nearest lunch box at any given time. The packet chooses a channel that happens to be quiet at that instant and jumps to the next lamppost at the speed of
light. The multiplexing of communications across the available network topology is extended to the multiplexing of network topology across the available frequency spectrum. Communication becomes more efficient, fault tolerant, and secure.

The way the system works now (in a growing number of metropolitan areas—hence the name) is that you purchase or rent a small Ricochet modem, about the size of a large candy bar and transmitting at about two-thirds of a watt. Your modem establishes contact with the nearest pole-top lunch box or directly with any other modem of its species within range. Your computer sees the system as a standard modem connection or an Internet node, and the network, otherwise transparent to the users, keeps track of where all the users and all the lunch boxes are. This knowledge is fully distributed; there is no centralized intelligence or control. The modems and repeaters communicate at up to 100 kilobits per second; the user nets about 20 kilobits per second of usable communications, at no marginal cost above the flat monthly subscription, now about a dollar a day. The modems and repeaters cost several hundred dollars each. The system scales gracefully; the more users, the less expensive and more efficient it gets. It does not have to acquire expensive spectra in new markets as it grows.

Whether Metricom succeeds or fails, it demonstrates a communication system that uses less power, but becomes more powerful, as its nodes become larger in number but smaller in physical scale. Life as we know it is composed of small, cellular units not because larger units are impossible to build, but because smaller, semiautonomous units are faster, cheaper, easier to program, and easier to replace. Animals have become larger, but cells have stayed small. Everything in one of the lunch boxes could coalesce into a single component, a microcommunicator that, if produced in similarly large quantities as microprocessors, might cost as much to package as to make. A telepathic, subvital unit. All the economies that Baran demonstrated by adaptive message-block switching over microwave links between defense command centers are applicable to exchanging data directly between microprocessors. The ultimate decentralization of network architecture is for every processor to become a node.

Semiconductor technology originated nearly a century ago with the “cat's-whisker” crystal detector, which allowed the reception of coded signals via radio-frequency disturbances in the electromagnetic field. A wireless web of microprocessors is now relentlessly taking form. The “last centimeter” problem is keeping us a healthy distance from telepathy among human beings, but we are fast approaching telepathy among machines. Every last bit of isolated intelligence—your
living-room thermostat, your gas meter, your stereo system, the traffic light down the street—has an affinity for other bodies of intelligence, a faint attraction to all sources of information within reach.

Other books

Nothing to Envy by Barbara Demick
From Afar by John Russell Fearn
Deep River Burning by Donelle Dreese
Muertos de papel by Alicia Giménez Bartlett
Dying for the Past by T. J. O'Connor