The Future (11 page)

Read The Future Online

Authors: Al Gore

BOOK: The Future
12.25Mb size Format: txt, pdf, ePub

*
This term was first coined by Buckminster Fuller in 1973, but he used it to convey a completely different meaning.

For a larger version of the following image,
click here
.

2
THE GLOBAL MIND

J
UST AS THE SIMULTANEOUS OUTSOURCING AND ROBOSOURCING OF PRODUCTIVE
activity has led to the emergence of Earth Inc., the simultaneous deployment of the Internet and ubiquitous computing power have created a planet-wide extension of the human nervous system that transmits information, thoughts, and feelings to and from billions of people at the speed of light.

We are connecting to vast global data networks—and to one another—through email, text messaging, social networks, multiplayer games, and other digital forms of communication at an unprecedented pace. This revolutionary and still accelerating shift in global communication is driving a tsunami of change forcing disruptive—and creative—modifications in activities ranging from art to science and from collective political decision making to building businesses.

Some familiar businesses are struggling to survive: newspapers, travel agencies, bookstores, music, video rental, and photography stores are among the most frequently recognized early examples of businesses confronted with a technologically driven mandate to either radically change or disappear. Some large institutions are also struggling: national postal services are hemorrhaging customers as digital communication
displaces letter writing, leaving the venerable post office
to serve primarily as a distribution service for advertisements and junk mail.

At the same time, we are witnessing the explosive growth of new business models, social organizations, and patterns of behavior that would have been unimaginable before the Internet and computing: from Facebook and Twitter to Amazon and iTunes, from eBay and Google to Baidu, Yandex.ru, and Globo.com, to a dozen other businesses that have started since you began reading this sentence—all are
phenomena driven by the connection of two billion people (thus far) to the Internet. In addition to people, the number of digital devices connected to other devices and machines—
with no human being involved—already exceeds the population of the Earth. Studies project that by 2020, more than 50 billion devices will be
connected to the Internet and exchanging information on a continuous basis. When less sophisticated devices like Radiofrequency Identification (RFID) tags capable of transmitting information wirelessly or transferring data to devices that read them are included,
the number of “connected things” is already much larger. (Some school systems, incidentally, have begun to require students to wear identification tags equipped with
RFID tags in an effort to combat truancy, generating protests from many students.)

TECHNOLOGY AND THE “WORLD BRAIN”

Writers have used the human nervous system to describe electronic communication since the invention of the telegraph. In 1851, only six years after Samuel Morse received the message “What hath God wrought?” Nathaniel Hawthorne wrote: “By means of electricity, the world of matter has become a great nerve vibrating thousands of miles in a breathless point of time.
The round globe is a vast brain, instinct with intelligence.” Less than a century later, H. G. Wells modified Hawthorne’s metaphor when he offered a proposal to develop a “world brain”—which he described as a commonwealth of all the world’s information, accessible to all the world’s people as “a sort of mental clearinghouse for the mind: a depot
where knowledge and ideas are received, sorted, summarized, digested, clarified and compared.” In the way Wells used the phrase “world brain,” what began as a metaphor is now a reality. You can look it up right now on Wikipedia or search the
World Wide Web on Google for some of the estimated one trillion web pages.

Since the nervous system connects to the human brain and the brain gives rise to the mind, it was understandable that one of the twentieth century’s greatest theologians, Teilhard de Chardin, would modify Hawthorne’s metaphor yet again. In the 1950s, he envisioned the “planetization” of consciousness within a technologically enabled
network of human thoughts that he termed the “Global Mind.” And while the current reality may not yet match Teilhard’s expansive meaning when he used that provocative image, some technologists believe that what is emerging may nevertheless mark the beginning of an entirely new era. To paraphrase Descartes, “It thinks; therefore it is.”
*

The supercomputers and software in use have all been designed by human beings, but as Marshall McLuhan once said,
“We shape our tools, and thereafter, our tools shape us.” Since the global Internet and the billions of intelligent devices and machines connected to it—the Global Mind—represent what is arguably far and away the most powerful tool that human beings have ever used, it should not be surprising that it is beginning to reshape the way we think in ways both trivial and profound—but sweeping and ubiquitous.

In the same way that multinational corporations have become far more efficient and productive by outsourcing work to other countries and robosourcing work to intelligent, interconnected machines, we as individuals are becoming far more efficient and productive by instantly connecting our thoughts to computers, servers, and databases all over the world. Just as radical changes in the global economy have been driven by a positive feedback loop between outsourcing and robosourcing, the spread of computing power and the increasing number of people connected to the Internet are mutually reinforcing trends. Just as Earth Inc. is changing the role of human beings in the production process, the Global Mind is changing our relationship to the world of information.

The change being driven by the wholesale adoption of the Internet as the principal means of information exchange is simultaneously disruptive and creative. The futurist Kevin Kelly says that our new technological world—infused with intelligence—more and more resembles
“a very complex organism that often follows its own urges.” In this case, the large complex system includes not only the Internet and the computers, but also us.

Consider the impact on conversations. Many of us now routinely reach for smartphones to find the answers to questions that arise at the dinner table by searching the Internet with our fingertips. Indeed, many now spend so much time on their smartphones and other mobile Internet-connected devices that oral conversation sometimes almost ceases. As a distinguished philosopher of the Internet, Sherry Turkle, recently wrote,
we are spending more and more time “alone together.”

The deeply engaging and immersive nature of online technologies has led many to ask whether their use might be addictive for some people. The
Diagnostic and Statistical Manual of Mental Disorders
(DSM), when it is updated in May 2013, will include
“Internet Use Disorder” in its appendix for the first time, as a category targeted for further study. There are an
estimated 500 million people in the world now playing online games at least one hour per day. In the United States, the average person under the age of twenty-one now spends almost
as much time playing online games as they spend in classrooms from the sixth through twelfth grades. And it’s not just young people:
the average online social games player is a woman in her mid-forties. An estimated
55 percent of those playing social games in the U.S.—and 60 percent in the U.K.—are women. (Worldwide, women also
generate 60 percent of the comments and post 70 percent of the pictures on Facebook.)

OF MEMORY, “MARKS,” AND THE GUTENBERG EFFECT

Although these changes in behavior may seem trivial, the larger trend they illustrate is anything but. One of the most interesting debates among experts who study the relationship between people and the Internet is over how we may be adapting the internal organization of our brains—and the nature of consciousness—
to the amount of time we are spending online.

Human memory has always been affected by each new advance
in communications technology. Psychological studies have shown that when people are asked to remember a list of facts, those told in advance that the facts will later be retrievable on the Internet are not able to remember the list as well as a
control group not informed that the facts could be found online. Similar studies have shown that regular users of GPS devices
began to lose some of their innate sense of direction.

The implication is that many of us use the Internet—and the devices, programs, and databases connected to it—as an extension of our brains. This is not a metaphor; the
studies indicate that it is a literal real-location of mental energy. In a way, it makes sense to conserve our brain capacity by storing only the meager data that will allow us to retrieve facts from an external storage device. Or at least Albert Einstein thought so, once remarking: “
Never memorize what you can look up in books.”

For half a century neuroscientists have known that specific neuronal pathways grow and proliferate when used, while
the disuse of neuron “trees” leads to their shrinkage and gradual loss of efficacy. Even before those discoveries, McLuhan described the process metaphorically, writing that when we adapt to a new tool that extends a function previously performed by the mind alone, we gradually lose touch with our former capacity because a “built-in numbing apparatus” subtly anesthetizes us to accommodate the attachment of a mental prosthetic
connecting our brains seamlessly to the enhanced capacity inherent in the new tool.

In Plato’s dialogues, when the Egyptian god Theuth tells one of the kings of Egypt, Thamus, that the new communications technology of the age—writing—would allow people to remember much more than previously, the king disagreed, saying, “It will implant forgetfulness in their souls: they will cease to exercise memory because they rely on that which is written,
calling things to remembrance no longer from within themselves, but by means of external marks.”

So this dynamic is hardly new. What is profoundly different about the combination of Internet access and mobile personal computing devices is that the instantaneous connection between an individual’s brain
and the digital universe is so easy that a habitual reliance on external memory (or “exomemory”) can become an extremely common behavior. The more common this behavior becomes, the greater one comes to rely on exomemory—
and the less one relies on memories stored in the brain itself
. What becomes more important instead are the “external marks” referred to by Thamus 2,400 years ago. Indeed, one of the new measures of practical intelligence in the twenty-first century is the ease with which someone can quickly locate relevant information on the Internet.

Human consciousness has always been shaped by external creations. What makes human beings unique among, and dominant over,
life-forms on Earth is our capacity for complex and abstract thought. Since the emergence of the
neocortex in roughly its modern form around 200,000 years ago, however, the trajectory of human dominion over the Earth has been defined less by further developments in human physical evolution and more by the evolution of our relationship to the tools we have used to augment our leverage over reality.

Scientists disagree over whether the use of complex speech by humans emerged rather suddenly
with a genetic mutation or whether it developed more gradually. But whatever its origin, complex speech radically changed the ability of humans to use information in gaining mastery over their circumstances by enabling us for the first time
to communicate more intricate thoughts from one person to others. It also arguably represented the first example of the storing of information outside the human brain. And for most of human history, the spoken word was the principal “information technology” used in human societies.

The long
hunter-gatherer period is associated with oral communication. The first use of written
language is associated with the early stages of the Agricultural Revolution. The progressive development and use of more sophisticated tools for written language—from stone tablets to papyrus to velum to paper, from pictograms to hieroglyphics to phonetic alphabets—is associated with the emergence of complex civilizations in
Mesopotamia, Egypt, China and India, the Mediterranean, and Central America.

Other books

Memoirs of a Bitch by Francesca Petrizzo, Silvester Mazzarella
The Shape of My Name by Nino Cipri
A Rhinestone Button by Gail Anderson-Dargatz
Terror in Taffeta by Marla Cooper
A Cedar Cove Christmas by Debbie Macomber
El profesor by Frank McCourt
Lawn Boy Returns by Gary Paulsen
Wedding Girl by Madeleine Wickham
Dreamland Social Club by Tara Altebrando