Authors: James P. Hogan
Tags: #fiction, #science fiction, #General, #Action & Adventure, #Collections & Anthologies
“You’re right,” Dyer said, nodding. “But when you say that a person feels any of those things, how do you know? How do you know what he feels inside his own head?” He gave them a few seconds to reflect on this and then supplied his own answer. “Obviously you can’t. All you can know is what you see him do and hear him say—in other words by his
observable behavior.
What I’m saying is that different causes can result in identical effects. If some other cause were to result in the kinds of behavior that go with the emotions I’ve just listed, as far as we would be concerned there wouldn’t be any difference. If somebody comes at you with an axe it doesn’t make any difference if he’s doing it because he hates your guts or because he’s quite rational but thinks you’re a monster from Venus. The result’s the same.”
“I think maybe we go away from the point.” The speaker was Emilio Gerasa from Spain, one of Fritz Muller’s contingent. “Isn’t the problem with HESPER that of incompetence, not all these other things? Why do we speak of these other things, like the anger and so on?”
“FISE would solve the competence problem,” Dyer assured them. “I don’t have any worries in that direction. I’m more worried that it might end up being too competent.” A few mystified glances were exchanged in parts of the room.
“The emotional traits that we’ve mentioned, along with pretty well all the rest, can be traced back to one root—
survival!
”
Dyer told them. “If an enhanced TITAN ever evolved the motivational drive to preserve its own existence, the very fact that it’s a rational system would enable it to devise very effective ways of going about it. Also, since it’s an extremely powerful learning machine that operates at computer speeds, once it started to do something, it would do it very fast! If the machine interpreted agencies in the universe around it as constituting real or imagined threats to its existence, then the rational thing for it to do would be to experiment until it identified measures that were effective in neutralizing those agencies.” Dyer shrugged. “If one of them turned out to be us or our vital interests, we could have real problems.”
Schroder leaned across to confer with Muller for a few seconds, Muller nodded, then shook his head and gestured in Dyer’s direction. Schroder looked up again.
“Maybe I’ve missed the point,” he said. “But I thought you agreed a little while ago that a machine wouldn’t possess a human survival drive because it hadn’t come from the same origins as humans. Now you seem to be saying that it will. Could you clarify that, please.”
“He is talking in circles,” Van der Waarde muttered.
“And why should it feel threatened and act against us when it doesn’t share any of our survival-based emotions?” Frank Wescott, who was present to represent CIT, challenged. Richter was by this time sitting back glumly, resigned to hearing whatever Dyer was going to say.
“Because it wouldn’t even
know
it was doing so,” Dyer answered. “That kind of question still presumes that it would think in human terms. I’m talking about a totally rational entity that simply modifies its reactions to an environment around it. It hasn’t had the evolutionary conditioning that we’ve had to understand the concept of rivalry or even that beings other than itself exist. All it’s aware of is itself and influences impinging on it that are external to itself. Now do you see what I’m getting at? It wouldn’t
consciously
or
deliberately
take on Man as an opponent because in all probability it would have no concept of Man
per se
.”
“Very well, Dr. Dyer.” Fritz Muller held up a hand. “We take your point. But tell me, what kinds of circumstances do you envisage occurring that might equate to a clash of interest between us and it? Let us not worry for now about whether or not the two parties look upon the situation in the same way.”
Dyer paused to consult the notes that he had prepared beforehand. The CIM people and the advisers from the World Council committee were watching him intently while the academics were looking unhappy and muttering among themselves. Richter was glowering up at him over folded arms.
“Consider the following scenario,” Dyer resumed. “The system has evolved some compulsive trait that reflects the reasons for its having been built in the first place—a counterpart to the survival drive of organic systems. The other day, somebody I was talking to suggested that it might become insatiably curious, so let’s take that as its overriding compulsion. It doesn’t know why it wants to be that way any more than we know why we want to survive. It’s just made that way. To discover more about the universe, it requires resources—energy, instruments, vehicles to carry the instruments to places, and, of course, a large share of its own capacity. Moreover, the system finds that it has access to vast amounts of such resources—a whole world full of them. So it follows its inclinations and begins diverting more of those resources toward its own ends and away from the things that they were intended for. As far as we were concerned, it would have manifested the feeling of
indifference.
Our goals would cease to figure in its equations and we’d face the prospect of being reduced to second-class citizens on our own planet.”
“Only if we just sat there and allowed it to help itself,” a professor from Hamburg interjected. “I can’t see that we would. Why should we?”
“Which brings us to a second scenario,” Dyer carried on. “We take active steps to deny it access to the resources it wants. The system retaliates in kind by denying us the resources that we want, say by progressively shutting down energy plants, grounding air traffic, blacking out cities . . . all kinds of things.” He raised his hands to stifle the objections that appeared written across several faces. “Don’t forget, I am not postulating that the system has any concept of Man or sees its behavior in the same terms as we would. But this is a powerful learning machine! All it knows is that certain events in the environment around it are capable of obstructing its goals, and that certain coordinated actions on its part have the effect of stopping those events from happening. It’s like a dog scratching. It just feels uncomfortable and learns that doing certain things makes it feel better. The dog doesn’t have to be an entomologist or know that it’s fleas that are causing the discomfort.”
“But how could it possibly know that cutting off power to cities or anything like that would help?” Gary Corbertson, Director of Software Engineering from Datatrex Corporation, shook his head in disbelief. “I thought you said it wouldn’t know anything about people. How could it figure out how to blackmail them if it didn’t even know about them? That doesn’t make sense.”
“It wouldn’t have to figure out
why
it worked,” Dyer replied. “All it would need to know is that it did. Suppose it decided that it wanted a Jupiter probe all to itself, but we tried to take the probe away and it responded by shutting down cities at the rate of ten per night. Suppose also that we knew why it was doing it. What do you think we’d do?” He nodded slowly around the room. “It’d get its Jupiter probe pretty soon, wouldn’t it?”
“Mmm . . . I think I see the point,” Schroder said slowly. “All a baby has to know about the world is that when it screams loud enough it gets what it wants.”
“Good analogy,” Dyer agreed. “I’m not suggesting the system would do anything as sophisticated as that to start with, but like a baby it would experiment, observe, connect and hypothesize. Pretty soon it would have a fair grasp of what actions resulted in what effects.
“And now take our supposition one step further,” he went on. “What if the coordinated actions that it learned amounted not merely to blackmail but to overt aggression? As far as the machine’s concerned there’d be no difference—certain actions simply makes the discomfort go away or the comfort increase. That’s where the fact that it doesn’t possess any human values or concepts at all becomes really worrying. Another scenario—it discovers that it gets far faster and more positive results when it doesn’t stop at threats; it carries them out. Now it’s exhibiting open hostility as far as we’re concerned, but it doesn’t know it.
“So, without invoking any human attributes at all, we’ve just taken it through the whole spectrum from indifference to hostility—a perfectly plausible simulation of behavior that we thought we wouldn’t have to worry about because it couldn’t evolve the emotions that normally accompany it. But now we see that it wouldn’t have to evolve any such emotions.”
As Dyer sat down, Richter, now looking less disgruntled, leaned toward him.
“Christ, Ray,” he said over the hubbub of voices that broke out on every side. “Is FISE really capable of going all the way to that extent?”
“Not one of them in a lab,” Dyer told him. “But what happens when you connect thousands of them up together? Would you want to put money on it?” Richter sat back, shaking his head slowly and frowning to himself. The meeting subsided to silence again as Campbell Roberts, Muller’s representative from Australia/New Zealand, began to speak.
“I still think we’re exaggerating the whole thing,” he declared loudly. “So there are risks. Nobody ever said there weren’t. All through history men have taken risks where the benefits they stood to gain justified them. But as we said earlier on, if the system starts doing things we don’t like, we can always pull the plug on it. If we have to, we can always take the bloody thing to bits again. Why in hell’s name are we getting so hung up about some lousy machine developing a mind of its own? We’ve got minds too, dammit, and we’ve been around a lot longer. If it wants to play survival games I reckon we could teach it a thing or two. I say put FISE in and make damn sure it never forgets who’s boss.
Homo sapiens
have had plenty of practice at that!”
“Maybe it won’t let you pull the plug,” Fierney pointed out.
“That’s bloody ridiculous!”
“I’m not so sure it is,” Muller commented. “Even now TITAN controls its own power sources and the manufacture of most of its own parts. If current forecasts are anything to go by, it will soon control everything related to its own perpetuation—from surveying sources of raw material to installing extensions to itself and carrying out one-hundred-percent self-repair. On top of that it controls other machinery of every description. It might not reach the point of becoming incapable of being switched off, but it could conceivably make the job of switching it off an extremely difficult and possibly costly undertaking.”
“But why should it want to do that in the first place if it doesn’t have a survival instinct?” Roberts objected.
“What have you got to say to that, Dr. Dyer?” Schroder invited.
“The same thing applies as before,” Dyer said without hesitation. “If the system evolved some overriding purpose that its programming compelled it to strive to achieve, it wouldn’t take it long to figure out that its own continued existence was an essential prerequisite to being sure of achieving it. Its own observations would tell it that its existence could not be guaranteed as things stood, so its immediate response would be to experiment in order to find out what it could do to remedy the situation. The rest follows logically from there. In other words, here we have a mechanism via which something tantamount to a survival instinct could emerge without the need for any survival-dominated origins at all. And as I said before, once you’ve got a survival instinct established, all the emotions that go with it will follow in the course of time.”
Dyer paused to allow his words time to sink in and then summarized his view on the things that had been said.
“If the system started to exhibit any of the traits we’ve been talking about, that in itself wouldn’t add up to an insurmountable problem because, as Campbell says, we can always pull the plug. As long as that’s true, the benefits outweigh the risks; and if that was all there were to it, my vote would be to upgrade the net. But that isn’t all there is to it. If the system were to evolve a survival drive, logically we would expect it to attempt making itself an unpullable plug. Even that, in itself, wouldn’t be a problem if it didn’t succeed. After all, it wouldn’t matter much what the system
wanted
as long as it was incapable of doing much about it. If we could guarantee that, I’d still say upgrade the net. But we can’t.
“It all boils down to two questions. One: Could the system evolve a survival instinct? Two: If it did, what could it do about it? The second is really the key. Until we can find some way of answering that with confidence, I can’t see our way clear to taking things further.”
A long silence followed Dyer’s words. Then Schroder took up the debate.
“I’m inclined to agree that we can’t recommend putting FISE into the net at this stage. As to the question of continuing with FISE research, that’s a funding issue that doesn’t concern this meeting. But something else bothers me. Everything that has been said this morning has assumed that we’ve been talking about a supercomplex that includes FISE machines. But the business at Maskelyne happened with the system as it is now. Even with just HESPER, TITAN showed itself to be capable of integrating its activities to a degree that nobody thought possible.” He gestured vaguely toward the door. “Out there is a world that’s being run by a super-complex of HESPERS. What guarantee do we have that the kinds of behavior you’ve described can’t happen even today with the system we’ve got?”
Dyer had been expecting the question. He held Schroder’s eye and replied simply. “None.”
Schroder considered the answer for a long time. At last he sighed and stretched his arms forward across the table in front of him.
“The objective of the meeting was to agree what to do about HESPER,” he reminded them all. “We have three choices: Allow TITAN to grow further, freeze it where it is now, or downgrade it by taking HESPER out. We can’t allow it to grow further until we have some way of obtaining guaranteed answers to Dr. Dyer’s two key questions. If we leave it as it is, we risk a repetition of the Maskelyne kind of accident but maybe on a catastrophic scale, which would clearly be totally unacceptable. Therefore, as I see it, the only choice open to us is the third. Does anybody here disagree?”