Asimov's Future History Volume 1 (25 page)

BOOK: Asimov's Future History Volume 1
5.81Mb size Format: txt, pdf, ePub

“So I’ve heard. Is there something in connection with it you wish of me?”

“No-o. Still, the mere fact that it is in production and is doing well means that working with this messed-up specimen is useless. Shouldn’t it be scrapped?”

“In short, Alfred, you are annoyed that I am wasting my so-valuable time. Feel relieved. My time is not being wasted. I am
working
with this robot.”

“But the work has no meaning.”

“I’ll be the judge of that, Alfred.” Her voice was ominously quiet, and Lanning thought it wiser to shift his ground.

“Will you tell me what meaning it has? What are you doing with it right now, for instance?”

“I’m trying to get it to raise its hand on the word of command. I’m trying to get it to imitate the sound of the word.”

As though on cue, Lenny said, “Eh-uh” And raised its hand waveringly.

Lanning shook his head. “That voice is amazing. How does it happen?”

Susan Calvin said, “I don’t quite know. Its transmitter is a normal one. It could speak normally, I’m sure. It doesn’t, however; it speaks like this as a consequence of something in the positronic paths that I have not yet pinpointed.”

“Well, pinpoint it, for Heaven’s sake. Speech like that might be useful.”

“Oh, then there is some possible use in my studies on Lenny?” Lanning shrugged in embarrassment. “Oh, well, it’s a minor point.”

“I’m sorry you don’t see the major points, then,” said Susan Calvin with asperity, “which are much more important, but that’s not my fault. Would you leave now, Alfred, and let me go on with my work?”

 

Lanning got to his cigar, eventually, in Bogert’s office. He said, sourly, “That woman is growing more peculiar daily.”

Bogert understood perfectly. In the U. S. Robots and Mechanical Men Corporation, there was only one “that woman.” He said, “Is she still scuffing about with that pseudo-robot – that Lenny of hers?”

“Trying to get it to talk, so help me.” Bogert shrugged. “Points up the company problem. I mean, about getting qualified personnel for research. If we had other robopsychologists, we could retire Susan. Incidentally, I presume the directors’ meeting scheduled for tomorrow is for the purpose of dealing with the procurement problem?”

Lanning nodded and looked at his cigar as though it didn’t taste good. “Yes. Quality, though, not quantity. We’ve raised wages until there’s a steady stream of applicants – those who are interested primarily in money. The trick is to get those who are interested primarily in robotics – a few more like Susan Calvin.”

“Hell, no. Not like her.”

“Well, not like her personally. But you’ll have to admit, Peter, that she’s single-minded about robots. She has no other interest in life.”

“I know. And that’s exactly what makes her so unbearable.” Lanning nodded. He had lost count of the many times it would have done his soul good to have fired Susan Calvin. He had also lost count of the number of millions of dollars she had at one time or another saved the company. She was a truly indispensable woman and would remain one until she died – or until they could lick the problem of finding men and women of her own high caliber who were interested in robotics research.

He said, “I think we’ll cut down on the tour business.” Peter shrugged. “If you say so. But meanwhile, seriously, what do we do about Susan? She can easily tie herself up with Lenny indefinitely. You know how she is when she gets what she considers an interesting problem.”

“What
can
we do?” said Lanning. “If we become too anxious to pull her off, she’ll stay on out of feminine contrariness. In the last analysis, we can’t force her to do anything.”

The dark-haired mathematician smiled. “I wouldn’t ever apply the adjective ‘feminine’ to any part of her.”

“Oh, well,” said Lanning, grumpily. “At least, it won’t do anyone any actual harm.”

In that, if in nothing else, he was wrong. The emergency signal is always a tension-making thing in any large industrial establishment. Such signals had sounded in the history of U. S. Robots a dozen times – for fire, flood, riot and insurrection.

But one thing had never occurred in all that time. Never had the particular signal indicating “Robot out of control” sounded. No one ever expected it to sound. It was only installed at government insistence. (“Damn the Frankenstein complex,” Lanning would mutter on those rare occasions when he thought of it.)

Now, finally, the shrill siren rose and fell at ten-second intervals, and practically no worker from the President of the Board of Directors down to the newest janitor’s assistant recognized the significance of the strange sound for a few moments. After those moments passed, there was a massive convergence of armed guards and medical men to the indicated area of danger and U. S. Robots was struck with paralysis.

Charles Randow, computing technician, was taken off to hospital level with a broken arm. There was no other damage. No other physical damage.

“But the moral damage,” roared Lanning, “is beyond estimation.”

Susan Calvin faced him, murderously calm. “You will do nothing to Lenny. Nothing. Do you understand?”

“Do
you
understand, Susan?” That thing has hurt a human being. It has broken First Law. Don’t you know what First Law is?”

“You will do nothing to Lenny.”

“For God’s sake, Susan, do I have to tell
you
First Law? A
robot may not harm a human being or, through inaction, allow a human being to come to harm.
Our entire position depends on the fact that First Law is rigidly observed by all robots of all types. If the public should hear, and they will hear, that there was an exception, even one exception, we might be forced to close down altogether. Our only chance of survival would be to announce at once that the robot involved had been destroyed, explain the circumstances, and hope that the public can be convinced that it will never happen again.”

“I would like to find out exactly what happened,” said Susan Calvin. “I was not present at the time and I would like to know exactly what the Randow boy was doing in my laboratories without my permission.”

“The important thing that happened,” said Lanning, “is obvious. Your robot struck Randow and the damn fool flashed the ‘Robot out of control’ button and made a case of it. But your robot struck him and inflicted damage to the extent of a broken arm. The truth is your Lenny is so distorted it lacks First Law and it must be destroyed.”

“It does
not
lack First Law. I have studied its brainpaths and know it does not lack it.”

“Then how could it strike a man?” Desperation turned him to sarcasm. “Ask Lenny. Surely you have taught it to speak by now.”

Susan Calvin’s cheeks Bushed a painful pink. She said, “I prefer to interview the victim. And in my absence, Alfred, I want my offices sealed tight, with Lenny inside. I want no one to approach him. If any harm comes to him while I am gone, this company will not see me again under any circumstances.”

“Will you agree to its destruction, if it has broken First Law?”

“Yes,” said Susan Calvin, “because I know it hasn’t.”

 

Charles Randow lay in bed with his arm set and in a cast. His major suffering was still from the shock of those few moments in which he thought a robot was advancing on him with murder in its positronic mind. No other human had ever had such reason to fear direct robotic harm as he had had just then. He had had a unique experience.

Susan Calvin and Alfred Lanning stood beside his bed now; Peter Bogert, who had met them on the way, was with them. Doctors and nurses had been shooed out.

Susan Calvin said, “Now – what happened?” Randow was daunted. He muttered, “The thing hit me in the arm. It was coming at me.”

Calvin said, “Move further back in the story. What were you doing in my laboratory without authorization?”

The young computer swallowed, and the Adam’s apple in his thin neck bobbed noticeably. He was high-cheekboned and abnormally pale. He said, “We all knew about your robot. The word is you were trying to teach it to talk like a musical instrument. There were bets going as to whether it talked or not. Some said – uh – you could teach a gatepost to talk.”

“I suppose,” said Susan Calvin, freezingly, “that is meant as a compliment. What did that have to do with you?”

“I was supposed to go in there and settle matters – see if it would talk, you know. We swiped a key to your place and I waited till you were gone and went in. We had a lottery on who was to do it. I lost.”

“Then?”

“I tried to get it to talk and it hit me.”

“What do you mean, you tried to get it to talk? How did you try?”

“I – I asked it questions, but it wouldn’t say anything, and I had to give the thing a fair shake, so I kind of – yelled at it, and –”

“And?”

There was a long pause. Under Susan Calvin’s unwavering stare, Randow finally said, “I tried to scare it into saying something.” He added defensively, “I had to give the thing a fair shake.”

“How did you try to scare it?”

“I pretended to take a punch at it.”

“And it brushed your arm aside?”

“It
hit
my arm.”

“Very well. That’s all.” To Lanning and Bogert, she said, “Come, gentlemen.”

At the doorway, she turned back to Randow. “I can settle the bets going around, if you are still interested. Lenny can speak a few words quite well.”

 

They said nothing until they were in Susan Calvin’s office. Its walls were lined with her books, some of which she had written herself. It retained the patina of her own frigid, carefully ordered personality. It had only one chair in it and she sat down. Lanning and Bogert remained standing.

She said, “Lenny only defended itself. That is the Third Law: A
robot must protect its own existence.”

“Except,”
said Lanning forcefully,
“when this conflicts with the First
or
Second Laws.
Complete the statement! Lenny had no right to defend itself in any way at the cost of harm, however minor, to a human being.”

“Nor did it,” shot back Calvin,
“knowingly.
Lenny has an aborted brain. It had no way of knowing its own strength or the weakness of humans. In brushing aside the threatening arm of a human being it could not know the bone would break. In human terms, no moral blame can be attached to an individual who honestly cannot differentiate good and evil.”

Bogert interrupted, soothingly, “Now, Susan,
we
don’t blame.
We
understand that Lenny is the equivalent of a baby, humanly speaking, and we don’t blame it. But the public will. U. S. Robots will be closed down.”

“Quite the opposite. If you had the brains of a flea, Peter, you would see that this is the opportunity U. S. Robots is waiting for. That this will solve its problems.”

Lanning hunched his white eyebrows low. He said, softly, “What problems, Susan?”

“Isn’t the corporation concerned about maintaining our research personnel at the present – Heaven help us – high level?”

“We certainly are.”

“Well, what are you offering prospective researchers? Excitement? Novelty? The thrill of piercing the unknown? No! You offer them salaries and the assurance of no problems.”

Bogert said, “How do you mean, no problems?”

“Are there problems?” shot back Susan Calvin. “What kind of robots do we turn out? Fully developed robots, fit for their tasks. An industry tells us what it needs; a computer designs the brain; machinery forms the robot; and there it is, complete and done. Peter, some time ago, you asked me with reference to Lenny what its use was. What’s the use, you said, of a robot that was not designed for any job? Now I ask you – what’s the use of a robot designed for only one job? It begins and ends in the same place. The LNE models mine boron. If beryllium is needed, they are useless. If boron technology enters a new phase, they become useless. A human being so designed would be sub-human. A robot so designed is sub-robotic.”

“Do you want a versatile robot?” asked Lanning, incredulously. “Why not?” demanded the robopsychologist. “Why not? I’ve been handed a robot with a brain almost completely stultified. I’ve been teaching it, and you, Alfred, asked me what was the use of that. Perhaps very little as far as Lenny itself is concerned, since it will never progress beyond the five-year-old level on a human scale. But what’s the use in general? A very great deal, if you consider it as a study in the abstract problem of
learning how
to
teach robots.
I have learned ways to short-circuit neighboring pathways in order to create new ones. More study will yield better, more subtle and more efficient techniques of doing so.”

“Well?”

“Suppose you started with a positronic brain that had all the basic pathways carefully outlined but none of the secondaries. Suppose you then started creating secondaries. You could sell basic robots designed for instruction; robots that could be modeled to a job, and then modeled to another, if necessary. Robots would become as versatile as human beings.
Robots could learn!”

They stared at her. She said, impatiently, “You still don’t understand, do you?”

“I understand what you are saying,” said Lanning.

“Don’t you understand that with a completely new field of research and completely new techniques to be developed, with a completely new area of the unknown to be penetrated, youngsters will feel a new urge to enter robotics? Try it and see.”

“May I point out,” said Bogert, smoothly, “that this is dangerous. Beginning with ignorant robots such as Lenny will mean that one could never trust First Law – exactly as turned out in Lenny’s case.”

“Exactly. Advertise the fact.”

“A
dvertise it!”

“Of course. Broadcast the danger. Explain that you will set up a new research institute on the moon, if Earth’s population chooses not to allow this sort of thing to go on upon Earth, but stress the danger to the possible applicants by all means.”

Lanning said, “For God’s sake, why?”

“Because the spice of danger will add to the lure. Do you think nuclear technology involves no danger and spationautics no peril? Has your lure of absolute security been doing the trick for you? Has it helped you to cater to the Frankenstein complex you all despise so? Try something else then, something that has worked in other fields.”

Other books

The Marble Kite by David Daniel
Love Beyond Time by Speer, Flora
Unforgettable You by Deanndra Hall
No One in the World by E. Lynn Harris, RM Johnson
Resilience by Elizabeth Edwards
A Son of Aran by Martin Gormally